-
Click on "Use this template" button and "Create a new repository" in your github account for submission.

-
Add one of the following open source licenses - MIT, Apache 2.0 or BSD 3-Clause to your submission repository.
-
Once your repository is ready for evaluation send an email to ennovatex.io@samsung.com with the subject - "AI Challenge Submission - Team name" and the body of the email must contain only the Team Name, Team Leader Name & your GitHub project repository link.
-
All submission project materials outlined below must be added to the github repository and nothing should be attached in the submission email.
-
In case of any query, please feel free to reach out to us at ennovatex.io@samsung.com
| Project Aspect | % |
|---|---|
| Novelty of Approach | 25% |
| Technical implementation & Documentation | 25% |
| UI/UX Design or User Interaction Design | 15% |
| Ethical Considerations & Scalability | 10% |
| Demo Video (10 mins max) | 25% |
-------------------------- Your Project README.md should start from here -----------------------------
- Classify User Application Traffic at the Network in a Multi-UE Connected Scenario
- Palt
- Pulast S Tiwari
- Sarthak Vats
- Yash Kumar
Sentinel-QoS is an advanced network traffic classification system that combines machine learning and large language models to provide real-time quality of service monitoring and intelligent traffic analysis in multi-user environments.
-Hybrid AI engine: Low‑latency LightGBM classifier handles most traffic; ambiguous cases are escalated to an LLM for richer contextual classification and human‑readable explanations.
-Live observability: Real‑time charts, per‑flow metrics, and alerting with playback for recent events.
-Explainable investigations: For each incident, receive a concise summary, confidence score, and recommended policy changes or QoS adjustments.
-Scenario modeling: Simulate traffic shifts, policy changes, or failure modes to predict downstream effects on SLA and throughput.
-Production ready: Containerized services, docker‑compose examples, health endpoints, and logging integrations for easy ops onboarding.
-Developer friendly: TypeScript/Next.js frontend, FastAPI backend, optional Hugging Face model integration, and dataset publishing scripts.
- Technical Documentation -
docs/- Comprehensive technical documentation including system architecture, API documentation, and deployment guides - Source Code -
src/- Complete source code with organized backend and frontend components- Backend API -
src/backend/- FastAPI server with ML model integration - Frontend Dashboard -
src/frontend/- Next.js React application with real-time monitoring
- Backend API -
- Machine Learning Models - Hugging Face Model Repository
- Published Models - Sentry Model on Hugging Face
- Training Data -
training_data.csv- Synthetic network traffic dataset for model training - Training Data - Pulast/sentry_training_data - Synthetic network traffic dataset (published on Hugging Face)
- Model Artifacts -
models/- Trained model files and publishing scripts - Deployment Configuration - Docker Compose files, Vercel configuration, and CI/CD workflows
- Python 3.8+
- Node.js 18+
- pnpm (recommended) or npm
- Navigate to backend directory and set up environment:
cd src/backend
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt- Start the FastAPI backend:
python -m uvicorn orchestrator:app --reload --port 8000- Navigate to frontend directory and install dependencies:
cd src/frontend
pnpm install # or npm install- Start the development server:
pnpm dev # or npm run dev- Open http://localhost:3000 in your browser
For easy deployment, use Docker Compose:
# Development
docker-compose -f docker-compose.dev.yml up --build
# Production
docker-compose up --buildCreate a .env file in the root directory based on .env.example:
cp .env.example .env
# Edit .env with your configurationKey environment variables:
NEXT_PUBLIC_API_URL— Backend API URL (default: http://localhost:8000)ADMIN_USERNAME— Admin dashboard usernameADMIN_PASSWORD— Admin dashboard passwordLLM_MODEL— Ollama model for LLM classificationLLM_BASE_URL— Ollama server URL
You can load the published dataset directly from Hugging Face in a few ways.
Using the Hugging Face datasets library (recommended):
from datasets import load_dataset
# load the dataset as a DatasetDict and convert to pandas
ds = load_dataset("Pulast/sentry_training_data")
df = ds["train"].to_pandas()Or download the raw CSV (useful for scripts):
import requests
url = "https://huggingface.co/datasets/Pulast/sentry_training_data/resolve/main/training_data.csv"
resp = requests.get(url)
resp.raise_for_status()
open("training_data.csv", "wb").write(resp.content)- FastAPI - High-performance async API framework
- LightGBM - Primary traffic classification model (Sentry)
- Ollama + Gemma - LLM fallback system (Vanguard) for complex cases
- Scikit-learn - Data preprocessing and model pipeline
- Pandas - Data manipulation and analysis
- Next.js 14 - React framework with app router
- TypeScript - Type-safe development
- Tailwind CSS - Modern utility-first styling
- Shadcn/ui - Beautiful and accessible UI components
- Chart.js - Interactive data visualization
- Docker & Docker Compose - Containerized deployment
- Vercel - Frontend hosting with CI/CD
- GitHub Actions - Automated testing and deployment
Contributing
Please read CONTRIBUTING.md for contribution guidelines and CODE_OF_CONDUCT.md for community expectations.
This project is licensed under the MIT License - see the LICENSE file for details.
This project demonstrates a novel hybrid AI approach to network traffic classification, combining the speed and efficiency of traditional ML models with the reasoning capabilities of large language models. The system is designed for real-world deployment with comprehensive monitoring, intuitive UI/UX, and ethical AI considerations including model transparency and fair resource allocation.
- Team Palt
- Team Leader: Pulast S Tiwari
- Members: Sarthak Vats, Yash Kumar
- Repository: github.com/PulastTiwari/sample-template-samsung-ennovatex