An advanced AI-powered driver monitoring system leveraging facial recognition and pose estimation to detect driver fatigue, distraction, and anomalies in real-time. Designed to enhance road safety through proactive alerts.
🌐 Live Demo: drivesafe.manikantadarapureddy.in
- Eye Tracking & Blink Analysis: Monitors Eye Aspect Ratio (EAR) and blink patterns to detect prolonged eye closure, effectively identifying drowsiness.
- Yawn Detection: Analyzes Mouth Aspect Ratio (MAR) and duration to recognize yawning related to fatigue.
- Head Pose Estimation: Tracks head movements and orientation (yaw, pitch, roll) to detect loss of focus and physical distraction.
- Object Detection: Identifies the presence of mobile phones or other objects that may cause driver distraction.
- Dynamic Risk Scoring: Provides a real-time risk assessment categorized by safety levels (Safe, Warning, Critical).
- Multi-Modal Alert System: Triggers visual, audio, and voice notifications upon detecting critical behavioral patterns.
- Real-Time Edge Processing: Runs client-side machine learning at up to 60 FPS, ensuring completely private, secure, and fast inference without server data transmission.
- Framework: Next.js & React
- Language: TypeScript
- Styling: Tailwind CSS
- Machine Learning Models: MediaPipe Tasks Vision (Face mesh & landmark detection)
- UI Components: Radix UI, Framer Motion
- Data Visualization: Recharts
To run this project locally, ensure you have the following installed:
- Node.js (v18 or newer)
- npm, yarn, or pnpm
- A modern web browser with webcam access (Google Chrome or Microsoft Edge recommended for optimal MediaPipe performance)
git clone https://github.com/chinni-d/driver_safety_ai.git
cd driver_safety_ainpm installnpm run devNavigate to http://localhost:3000 in your browser to view the application.
npm run build
npm run startdriver_safety_ai/
├── app/ # Application routes and layouts
├── components/ # Reusable React components
│ ├── ui/ # Radix-based UI primitives
│ ├── detection-dashboard.tsx # Core real-time detection UI
│ └── analytics-dashboard.tsx # Analytics visualization UI
├── hooks/ # Custom React hooks for state and ML logic
├── lib/ # Utility functions and constants
└── public/ # Static assets
- Navigate to the Detect page (
/detect). - Grant camera permissions when prompted by the browser.
- The system will start analyzing your facial landmarks to track:
- Eye closure: EAR < 0.25 for over 2 seconds increments drowsiness alerts.
- Yawning: MAR > 0.6 triggers fatigue counters.
- Distraction: Head deviation angle > 25° triggers distraction warnings.
- Observe the dynamic risk level categorizing your focus state as Safe (0–39%), Warning (40–69%), or Critical (70–100%).
- Switch to the Analytics page (
/analytics) to review session history and safety trends.
This project is licensed under the MIT License.