Snap Ingredients β’ Get Recipes β’ Start Cooking
NUTRISNAP is a mobile application designed for students and individuals living alone who want to make the most of their available ingredients. Simply snap a photo of what you have in your kitchen, and NUTRISNAP will suggest recipes you can make, ranked by how well your ingredients match each recipe.
Built with a custom-trained YOLOv8n model for ingredient recognition and a curated database of Nepali and Western recipes, NUTRISNAP helps you discover cooking possibilities without the hassle of manual ingredient entry.
- π User Authentication - Secure login and session management with Supabase
- πΈ Ingredient Recognition - Snap photos to automatically identify ingredients
- π― Smart Recipe Matching - Recipes ranked by ingredient match percentage
- π² Curated Recipe Database - 30-40 Nepali and Western recipes
- βοΈ Editable Results - Refine recognized ingredients before searching
- π Scan History - View your last 10-15 ingredient scans
- π± Cross-Platform - Works on both iOS and Android
- β‘ Fast & Lightweight - Optimized for quick results
graph LR
A[Sign In] --> B[Take Photo]
B --> C[YOLOv8n Recognition]
C --> D[Edit Ingredients]
D --> E[Match Recipes]
E --> F[View Results]
F --> G[Start Cooking!]
-
Sign In
- Create an account or log in with existing credentials
- Secure authentication via Supabase
-
Capture Ingredients
- Open the app and tap the camera button
- Snap a photo of your ingredients
- AI identifies what's in the image
-
Review & Refine
- Check recognized ingredients
- Edit or remove any incorrect items
- Add missing ingredients if needed
-
Get Recipe Suggestions
- View recipes sorted by match percentage
- Recipes requiring your ingredients appear first
- Partial matches shown below
-
Cook Your Meal
- Follow step-by-step instructions
- Access your scan history anytime
- React Native - Cross-platform mobile framework
- TypeScript - Type-safe development
- Expo (SDK 52) - Build and deployment tools
- React Navigation - Screen navigation
- Expo ImagePicker - Camera and gallery access
- Flask - Python web framework
- Supabase - Database and authentication service
- YOLOv8n - Custom-trained object detection model
- OpenCV - Image processing
- NumPy - Numerical computations
- Supabase - PostgreSQL database for user data, recipes, and authentication
- AsyncStorage - Local history persistence
- Custom ML Model - 13 ingredient classes (custom-annotated dataset)
- Axios - HTTP client for API requests
- RESTful API - Flask backend endpoints
- Base64 Encoding - Image transfer format
- Node.js (v16 or higher)
- npm or yarn
- Expo CLI (
npm install -g expo-cli) - iOS Simulator (Mac) or Android Studio
- Python 3.11+ (for backend)
- Supabase account (for database and authentication)
- Clone the repository
git clone https://github.com/Boredoom17/SmartCooking.git
cd SmartCooking/SmartCookingStable- Install dependencies
npm install
# or
yarn install-
Configure environment variables
Create a
.envfile:
API_URL=http://your-backend-url:5000
EXPO_PUBLIC_API_URL=http://your-backend-url:5000
SUPABASE_URL=your_supabase_project_url
SUPABASE_ANON_KEY=your_supabase_anon_key- Start the app
npm start
# or
expo start- Run on device
- Scan QR code with Expo Go app (iOS/Android)
- Press
ifor iOS Simulator - Press
afor Android Emulator
- Clone the backend repository
git clone https://github.com/Boredoom17/smartcooking-flask-backend.git
cd smartcooking-flask-backend- Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate- Install dependencies
pip install -r requirements.txt-
Configure environment
Create a
.envfile:
FLASK_ENV=development
PORT=5000
SUPABASE_URL=your_supabase_project_url
SUPABASE_KEY=your_supabase_service_key- Run the server
python server.pyThe backend will start on http://localhost:5000
With Expo Go:
npm start- Install Expo Go from App Store (iOS) or Play Store (Android)
- Scan the QR code to launch the app
Note: Ensure your mobile device and development machine are on the same network.
Test if the backend is accessible:
cd smartcooking-flask-backend
python test_api.pySmartCooking/
βββ SmartCookingStable/ # Frontend application
β βββ app/ # App screens
β β βββ (tabs)/ # Tab navigation
β β β βββ index.tsx # Home/Camera screen
β β β βββ history.tsx # Scan history
β β βββ auth/ # Authentication screens
β β βββ recipe/ # Recipe details
β β βββ _layout.tsx # Root layout
β βββ components/ # Reusable components
β β βββ CameraView.tsx
β β βββ RecipeCard.tsx
β β βββ IngredientList.tsx
β βββ services/ # API integration
β β βββ api.ts # Backend communication
β β βββ supabase.ts # Supabase client
β βββ types/ # TypeScript types
β βββ assets/ # Images and resources
β βββ app.json # Expo configuration
β βββ package.json # Dependencies
β βββ .env # Environment config
βββ README.md
smartcooking-flask-backend/
βββ server.py # Flask application
βββ test_api.py # API testing script
βββ requirements.txt # Python dependencies
βββ .env # Backend configuration
βββ test_images/ # Sample test images
- Supabase Authentication - Secure user registration and login
- Session Management - Persistent user sessions across app restarts
- User Data Isolation - Each user's scan history stored separately
- Secure Token Storage - Authentication tokens encrypted locally
- Powered by YOLOv8n (You Only Look Once v8 Nano)
- Custom-trained on 13 ingredient categories
- Recognizes common Nepali and Western cooking ingredients
- Fast inference time (~200ms average)
- Bounding box visualization for detected items
Supported Ingredients (13 total): The model is trained to recognize common ingredients like vegetables, fruits that are frequently used in Nepali cuisine.
- Percentage-based ranking - Recipes sorted by ingredient match
- Flexible matching - Shows recipes even with partial ingredient lists
- Prioritized display - Best matches appear first
- Recipe database - 30-40 curated recipes (Nepali and Western fusion)
- Stores last 10-15 scans per user
- Quick access to previous ingredient searches
- View what ingredients were detected in each scan
- Persistent across app sessions and devices
If running backend on a different device or network:
For local network (device testing):
# Use your computer's local IP
API_URL=http://192.168.1.100:5000For production deployment:
API_URL=https://your-domain.com- Create a new project at supabase.com
- Create the necessary tables for users, recipes, and scan history
- Enable authentication in the Supabase dashboard
- Copy your project URL and keys to the
.envfile
The app automatically requests camera permissions on first use. Make sure to allow camera access when prompted.
The current model supports 13 ingredient categories. This is a demo version with limited ingredient recognition. The model can be expanded with additional training data to support more ingredients.
Note: Ingredient recognition accuracy depends on image quality, lighting conditions, and how clearly ingredients are visible in the photo.
- Secure Authentication - User credentials managed by Supabase with industry-standard encryption
- User Data Isolation - Each user's data (scan history, preferences) stored separately
- No cloud storage of images - Ingredient photos processed in real-time, not stored
- Local session tokens - Authentication tokens encrypted and stored on device
- No third-party tracking - No analytics or data sharing with external services
- Limited to 13 ingredient categories
- Recognition accuracy varies with lighting and image quality
- Recipe database is relatively small (30-40 recipes)
- History limited to last 10-15 scans per user
- Requires active internet connection for ingredient recognition and authentication
Version: Demo.Final (v0.9.0)
This is a demonstration version developed as a student project. The app showcases core functionality with a limited ingredient database and recipe collection. With additional resources and time, the system can be significantly expanded.
- Expand ingredient recognition to 50+ categories
- Larger recipe database (100+ recipes)
- Cloud sync for cross-device history
- Recipe ratings and favorites
- Shopping list generation
- Offline ingredient recognition
- Nutritional information
- Dietary preference filters
- Social features and recipe sharing
This is a student demonstration project. Contributions, suggestions, and feedback are welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/improvement) - Commit your changes (
git commit -m "Add improvement") - Push to the branch (
git push origin feature/improvement) - Open a Pull Request
This project is open source and available under the MIT License.
Development Team
- Aadarsha Chhetri - @Boredoom17
- Vipassi V - @Vipassi-V
Both developers worked collaboratively on the entire project, contributing to both frontend and backend development.
Repositories:
- Frontend: SmartCooking Mobile App
- Backend: SmartCooking Flask Backend
- YOLOv8 by Ultralytics for object detection framework
- Supabase for providing excellent backend infrastructure
- React Native and Expo teams for excellent mobile development tools
- Flask community for lightweight backend framework
- All testers who provided valuable feedback
Need help or found a bug?
- π§ Open an issue on GitHub
- π Check the backend repository for API documentation
- π¬ Review closed issues for common solutions
β Star this repo if you found it helpful!
Made with β€οΈ for smarter cooking decisions
π± Frontend Repo β’ π§ Backend Repo β’ π Report Bug
Demo Version β’ Built by Students β’ Open for Expansion