Useful links:
- DockerHub: https://hub.docker.com/repositories/jadonay123
- Render Deployment: https://social-media-api-project-dglt.onrender.com
- Ubuntu web deployment: https://jadonsapi.me/docs
Reference Link for tutorial video:
Description: Followed a tutorial which can be found in the links above, providing me deep insight on the following topics:
- Software/tools
- PostMan
- Postgres/PGAdmin4
- Render
- DigitalOcean/Ubuntu
- DockerHub
- Git
- Pull/Push
- GitHub Actions
- GitHub Marketplace
- Linux Command terminal
- Command Prompt(CMD)
- NGINX
- Languages & Libraries
- Python
- Libraries: Refer to the requirements.txt file for a comprehensive overview
- Short Summary:
- FastAPI
- Pydantic
- SQLAlchemy
- jose
- pytest
- Short Summary:
- Libraries: Refer to the requirements.txt file for a comprehensive overview
- SQL Topics that this project covered & taught me:
- Python
- FastAPI/Pydantic
- pydantic validation
- serialization/deserialization
- schematics
- Crud Operations
- Create users/posts
- Delete users/posts
- Update users/posts
- Get users/posts
- ORMs
- Used ORMs to create databases in python without having to manually make one (refer to models.py for a comprehensive overview)
- authentication
- JWT token
- token validation/creation
- Routers
- Databases
- relational database
- Used foreign keys to form relationships between tables
- Used SQL commands to query database
- later switched to python commands to query database using SQLAlchemy
- SQL joins
- Alembic
- Completed alembic revisions and migrations onto Postgres database
- relational database
- Deployment
- Render
- Note: Used Render as a free alternative to Heroku
- Linked postgres database with Render
- started an active server which can be accessed in the links above
- DigitalOcean
- Started an Ubuntu Server
- Created a personal user to avoid using 'root'
- modified files to include environment variables
- made various changes to files to fit the API's needs
- Utilized Alembic migrations to create tables in Postgres automatically
- Used Gunicorn to create workers to boost efficiency and maximize performance
- Enabled NGINX to start web server which can be accessed in the links above
- enabled a firewall for basic security
- Started an Ubuntu Server
- Docker
- Created a docker-compose development & production file
- The files hold all the commands required to build the chosen image, with the given ports, environment variables, and volumes.
- Pushed images used onto DockerHub(refer to the links above for DockerHub)
- Created a docker-compose development & production file
- Render
- Testing
- PyTest
- Tested various functions in the users and posts files
- Utilized pytest's 'fixture' function & learned how to use the 'parameterize function
- Created a Testing environment
- Used the 'override_get_db' function to create a testing database
- Created a TestClient object for testing
- Made client function dependent on session function, guaranteeing testing database isolation, forcing the TestClient to use the same database as Test as well as for automatic clean up and creation of tables
- used the -s and -v flags during testing to see detailed error responses
- Created a Conftest file to store all fxtures
- PyTest
- CI/CD Pipelines (GitHub Actions)
- created a .yaml file storing all commands and tools required to automate the delivery process from development to deployment
- Created a 'Build' and 'Deploy' job for development and production
- CI
- Setup environment variables in the GitHub secrets tab
- Connected the Postgres testing database with the tetsing portion of the integration
- prevents code pushing onto GitHub if tests aren't passes
- Integrated docker
- CD
- deployed changes onto ubuntu server via 'Deploy' job
- Used SSH Remote Commands by 'appleboy' found on GitHub Marketplace
- Verifies that 'Build' runs successfully before running 'Deploy' as GitHub Actions runs jobs 'parrallel'
- deployed changes onto ubuntu server via 'Deploy' job