Newsletter Wizard is a tool that aggregates information from various sources like Reddit and Discord, processes it, and generates with an LLM a blog-style website with curated articles.
Ensure you have the following installed:
Backend:
- Python 3.12+
- Poetry
Website (Frontend):
- Node.js 22+
- pnpm
Database:
- Docker (Recommended)
- Alternatively: PostgreSQL 17 installed directly
-
Environment Variables:
- Copy the example environment file:
cp .env.example .env
- Edit the
.envfile with your specific configuration (API keys, database credentials, etc.). Use your preferred editor:nano .env
- Create symbolic links for the backend and frontend apps to access the central
.envfile:ln -s ../.env app/.env && ln -s ../.env backend/.env
- Copy the example environment file:
-
Database Setup (Choose one):
-
Option A: Docker (Recommended)
Run the official PostgreSQL container:
docker run -it --rm \ --name newsletter-wizard-db \ -p 5433:5432 \ -e POSTGRES_USER=postgres \ -e POSTGRES_PASSWORD=postgres \ -e POSTGRES_DB=newsletter_wizard \ -v ./db/data:/var/lib/postgresql/data \ postgres:latest \ -c log_statement=all
-
Option B: Bare Metal PostgreSQL
Ensure you have PostgreSQL 17 installed and running. Create a database and user matching the credentials specified in your
.envfile.
-
-
Install Frontend Dependencies:
cd app pnpm install -
Install Backend Dependencies:
cd backend poetry install -
Database Migration: Apply the database schema:
cd app npx drizzle-kit migrate -
Install DiscordChatExporter:
This tool is required for extracting Discord messages.
- Navigate to the DiscordChatExporter Releases Page.
- Download the appropriate version for your Operating System (e.g.,
DiscordChatExporter.Cli.linux-x64.zipfor Linux). - Extract the downloaded archive.
- Rename the extracted folder to
discord_chat_exporter. - Move this
discord_chat_exporterfolder into thebackenddirectory of the project.
-
Initialize Newsletter Configuration (One-time setup): Run the initialization script located in the backend:
# Ensure you are in the backend directory or provide the full path python backend/dev/init_db.py -
Extract Newsletter Sources: Execute the extraction script:
# Ensure you are in the backend directory or provide the full path python backend/libs/extract_all_newsletters.py -
Generate Newsletter Content: Run the generation script:
# Ensure you are in the backend directory or provide the full path python backend/libs/newsletter_generator.py -
Start Frontend Development Server: Navigate to the
appdirectory and start the development server:cd app pnpm devThe website should now be accessible, typically at
http://localhost:5173(or the port configured by Vite/your setup).
- LLM Integration: OpenAI API Reference (Used for various LLM interactions)
- Frontend Routing: React Router v7
- Frontend File-Based Routing: React Router File Route Conventions
- Frontend UI Components: Shadcn/ui
- CSS Framework: Tailwind CSS
- Database ORM: Drizzle ORM
- Reddit Data Extraction: PRAW (Python Reddit API Wrapper)
- Discord Data Extraction: DiscordChatExporter Documentation
-
Configure Production Environment: Copy the example environment file specifically for production:
cp .env.example .env
Edit the
.envfile with your production database credentials, API keys, and other settings:nano .env
-
Start Services: From the root directory of the project, use Docker Compose to build and start the frontend container in detached mode:
docker compose up -d
This will start the frontend service and Caddy web server as defined in your
docker-compose.ymlfile.
The newsletter generation is automated using GitHub Actions. The workflow runs daily at 13:00 UTC and can also be triggered manually.
- Fork this repository
- Go to your repository's Settings > Secrets and variables > Actions
- Copy the workflow:
mkdir -p .github/workflows/ && cp newsletter_workflow.example.yml .github/workflows/newsletter.yml - Add all the required secrets from the
.env.examplefile in the root of the project - The workflow will automatically run daily at 13:00 UTC
- You can also trigger it manually from the Actions tab