An equity order matching engine simulating US stock exchange mechanics with an interactive dashboard, built in Python. Features price-time priority matching, automated trade execution, real-time market data processing, advanced analytics, and full REST API access via FastAPI.
- Order matching engine – price-time priority matching for limit and market orders
- Real-time market data – live order book updates, trade analytics, and market statistics
- Advanced analytics – VWAP calculations, trade metrics, market activity tracking, and historical data analysis
- Interactive web-based Streamlit dashboard – market monitoring, analysis, and order submission
- REST API via FastAPI – full programmatic access with comprehensive documentation
- Market simulation tool – automated order generation for demonstration purposes
- Highly performant, scalable architecture – Redis caching and Kafka messaging for real-time processing
- Containerised deployment – easy deployment and local development via Docker Compose
The order book services handle order processing, matching, and market data dissemination. We expose the services via a FastAPI gateway service which enables the user to interact with the services via REST API calls.
The system consists of four main services:
- Gateway Service: REST API service that handles incoming orders and market data requests
- Matching Engine: Processes orders and executes trades using price-time priority
- Market Data Service: Manages market data dissemination and analytics
- Database: Stores order and trade history
- Streamlit UI: Interactive web dashboard providing real-time market data visualisation, order book analysis, trade history, and order submission
The market simulator is a tool that allows you to simulate market activity to demonstrate how the order book services work. We use the simulator to generate orders for the order book services to process.
You can find the MarketSimulator class in
src/order_book_simulator/simulator/market_simulator.py.
We've also provided an example script in
examples/market_simulator_usage.py that
shows how to use the MarketSimulator class to simulate market activity.
- FastAPI: REST API framework for the gateway service
- Polars: Data processing and analysis
- SQLAlchemy: ORM for the database
- Pydantic: Data modelling and validation
- Kafka: Message broker for order flow and market data
- PostgreSQL: Persistent storage for orders and trades
- Redis: Caching for real-time market data
- Streamlit: UI for the interactive web dashboard
- Docker: Containerisation and deployment
Run the following command from the project root directory:
uv sync --all-extras --devUse Docker Compose to build and run the services locally:
docker compose up --buildFrom there, you can interact with the services:
- Streamlit UI: http://localhost:8501 – Interactive dashboard for market monitoring, analysis, and user-friendly order submission
- FastAPI Documentation: http://localhost:8000/docs – REST API interface for programmatic, full-featured access to the order book services
To reset the order book services, you can stop the services and remove the containers, images, and volumes:
docker compose down -vTo run the market simulator, you can use the following command:
uv run python examples/market_simulator_usage.pyThe project includes two categories of benchmarks:
# Benchmark the core order book data structure (pure Python, synchronous).
python benchmarks/unit/order_book_benchmark.py
# Benchmark the full matching engine with mocked I/O.
python benchmarks/unit/matching_engine_benchmark.py
# Run all unit benchmarks together.
python benchmarks/unit/run_all.pyThe order book benchmark measures the raw performance of the matching logic and data structures (higher throughput). The matching engine benchmark measures end-to-end throughput with mocked dependencies (lower throughput), showing async orchestration overhead.
# Integration benchmark with real Redis and Kafka.
docker compose up -d redis kafka
sleep 5
python benchmarks/integration/integration_benchmark.py
docker compose down -vThe integration benchmark uses real Redis and Kafka, reflecting true production performance (lowest throughput). This shows the impact of actual I/O operations and is the most realistic measure of production performance.





