Technical Foundations and Modern Architectures in Neural Networks.
This repository is dedicated to the theoretical and practical aspects of neural networks, with a focus on both foundational models and extensible architectures. It provides:
- Mathematical and algorithmic implementations of core neural network concepts, starting with the perceptron as the fundamental building block for binary classification and linear separability.
- Hands-on code for constructing, training, and evaluating neural networks from scratch, emphasizing transparency and educational value.
- Extensible design to facilitate experimentation with learning rules, activation functions, and network topologies, supporting both research and teaching use cases.
- Modern Python packaging and testing practices, ensuring reproducibility and ease of integration into larger machine learning workflows.
The project is structured to help users understand the step-by-step mechanics of neural computation, weight updates, convergence, and the transition from single-layer to multi-layer architectures.
deep-learning/
├── perceptron/
│ ├── __init__.py
│ ├── perceptron.py
│ ├── readme.md
│ └── single-layer.ipynb
├── tests/
│ └── perceptron/
│ └── test_perceptron.py
├── README.md
├── pyproject.toml
└── ...This project uses PEP 621 and Hatchling for packaging. To install dependencies:
uv syncYou can use the perceptron implementation directly in your Python code:
from perceptron.perceptron import Perceptron
# Example: AND logic gate
X = [[0,0],[0,1],[1,0],[1,1]]
y = [0,0,0,1]
p = Perceptron(2, learning_rate=0.1)
p.train(X, y, epochs=20)
predictions = [p.predict(x) for x in X]
print(predictions) # Output: [0, 0, 0, 1]To run the perceptron unit tests:
uv run tests tests/perceptron