NumPy_ML was a personal education & exploration project that implements a grammar of Deep-Learning tools.
- Built From Scratch ๐ฅ๐ฅ
- 100% NumPy ๐ฏ๐
- Extensible architecture with Auto-Grad ๐ป๐บ๏ธ
- Easy Modular ๐ฅฅ๐ด
- Expressive, type-hinted, human code ๐งผ๐
Please see the MNIST demo as a 'Tutorial by Example' ๐ง๐งฎ
The code is useful for ML students to preview some essential ANN algorithms from scratch ;) ๐ฟ๐ฟ๐ฟ
Algorithms such as:
- 2D Convolution (Numba accelerated)
- Softmax
- Backpropagation / Chain Rule
- Adam Optimisation
- Auto-Initialisation (e.g. ReLU -> Kaiming; Softmax -> Xavier)
- Cross-Entropy Loss
- Confusion Matrix
Network operations are float32 based. The design is capable of quick prototyping & deployment of small networks with ~max 12-16 layers unless you're good at keeping gradients alive (may require non-sequential architecture). ๐
The full project was originally intended for non-linear Deep Reinforcement-Learning workflows hence its a bit over-engineered for its current capabilities but the essential roadmap is all laidout if I ever wish to revisit this old project. There's of course a lot more features I wish to have added. Please see pyproject.toml for build requirements. Please note that Torchvision is a dependency of this project only for convenience and reproducibility of fetching the MNIST dataset for the MNIST classification demo.
All work within this repository is licensed with the Attribution-NonCommercial-ShareAlike 4.0 International
(see license.md or visit https://creativecommons.org/licenses/by-nc-sa/4.0/)
Thanks for readingโ๏ธ๐
If my work was useful in anyway, please support it with a star โญ๏ธ๐