Project Overview
Automatic differentiation engine that is similar to PyTorch, used to automatically calculate the derivative / gradient of any computable function by performing PyTorch-like backpropagation, reverse autodifferentiation, on dynamically built computational graph, DAG.
Features
- Activation functions & optimizers: ReLU, Sigmoid, tanh, SGD, Adam
- Layers: Linear, BatchNorm1d, BatchNorm2d, Dropout, Conv1d, Conv2d, MaxPool2d, AvgPool2d
- Loss functions: CrossEntropyLoss, Mean squared error
- RNN, GRU and Computational graph visualizer
Tools Used
Python
Numpy
Graphviz
Jupyter Notebook