A simple neural network written in rust.
This implementation of a neural network using gradient-descent is completely written from ground up using rust. It is possible to specify the shape of the network, as well as the learning-rate of the network. Additionally, you can choose from one of many predefined datasets, for example the XOR- and CIRCLE Datasets, which represent the relative functions inside the union-square. As well as more complicated datasets like the RGB_DONUT, which represents a donut-like shape with a rainbow like color transition.
Below, you can see a training process, where the network is trying to learn the color-values of the RGB_DONUT dataset.
The following features are currently implemented:
- Optimizers
- Adam
- RMSProp
- SGD
- Loss Functions
- Quadratic
- Activation Functions
- Sigmoid
- ReLU
- Layers
- Dense
- Plotting
- Plotting the cost-history during training
- Plotting the final predictions inside, either in grayscale or RGB
The process of creating and training the neural network is pretty straightforwards:
Below, you can see how the network learns: