top of page

Neural Networks for Recognition 

CMU16-720, Computer Vision

November 2020

Abstract: 

For this project, my objective was to implement a fully connected neural network from scratch. This included writing the forward/backward propagation, loss function, weight updates and batching. My objective was to identify handwritten letters given a set of sample data - the "Hello World" of neural nets. 

Process: 

The back propagation algorithm was engineered using the gradient descent approach. Given the original weights of each neuron and the appropriate intermediate results, this function would return the gradient with respect to its loss. From here the network was trained with a single hidden layer consisting of 64 neurons to get the best weights possible. The confusion matrix to the left demonstrates the most common mix-ups during training

Result: 

After segmenting the letters from the sample images, each image was compressed in size and passed as input the to neural net. An example of the neural net's best output can be found below. 

Original Image

​

NN's Best Guess

Segmented Letters

As you can see the NN is subject to many of the mistakes captured in the confusion matrix. Although it did a decent job recreating the sentences given the original images. 

Some other neat results:After segmenting the letters from the sample images, each image was compressed in size and passed as input the to neural net. An example of the neural net's best output can be found below. 

Visualization of Neuron's Weights Upon Initialization

Visualization of Neuron's Weights After Training

Example of input image vs. NN's recreation of the letter V and O

bottom of page