Digit Recognition
Overview
This project is a Python neural network that recognizes and evaluates a drawn digit using scikit-learn. The user can draw a digit on a kivy window and the program will display the predicted digit and the confidence scores for each possible digit in the form of a graph.
Data
The training data for this project is from the UCI ML hand-written digits dataset, which contains 1,797 images of digits from 0 to 9, each with a size of 8x8 pixels. The images are preprocessed and converted into a numpy array of 64 features, representing the pixel values of each image.
Model
The model for this project is a multilayer perceptron (MLP) classifier from the scikit-learn library. The MLP classifier is a type of artificial neural network that consists of an input layer, one or more hidden layers, and an output layer. The MLP classifier learns the weights and biases of the network by using backpropagation and gradient descent algorithms.
The MLP classifier has the following parameters:
hidden_layer_sizes
: a tuple of integers, specifying the number of neurons in each hidden layer. For this project, I used(15,)
, meaning one hidden layer with 15 neurons.activation
: the activation function for the hidden layer. I used a logistic sigmoid function for this project:alpha
: the strength of the L2 regularization term. The L2 regularization term is divided by the sample size when added to the loss. I used 1e-4.solver
: solver for weight optimization. I used stochastic gradient descent.tol
: tolerance for the optimization. When the loss or score is not improving by at leasttol
forn_iter_no_change
consecutive iterations, convergence is considered to be reached and training stops.random_state
: determines random number generation for weights and bias initialization, and batch sampling. Using 1, I was able to create reproducible results across multiple function calls.learning_rate_init
: initial learning rate used. It controls the step-size in updating the weights. Used a relatively high value of 0.1 which led to faster training.