This paper is published in Volume-4, Issue-3, 2018
Area
Artificial Neural Networks
Author
Karishma Dasgaonkar, Swati Chopade
Org/Univ
Veermata Jijabai Technological Institute, Mumbai, Maharashtra, India
Pub. Date
30 June, 2018
Paper ID
V4I3-2038
Publisher
Keywords
Neural networks, Multi-Layered perceptron, MLP, Radial basis function, RBF, Convolutional neural network, CNN, MNIST, ANN

Citationsacebook

IEEE
Karishma Dasgaonkar, Swati Chopade. Analysis of multi-layered perceptron, radial basis function and convolutional neural networks in recognizing handwritten digits, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.

APA
Karishma Dasgaonkar, Swati Chopade (2018). Analysis of multi-layered perceptron, radial basis function and convolutional neural networks in recognizing handwritten digits. International Journal of Advance Research, Ideas and Innovations in Technology, 4(3) www.IJARIIT.com.

MLA
Karishma Dasgaonkar, Swati Chopade. "Analysis of multi-layered perceptron, radial basis function and convolutional neural networks in recognizing handwritten digits." International Journal of Advance Research, Ideas and Innovations in Technology 4.3 (2018). www.IJARIIT.com.

Abstract

Identification of handwritten digits is one of the major areas of research in the field of character recognition. Artificial Neural Networks helps in computer vision that deals with how a computer could achieve high-level of understanding from digital images or videos. Thus, neural networks prove to be a boon in recognizing handwritten digits that are scanned as images. However, this paper aims at studying the working of specifically three neural networks - Multi-Layered Perceptron (MLP), Radial Basis Function (RBF) and Convolutional Neural Network (CNN). In order to focus majorly on the implementation of these three neural networks rather than the complexity of the dataset being used, we have used MNIST (Modified National Institute of Standard and Technology) dataset from keras library. MNIST dataset contains 70,000 black and white images of handwritten English digits (60,000 training images and 10,000 testing images). In our study of the above three mentioned neural networks, we have used relu as activation function in the hidden layers and softmax as activation function in the final layer of neural network, adam as an optimizer and cross-entropy as loss function. We have observed that all three networks give accuracy above 95%, however, the major difference is in its training time and error rate.