The problem we’re trying to solve here is to classify grayscale images of handwritten digits (28 × 28 pixels) into their 10 categories (0 through 9). We’ll use the MNIST dataset, a classic in the machine-learning community, which has been around almost as long as the field itself and has been intensively studied. It’s a set of 60,000 training images, plus 10,000 test images, assembled by the National Institute of Standards and Technology (the NIST in MNIST) in the 1980s. You can think of “solving” MNIST as the “Hello World” of deep learning—it’s what you do to verify that your algorithms are working as expected.
MNIST stands for the mixed National Institute of Standards and Technology dataset that contains the images. The MNIST dataset is a collection of images representing the handwritten numerals 0 through 9.
Loading the MNIST dataset in Keras.
#Import all required lib. from keras.datasets import mnist from keras import models from keras import layers from keras.utils import to_categorical
#load data into mnist data into test and train. (train_images, train_labels), (test_images, test_labels) = mnist.load_data()
# check shape of array. Array is having 60k Images with 28x28 in size. train_images.shape
# check shape of array. Array is having 10k images with 28x28 in size. test_images.shape
The network architecture
network = models.Sequential() network.add(layers.Dense(512, activation='relu', input_shape=(28*28,))) network.add(layers.Dense(10, activation='softmax'))
The compilation step
Preparing the image data
train_images= train_images.reshape((60000, 28 * 28)) train_images= train_images.astype('float32') / 255 test_images= test_images.reshape((10000, 28 * 28)) test_images= test_images.astype('float32') / 255
Preparing the labels
train_labels = to_categorical(train_labels) test_labels = to_categorical(test_labels)
network.fit(train_images, train_labels, epochs=5, batch_size=128)
Epoch 1/5 60000/60000 [================] - 6s 107us/step - loss: 0.2578 - acc: 0.9257 Epoch 2/5 60000/60000 [================] - 7s 110us/step - loss: 0.1034 - acc: 0.9699 Epoch 3/5 60000/60000 [================] - 6s 101us/step - loss: 0.0682 - acc: 0.9792 Epoch 4/5 60000/60000 [================] - 6s 103us/step - loss: 0.0493 - acc: 0.9851 Epoch 5/5 60000/60000 [================] - 6s 107us/step - loss: 0.0367 - acc: 0.9890
test_loss, test_acc = network.evaluate(test_images, test_labels)
10000/10000 [==============================] - 1s 73us/step