# | Likes | Tech tags | Title | Creator | Created date |
---|---|---|---|---|---|
1 | 0 |
TensorFlow
Keras
|
2022-10-01 23:02
|
Constructs a simple 3-layer neural network for running on the MNIST dataset. Uses sigmoid activation functions and achieves 95% accuracy.
from tensorflow import keras
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data(path="mnist.npz") # Download the MNIST dataset
inputs = keras.Input(shape=x_train.shape[1:])
hidden = keras.layers.Flatten()(inputs)
hidden = keras.layers.Dense(128, activation='sigmoid')(hidden)
hidden = keras.layers.Dense(64, activation='sigmoid')(hidden)
outputs = keras.layers.Dense(10, activation='softmax')(hidden)
model = keras.Model(inputs, outputs, name="simple_feedforwards_sample")
model.compile(
optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),
loss=keras.losses.CategoricalCrossentropy(),
metrics=["acc"]
)
In order for you to be able to make meaningful predictions, you need to train this neural network. The following sample shows how:
Train a Neural Network on Data: Short Code - algoteka.com
A full example on how a model was trained and evaluated on MNIST:
MNIST Digit Classification: Simple 3-layer Neural Network 96% acc. - algoteka.com
You can read up more on the theory from the following links:
Neural Networks: Lecture 4: Feed-forward Neural Networks slides - courses.cs.ut.ee
Neural Networks: Lecture 4: Feed-forward Neural Networks video - courses.cs.ut.ee
Feedforward neural network - wikipedia.org
classes | |
tensorflow.keras.Input |
tensorflow.org |
tensorflow.keras.Model |
tensorflow.org |
tensorflow.keras.layers.Dense |
tensorflow.org |
tensorflow.keras.layers.Flatten |
tensorflow.org |
tensorflow.keras.losses.CategoricalCrossentropy |
tensorflow.org |
tensorflow.keras.optimizers.RMSprop |
tensorflow.org |
functions | |
tensorflow.keras.Model.compile |
tensorflow.org |
tensorflow.keras.datasets.mnist.load_data |
tensorflow.org |
Construct a feedforward neural network with at least 3 layers and with sensible activation functions. Strive for simplicity in your samples.