Stacked autoencoder in Keras

Now let's build the same autoencoder in Keras.

We clear the graph in the notebook using the following commands so that we can build a fresh graph that does not carry over any of the memory from the previous session or graph: tf.reset_default_graph() keras.backend.clear_session()
  1. First, we import the keras libraries and define hyperparameters and layers:
import kerasfrom keras.layers import Densefrom keras.models import Sequentiallearning_rate = 0.001n_epochs = 20batch_size = 100n_batches = int(mnist.train.num_examples/batch_sizee# number of pixels in the MNIST image as number of inputsn_inputs = 784n_outputs = n_i# number of hidden layersn_layers = 2# neurons in each hidden layern_neurons = [512,256]# add decoder ...

Get Mastering TensorFlow 1.x now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.