First steps with KERAS

In reality, this post was intended for my DLAI course’s students, although I think it may be of interest to other students. I am going to share in this blog the teaching material that I am going to generate for the part of DLAI course that will cover the basic principles of Deep Learning from a computational perspective.

This post provides a fast-paced introduction to the KERAS API required to follow the DLAI Labs (Master Course at UPC – Autumn 2017). I will teach the part of DLAI course that will cover the basic principles of deep learning from computational perspectives. In this part we will review the basics of KERAS, a high-level neural networks API, written in Python and capable of running on top of TensorFlow, CNTK, or Theano.  A first contact with Keras can be found in this post. Also we will introduce  TensorBoard, a visualization tool included with TensorFlow

Keras

Keras is a Python library that provides a clean and convenient way to create a range of deep learning models on top of  powerful libraries such as TensorFlow, Theano or CNTK. Keras was developed and maintained by François Chollet, a Google engineer and it is released under the permissive MIT license.

Below are the tasks of this lab session. If you don’t finish all of them during this session lab, please, read last Task before leaving classroom.

Task 1:  Update DLAI lab docker image

We need to be sure that we have the last version of DLAI lab docker. Update or download our docker image :

docker pull jorditorresbcn/dlai-met:latest

Next we need to create a new container with the following options:

  • Forward port 6006
  • Forward port 8888
  • Share the container folder /app/code

Mac/Linux example:

docker run -it -p 8888:8888 -p 6006:6006 -v /home/user/newfolder:/app/code jorditorresbcn/dlai-met:latest

Windows example:

docker run -it -p 8888:8888 -p 6006:6006 -v /c/Users/youruser/newfolder:/app/code jorditorresbcn/dlai-met:latest

newfolder is an empty folder created by you somewhere inside your home folder.

In your container, clone the course repository inside the /app/code folder:

cd /app/code
git clone https://github.com/jorditorresBCN/dlaimet.git

Start a jupyter notebook using this command:

jupyter notebook --ip=0.0.0.0 --allow-root

Open a browser and go to http://localhost:8888 , the password is dlaimet.

If you are on windows and you are experiencing connectivity issues, please check THIS.

In this lab we will use the ports 8888 and 6006, if you are using a remote computer please open these ports.

Task 2: Run your first Keras program

Run your first Keras program following these instructions:

First of all, using your browser with jupyter, open the Keras examples folder and locate the mnist-keras-book  file. Try to run all the blocks t in order to check your Keras installation.  

The output should be something like (You can stop it anytime):

Using TensorFlow backend.
Downloading data from 
https://s3.amazonaws.com/img-datasets/mnist.npz
   8192/11490434 [.] - ETA: 164 ...
60000 train samples
10000 test samples
Train on 60000 samples, validate on 10000 samples
Epoch 1/12
128/60000 [.] - ETA: 102s - loss: 2.2928 - acc: 0.0938
...

Task 3 : Analyzing the code

Using your browser with jupyter, look for these parts on the code:

  1. Identify how Keras reads the data.
  2. Identify where the neural net definition is.
  3. Which are the layers used on this net? Which are the activation functions used on this net?
  4. Identify the loss and optimizer functions.

TensorBoard

TensorBoard is a visualization tool included with TensorFlow. You can use TensorBoard to visualize your TensorFlow graph, plot quantitative metrics about the execution of your graph, and show additional data like images that pass through it. In this lab we will use it to visualise information about our Keras network.

The code contains the variables tensorboard_dir  and tensorboard_active that allow the TensorBoard execution using the Keras callbacks. If you put tensorboard_active to True , Keras will start to save TensorBoard data to tensorboard_dir  every epoch.

Task 4: TensorBoard

Modify the tensorboard_dir  value to a folder for saving the TensorBoard data. Change the tensorboard_active  value to True. Before running the script, clear the jupyter kernel (Kernel -> Restart and clear output).

Hint: You will need another terminal for running TensorBoard and Jupyter at the same time. Open a new terminal and then use these commands:

docker ps
docker exec -it $DOCKER_ID /bin/bash
cd /app/code/dlaimet/keras

Run the  code with TensorBoard activated and then start it using the following command :

tensorboard --logdir=$tensorboard_dir
Starting TensorBoard Starting TensorBoard 0.1.6 at http://localhost:6006
(Press CTRL+C to quit)

Go to http://localhost:6006 through your browser and TensorBoard will start. We recommend Google Chrome or Chromium in order to avoid compatibility and lag problems.You will see an output like:

You can run TensorBoard and Keras at same time, Tensor-Board will update the data every epoch.

Task 5: How to use TensorBoard

TensorBoard could be a very useful tool during your DLAI project. Now you are ready to learn by your self the required features that could help you. The following guides explain how to use TensorBoard:

Going through the jupyter notebook, modify some hyperparameters (batch size, number of epochs, learning rate), optimizer, loss functions or the layers of the model. Use TensorBoard and the logs to observe the new accuracy. Before running the script, clear the jupyter kernel (Kernel -> Restart and clear output) and modify the Tensorboard folder. Take a look to the TensorBoard docs for seeing how to merge different runs in one chart.

Task 6: Lab Report

Build a lab report with

  • Your names and contact emails
  • Task 2: Include a screenshot of the output an a brief explanation of it.
  • Task 3: Include a brief answer of the 4 subsections
  • Task 5: Include a screenshot of your Tensorboard view and add some discussion about your changes.

You can submit your lab report @ATENEA intranet. Deadline: 31/oct/2017.

My thanks to Francesc Sastre for helping me with the preparation of this lab.

2017-10-24T17:40:12+00:00 September 23rd, 2017|