Computing Workflows, Data Science, and such


TensorFlow and skflow weight recovery

In addition to supporting a handful of built-in models, skflow allows you to supply a custom graph, typically capped off with one of the common models (like a fully connected layer). The example shows how this actually works with a convolutional neural net. There’s just one problem: weights are no longer accessible. I struggled through this in the first video last week.

But perseverance paid off! This week I studied the log file normally used for TensorBoard. Running strings on that file, I extracted the necessary incantation to access all of the weights. And I submitted a Pull Request to show other users how this is done.

classifier.get_tensor_value('conv_layer1/convolution/filters:0')

Here’s the triumph in action:

TensorFlow with skflow Introduction

TensorFlow is pretty cool, but if you want a higher-level interface, you should checkout skflow. skflow lets you quickly specify common machine learning algorithms like linear regression, logistic classification, and of course neural nets (of various flavors). What makes it so easy to use is that it abstracts away a lot of the variable initialization and handling required for lower-level libraries like TensorFlow or Theano. You can literally specify a Deep Neural Net in one line of code.

classifier = skflow.TensorFlowDNNClassifier(hidden_units=[15,20,25,15], n_classes = len(mnist['target_names'])

If you’re familiar with scikit-learn, the interface is designed to feel very similar. And while skflow is nominally for educational purposes to get you up to speed with TensorFlow, I have a feeling many people will still make some impressive models.

Dreaming of a Convolutional Neural Network

After spending forever yakking about Convolutional Neural Nets last week, I thought it would be best to complete the “show” part of “show and tell”. So this week I built a simple CNN to predict digit classification, modifying an example at this excellent tutorial. In addition to actually building the dang net, I also made some quick plots of the weights and activations of various layers.

In for a penny, in for a pound. After whipping up the weight image, I tried to mimic Google’s Deep Dreams and let my network imagine what it thinks a ‘2’ looks like. The results were entertaining if not helpful.

Theano Convolutional Neural Network

Convolutional Neural Nets (CNNs) are where machine learning really starts to ramp up the challenge in understanding. Accordingly, more of this video is spent explaining rather than coding. That unforunately means I didn’t actually build much in the way of CNNs. I think next week I’ll try to do a walkthrough of a simple MNIST example, carefully explaining each step. It’s like one of my mantras

Anything I understand is trivially simple.
Anything I don't is hopelessly complex.
Microsoft CNTK Installation

Fortuitously, Microsoft released their own machine learning library this week, CNTK. Unlike TensorFlow, Theano, and other popular frameworks, CNTK doesn’t yet support Python or C++ bindings. So far, it looks like it only has configuration files and an ndl file format for specifying nets. I streamed trying to install it and do the simplest computation

Since I tried to do this the day it was released, you shouldn’t be suprised that there were issues. I was little irked that although they supported Linux, they didn’t have a binary already compiled. And trying to compiling it myself quickly led to dependency hell. Thankfully, I found an old binary on their codeplex page. While it was a CPU-only version, it allowed me to start playing with the language.

Page 12 of 16