Computing Workflows, Data Science, and such


Keras In Motion Release and Sabbatical Update

My latest video course, Keras In Motion, is now live at Manning Publications. You may have heard me mention this “other project” in the stream over the past 6 months or so; it’s been taking up a lot of my free time. The course itself is a walkthrough of the Keras library using real applied projects. It’s really a great course, with the live construction of the streaming videos, but with a scripted, edited, final product for cleanliness. If you’ve enjoyed my YouTube channel, I think you’ll like the content and structure of this course. Plus, if you use the code, vlboxeltw, you can get a great discount.

So am I doing anything else on my sabbatical besides selling videos? I’ve been traveling, doing some sightseeing, and trying to decompress. But also, trying to live more simply for awhile. I had accumulated a long of stuff, so while traveling, it’s actually nice to live with less for awhile. Something about the scrimping, saving, and scavenging for basic household goods is a bit of fun. Maybe, like a video game, it’s the continual “upgrade” of basic tools possible when starting from nothing again.

One thing I’m glad to have back is a high-speed internet connection. I was operating on purely mobile data for awhile, which in addition to being slow, is paid by the byte. Even if it weren’t more expensive, the psychological toll of knowing every bit of data is costing me limited my connectivity. That’s good and bad for decompressing, but now I’m glad to be a part of the 21st century again.

Keras Line by Line MNIST IRNN

It’s time for another LineByLine, now with a recurrent neural net. Keras includes a curious example of an RNN for MNIST, so I wanted to take a look at that. The premise is that we should be ableto classify digits by doing the same computation on pixels one at a time and finally making our prediction at the end. It’s not so different from a convolutional neural net, except it’s exactly one pixel at a time, and we only get the final output to make our prediction. Well, and the recurrence gives some history between successive pixels.

What I learned by working through this is another key was initializing the recurrent weights to the identity matrix. In this way, a given node really only passes on its activation to itself in the next time step. The weights won’t necessarily stay zero, of course, but this seems like a reasonable starting point, and the research bears this out.

One thing to be aware of, however, is that this is a terrible way to solve MNIST. It’s slow as heck (minimal parallelization with the RNNs), and the accuracy is crummy. But, as an instructive tool, it’s highly valuable. I see people bash MNIST, and while there are good reasons for disliking it, it’s educational value is huge.

Keras Line by Line Scikit-learn GridSearch

I felt like the Project Euler videos weren’t as interesting as I hope (and they were topics well covered by others already), so by request I’m doing another Line by Line series. This time, we’re looking at various Keras examples. Specifically, I decided to check out sklearn GridSearch Example. GridSearch is really just an exhaustion over possible hyperparameters you declare in a dictionary. This is neat if you have a limited number of varied models to try and don’t mind waiting:

While I didn’t have a prior familiarity with sklearn’s GridSearch, the Keras example was a gentle introduction to the process. While I probably won’t be using this specific SciKit-Learn tool, it’s great that the ability is built-in to Keras. Finding the right specific model is a frustration of mine, so I’m looking forward to deploying some of these tools for Keras models.

Project Euler Probs 1-3

We’re taking a break from machine learning to do some fundamental mathematics. Project Euler is a website that hosts mathematics problems of increasing difficulty. In general, these can be solved within a few minutes on a typical computer if programmed properly (many can also be solved by hand with clever tricks to avoid laborious computation). While I find that they have a few too many factoring problems, they’re a great way to get used to a language and working with math. I worked through the first few problems this week.

For the first two problems, I think this worked very well. The problems aren’t particularly difficult, but doing them efficiently can trip you up. I started with the most naive solution and worked up to better ones. If you’ve seen the problems before, you’ll probably arrive directly at the better approaches, but it’s helpful to bootstrap your way there. And I’ve found that minimal iteration is about the only reliable way humans can learn.

Bardic Lore Kids Stories

While last time I setup the code to enact my epic transfer learning plan, this week was about actually putting some data in. Thankfully, I was able to find a Dr. Seuss corpus as well as a set of Aesop’s Fables. I thought it would make sense to train on Dr Seuss first, being nominally simpler language, and then transfer up to more complex models.

During the stream, there’s not enough time to train these models to completion. But offline, I let the biggest model train overnight on Dr Seuss. The result? Well, it was something:

oh theer called a call. there they call all we likes. i hall now house do not hall. what a lot og they is nod, and the good! they should then a shall. i was for they are dear. and then this well feet of a fish. i like thing ase ase things and sis. you call tat a say. not is a good! and the good gust a lido! and then he this one hood. herd! i have a down. and when we said, oh u pat he to os of the, he head come old. i have no shad was to then moush box of therr will sing. whele theyee nerettleeee tleetle bedttle butp as a they… homle ha heold. then he say! i wish not his, and she call a pat feat the there. here. then thing house like. whele, thele things whes to! hees they thisgs they gave howtele seeetletle beetles. but, what theve lae fleas. whis is she theid head a shoull. thenee stord a chat the good fish, said thing sand and would bat bere

← Newer Page 1 of 15