With the Bardic Lore model spitting out Shakespearean sounding (if nonsensical) text, I wanted to step back and try an iterative training process. I learned to read as a kid reading well, kid’s books. They have simple language, so it should be easier for a machine to learn as well. And ideally, we can use a simple model from the simple data to start training a more complex model on more complex data (like chapter books). While I didn’t have time to find a suitable data set, I did implement the syntax of this series of transfer learning. It’s like advancing through grade school for our machine learning reader.

If you have a good data set of kid’s books in a text format, I’d love to give this model a whirl. I had wanted to use Dr Seuss (and so get out some amusing phrases from the model), but that’s not in the public domain. Maybe children’s poems or fairy tales would be suitable. Hmm, or perhaps an open-source work written by a human in the style of Dr Seuss would be good enough to get a model going. All ideas welcome!