While I’m not longer producing videos, I am pursuing active research. As an Applied Math PhD Student at University of Arizona, I’m researching new machine learning methods.

While NNs are still extremely popular, I’m looking at how to integrate them better with classical ideas like Random Forests, or especially, Bayesian Additive Regression Trees. To that end, I replicated and extended some work on Neural Random Forests. Biau et al came up with a clever method of rewriting a decision tree (as from a Random Forest) as a numerically equivalent neural network. I apply this methodology to BART, because the woods are just trees, and I’m just making an artifical Orchard.

I found, however, that one recovers results similar to simply training a new neural network or forest. In fact, this is true even for the RF used by Biau et al. That is, their table of results includes standard deviations for results (i.e. error bars), and the Neural Random Forest results are always within these error bars. The same happens for BART-derived forests, even when trying a variety of methods.

I’m still investigating better ways to marry BART-like approaches with neural networks. I hope to have something promising and publishable this year.