Talking Machines

If you haven’t come across specialized, technical podcasts then Talking Machines is going to pleasantly surprise you. It is a show devoted to machine learning. Unlike podcasts such as Radiolab, which have a generic mission to make science accessible to the public, TM has a much more precise focus and assumes that you already have a basic background in the field. It manages to provide sophisticated overviews of different sub-fields in machine learning and never shies away from technical details. Its creators, Katherine Gorman and Ryan Adams (who’s a machine learning prof), are able to simplify concepts without dumbing them down. They’ve done an excellent job and I hope they continue to make this gem for a long time.

Talking Machines

Support vector machines

Yet another classification method used in machine learning. Here is the most accessible tutorial I found on this topic, by Tristan Fletcher at UCL. It might also be useful to see how you can use SVMs for regression (to predict continuous variables, instead of classes). This technical report, by Steve Gunn at U of Southampton, was the one that added the most intuition among the tutorials I found on SVMs, along with Geoff Hinton’s notes. And if you are interested in having a library that implements different SVMs then you might want to take a look at Shogun. It provides interfaces to Matlab, R, Octave and Python. It seems like a pretty neat library — at least from what I can see on its website.

Support vector machines

Bagging and boosting

Bagging and boosting are two methods introduced in machine learning that aim to combine different classifiers into one classifier that outperforms its components. AdaBoost seems to be the most widely used boosting algorithm, or at least the one profs seem to be more excited about. A very detailed description of how and why it works can be found here. However, I find Chris Bishop’s chapter 14 in Pattern Recognition and Machine Learning much more accessible if you haven’t seen this material before.

I have the feeling that the more algorithms I am introduced to in my machine learning class the less ways I know to solve problems, or at least, the less confident I am about which method is most appropriate. I guess that will come with time and experience, but still, I have to admit I have more respect now for statisticians.

Thankfully, Prof. Aaron Hertzmann’s notes for CSC411 at UofT are really helpful and accessible for people who visit the machine learning jungle for the first time.

Bagging and boosting

The backpropagation algorithm

One of the lectures in my Machine Learning class briefly described the backpropagation algorithm used in neural networks. I wasn’t very satisfied with how the lecture notes described it, and there were many details I didn’t understand. Fortunately, this chapter from Raul Rojas’ Neural Networks: a Systematic Introduction does a much better and detailed job of describing the algorithm. Hope it helps!

The backpropagation algorithm