Bagging and boosting

Bagging and boosting are two methods introduced in machine learning that aim to combine different classifiers into one classifier that outperforms its components. AdaBoost seems to be the most widely used boosting algorithm, or at least the one profs seem to be more excited about. A very detailed description of how and why it works can be found here. However, I find Chris Bishop’s chapter 14 in Pattern Recognition and Machine Learning much more accessible if you haven’t seen this material before.

I have the feeling that the more algorithms I am introduced to in my machine learning class the less ways I know to solve problems, or at least, the less confident I am about which method is most appropriate. I guess that will come with time and experience, but still, I have to admit I have more respect now for statisticians.

Thankfully, Prof. Aaron Hertzmann’s notes for CSC411 at UofT are really helpful and accessible for people who visit the machine learning jungle for the first time.

Advertisements
Bagging and boosting

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s