University of Cambridge > Talks.cam > Computer Laboratory Research Students' Lectures 2014 > Logistic and Softmax Regression, and their Relation to the Neural Network World

Logistic and Softmax Regression, and their Relation to the Neural Network World

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Advait Sarkar.

Logistic Regression, along with its generalized counterpart Softmax Regression, is one of the most popular and best-performing generalized linear classification algorithms currently used in Machine Learning. In this lecture, I will explain some of the intuitions behind using and training these classifiers, and I will show how they are related to Neural Networks.

Using cleverly assembled examples from the harsh and unforgiving world of dating, I will reveal the probabilistic concepts that lie behind the logistic function. I will demonstrate how to train a binary logistic regression classifier using gradient descent, and I will show how those intuitions generalize naturally to the multi-class problem. Last but not least, we will see how these classifiers can be thought of as very simple Artificial Neural Networks, and thus can be used as layer components in more complicated Neural Network architectures.

The slides can be viewed here.

This talk is part of the Computer Laboratory Research Students' Lectures 2014 series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity