Data science

In the next two lectures, we discuss a general framework for learning, neural networks.

History and recent surge

From Wang and Raj (2017):

The current AI wave came in 2012 when AlexNet (60 million parameters) cuts the error rate of ImageNet competition (classify 1.2 million natural images) by half.

Learning sources

This lecture draws heavily on following sources.

Single layer neural network (SLP)

image source

Multi-layer neural network (MLP)

Expressivity of neural network

Universal approximation properties

Practical issues

Neural networks are not a fully automatic tool, as they are sometimes advertised; as with all statistical models, subject matter knowledge should and often be used to improve their performance.

Convolutional neural networks (CNN)

Sources: https://colah.github.io/posts/2014-07-Conv-Nets-Modular/

Source: https://indoml.com/2018/03/07/student-notes-convolutional-neural-networks-cnn-introduction/

Example: MNIST and LeNet-5

Example: ImageNet and AlexNet

Source: http://cs231n.github.io/convolutional-networks/

Recurrent neural networks (RNN)

Generative Adversarial Networks (GANs)

The coolest idea in deep learning in the last 20 years.
- Yann LeCun on GANs.