CS565500 Large-Scale Machine Learning

Fundamentals of machine learning, scalable techniques, and deep learning.

Description

This class brings machine learning theory, tools, and real-world datasets together to teach students how to analyze massive data effectively and efficiently. It is designed to be self-containted and consists of 3 parts:

In the first part, we review some required maths. In the second part, we introduce fundamental machine learning concepts/models/algorithms. And lastly, in part 3 we discuss how large-scale machine learning differs from small-scale learning tasks. In particular, we focus on the deep learning techniques (big models) and present some recent, exiting advances in the field.

Instructor

Teaching Assistants

Time & Location

  • Lecture: Tue. 10am-12pm at Delta 109
  • Lab: Thu. 9-11am at Delta 109
  • Office hour: Wed. 3-5pm at Delta 723/724

Grading Policy

  • Assignments (x4): 40%
  • Midterm exam: 20%
  • Competitions (x2): 40%

Prerequisits

This course is intended for senior undergraduate and junior graduate students who understand

  • Computer Programming,
  • Calculus,
  • Linear Algebra, and
  • Probability.
We use Python 3 as the main programming language throughout the course. Although not required, background knowledge about scientific computing will be helpful.

Announcement

Curriculum

The class was offered in Fall 2016 and has ended. However, we will continue updating the materials. If you have any feedback, feel free to contact: shwu [AT] cs.nthu.edu.tw

Lecture 01

Introduction

What's ML? | About This Course... | FAQ

Slides Notation

Scientific Python 101

This lab guides you through the setup of scientific Python environment and provides useful references for self-reading.

Notebook

Lecture 02

Linear Algebra

Span & Linear Dependence | Norms | Eigendecomposition | Singular Value Decomposition | Traces | Determinant

Video Slides

Data Exploration & PCA

This lab guides you through the process of Exploratory Data Analysis (EDA) and discuss how you can leverage the Principle Component Analysis (PCA) to visualize and understand high-dimensional data.

Notebook

Lecture 03

Probability & Information Theory

Random Variables & Probability Distributions | Multivariate & Derived Random Variables | Bayes’ Rule & Statistics | Principal Components Analysis | Information Theory | Decision Trees & Random Forest

Video Slides

Decision Trees & Random Forest

In this lab, we will apply the Decision Tree and Random Forest algorithms to the classification and dimension reduction problems using the Wine dataset.

Notebook

Lecture 04

Numerical Optimization

Numerical Computation | Optimization Problems | Unconstrained Optimization | Stochastic Gradient Descent | Perceptron | Adaline | Constrained Optimization | Linear & Polynomial Regression | Duality

Video Slides

Perceptron & Adaline

In this lab, we will guide you through the implementation of Perceptron and Adaline, two of the first machine learning algorithms for the classification problem. We will also discuss how to train these models using the optimization techniques.

Notebook

Regression

This lab guides you through the linear and polynomial regression using the Housing dataset. We will also extend the Decision Tree and Random Forest classifiers to solve the regression problem.

Notebook

Lecture 05

Learning Theory & Regularization

Point Estimation | Bias & Variance | Consistency | Decomposing Generalization Error | Weight Decay | Validation

Video Slides

Regularization

In this lab, we will guide you through some common regularization techniques such as weight decay, sparse weight, and validation.

Notebook

Lecture 06

Probabilistic Models

Maximum Likelihood Estimation | Maximum A Posteriori Estimation | Bayesian Estimation

Video Slides

Logistic Regression & Metrics

In this lab, we will guide you through the practice of Logistic Regression. We will also introduce some common evaluation metrics other than the "accuracy" that we have been used so far.

Notebook

Lecture 07

Non-Parametric Methods & SVMs

KNNs | Parzen Windows | Local Models | Support Vector Classification (SVC) | Nonlinear SVC | Kernel Trick

Video Slides

SVMs & Scikit-Learn Pipelines

In this lab, we will classify nonlinearly separable data using the KNN and SVM classifiers. We will show how to pack multiple data preprocessing steps into a single Pipeline in Scikit-learn to simplify the training workflow.

Notebook

Lecture 08

Cross Validation & Ensembling

CV | How Many Folds? | Voting | Bagging | Boosting | Why AdaBoost Works?

Video Slides

CV & Ensembling

In this lab, we will guide you through the cross validation technique for hyperparameter selection. We will also practice and compare some ensemble learning techniques.

Notebook

Midterm Competition

Predicting News Popularity

In this competition, you are provided with raw news articles and the goal is to use everything you have learned so far to predict whether a news article will be intensively shared in online social networking services. Good luck!

Notebook

Lecture 09

Large-Scale Machine Learning

When ML Meets Big Data... | Representation Learning | Curse of Dimensionality | Trade-Offs in Large-Scale Learning | SGD-Based Optimization

Video Slides

Lecture 10

Neural Networks: Design

NN Basics | Learning the XOR | Back Propagation | Cost Function & Output Neurons | Hidden Neurons | Architecture Design

Video Slides

Keras & Word2Vec

In this lab, we will show how to train a neural network (NN) for text classification using the Keras library. Then we train another neural network, called the word2vec, that embeds words into a dense vector space where semantically similar words are mapped to nearby points.

Notebook

Lecture 11

Neural Networks: Optimization & Regularization

Momentum & Nesterov Momentum | AdaGrad & RMSProp | Batch Normalization | Continuation Methods & Curriculum Learning | Weight Decay | Data Augmentation | Dropout | Manifold Regularization | Domain-Specific Model Design

Video Slides

NN Regularization

In this lab, we will apply some regularization techniques to neural networks over the CIFAR-10 dataset and see how they improve the generalizability.

Notebook

Lecture 12

Convolutional Neural Networks

ConvNet Architecture | Kernel/Filter & Stride & Padding | Pooling | Dilated Convolutions | LeNet | AlexNet | VGGNet | GoogLeNet & Inception Modules | Residual Networks | DenseNets | Stacked Hourglass Networks | Deep Compression

Guest Lecture by Prof. Hwann-Tzong Chen

Slides

Visualizing CNNs

TBA

Notebook

Lecture 13

Recurrent Neural Networks

Vanilla RNNs | Design Alternatives | Backprop through Time (BPTT) | LSTM | Parallelism & Teacher Forcing | Attention | Explicit Memory | Adaptive Computation Time (ACT) | Memory Networks | Google Neural Machine Translation

Video Slides

Seq2Seq Learning for Machine Translation

TBA

Notebook

Lecture 14

Unsupervised Learning

Clustering | Recommendation & Factorization | Dimension Reduction | Predictive Learning | Autoencoders | Manifold Learning | Synthesis & Generation | Generative Adversarial Networks (GANs)

Video Slides

Autoencoders & GANs

TBA

Notebook

Lecture 15

Semisupervised/Transfer Learning and the Future

Label Propagation | Semisupervised GANs | Semisupervised Clustering | Multitask Learning | Weight Initiation & Fine-Tuning | Domain Adaptation | Zero Shot Learning | Unsupervised Transfer Learning | Future at a Glance

Video Slides

Final Competition

Image Captioning

Given the Microsoft COCO dataset, your task is to devise and train a model that generates a suitable sentence describing an image. Here is an example:

A boy who just submitted his code and is waiting for the ranking in the final competition.

Notebook

Resources

Following provides links to some useful online resources. If this course starts your ML journey, don't stop here. Enroll yourself in advanced courses (shown below) to learn more.

Other Course Materials

For more course materials (such as assignments, score sheets, etc.) and online forum please refer to the iLMS system.

iLMS System

Reference Books

  • Ian Goodfellow, Yoshua Bengio, Aaron Courville, Deep Learning, MIT Press, 2016, ISBN: 0387848576

  • Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition, Springer, 2009, ISBN: 0387848576

  • Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006, ISBN: 0387310738

  • Sebastian Raschka, Python Machine Learning, Packt Publishing, 2015, ISBN: 1783555130

Online Courses