COURSE OUTLINE
Session 1
Neural networks: history and basic idea. Relationship between biology and mathematics. The perceptron: basic construction, training, activation functions. Practice: intro to Deep Learning frameworks
Neural network basics.
Session 4
Regularization: L1, L2, early stopping. Dropout. Data augmentation.
Practice: Applying different regularisation approaches
Regularisation in neural networks
Session 5
Weight initialisation: supervised pre training idea, why straightforward random init fails, Xavier initialisation. Covariate shift and batch normalisation.
Practice: putting everything together
Weight initialisation and batchnorm
Session 2
Feedforward neural networks. Gradient descent basics. Computation graph and computing gradients on the computation graph (backpropagation).
Practice: a feedforward neural network on classic datasets
Feedforward neural networks
Session 3
Gradient descent: motivation, problems. Modifications, ideas: momentum, Nesterov’s momentum, Adagrad, RMSProp, adam. Second order methods
Practice: comparing gradient descent variations
Optimization in neural networks
Session 6
Convolutional architectures: idea and structure. Examples. Deconvolution and visualisation in CNNs.
Practice: CNNs for MNIST
Convolutional neural networks I
Session 7
Modern convolutional architectures. AlexNet, VGG, Network in network, Inception. Residual connections and ResNet.
Practice: image classification
Convolutional neural networks II
Session 8
Single-stage detectors: YOLO, SSD, YOLOv2, and YOLOv3.
Practice: single-stage object detection
Object detection
Session 9
Mid-term test
Session 10
Two-stage detectors: R-CNN, Fast R-CNN, Faster R-CNN, F-RCN, Feature Pyramid Networks (FPN), focal loss and RetinaNet
Practice: two-stage object detection
Object detection II
Session 11
Classical approaches: edge detection, region growing, graph-based image segmentation, N4-fields. Fully convolutional networks: FCN, DeconvNet, SegNet, U-Net, TernausNet. Instance segmentation: FCIS, DeepMask, Mask R-CNN
Practice: deep learning for segmentation
Segmentation
Session 12
Style transfer: problem setting, models for style transfer. A neural algorithm of artistic style. Perceptual losses. Variations.
Practice: style transfer model
Style transfer
Session 13
Generative models and neural networks. Types of generative models. Generative adversarial networks: idea, DCGAN, AAE, modern applications.
Practice: AAE on MNIST
Generative adversarial networks I
Session 14
GANs for image generation. Conditional GANs. Wasserstein GANs. Various loss functions in GANs. Stacked GAN.
#thispersondoesnotexist.
Practice: GAN for image generation
Generative adversarial networks II
Session 15
Final exam