NEURAL NETWORKS AND COMPUTER VISION
SERGEY NIKOLENKO
We offer innovative university degrees taught in English by industry leaders from around the world, aimed at giving our students meaningful and creatively satisfying top-level professional futures. We think the future is bright if you make it so.
Deep learning, i.e., training multi-layered neural architectures, was one of the oldest tools in machine learning but has revolutionised the industry over the last decade. In this course, we begin with the fundamentals of deep learning and then proceed to modern architectures related to basic computer vision problems: image classification, object detection, segmentation, and others.
Modern computer vision is almost entirely based on deep convolutional neural networks, so this is a natural fit that lets us explore interesting architectures while at the same time staying focused and not going into too wide a survey of the entire field of deep learning. Computer vision is also a key element in robotics: vision systems are necessary for navigation, localisation and mapping, and scene understanding, which are all key problems for creating industrial and home robots.
The course is supported by Neuromation and features practical assignments done over the Neuromation platform.
PhD, CRO at Neuromation, Laboratory Head at Steklov Mathematical Institute,
Associate Professor at St. Petersburg State University
Sergey Nikolenko is a computer scientist with vast experience in machine learning and data analysis, algorithms design and analysis, theoretical computer science, and algebra. He graduated from St. Petersburg State University in 2005, majoring in algebra (Chevalley groups), and earned his Ph.D at the Steklov Mathematical Institute at St. Petersburg in 2009 in theoretical computer science (circuit complexity and theoretical cryptography). Since then, Sergey has been interested in machine learning and probabilistic modeling, producing theoretical results and working on practical projects for the industry.
Sergey Nikolenko is currently serving as the Chief Research Officer at Neuromation, leading the Artificial Intelligence Lab at the Steklov Mathematical Institute at St. Petersburg, and teaching at the St. Petersburg State University and Higher School of Economics. Dr. Nikolenko has published more than 150 research papers, including top computer science journals and conferences and several books, including a bestselling "Deep Learning" book (in Russian).
As a result of the course, the students will:
- Learn to apply Deep Learning techniques in practice
- Understand the theory behind the Deep Learning from basics to state-of-the-art approaches
- Learn how to train various deep neural architectures
- Understand a wide variety of neural architectures suited for real-life computer vision problems
- Gain essential experience with main Deep Learning frameworks
SKILLS:
- Machine Learning
- Algorithms for Networking
- Bioinformatics
- Mathematical Modeling
ABOUT SERGEY
HARBOUR.SPACE
WHAT YOU WILL LEARN
DATE: 20 May - 7 Jun, 2019
DURATION: 3 Weeks
LECTURES: 3 Hours per day
LANGUAGE: English
LOCATION: Barcelona, Harbour.Space Campus
COURSE TYPE: Offline
HARBOUR.SPACE UNIVERSITY
DATE: 20 May - 7 Jun, 2019
DURATION: 3 Weeks
LECTURES: 3 Hours per day
LANGUAGE: English
LOCATION: Barcelona, Harbour.Space Campus
COURSE TYPE: Offline
All rights reserved. 2017
COURSE OUTLINE
Session 1
Neural network basics
Neural networks: history and basic idea. Relationship between biology and mathematics. The perceptron: basic construction, training, activation functions.
Practice: intro to Deep Learning frameworks
Session 4
Regularisation in neural networks
Regularisation: L1, L2, early stopping. Dropout. Data augmentation.
Practice: Applying different regularisation approaches
Session 3
Optimisation in neural networks
Gradient descent: motivation, problems. Modifications, ideas: momentum, Nesterov’s momentum, Adagrad, RMSProp, adam. Second order methods
Practice: comparing gradient descent variations
Session 2
Feedforward neural networks
Feedforward neural networks. Gradient descent basics. Computation graph and computing gradients on the computation graph (backpropagation).
Practice: a feedforward neural network on classic datasets
NEURAL NETWORKS AND COMPUTER VISION
BIBLIOGRAPHY
Deep learning, i.e., training multi-layered neural architectures, was one of the oldest tools in machine learning but has revolutionised the industry over the last decade. In this course, we begin with the fundamentals of deep learning and then proceed to modern architectures related to basic computer vision problems: image classification, object detection, segmentation, and others.
Modern computer vision is almost entirely based on deep convolutional neural networks, so this is a natural fit that lets us explore interesting architectures while at the same time staying focused and not going into too wide a survey of the entire field of deep learning. Computer vision is also a key element in robotics: vision systems are necessary for navigation, localisation and mapping, and scene understanding, which are all key problems for creating industrial and home robots.
The course is supported by Neuromation and features practical assignments done over the Neuromation platform.
"Deep Learning (Adaptive Computation and Machine Learning series)" by Ian Goodfellow,Yoshua Bengio & Aaron Courville (The MIT Press, 2016)
ALEXEY DAVYDOV
ABOUT ALEXEY
Alexey Davydov is a computer scientist experienced with algorithm design and machine learning. He received his bachelor degree in physics at Moscow Institute of Physics and Technology and his master degree at St. Petersburg Academic University. His main research interests are developing of competitive scheduling algorithms and usage of synthetic data in deep learning.
He has been teaching at St. Petersburg Academic University, Computer Science Center and St. Petersburg State University since 2012. Alex Davydov currently is a researcher at Steklov Math Institute where he works on theoretical research and at Neuromation where he can apply it to practice.
SKILLS:
- Model Theory
- Algorithms
- Machine Learning
- Graph Theory
- Discrete Mathematics