Course detail
Neural Networks and Deep Learning
FSI-VSC Acad. year: 2020/2021 Summer semester
The course introduces basic approaches to Machine Learning and Deep Learning and classical methods used in the field. Practical use of the methods is demonstrated on solving simple engineering problems.
Language of instruction
Czech
Number of ECTS credits
5
Supervisor
Department
Learning outcomes of the course unit
Understanding of basic methods of Artificial Neural Networks and ability of their implementation.
Prerequisites
The knowledge of basic relations of the optimization, statistics, graphs theory and programming.
Planned learning activities and teaching methods
The course is taught through lectures explaining the basic principles and theory of the discipline. Exercises are focused on practical topics presented in lectures.
Assesment methods and criteria linked to learning outcomes
Course-unit credit requirements: submitting a functional software project which uses implementation of selected AI method. Project is specified in the first seminar. Systematic checks and consultations are performed during the semester. Each student has to get through one test and complete all given tasks. Student can obtain 100 marks, 40 marks during seminars (20 for project and 20 for test; he needs at least 20), 60 marks during exam (he needs at least 30).
Aims
The course objective is to make students familiar with basic resources of Artificial Neural Networks, potential and adequacy of their use in engineering problems solving.
Specification of controlled education, way of implementation and compensation for absences
The attendance at lectures is recommended, at seminars it is obligatory. Education runs according to week schedules. The form of compensation of missed seminars is fully in the competence of a tutor.
The study programmes with the given course
Programme N-AIŘ-P: Applied Computer Science and Control, Master's, compulsory
Type of course unit
Lecture
26 hours, optionally
Teacher / Lecturer
Syllabus
1. Introduction to Machine Learning and Soft Computing in the Context of Artificial Intelligence.
2. Evolutionary algorithms I. (genetic algorithms, evolutionary strategies, differential evolution).
3. Evolutionary algorithms II. (grammatical evolution, genetic programming).
4. Selected optimization metaheuristics (HC, HC12, THC, simulated annealing).
5. SWARM Intelligence (PSO, ACO, SOMA).
6. Architectures and classification of neural networks. Perceptron.
7. Feedforward neural networks, single and multilayer networks. ADALINE. Back Propagation Algorithm. Optimization methods used in ANN design.
8. RBF and RCE neural networks. Topologically organized neural networks (competitive learning, Kohonen maps).
9. Cluster analysis. Task dimension reduction. Principal component analysis. LVQ neural networks, neural networks ART.
10. Associative neural networks (Hopfield, BAM), behavior, state diagram, attractors, learning. and Neocognitron.
11. Deep Neural Network. CNN. Transfer Learning.
12. Spiking neural Network.
13. Case studies. Deterministic chaos and its control.
Computer-assisted exercise
26 hours, compulsory
Teacher / Lecturer
Syllabus
Seminars related to the lectures in the previous week. Solution Topics:
- Implementation of basic metaheuristics
- solving global optimization problems
- use of global optimization toolbox
- use of deep neural network toolbox
- creation of nonlinear models using neural networks
- deep learning in computer vision for image classification
- detection of objects in Image using Deep Learning (R-CNN)
- Semantic Image Segmentation using Deep Learning (SegNet)
- validation of CNN learning and control of learned networks using deep dream method