Course details
Classification and recognition
KRD Acad. year 2024/2025 Summer semester
Estimation of parameters Maximum Likelihood and Expectation-Maximization, formulation of the objective function of discriminative training, Maximum Mutual information (MMI) criterion, adaptation of GMM models, transforms of features for recognition, modelling of feature space using discriminative sub-spaces, factor analysis, kernel techniques, calibration and fusion of classifiers, applications in recognition of speech, video and text.
State doctoral exam - topics:
- Maximum Likelihood estimation of parameters of a model
- Probability distribution from the exponential family and sufficient statistics
- Linear regression model and its probabilistic interpretation
- Bayesian models considering the probability distribution (uncertainty) of model parameters
- Conjugate priors and their significance in Bayesian models
- Fishers linear discriminant analysis
- Difference between generative and discriminative classifiers; their pros and cons
- Perceptron and its learning algorithm as an example of linear classifiers
- Generative linear classifier - Gaussian classifier with shared covariance matrix
- Discriminative classifier based on linear logistic regression
Guarantor
Language of instruction
Completion
Time span
- 39 hrs lectures
Assessment points
- 100 pts final exam
Department
Learning objectives
To understand advanced classification and recognition techniques and to learn how to apply the algorithms and methods to problems in speech recognition, computer graphics and natural language processing. To get acquainted with discriminative training and building hybrid systems.
The students will get acquainted with advanced classification and recognition techniques and learn how to apply basic methods in the fields of speech recognition, computer graphics and natural language processing.
The students will learn to solve general problems of classification and recognition.
Prerequisite knowledge and skills
Basic knowledge of statistics, probability theory, mathematical analysis and algebra.
Study literature
- Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning, MIT Press, 2016.
- Simon Haykin: Neural Networks And Learning Machines, Pearson Education; Third edition, 2016.
Syllabus of lectures
- Estimation of parameters of Gaussian probability distribution by Maximum Likelihood (ML)
- Estimation of parameters of Gaussian Gaussian Mixture Model (GMM) by Expectation-Maximization (EM)
- Discriminative training, introduction, formulation of the objective function
- Discriminative training with the Maximum Mutual information (MMI) criterion
- Adaptation of GMM models- Maximum A-Posteriori (MAP), Maximum Likelihood Linear Regression (MLLR)
- Transforms of features for recognition - basis, Principal component analysis (PCA)
- Discriminative transforms of features - Linear Discriminant Analysis (LDA) and Heteroscedastic Linear Discriminant Analysis (HLDA)
- Modeling of feature space using discriminative sub-spaces - factor analysis
- Kernel techniques, SVM
- Calibration and fusion of classifiers
- Applications in recognition of speech, video and text
- Student presentations I
- Student presentations II
Progress assessment
Oral exam.
Course inclusion in study plans
- Programme DIT, any year of study, Compulsory-Elective group T
- Programme DIT, any year of study, Compulsory-Elective group T
- Programme DIT-EN (in English), any year of study, Compulsory-Elective group T
- Programme DIT-EN (in English), any year of study, Compulsory-Elective group T
- Programme VTI-DR-4, field DVI4, any year of study, Elective
- Programme VTI-DR-4, field DVI4, any year of study, Elective
- Programme VTI-DR-4 (in English), field DVI4, any year of study, Elective