Course details
Natural Language Processing (in English)
ZPJa Acad. year 2021/2022 Winter semester 5 credits
Foundations of the natural language processing, historical perspective, statistical NLP and modern era dominated by machine learning and, specifically, deep neural networks. Meaning of individual words, lexicology and lexicography, word senses and neural architectures for computing word embeddings, word sense classification and inferrence. Constituency and dependency parsing, syntactic ambiguity, neural dependency parsers. Language modeling and its applications in general architectures. Machine translation, historical perspective on the statistical approach, neural translation and evaluation scores. End-to-end models, attention mechanisms, limits of current seq2seq models. Question answering based on neural models, information extraction components, text understanding challenges, learning by reading and machine comprehension. Text classification and its modern applications, convolutional neural networks for sentence classification. Language-independent representations, non-standard texts from social networks, representing parts of words, subword models. Contextual representations and pretraining for context-dependent language modules. Transformers and self-attention for generative models. Communication agents and natural language generation. Coreference resolution and its interconnection to other text understanding components.
Guarantor
Course coordinator
Language of instruction
Completion
Time span
- 26 hrs lectures
- 26 hrs projects
Assessment points
- 51 pts final exam
- 9 pts mid-term test
- 40 pts projects
Department
Lecturer
Fajčík Martin, Ing., Ph.D. (DCGM)
Kesiraju Santosh, Ph.D. (DCGM)
Ondřej Karel, Ing.
Smrž Pavel, doc. RNDr., Ph.D. (DCGM)
Subject specific learning outcomes and competences
The students will get acquainted with natural language processing and will understand a range of neural network models that are commonly applied in the field. They will also grasp basics of neural implementations of attention mechanisms and sequence embedding models and how these modular components can be combined to build state of the art NLP systems. They will be able to implement and to evaluate common neural network models for various NLP applications.
Students will improve their programming skills and their knowledge and practical experience with tools for deep learning as well as with general processing of textual data.
Learning objectives
To understand natural language processing and to learn how to apply modern machine learning methods in this field. To get acquainted with advanced deep learning architectures that proved to be successful in various NLP tasks. To
Why is the course taught
More and more people use natural language processing (NLP) in their everyday life - machine translators, virtual assistants, etc. Most NLP tasks have been recently realised by means of deep neural networks. Students of this course will learn, how the computer translates texts between languages, how it recognizes what a review author likes or dislikes about a new product, and how virtual asistants can answer questions on the Wikipedia text.
Prerequisite knowledge and skills
Good knowledge of artificial neural network models and Python programming.
Study literature
- Géron, Aurélien. Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. " O'Reilly Media, Inc.", 2017.
- Raaijmakers, Stephan. Deep Learning for Natural Language Processing. Manning, 2019.
- Goldberg, Yoav. "Neural network methods for natural language processing." Synthesis Lectures on Human Language Technologies 10, no. 1 (2017): 1-309.
- Deng, Li, and Yang Liu, eds. Deep Learning in Natural Language Processing. Springer, 2018.
Syllabus of lectures
- Introduction, history of NLP, and modern approaches based on deep learning
- Word senses and word vector
- Dependency parsing
- Language models
- Machine translation
- Seq2seq models and attention
- Question answering
- Convolutional neural networks for sentence classification
- Information from parts of words: Subword models
- Modeling contexts of use: Contextual representations and pretraining
- Transformers and self-attention for generative models
- Natural language generation
- Coreference resolution
Syllabus - others, projects and individual work of students
- Individually assigned project
Progress assessment
- Mid-term test - up to 9 points
- Individual project - up to 40 points
- Written final exam - up to 51 points
Controlled instruction
The evaluation includes mid-term test, individual project, and the final exam. The mid-term test does not have a correction option, the final exam has two possible correction runs.
Exam prerequisites
- Realized individual project
Course inclusion in study plans
- Programme IT-MGR-2, field MBI, any year of study, Compulsory-Elective group S
- Programme IT-MGR-2, field MBS, MGM, MIN, MIS, MMM, MPV, MSK, any year of study, Elective
- Programme IT-MGR-2 (in English), field MGMe, any year of study, Elective
- Programme MITAI, field NADE, NBIO, NCPS, NEMB, NGRI, NHPC, NIDE, NISD, NISY, NISY up to 2020/21, NMAL, NMAT, NNET, NSEC, NSEN, NVER, NVIZ, any year of study, Elective
- Programme MITAI, field NSPE, any year of study, Compulsory