Multilingual natural language processing

Course objectives

General Objectives The goal of the course is to provide an overview of state-of-the-art natural language processing techniques and their applications. Specific Objectives Students will learn the principles of automatic language processing, understanding how machines can interpret, generate and respond to human language. This includes topics such as word representation, word and sense embeddings, neural architectures for NLP, machine translation, and more general text generation. Knowledge and Understanding -) Knowledge of neural network architectures, such as recurrent neural networks and Transformers, used for natural language processing. -) Knowledge of supervised and unsupervised learning methods in NLP.-) Knowledge of lexical and phrasal computational semantics techniques. -) Understanding of language models for interpreting and generating text. Applying knowledge and understanding: -) How to develop models for understanding language -) How to develop models for generating language -) How to use neural architectures for NLPAutonomy of Judgment. Autonomy of Judgment Students will be able to evaluate the effectiveness of NLP techniques in different applications. Communication Skills Students will be able to explain the principles and techniques of natural language processing. Next Study Abilities Students interested in research will discover what are the main open challenges in the area of NLP, obtaining the necessary foundation for more in-depth studies in the field.

Channel 1
ROBERTO NAVIGLI Lecturers' profile

Program - Frequency - Exams

Course program
Introduction to Natural Language Processing N-gram language models; smoothing; interpolation; backoff Deep learning for NLP Introduction to PyTorch and PyTorch Lightning Monolingual and multilingual word embeddings, sense and concept embeddings Neural language models: recurrent and Transformer-based (BERT, XLM-RoBERTa, GPT-x, etc.) Large Language Models: pretraining and finetuning Computational lexical semantics Computational lexicons: WordNet Multilingual semantic networks: BabelNet Word embeddings vs. contextualized word embeddings vs. sense embeddings Word Sense Disambiguation and Entity Linking Multilinguality in Natural Language Processing Computational sentence-level semantics Neural Semantic Role Labeling and Semantic Parsing Natural Language Generation and Question Answering Neural Machine Translation
Prerequisites
No prerequisite.
Books
Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition.
Frequency
In class attendance.
Exam mode
Homework submission + oral presentation.
Lesson mode
In class attendance.
  • Lesson code10606869
  • Academic year2025/2026
  • CourseArtificial Intelligence and Robotics
  • CurriculumSingle curriculum
  • Year1st year
  • Semester2nd semester
  • SSDINF/01
  • CFU6