Multilingual natural language processing

Channel 1
ROBERTO NAVIGLI Lecturers' profile

Program - Frequency - Exams

Course program
Introduction to Natural Language Processing N-gram language models; smoothing; interpolation; backoff Deep learning for NLP Introduction to PyTorch and PyTorch Lightning Monolingual and multilingual word embeddings, sense and concept embeddings Neural language models: recurrent and Transformer-based (BERT, XLM-RoBERTa, GPT-x, etc.) Large Language Models: pretraining and finetuning Computational lexical semantics Computational lexicons: WordNet Multilingual semantic networks: BabelNet Word embeddings vs. contextualized word embeddings vs. sense embeddings Word Sense Disambiguation and Entity Linking Multilinguality in Natural Language Processing Computational sentence-level semantics Neural Semantic Role Labeling and Semantic Parsing Natural Language Generation and Question Answering Neural Machine Translation
Prerequisites
No prerequisite.
Books
Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition.
Frequency
In class attendance.
Exam mode
Homework submission + oral presentation.
Lesson mode
In class attendance.
ROBERTO NAVIGLI Lecturers' profile

Program - Frequency - Exams

Course program
Introduction to Natural Language Processing N-gram language models; smoothing; interpolation; backoff Deep learning for NLP Introduction to PyTorch and PyTorch Lightning Monolingual and multilingual word embeddings, sense and concept embeddings Neural language models: recurrent and Transformer-based (BERT, XLM-RoBERTa, GPT-x, etc.) Large Language Models: pretraining and finetuning Computational lexical semantics Computational lexicons: WordNet Multilingual semantic networks: BabelNet Word embeddings vs. contextualized word embeddings vs. sense embeddings Word Sense Disambiguation and Entity Linking Multilinguality in Natural Language Processing Computational sentence-level semantics Neural Semantic Role Labeling and Semantic Parsing Natural Language Generation and Question Answering Neural Machine Translation
Prerequisites
No prerequisite.
Books
Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition.
Frequency
In class attendance.
Exam mode
Homework submission + oral presentation.
Lesson mode
In class attendance.
  • Lesson code10606869
  • Academic year2025/2026
  • CourseEngineering in Computer Science and Artificial Intelligence
  • CurriculumSingle curriculum
  • Year2nd year
  • Semester2nd semester
  • SSDINF/01
  • CFU6