THREE-DIMENSIONAL MODELING

Course objectives

Learning goals Fundamentals of Statistical Learning II I is the second part (worth 3 out of 12 credits) of a two-semester course which overall aims at providing the fundamental tools for: - setting up probabilistic models for observable phenomena; - understanding the basic principles of the main inferential problems: estimation, hypothesis testing, model checking and forecasting; - understanding and contrasting the two main inferential paradigms, namely frequentist and Bayesian statistics; - implementing inference on observed data through both optimization and simulation-based (approximation) techniques such as: Bootstrap, Monte Carlo (MC) and Monte Carlo Markov Chain (MCMC); - understanding comparative merits of alternative strategies developing statistical computations within a suitable software environment like R (www.r-project.org), JAGS(https://mcmc-jags.sourceforge.io), OpenBUGS (http://openbugs.net/w/FrontPage) and STAN (http://mc-stan.org/). In particular the second part of the course will focus mainly on the Bayesian inferential framework with epmhasis on MC and MCMC techniques and R and JAGS software. Conoscenza e capacità di comprensione | Knowledge and understanding On successful completion of the second part of the course, students will know: - how to set up Bayesian inference; - the theoretical ground for solving inferential goals like point estimation, interval estimation and hypothesis testing by means of the posterior distribution; - the theoretical ground for predicting future observations by means of the posterior predictive distribution; - how to set up a conjugate Bayesian model and obtain point estimation, interval estimation and hypothesis testing and prediction on future - how to obtain approximations of the theoretical tools for point and set estimation, hypothesis testing and predictions by means of simulations of i.i.d copies from the posterior distribution or the posterior predictive distributions as well as by means of simulations from a suitable ergodic Markov Chain with invariant distributions corresponding to the posterior distribution. Applying knowledge and understanding Besides the understanding of theoretical aspects, thanks to applied homeworks and a dedicated laboratory students will be constantly challenged to use and evaluate all the techniques they have learned as well as to propose new modelizations suitable for specific tasks at hand. Students will be able to carry out the inferential tasks by means of a suitable probabilistic programming language like JAGS. Making judgements On successful completion of this course, students will develop a positive critical attitude towards conceiving a suitable statistical models for observed data and providing empirical and theoretical evaluation of statistical methodologies and results. Communication skills In preparing the report and oral presentation for the final project students will learn how to effectively communicate information, ideas, problems and solutions to specialists, but also to a general audience. Learning Skills In this course students will develop the skills necessary for a successful understanding and application of new statistical methodologies together with their effective implementation. The goal is of course to grow an active attitude towards continued learning throughout a professional career.

Channel 1
LUCA TARDELLA Lecturers' profile

Program - Frequency - Exams

Course program
• introduction to the basics of Bayesian inference • conjugate Bayesian models • examples of non-conjugate models • introduction to Monte Carlo methods as approximation strategy • Monte Carlo methods for Bayesian inference • pseudo-random number generation • uniform distributions and common classes of parametric distributions • general classes of algorithms for simulating from a known density: inverse transform sampling, acceptancerejection algorithm, fundamental theorem of simulation • classical asymptotic theorems and Monte Carlo methods: convergence and error control • importance sampling techniques • alternative Monte Carlo strategies for approximating marginal likelihood and Bayes Factor • introduction to Markov chains on a finite state space • introduction to Markov chains on general state spaces • transition kernels and transition densities • Markov chains, stationarity, invariant measures • limiting distributions and rate of convergence • general algorithms for Markov chain simulation with a prescribed invariant distribution • Gibbs sampling • Metropolis Hastings • MH, alternative proposal distributions, tuning • basic examples of GS • basic examples of MH • reversibility • hybrid methods: kernel composition, kernel mixtures • GS and MH implementation on real data examples • Bayesian hierarchical linear models, generalized linear model (examples) • linear mixed-effect models • multimodel inference: model choice via marginal likelihood and DIC only and model averaging
Books
Teaching material will be also delivered through the Sapienza elearning platform Moodle at the following address: https://elearning.uniroma1.it/course/view.php?id=13089 Course Material and Main Reference Books • Course lecture notes (slide available at the https://elearning2.uniroma1.it/course/view.php?id=7253) • R or R+Jags commented codes • Peter Hoff, A First Course in Bayesian Statistical Methods. Springer-Verlag Inc, 2009. • Jean-Michel Marin and Christian P. Robert, Bayesian Core: A Practical Approach to Computational Bayesian Statistics, Springer, 2007 • Christian P. Robert and George Casella. Monte Carlo statistical methods (2nd ed.. Springer-Verlag Inc, 2004. • Ioannis Ntzoufras, Bayesian Modeling Using WinBUGS. Wiley, 2009. • Peter Congdon. Bayesian Statistical Modelling (2nd ed.). Wiley, 2006
  • Academic year2025/2026
  • CourseData Science
  • CurriculumSingle curriculum
  • Year1st year
  • Semester2nd semester
  • SSDSECS-S/01
  • CFU3