- Docente: Andrea Mentrelli
- Credits: 6
- SSD: MAT/07
- Language: English
- Teaching Mode: Traditional lectures
- Campus: Bologna
- Corso: Second cycle degree programme (LM) in Automation Engineering (cod. 8891)
-
from Sep 16, 2024 to Dec 16, 2024
Learning outcomes
At the end of the course students have the following skills. They are familiar with the framework of the modern theory of probability; they know the foundations and, in particular, they have specific knowledge about the applications of the probability theory to automation engineering: they know all the basics of reliability theory and the most important probabilistic distributions relevant in this field. They are familiar with the fundamentals of stochastic processes, both discrete and continuous. They are familiar with basics of unconstrained and constrained optimization theory: they know the main definitions, the optimality conditions and basic algorithms. They have expertise to further develop their knowledge in the area of decision, planning and control according to future needs.
Course contents
Introduction to the modern theory of probability. Deterministic and random experiments; sample spaces and events; the algebra of events; overview of the various approaches to the study of probability; the axioms of probability; the measure of probability.
Combinatorics. The basic principle of counting; simple permutations; simple dispositions; permutations with repetitions; dispositions with repetitions; cyclic permutations; sampling; binomial coefficients and multinomial coefficients; simple combinations; combinations with repetitions; binomial theorem; number of integer solutions of linear equations.
Conditional probability. Definitions; theorem of total probability; Bayes's formula; independent events; diagnostic tests; Bayesian filters
Random variables. Definitions of random variable; distribution function of probability; cumulative distribution function; density function; expected value; variance; skewness; kurtosis; Chebyshev's inequality.
Distributions of probability. Bernoulli distribution; binomial distribution; geometric distribution; negative binomial distribution; hypergeometric distribution; Poisson distribution; discrete uniform distribution; continuous uniform distribution; exponential distribution; Rayleigh distribution; gamma distribution; Erlang distribution;; Weilbull distribution; Gaussian distribution.
Introduction to the reliability theory. Failure rate and reliability/survival functions; mean time between failures; the role of the exponential, gamma and Weibull distributions; the bathtub curve.
Multiple random variables. Definitions; distribution function; joint and marginal probability density functions; conditional distribution functions; independent random variables; means, covariance, moments of double random variables; correlation. Linear regression and correlation analysis. Extension to the case of multiple random variables with any number of components.
Functions of random variables. Expected value and variance of the sum and product of two random variables; linear combination of random variables. Sum of discrete and continuous random variables (convolution). Sum of exponential/normal random variables. Probability density function for functions of one or more random variables. Lognormal distribution.
Limit theorems. Laws of large numbers and limit theorems; convergence of sequences of random variables; weak laws of large numbers; the central limit theorem; applications of the central limit theorem.
Markov Chains. Definition of Markov chain and transition probabilities; representation of a Markov chain by means of graph and transition matrix; absorbing and transient states; absorbing chain; the drunkard's walk; canonical form and fundamental matrix; time to absorption and absorption probabilities; ergodic and regular chains; the Ehrenfest model; limiting matrix for regular chains; fixed vector; equilibrium state; mean first passage time; mean recurrence time; reversibility.
Stochastic Processes. Definitions and fundamentals of stochastic processes, with a focus on discrete-time processes; realizations; first and second order functions; expected value and variance; autocovariance function, autocorrelation function and autocorrelation coefficient; processes with a trend; cross-correlation. White noise; random walk; counting processes; Poisson process and its properties; sum and difference of Poisson processes. Weak-sense/Strong-sense stationary processes (WSS/SSS).
Time series. Introduction to time series: approximation of mean value, variance, autocorrelation function, autocovariance function and autocorrelation coefficient; mean-square convergence and convergence in probability; weak law of large numbers; ergodic theorem.
Gaussian Variables and Gaussian Processes. Gaussian random variables; Gaussian vectors; Gaussian processes and Gaussian spaces; Gaussian white noise.
Nonlinear Optimization. Definition and main elements of an optimization problem. Unconstrained nonlinear optimization: global and local minima. First-order and second-order necessary conditions for optimality. Sufficient conditions for optimality. Convex sets and convex functions. Convex sets defined via inequality and equality constraints. Minimization of convex functions. Properties of convex problems, necessary and sufficient conditions for optimality.
Quadratic programs: necessary conditions of optimality. Iterative descent methods: introduction, two-step procedure. Gradient methods. Gradient methods via quadratic minimization. Newton's method (for zero finding). Newton's method for unconstrained optimization.
Step-size selection rules: constant step-size, diminishing step-size, minimisation rule, Armijo's rule. Convergence of gradient methods with Armijo's rule. Convergence of gradient methods with constant and diminishing step-size. Remarks on limit points and subsequences. Sufficient condition for the existence of minima (coercive function). Remark on the gradient method for convex problems. Optimization over convex sets: introduction and necessary conditions for optimality. Definition of projection over a convex set. Projected gradient method. Feasible direction method.
Exercising on gradient method for unconstrained optimization. Python implementation of the gradient method: constant step-size. Example of gradient method for an unconstrained quadratic program. Equality and inequality constrained optimization: introduction. Definitions: set of active constraints, regular point, Lagrangian function.
Karush-Kuhn-Tucker (KKT) necessary conditions for optimality. Complementary slackness. Optimality conditions for quadratric Programs. Optimality conditions for equality constrained problems. Newton's method for equality constrained minimization. Equality constrained quadratic programming. Sequential quadratic programming. Barrier functions for inequality constrained optimization problems.
Readings/Bibliography
- S. M. Ross, “Introduction to probability and statistics for engineers and scientists”, 4thEdition, Academic Press
- H. Hsu, “Probability, random variables, and random processes”, 2ndEdition,Schaum's Outline Series, McGrow Hill
- A. Papoulis, S. U. Pillai, “Probability, Random Variable, and Stochastic Processes”, 4thEdition, Mc-Grow Hill
- D. Bertsekas, "Nonlinear programming", 2nd Edition, Athena Scientific
Teaching methods
Lectures in which the basic theory is explained will be combined with several examples and exercises
Assessment methods
Written test examination. The written test includes: 1) exercises for the solution of which the student is expected to apply the theory learned during the course; 2) questions aimed at verifying the knowledge gained by the student concerning the theoretical part of class. The final grade takes into account the evaluations of both the exercises and the theoretic part of the exam.
During the course, examples of written exams (inclusive of commented solutions) are provided.
Teaching tools
Notebook or tablet PC and projector
Office hours
See the website of Andrea Mentrelli