The Bayesian method of inference is introduced for estimating model parameters and their uncertainties. Bayesian inference is rooted in Bayes theorem which encapsulates our state of knowledge about a parameter using probability distributions, in light of information contained in measured data. Probability is interpreted as a measure of plausibility.
Lecture 01 - Introduction to models and their uncertainty. Use of probability to quantify uncertainty with probability distributions. Important theorems from Probability Theory. Introduction to mathematical and computational models of systems and their uncertainty. The common sources of uncertainty are described which include modeling, parametric, computational and measurement uncertainty. Probability is used (Cox interpretation) to quantify uncertainty using probability distributions. A brief overview of Probability Theory is presented along with the most important theorems (Bayes theorem, marginalization theorem etc..) for discrete and continuous variables. Probability distributions are used as a means of quantifying the plausibility of each possible value of an uncertain parameter. Finally, the concept of uncertainty propagation is demonstrated with an example. |
|
Lecture 02 - Measures of uncertainty in output quantities of interest. Bayes Theorem using observations, and example. Uncertainty propagation for the simple linear model case. Gaussian distribution. The study in uncertainty propagation is continued. Measures of uncertainty in output quantities of interest are introduced along with analytical and computational tools to tackle them. The Bayes Theorem is introduced to calculate the posterior pdf of the model parameters using information contained in measured data (observations) of the real system. The posterior pdf is interpreted as a means of quantifying the plausibility of each possible value of the parameters in light of the available observations. The concept is demonstrated with a simple example. Uncertainty propagation is investigated for the simple case of a linear model. The Gaussian (Normal) distribution is studied. |
|
Lecture 03 - Multivariate Gaussian distribution and its properties. Linear transformation of Gaussian vectors. Quadratic forms. Nonlinear model uncertainty propagation. The univariate Gaussian distribution is extended to the multivariate Gaussian distribution for the case of more than one variables. Its properties are examined. The linear transformation of Gaussian random vectors is studied. Next, quadratic forms are studied and their geometric interpretation is given using eigenvalues and eigenvectors. Further properties of the Multivariate Gaussian distribution are examined, including the cumulative, marginal and conditional distributions. Next, the nonlinear model for uncertainty propagation is examined. Analytical approximations based on Taylor Series Expansion are introduced. |
|
Lecture 04 - Uncertainty propagation for nonlinear models using Taylor series. Posterior system analysis. Likelihood function estimation. Uncertainty propagation for nonlinear models is continued for the multidimensional case. Truncated Taylor series expansions are used to approximate the nonlinear model and calculate expected values of various quantities that depend on the uncertain parameters. Next, posterior system analysis is studied, where data are used to update the uncertainties in the system parameters, in contrast to the prior system analysis. The problem of estimating the likelihood function is investigated with three examples. |
|
Lecture 05 - Likelihood function construction. Bayesian Parameter Estimation for one-dimensional models. Taylor series approximation of the posterior pdf. Bayesian Central Limit Theorem. The problem of constructing the likelihood function is continued and demonstrated by further examples including linear and nonlinear difference equations. Next, Bayesian Parameter Estimation for the one-dimensional case is explained. Derivatives are used to find the most probable values, and Taylor Series are used to approximate the local behavior of the posterior pdf close to the most probable value. The Bayesian Central Limit theorem is introduced, to approximate the posterior pdf with a Gaussian for large number of data. Next, the process of Bayesian estimation of the mean of a gaussian process is explained with an extended example. Next, the exponential model is discussed. |
|
Lecture 06 - Multi-dimensional Bayesian Parameter Estimation & Bayesian Central Limit Theorem. Taylor series expansions for many variables. Eigenvalue and eigenvector Hessian matrix analysis. The Bayesian Parameter estimation framework is extended for the multi-dimensional case. Derivatives and Taylor series expansions are also generalized for multiple variables and the posterior pdf is also approximated by a multivariate Gaussian as in the one-dimensional case, for large number of data. Eigenvalues and eigenvectors of the Hessian matrix are used to visualize the contours of the posterior pdf. Next, the marginalization theorem is used to obtain the marginal distributions of the parameters. Next, an example is given where the uncertain parameters are the mean and variance of a Gaussian process. |
|
Lecture 07 - Application of multidimensional bayesian parameter estimation in curve fitting. A detailed example of multidimensional bayesian parameter estimation problem is solved, where the parameters of a curve are found by fitting the curve to data. The posterior pdf, along with other important quantities for inference are calculated and demonstrated. |
|
Lecture 08 - Example in uncertainty propagation using Gaussian approximations. Robust Posterior Predictions. Laplace asymptotic approximations of integrals. The example of the previous lecture is continued with uncertainty propagation. Special cases of simplifications using Gaussian approximations for the posterior pdf are considered. Simple measures of uncertainty are derived for an output quantity of interest that depends on the uncertain parameters (Robust Posterior Predictions) using their posterior pdf. Laplace asymptotic approximations are introduced to calculate the resulting integrals. Next, an illustrative example is solved using asymptotic approximations. |
|
Theory #01a | |
Theory #01b | |
Theory #02 | |
Theory #03 | |
Theory #04 | |
Theory #05 | |
Theory #06 | |
Theory #07a | |
Theory #07b | |
Theory #08a | |
Theory #08b | |
Theory #09 | |
Theory #10 | |
Example #01 | |
Example #02 | |
Example #03 | |
Example #04 | |
Example #05 | |
Example #06 |