Using elements of Information Theory, the problem of designing experiments in an optimal way is considered. The goal is to find which experiment will yield the most informative data in order to make inference about model parameters. Information Entropy is used as a scalar measure of the uncertainty.
Lecture 09 - Shannon Information Entropy. Principle of Maximum Information Entropy for assigning probabilities. The problem of assigning probabilities is examined using elements of information theory, specifically the Shannon Information Entropy. The information entropy for the univariate and multivariate Gaussian distributions are derived analytically. Next, the Principle of Maximum Information Entropy is introduced to help us assign probabilities to the prior pdf based on available information about it. It is proved that according to the Maximum Information Entropy, given the mean and variance of a random variable, the best distribution to describe its uncertainty is the normal distribution. |
|
Lecture 10 - Bayesian optimal experimental design for estimation of model parameters. First a review of Bayesian parameter estimation is done, in order to proceed with the Bayesian Optimal Experimental Design. Next, elements of Information Theory are presented to lay the foundation for OED. The most important quantity is the Information Entropy which is described, and the optimal sensor location methodology is based on that. |
|
Lecture 11 - Optimal sensor placement for structural dynamics applications. The study of optimal sensor location methodology is continued and expanded. The effect of prediction error correlation is discussed, and several prediction error correlation models are introduced. The implementation of optimal sensor location is described for structural dynamics applications. |
|
Theory #11 | |
Theory #12 | |
Theory #13 |