Epistemology
Introduction E9F5FC Questions FFFFC0 Software |
Extracting information How much information is extractable from a uniform distribution with one constraint. Uniform distribution: all allowed values have equal probability. For example, the allowed values could be on the interval {$[a,b]$}. One constraint: the allowed values satisfy a constraint, for example, they are on a geometric shape. Cramer-Rao lower bound
Suppose {$\theta$} is an unknown deterministic parameter that is to be estimated from {$n$} independent observations (measurements) of {$x$} each from a distribution according to some probability density function {$f(x;\theta )$}. The variance of any unbiased estimator {$\hat{\theta}$} of {$\theta$} is then bounded by the reciprocal of the Fisher information {$I(\theta)$}. {$\operatorname{var}({\hat {\theta }})\geq {\frac {1}{I(\theta )}}$} where the Fisher information is {$I(\theta )=n\operatorname {E} _{X;\theta }\left[\left({\frac {\partial \ell (X;\theta )}{\partial \theta }}\right)^{2}\right]$} and {$ \ell (x;\theta )=\log(f(x;\theta ))$} is the natural logarithm of the likelihood function for a single sample {$x$} and {$\operatorname {E} _{x;\theta }$} denotes the expected value with respect to the density {$f(x;\theta )$} of {$X$}. The expectation is taken with respect to {$X$}. Fisher information is a way of measuring the amount of information that an observable random variable {$X$} carries about an unknown parameter {$\theta $} of a distribution that models {$X$}.
Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. |