Orthogonality principle
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator.
Wikipage redirect
primaryTopic
Orthogonality principle
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator.
has abstract
In statistics and signal proce ...... m mean square error estimator.
@en
Wikipage page ID
18,823,653
page length (characters) of wiki page
Wikipage revision ID
985,136,711
Link from a Wikipage to another Wikipage
wikiPageUsesTemplate
hypernym
type
comment
In statistics and signal proce ...... m mean square error estimator.
@en
label
Orthogonality principle
@en