Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent in the variable's possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is sometimes called Shannon entropy in his honour. As an example, consider a biased coin with probability p of landing on heads and probability 1 − p of landing on tails. The maximum surprise is for p = 1/2, when there is no reason to expect one outcome over another, and in this case a coin flip has an entropy of one bit. The minimum surprise is when p = 0 or p = 1, when the event is known and the entropy is zero bits. Other values of p give different entropies between zero and one bits.
known for
Wikipage disambiguates
Average informationData compression/entropyData entropyEntropy (Information theory)Entropy (information)Entropy (statistics)Entropy of a probability distributionInformation EntropyInformation Theoretic EntropyInformation entropyInformational entropyInfotropyShannon's entropyShannon EntropyShannon entropyWeighted entropy
Wikipage redirect
1948 in scienceA Mathematical Theory of CommunicationAbbe MowshowitzAdaptabilityAddress space layout randomizationAdjusted mutual informationAkaike information criterionAlfréd RényiAlgorithmic coolingAlgorithmic information theoryAnatol SlissenkoAnti-informationApplications of randomnessArbitrarily varying channelArithmetic codingAsymmetric Laplace distributionAsymmetric numeral systemsAverage informationBayesian networkBernoulli processBeta distributionBinary dataBinary entropy functionBinary logarithmBiometricsBitBoltzmann constantBregman divergenceBregman–Minc inequalityC4.5 algorithmCDF-based nonparametric confidence intervalCanvas fingerprintingCatalog of articles in probability theoryCentral limit theoremCentral tendencyChain rule (disambiguation)Chain rule for Kolmogorov complexityChannel capacityCipherSaberCiphertext-only attack
Link from a Wikipage to another Wikipage
seeAlso
differentFrom
primaryTopic
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent in the variable's possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is sometimes called Shannon entropy in his honour. As an example, consider a biased coin with probability p of landing on heads and probability 1 − p of landing on tails. The maximum surprise is for p = 1/2, when there is no reason to expect one outcome over another, and in this case a coin flip has an entropy of one bit. The minimum surprise is when p = 0 or p = 1, when the event is known and the entropy is zero bits. Other values of p give different entropies between zero and one bits.
has abstract
En el ámbito de la teoría de l ...... s de que esto fuera percibido.
@es
Entropi är ett begrepp inom in ...... modynamiska storheten entropi.
@sv
Entropia – średnia ilość infor ...... ardzo duże obniżenie entropii.
@pl
Entropia, quando relacionada à ...... ao observar a sua ocorrência.
@pt
Entropie (nach dem Kunstwort ἐ ...... t einem großen Eta bezeichnet.
@de
Entropie is een maat voor de o ...... dezelfde reeks gebeurtenissen.
@nl
In information theory, the ent ...... tropy is analogous to entropy.
@en
Informační nebo též shannonovs ...... li na významu událostí samých.
@cs
L'entropia de Shannon, (formul ...... ent probables (equiprobables).
@ca
L'entropie de Shannon, due à C ...... s symboles sont équiprobables.
@fr
Link from a Wikipage to an external page
Wikipage page ID
page length (characters) of wiki page
Wikipage revision ID
1,024,571,966
Link from a Wikipage to another Wikipage
background colour
#F5FFFA
@en
border colour
#0073CF
@en
cellpadding
id
p/e035740
@en
title
Entropy
@en
Shannon's entropy
@en
wikiPageUsesTemplate
type
comment
En el ámbito de la teoría de l ...... de una fuente de información.
@es
Entropi är ett begrepp inom in ...... modynamiska storheten entropi.
@sv
Entropia – średnia ilość infor ...... ana jest . Własności entropii:
@pl
Entropia, quando relacionada à ...... meira lei de todas a ciências.
@pt
Entropie (nach dem Kunstwort ἐ ...... t einem großen Eta bezeichnet.
@de
Entropie is een maat voor de o ...... eze gebeurtenis zal opleveren.
@nl
In information theory, the ent ...... ies between zero and one bits.
@en
Informační nebo též shannonovs ...... v roce 1948 ve svém článku „".
@cs
L'entropia de Shannon, (formul ...... triccions particulars, l'entro
@ca
L'entropie de Shannon, due à C ...... lconque (collection d'octets).
@fr
label
Entropi (informationsteori)
@sv
Entropia (informazio-teoria)
@eu
Entropia (teoria dell'informazione)
@it
Entropia (teoria informacji)
@pl
Entropia da informação
@pt
Entropia de Shannon
@ca
Entropie (Informationstheorie)
@de
Entropie (informatietheorie)
@nl
Entropie de Shannon
@fr
Entropy (information theory)
@en