Page Card

Information theory

Belongs to subject Information theory

Information theory studies the quantification, storage, and communication of information. A key measure in information theory is "entropy". Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information.

Information theory studies the transmission, processing, extraction, and utilization of information. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition.

Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory. Information theory is based on probability theory and statistics. Information theory often concerns itself with measures of information of the distributions associated with random variables. Important quantities of information are entropy, a measure of information in a single random variable, and mutual information, a measure of information in common between two random variables. A common unit of information is the bit, based on the binary logarithm. p log ⁡

p

0

{x, ..., x} that X could be, and p(x) is the probability of some

x ∈

X

{\displaystyle x\in \mathbb {X} }

, then the entropy, H, of X is defined:

H ( X

)

E

X

( x ) x ∈

X

p

p

p − ( X , Y

)

E

X , Y

p

x , y

p ( x , y ) p ( x , y )

x,y}p(x,y)\log The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y:

H

E

Y

= Y

p x ∈ X

p ( x

|

y ) p ( x

|

y

)

x , y

p ( x , y )

p

Y)-H(Y).\,}

Mutual information measures the amount of information that can be obtained about one random variable by observing another. The mutual information of X relative to Y is given by:

I ( X

;

E

X , x , y

p ( x , y ) log ⁡

p ( x , y )

p ( x )

p x,y}p(x,y)\log {( X

;

( X ) − H Mutual information is symmetric:

I ( X

;

( Y ; X

)

( X ) ( X

;

E

p p ( X ; p ( X ) p x ∈ X

− p ( x )

x ∈ X

− p p ( x )

=

x ∈ X

p ( x ) log ⁡

p ( x )

Other important information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information.

Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.

This subset of information theory is called rate–distortion theory. Network information theory refers to these multi-agent communication models.

These terms are well studied in their own right outside information theory.

Information rate is the average entropy per symbol. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:

C

max

( X ; The erasure represents complete loss of information about an input bit. Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing.

Summary of this Wikipedia page.