In information theory, the noisy-channel coding theorem establishes that however contaminated with noise interference a communication channel may be, it is possible to communicate digital data (information) nearly error-free up to a given maximum rate through the channel. This surprising result, sometimes called the fundamental theorem of information theory, or just Shannon's theorem, was first presented by Claude Shannon in 1948. The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level.
Identifier (URI) | Rank |
---|---|
dbkwik:resource/7Z1iBWo-CWlnsUw3fEEL1A== | 5.88129e-14 |
dbr:Noisy-channel_coding_theorem | 5.88129e-14 |