In information theory, the Shannon–Hartley theorem is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted over such a communication link with a specified bandwidth in the presence of the noise interference, under the assumption that the signal power is bounded and the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.
| Attributes | Values |
|---|---|
| rdfs:label |
|
| rdfs:comment |
|
| sameAs | |
| dcterms:subject | |
| dbkwik:freespeech/...iPageUsesTemplate | |
| abstract |
|