abstract
| - In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability. Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution
- Channel capacity is the limit of the transmission rate which can be delivered to the destination with arbitrary small error. For the Gaussian channel case, the channel capacity is given as which shows that the channel capacity of the Gaussian channel is determined by SNR. We can derive the capacity of Gaussian channel by assuming the receive signal is represented as where are all Gaussian signal with the power spectral densities are respectively. If the length of the input and out signal sequence is infinitely long, the velocity of the sphere of infinite multi-dimensional sequence of is times bigger than the velocity of the sphere of infinite multi-dimensional sequence of . Hence, we predicts that the number of maximally possible transmission codeword is where .
|