Grocery on Distributed Algorithms

T1: Coding Theory (2)

Reminder: This post contains 1314 words · 4 min read · by Xianbin

Introduction

This post consider the following channel coding problem in the point-to-point system.

A sender wants to reliably send a message \(M\) at a rate \(R\) bits per transmission to a receiver over a noisy communication channel. Towards this end, the sender first encodes the message into a codeword \(X^n\) and transmit it over the channel. ONce the decoder receives the noisy sequence \(Y^n\), it decodes into \(\hat M\).

The goal is to find the channel capacity that is the highest rate \(R\) such that the probability of decoding error is made to decay to 0 asymptotically with the code block length \(n\).

Discrete Memoryless Channels (DMC)

DMC consists of three parts, the finite input set \(\mathcal{X}\), the finite output set \(\mathcal{Y}\) and a collection of conditional probability mass functions \(p(y \mid x)\) on \(\mathcal{Y}\) for every \(x \in \mathcal{X}\).

\(\textbf{Theorem 1}\) (Channel Coding Theorem). The capacity of the DMC \(p(y \mid x)\) is given by the information capacity formula

\[C = \max_{p(x)}I(X;Y)\]

Memoryless means that

\[p(Y^n \mid X^n) = \prod^n_{i=1}p(y_i\mid x_i)\] \[W \to \textup{Encoder} \to X^n \to \textup{Channel } p(y\mid x)\to Y^n \to \textup{Decoder} \to \hat W\]

Shannon’s second theorem shows that the information channel capacity is equal to the operation channel capacity, i.e. the highest rate (in bits) per channel can use at which information is sent with arbitrarily low error.

Key Questions

  1. How fast can we transmit information over a channel?
  2. Is that possible, the error can be nearly 0?