=========================================================================== CSC 310 Lecture 16 Summary Fall 2007 =========================================================================== Topics: - Decoding a message - Maximum likelihood decoding - References: ------------------ Decoding a message ------------------ Ideas: - In general, when the encoder sends the N-symbol message w through the channel, the decoder might receive any N-symbol message b. - The decoder's job is to take the received message and infer what message the encoder could have sent. It should then pick the most likely message w (unless we have the option of reporting "too many errors"). - It is important to realize that, for a general channel (more specifically, one where for every input symbol, every output symbol has some non-zero probability of being observed; e.g. BSC), it is always possible that w be altered in such a way as to trick the decoder into deriving another input message from b. The only thing we can request is that the probability of this happening be small, but it cannot be 0. - The job of the perfect/optimal decoder is to compute Pr[X = w|Y = b] for all possible input strings w, and output the one with the highest such probability. - In general, this will be practically unfeasible. If K = 1000, we'd have to consider 2^1000 possible input strings. - By Bayes' rule and definition of conditional probabilities, Pr[X = w|Y = b] = (Pr[X = w]*Pr[Y = b|X = w])/Pr[Y = b] References: section 9.4 --------------------------- Maximum likelihood decoding --------------------------- Idea: - IF all input strings w are equally likely, THEN maximizing the above is equivalent to maximizing Pr[Y = b|X = w] (again over the choice of w). - Given b, Pr[Y = b|X = w] is the "likelihood" of w. - So, if all input symbols are equally likely, optimal decoding is by maximum likelihood. Reference: definitions in section 9.6 ("optimal decoder" in particular).