By encoding data redundantly, one can ensure that errors can be corrected with high probability, given an assumed model for the noise. Finding good codes for this is easy: Just picking one at random is almost sure to work! The challenge is to find good codes that can be decoded nearly optimally in a reasonable amount of time.
The following papers with David MacKay deal with codes that can be decoded by probabilistic methods:
MacKay, D. J. C. and Neal, R. M. (1996) ``Near Shannon limit performance of low density parity check codes'', Electronics Letters, vol. 32, pp. 1645-1646. Reprinted with printing errors corrected in vol. 33, pp. 457-458: abstract.
MacKay, D. J. C. and Neal, R. M. (1995) ``Good codes based on very sparse matrices'', in C. Boyd (editor) Cryptography and Coding: 5th IAM Conference, Lecture Notes in Computer Science No. 1025, pp. 100-111. Springer-Verlag: abstract.
You can view the slides of two talks of mine on LDPC codes by clicking below:
``Monte Carlo decoding of LDPC codes'', ICTP Workshop on Statistical Physics and Capacity-Approaching Codes, May 2001: postscript, pdf.
``Faster encoding for low-density parity check codes using sparse matrix methods'', IMA workshop on Codes, Systems and Graphical Models, Minneapolis, 1999: postscript, pdf.
I now distribute software for Low Density Parity Check (LDPC) codes. This software is meant to support research and education in this area. It also includes modules for operations on dense and sparse modulo-2 matrices, and for random number generation.
Text and references to many more recent and classical papers on Low Density Parity Check codes can be obtained via the web site of the IMA 1999 workshop on Codes, Systems and Graphical Models,. Also of interest is David MacKay's Web Site.