About this Item
65-103, [1-blank] pages. 8 15/16 x 6 inches. Self-wrappers stapled at the spine. Wraps. "Information and Control," Vol 10, No 1, February 1967 (pp. 65-103) first published this paper, here offered in offprint form. This offprint does not have separate wrappers that we are aware of - the reprint statement is printed upper left on the first page of the paper. "The noisy channel coding theorem (Shannon, 1948) states that for a broad class of communication channels, data can be transmitted over the channel in appropriately coded form at any rate less than channel capacity with arbitrarily small error probability. Naturally, there is a rub in such a delightful sounding theorem, and the rub here is that the error probability can, in general, be made small only by making the coding constraint length large; this, in turn, introduces complexity into the encoder and decoder. Thus, if one wishes to employ coding on a particular channel, it is of interest to know not only the capacity but also how quickly the error probability can be made to approach zero with increasing constraint length." (pp.65-66) "New lower bounds are presented for the minimum error probability that can be achieved through the use of block coding on noisy discrete memoryless channels. Like previous upper bounds, these lower bounds decrease exponentially with the block length N. The coefficient of N in the exponent is a convex function of the rate. From a certain rate of transmission up to channel capacity, the exponents of the upper and lower bounds coincide. Below this particular rate, the exponents of the upper and lower bounds differ, although they approach the same limit as the rate approaches zero. Examples are given, and various incidental results and techniques relating to coding theory are developed. The paper is presented in two parts: the first, appearing here, summarizes the major results and treats the case of high transmission rates in detail; the second, to appear in the subsequent issue, treats the case of low transmission rate." (abstract) PROVENANCE: The personal files of Claude E. Shannon (unmarked). There were multiple examples of this item in Shannon's files. REFERENCES: Sloane and Wyner, "Claude Elwood Shannon Collected Papers," #122 Reprinted in D. Slepian, editor," Key Papers in the Development of Information Theory," IEEE Press, NY, 1974, pp 194-204. Seller Inventory # 28680
Contact seller
Report this item