Search preferences

Product Type

  • All Product Types
  • Books (4)
  • Magazines & Periodicals
  • Comics
  • Sheet Music
  • Art, Prints & Posters
  • Photographs
  • Maps
  • Manuscripts &
    Paper Collectibles

Condition

Binding

Collectible Attributes

Seller Location

Seller Rating

  • Martin, W. T. (Ed); et.al.; Bruck, R. H.; Wiener, Norbert; Fenchel, Werner; et.al.

    Published by American Mathematical Society, Menasha, WI, U.S.A., 1951

    Seller: SUNSET BOOKS, Newark, OH, U.S.A.

    Seller Rating: 5-star rating, Learn more about seller ratings

    Contact seller

    First Edition

    US$ 5.50 Shipping

    Within U.S.A.

    Quantity: 1

    Add to Basket

    Library Rebound. Condition: Very Good. No Jacket. 1st Edition. Library Rebound without original covers. NO Dust Jacket. Has or May Have all standard Library markings, pocket, labels, stamps, wear and soil to covers. CLEAN TEXT! Thank you for your purchase from Sunset Books! Help Promote World Literacy, GIVE a Book as a GIFT!! In stock, Ships from Ohio. WE COMBINE SHIPPING ON MULTIPLE PURCHASES!!!! SEE PICTURES!!!!! ANY ODD/GREEN TONES ON THE SCANS ARE CAUSED BY MY SCANNER!! All of our Technical/Textbook/Ex-Library volumes were obtained legally through Public or Auction sales. This volume was purchased through DMRO from the Wright-Patterson Technical Library in the late 1990's. The Copyright date is 19515 for this Printing. Size: 4to. Journal.

  • Wiener, Norbert et al

    Published by Cambridge/London: M.I.T. Pr. 1966., 1966

    Seller: de Wit Books, HUTCHINSON, KS, U.S.A.

    Seller Rating: 5-star rating, Learn more about seller ratings

    Contact seller

    US$ 5.00 Shipping

    Within U.S.A.

    Quantity: 1

    Add to Basket

    VG+, unmarked Hardback; no DJ. x + 176 pp.

  • Wiener, Norbert & Alex W Rathe et al

    Published by Executive Techniques, 1951

    Seller: NUDEL BOOKS, New York, NY, U.S.A.

    Seller Rating: 4-star rating, Learn more about seller ratings

    Contact seller

    Book

    US$ 7.00 Shipping

    Within U.S.A.

    Quantity: 1

    Add to Basket

    Soft cover. Condition: Very Good. 8vo, stapled wrappers, 40pp, bit spotted, photo of Wiener who is the main contributor and raison d'etre of this conference, & 10pp. essay by him, Typed letter from Rathe presenting the book and talking about Wiener laid in.(VV1/6.

  • Seller image for Communication / Information Theory: A Collection for sale by Manhattan Rare Book Company, ABAA, ILAB

    US$ 90,000.00

    Convert currency
    US$ 6.00 Shipping

    Within U.S.A.

    Quantity: 1

    Add to Basket

    various. Condition: Very Good. First editions. A remarkably complete collection of works documenting the history of the theory of communication of information - what 'information' actually is, and what are the theoretical restrictions on the accurate transmission of information from source to receiver. Note: The numbers in brackets correspond to the titles listed in the accompanying pdf, accessible via the link below the images. The first group of works details the development and proof of what is now called the 'Nyquist-Shannon sampling theorem'. If an analog signal (e.g., voice or music) has to be converted to a digital signal, consisting of binary zeros and ones ('bits'), the theorem states that a sample of twice the highest signal frequency rate captures the signal perfectly thereby making it possible to reconstruct the original signal. This theorem laid the foundation for many advances in telecommunications. The first evidence for the sampling theorem was found experimentally by Miner in 1903 [8]. It was formally proposed by Nyquist in 1924 [9, 10] and by Küpfmüller in 1928 [8], but first proved by Nyquist [12] and later by Küpfmüller's student Raabe [8]. In 1941, Bennett [15] referred to Raabe's work and generalized it. A result equivalent to the sampling theorem had, however, been proved by Whittaker as early as 1915 [8, 14] in the context of interpolation theory. Finally, in 1948 Shannon [8, 19] published a proof of both the sampling theorem and the interpolation formula as one part of his broader development of information theory. The term 'information', as a precise concept susceptible of measurement, was coined by Hartley in 1928 [11]. "Hartley distinguished between meaning and information. The latter he defined as the number of possible messages independent of whether they are meaningful. He used this definition of information to give a logarithmic law for the transmission of information in discrete messages . Hartley had arrived at many of the most important ideas of the mathematical theory of communication: the difference between information and meaning, information as a physical quantity, the logarithmic rule for transmission of information, and the concept of noise as an impediment in the transmission of information" (Origins of Cyberspace 316). In the following year, the physicist Szilard established the connection between information and the thermodynamic quantity 'entropy'. "Szilard described a theoretical model that served both as a heat engine and an information engine, establishing the relationship between information (manipulation and transmission of bits) and thermodynamics (manipulation and transfer of energy and entropy). He was one of the first to show that 'Nature seems to talk in terms of information'" (Seife, Decoding the Universe, 2007, p. 77). Another physicist, Gabor, pointed out the relation between the sampling theorem and the uncertainty principle in quantum mechanics [16]: "Signals do not have arbitrarily precise time and frequency localization. It doesn't matter how you compute a spectrum, if you want time information, you must pay for it with frequency information. Specifically, the product of time uncertainty and frequency uncertainty must be at least 1/4Ï ." In 1942 Wiener issued a classified memorandum (published in 1949 [23]) which combining ideas from statistics and time-series analysis, and used Gauss's method of shaping the characteristic of a detector to allow for the maximal recognition of signals in the presence of noise. This method came to be known as the 'Wiener filter'. In his Mathematical Theory of Communication (1948) [19], Shannon notes: "Communication theory is heavily indebted to Wiener for much of its basic philosophy and theory. His classic NDRC report 'The Interpolation, Extrapolation, and Smoothing of Stationary Time Series', to appear soon in book form, contains the first clear-cut formulation of communication theory as a statistical problem, the study of operations on time series." Many of the developments in communications theory up to 1948 were summarized and systematized in Weiner's famous book on cybernetics [17]. It is this work of Shannon's that represents the real birth of modern information theory. "Claude Shannon's creation in the 1940s of the subject of information theory is one of the great intellectual achievements of the twentieth century" (Sloane & Wyner, Claude Elwood Shannon Collected Papers, 1993, p. 3). "Probably no single work in this century has more profoundly altered man's understanding of communication than C. E. Shannon's article, 'A mathematical theory of communication', first published in 1948" (Slepian, Key papers in the development of information theory, 1974). "Th[is] paper gave rise to 'information theory', which includes metaphorical applications in very different disciplines, ranging from biology to linguistics via thermodynamics or quantum physics on the one hand, and a technical discipline of mathematical essence, based on crucial concepts like that of channel capacity, on the other . The 1948 paper rapidly became very famous; it was published one year later as a book, with a postscript by Warren Weaver regarding the semantic aspects of information" (DSB). "The revolutionary elements of Shannon's contribution were the invention of the source-encoder-channel-decoder-destination model, and the elegant and remarkably general solution of the fundamental problems which he was able to pose in terms of this model. Particularly significant is the demonstration of the power of coding with delay in a communication system, the separation of the source and channel coding problems, and the establishment of fundamental natural limits on communication. "Shannon created several original mathematical concepts. Primary among these is the notion of the 'entropy' of a random variable (and by extension of a random sequence), the 'mutual information' between two random variables or sequences, and an algebra that relates these quantities and their derivatives.