Download A Mathematical Theory of Communication by Claude E. Shannon

Document related concepts

History of information theory wikipedia , lookup

Information theory wikipedia , lookup

Von Neumann entropy wikipedia , lookup

Entropy (information theory) wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Transcript
Claude Elwood Shannon
En su tesis de maestría Shannon demuestra la
equivalencia entre el álgebra de Boole y las
compuertas lógicas (and, or, ...) con lo cual
establece una herramienta matemática (álgebra
de Boole) con la cual se pueden modelar
circuitos electrónicos y acelerar el desarrollo de
toda la electrónica, computación e Informática.
FGS Agosto del 2010
"...La teoría formal de la información nació de los artículos
publicados en 1948 por el matemático estadounidense Claude E.
Shannon. En ellos enunció: la medida de la información más
ampliamente usada hoy en día es la entropía. La entropía había
venido siendo un concepto central de la termodinámica, la rama
de la física que trata del calor. Suele decirse que la entropía
termodinámica expresa el desorden de un sistema físico"
http://www.iieh.org/Informacion/articulos_informacion01.php#(NOTA%205)
Regresar a Personajes Fundamentales de la Informática
Regresar a www.fgalindosoria.com
Claude Elwood Shannon
Wikipedia (20100514)
“Claude Elwood Shannon (April 30, 1916 – February 24, 2001), an American
mathematician and electronic engineer, is known as "the father of information theory".[1]
Shannon is famous for having founded information theory with one landmark paper
published in 1948. But he is also credited with founding both digital computer and digital
circuit design theory in 1937, when, as a 21-year-old master's student at MIT, he wrote a
thesis demonstrating that electrical application of Boolean algebra could construct and
resolve any logical, numerical relationship. It has been claimed that this was the most
important master's thesis of all time.”
http://en.wikipedia.org/wiki/Claude_E._Shannon
El tubo de vacío
Wikipedia (20100515)
“En 1937, Claude Shannon hizo su tesis de master en MIT que implementó álgebra
booleana usando relés electrónicos e interruptores por primera vez en la historia. Titulada
"Un Análisis Simbólico de Circuitos de Relés e Interruptores" (A Symbolic Analysis of
Relay and Switching Circuits), la tesis de Shannon, esencialmente, fundó el diseño de
circuitos digitales prácticos.”
http://es.wikipedia.org/wiki/Primera_generaci%C3%B3n_de_computadoras El_tubo_de_vac.C3.ADo
Claude Elwood Shannon
Wikipedia (20100514)
“...En su tesis de maestría en el MIT, demostró cómo el álgebra booleana se podía utilizar
en el análisis y la síntesis de la conmutación y de los circuitos digitales. La tesis despertó un
interés considerable cuando apareció en 1938 en las publicaciones especializadas. En 1940
le fue concedido el Premio a ingenieros americanos del Instituto Americano Alfred Nobel
de Estados Unidos, una concesión dada cada año a una persona de no más de treinta años.
Un cuarto de siglo más tarde H. H. Goldstine, en su libro "Las computadoras desde Pascal
hasta Von Neumann", citó su tesis como una de las más importantes de la historia que
ayudó a cambiar el diseño de circuitos digitales...”
http://es.wikipedia.org/wiki/Claude_Shannon
"En su tesis de maestría en el MIT, dirigida por Frank L. Hitchcock, demostró como el
álgebra boleana se podía utilizar en el análisis y la síntesis de la conmutación y de los
circuitos digitales. La tesis despertó un interés considerable al ser publicada en 1938. En
1940 le fue concedido gracias a ella el Premio Alfred Noble de las sociedades de ingeniería
de los Estados Unidos, otorgado anualmente a una persona de no más de treinta años. Un
cuarto de siglo más tarde H. H. Goldstine, en su libro Las computadoras desde PASCAL
hasta Von Neumann, la citó como una de las tesis de maestría más importantes de la
historia (...) que ayudó a cambiar el diseño de circuitos digitales.
......
Durante el verano de 1938 efectuó trabajos de investigación en el MIT y le fue concedida la
beca Bolles cuando trabajaba como ayudante de enseñanza mientras realizaba un doctorado
en matemáticas, de nuevo bajo la dirección de Hitchcock. En 1940 presentó su tesis, donde
proponía un álgebra para problemas de genética teórica, un tema propuesto por Bush.
......
Durante este período Shannon trabajó en muchas áreas, siendo lo mas notable todo lo
referente a la teoría de la información, un desarrollo que fue publicado en 1948 bajo el
nombre de Una teoría matemática de la comunicación. En este trabajo se demostró que
todas las fuentes de información (telégrafo, teléfono, radio, el hablante humano, las
cámaras de televisión, etc.) se pueden medir y que los canales de comunicaciones tienen
una unidad de medida similar. Mostró también que la información se puede transmitir sobre
un canal si y solamente si la magnitud de la fuente no excede la capacidad de transmisión
del canal que la conduce y sentó las bases de la corrección de errores, supresión de ruidos y
redundancia."
http://enciclopedia.us.es/index.php/Claude_Shannon
https://es.wikipedia.org/wiki/Claude_Elwood_Shannon
Entropy and Information / Entropía e Información
http://www.fgalindosoria.com/informatica/aspects/e_i/entropy_information/
Información y Entropía
Guillermo Agudelo Murguía, José Guillermo Alcalá Rivero
"...La teoría formal de la información nació de los artículos publicados en 1948 por el
matemático estadounidense Claude E. Shannon. En ellos enunció: la medida de la
información más ampliamente usada hoy en día es la entropía. La entropía había venido
siendo un concepto central de la termodinámica, la rama de la física que trata del calor.
Suele decirse que la entropía termodinámica expresa el desorden de un sistema físico"
http://www.iieh.org/Informacion/articulos_informacion01.php#(NOTA%205)
"Shannon introduces an H function of the following form:
where K is a positive constant. Shannon then states that "any quantity of this form, where K
merely amounts to a choice of a unit of measurement, plays a central role in information
theory as measures of information, choice, and uncertainty." Then, as an example of how
this expression applies in a number of different fields, he references R.C. Tolman's 1938
Principles of Statistical Mechanics, stating that "the form of H will be recognized as that of
entropy as defined in certain formulations of statistical mechanics where pi is the
probability of a system being in cell i of its phase space…"
Wikipedia 20140314
http://en.wikipedia.org/wiki/History_of_entropy - Information_theory
Information theory
Wikipedia 20140314
"An analog to thermodynamic entropy is information entropy. In 1948, while working at
Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically
quantify the statistical nature of "lost information" in phone-line signals. To do this,
Shannon developed the very general concept of information entropy, a fundamental
cornerstone of information theory. Although the story varies, initially it seems that Shannon
was not particularly aware of the close similarity between his new quantity and earlier work
in thermodynamics. In 1949, however, when Shannon had been working on his equations
for some time, he happened to visit the mathematician John von Neumann. During their
discussions, regarding what Shannon should call the "measure of uncertainty" or
attenuation in phone-line signals with reference to his new information theory, according to
one source:[10]
"My greatest concern was what to call it. I thought of calling it ‘information’,
but the word was overly used, so I decided to call it ‘uncertainty’. When I
discussed it with John von Neumann, he had a better idea. Von Neumann told
me, ‘You should call it entropy, for two reasons. In the first place your
uncertainty function has been used in statistical mechanics under that name, so
it already has a name. In the second place, and more important, nobody knows
what entropy really is, so in a debate you will always have the advantage"
According to another source, when von Neumann asked him how he was getting on with
his information theory, Shannon replied:[11]
"The theory was in excellent shape, except that he needed a good name for
"missing information". "Why don’t you call it entropy", von Neumann
suggested. "In the first place, a mathematical development very much like yours
already exists in Boltzmann's statistical mechanics, and in the second place, no
one understands entropy very well, so in any discussion you will be in a
position of advantage."
In 1948 Shannon published his famous paper A Mathematical Theory of Communication, in
which he devoted a section to what he calls Choice, Uncertainty, and Entropy.[12] In this
section, Shannon introduces an H function of the following form:
where K is a positive constant. Shannon then states that "any quantity of this form, where K
merely amounts to a choice of a unit of measurement, plays a central role in information
theory as measures of information, choice, and uncertainty." Then, as an example of how
this expression applies in a number of different fields, he references R.C. Tolman's 1938
Principles of Statistical Mechanics, stating that "the form of H will be recognized as that of
entropy as defined in certain formulations of statistical mechanics where pi is the
probability of a system being in cell i of its phase space… H is then, for example, the H in
Boltzmann's famous H theorem." As such, over the last fifty years, ever since this statement
was made, people have been overlapping the two concepts or even stating that they are
exactly the same.
Shannon's information entropy is a much more general concept than statistical
thermodynamic entropy. Information entropy is present whenever there are unknown
quantities that can be described only by a probability distribution. In a series of papers by
E. T. Jaynes starting in 1957,[13][14] the statistical thermodynamic entropy can be seen as
just a particular application of Shannon's information entropy to the probabilities of
particular microstates of a system occurring in order to produce a particular macrostate."
http://en.wikipedia.org/wiki/History_of_entropy - Information_theory
Reprinted with corrections from The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.
A Mathematical Theory of Communication
By C. E. SHANNON
"INTRODUCTION
THe recent development of various methods of modulation such as PCM and PPM which
exchange bandwidth for signal-to-noise ratio has intensified the interest in a general theory
of communication. A basis for such a theory is contained in the important papers of
Nyquist1 and Hartley2 on this subject. In the present paper we will extend the theory to
include a number of new factors, in particular the effect of noise in the channel, and the
savings possible due to the statistical structure of the original message and due to the nature
of the final destination of the information.
The fundamental problem of communication is that of reproducing at one point either
exactly or approximately a message selected at another point. Frequently the messages have
meaning; that is they refer to or are correlated according to some system with certain
physical or conceptual entities. These semantic aspects of communication are irrelevant to
the engineering problem. The significant aspect is that the actual message is one selected
from a set of possible messages. The system must be designed to operate for each possible
selection, not just the one which will actually be chosen since this is unknown at the time of
design.
If the number of messages in the set is finite then this number or any monotonic function of
this number can be regarded as a measure of the information produced when one message
is chosen from the set, all choices being equally likely. As was pointed out by Hartley the
most natural choice is the logarithmic function. Although this definition must be
generalized considerably when we consider the influence of the statistics of the message
and when we have a continuous range of messages, we will in all cases use an essentially
logarithmic measure.
The logarithmic measure is more convenient for various reasons:
1. It is practically more useful. Parameters of engineering importance such as time,
bandwidth, number of relays, etc., tend to vary linearly with the logarithm of the number
of possibilities. For example, adding one relay to a group doubles the number of possible
states of the relays. It adds 1 to the base 2 logarithm of this number. Doubling the time
roughly squares the number of possible messages, or doubles the logarithm, etc.
2. It is nearer to our intuitive feeling as to the proper measure. This is closely related to
(1) since we intuitively measures entities by linear comparison with common standards.
One feels, for example, that two punched cards should have twice the capacity of one for
information storage, and two identical channels twice the capacity of one for
transmitting information.
3. It is mathematically more suitable. Many of the limiting operations are simple in
terms of the logarithm but would require clumsy restatement in terms of the number of
possibilities.
The choice of a logarithmic base corresponds to the choice of a unit for measuring
information. If the base 2 is used the resulting units may be called binary digits, or more
briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a
relay or a flip-flop circuit, can store one bit of information. N such devices can store N bits,
since the total number of possible states is 2N and log2 2N =N.
If the base 10 is used the units may be called decimal digits. Since
log2M = log10M / log10 2 = 3:32log10M;
a decimal digit is about 3 1/3 bits. A digit wheel on a desk computing machine has ten
stable positions and therefore has a storage capacity of one decimal digit. In analytical work
where integration and differentiation are involved the base e is sometimes useful. The
resulting units of information will be called natural units.
Change from the base a to base b merely requires multiplication by logb a.
1Nyquist,
H., “Certain Factors Affecting Telegraph Speed,” Bell System Technical Journal, April 1924, p. 324; “Certain Topics in
Telegraph Transmission Theory,” A.I.E.E. Trans., v. 47, April 1928, p. 617.
2Hartley, R. V. L., “Transmission of Information,” Bell System Technical Journal, July 1928, p. 535."
http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf
"A Mathematical Theory of Communication by Claude E. Shannon
A Note on the Edition
Claude Shannon's ``A mathematical theory of communication'' was first published in two
parts in the July and October 1948 editions of the Bell System Technical Journal [1]. The
paper has appeared in a number of republications since:
 The original 1948 version was reproduced in the collection Key Papers in the
Development of Information Theory [2]. The paper also appears in Claude Elwood
Shannon: Collected Papers [3]. The text of the latter is a reproduction from the Bell
Telephone System Technical Publications, a series of monographs by engineers and
scientists of the Bell System published in the BSTJ and elsewhere. This version has
correct section numbering (the BSTJ version has two sections numbered 21), and as
far as we can tell, this is the only difference from the BSTJ version.
 Prefaced by Warren Weaver's introduction, ``Recent contributions to the
mathematical theory of communication,'' the paper was included in The
Mathematical Theory of Communication, published by the University of Illinois
Press in 1949 [4]. The text in this book differs from the original mainly in the
following points:
o the title is changed to ``The mathematical theory of communication'' and
some sections have new headings,
o Appendix 4 is rewritten,
o the references to unpublished material have been updated to refer to the
published material.
The text we present here is based on the BSTJ version with a number of corrections. (The
version on this site before May 18th 1998 was based on the University of Illinois Press
version.)
Here you can find a PostScript (460 Kbytes), gzipped PostScript (146 Kbytes) and pdf (358
Kbytes) version of Shannon's paper. PDF files can be viewed by Adobe's acrobat reader.
Tarred and gzipped contents of the directory (63 Kbytes) that contain the LaTeX code for
the paper is also available."
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html
The Mathematical Theory of Communication
Claude E Shannon, Warren Weaver
Univ of Illinois Press, 1949
University of Illinois Press Champaign, IL, USA, 1963
http://www.magmamater.cl/MatheComm.pdf