Skip to content
Scan a barcode
Scan
Paperback An Introduction to Information Theory: Symbols, Signals and Noise Book

ISBN: 0486240614

ISBN13: 9780486240619

An Introduction to Information Theory: Symbols, Signals and Noise

Select Format

Select Condition ThriftBooks Help Icon

Recommended

Format: Paperback

Condition: Good

$5.69
Save $16.31!
List Price $22.00
Almost Gone, Only 5 Left!

Book Overview

"Uncommonly good...the most satisfying discussion to be found." -- Scientific American.
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary...

Customer Reviews

5 ratings

Good book for the basics of information theory

I give this book five stars because it succeeds brilliantly at what it sets out to do - to introduce the field of information theory in an accessible non-mathematical way to the completely uninitiated. Information theory is that branch of mathematics that deals with the information content of messages. The theory addresses two aspects of communication: "How can we define and measure information?" and "What is the maximum information that can be sent through a communications channel?". No other book I know of can explain these concepts of information, bits, entropy, and data encoding without getting bogged down in proofs and mathematics. The book even manages to equate the concept of language with the information it inherently transmits in a conversational and accessible style. The book rounds out its discussion with chapters on information theory from the perspectives of physics, psychology, and art. The only math necessary to understand what's going on in this book is high school algebra and the concept of logarithms. If you are an engineer or engineering student who knows anything about information theory, you probably will not find this book helpful. Instead you would do better to start off with a more advanced book like "An Introduction To Information Theory" by Reza, which introduces concepts from a more mathematical perspective.

An Absolute Gem

Claude Shannon died last year, and it's really disgraceful that his name is not a household word in the manner of Einstein and Newton. He really WAS the Isaac Newton of communications theory, and his master's thesis on Boolean logic applied to circuits is probably the most cited ever.This is the ONLY book of which I am aware which attempts to present Shannon's results to the educated lay reader, and Pierce does a crackerjack job of it. Notwithstanding, this is not a book for the casual reader. The ideas underlying the theory are inherently subtle and mathematical, although there are numerous practical manifestations of them in nature, and in human "information transmission" behavior. On the other hand, this is a work which repays all effort invested in its mastery many times over.

Worth a Careful Reading

Pierce is an accomplished scientist/engineer, and was influential in the development of information theory/signal processing. This book has some mathematics, but lays a solid qualitative foundation for understanding the material. This book is a classic, good for computer engineers/scientists (as is his book Signals: The Science of Telecommunications). The presentation is accessible, and first hand accounts of important discoveries motivates a real appreciation for Pierce's contributions.However, the clarity of the presentation tends to obscure just how profound and deep the thinking involved really is. During the first reading, Pierce's insights made the material seem almost obvious. Later I would get doubts that such straightforward approaches could be correct, and then would think about the correctness of his assertions. This is why this is a great book, because it focuses on important stuff, and doesn't shy away from deep topics. This is a great book for those interested in the basis of information theory, on a side note Shannon's original papers are also quite readable.

Still the place to start

Although old this is still the best book to learn the core ideas of this subject, especially what information "entropy" really means. I read Ash's book, and followed the proofs, but I didn't really grasp the ideas until I read this.The book is geared towards non-mathematicians, but it is not just a tour. Pierce tackles the main ideas just not all the techniques and special cases.Perfect for: anyone in science, linguistics, or engineering. Very good for: everyone else.

Best Introduction

Though first printed in 1961 and revised in 1980 this is the best introduction to information theory there is. Very easy to read and light on math, just as an introduction should be. I expect it will be in print for a very, very long time.
Copyright © 2024 Thriftbooks.com Terms of Use | Privacy Policy | Do Not Sell/Share My Personal Information | Cookie Policy | Cookie Preferences | Accessibility Statement
ThriftBooks® and the ThriftBooks® logo are registered trademarks of Thrift Books Global, LLC
GoDaddy Verified and Secured