Information Theory

Why Trust Techopedia

What Does Information Theory Mean?

Information theory is a branch of mathematics that defines efficient and practical methods by which data can be exchanged and interpreted. The concept originated in a mid-twentieth century essay by a mathematician by the name of Claude Shannon, which set many important precedents for digital technology, including the usage of bits as units of measurement.

Advertisements

Techopedia Explains Information Theory

Prior to information theory, electronic communication was conducted mostly through analog transmission, which worked well enough in short distances but became problematic as the distances increased and signals degraded. Claude Shannon was an employee of Bell Labs (the research and development arm of the Bell Telephone Company) during the mid-twentieth century, and worked on improving electronic communication during the second World War in order to make it more efficient and secure.

Shannon’s research was eventually published in a book called “The Mathematical Theory of Communication” (co-written with Warren Weaver) and laid the groundwork for much of modern digital technology, such as the implementation of binary code.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.