학술논문

Information Theoretic Signal Processing and Its Applications [Bookshelf]
Document Type
Periodical
Author
Source
IEEE Control Systems IEEE Control Syst. Control Systems, IEEE. 43(2):97-109 Apr, 2023
Subject
Robotics and Control Systems
Computing and Processing
Components, Circuits, Devices and Systems
Book reviews
Information theory
Signal processing
Probability density function
Maximum likelihood estimation
Data models
Random variables
Language
ISSN
1066-033X
1941-000X
Abstract
The roots of information theory are almost 100 years old and include early works by Fisher [1], Hartley [2], and others. According to a history of information theory [3], motivated to understand how to draw information from experiments, Fisher [1] stated “the nature and degree of the uncertainty [must] be capable of rigorous expression.” Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon’s work [4] that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.