USC News

Menu Search

USC Viterbi lecture celebrates information theory

USC Viterbi Lecture Celebrates Information Theory
Sergio Verdú, left, and Andrew J. Viterbi

In 1948, an American researcher almost singlehandedly laid the foundation for computers, cell phones, compact discs, the Internet, interplanetary communications and most other aspects of what we call the Information Age.

Sixty-four years later, Sergio Verdú, who served as the 2012 Andrew J. Viterbi Distinguished Lecturer in Communications at the USC Viterbi School of Engineering on March 1, offered a strikingly clear introduction to visionary Claude Shannon’s work and its impact with a presentation titled “What Is Information Theory?”

Verdú, the Eugene Higgins Professor of Electrical Engineering at Princeton University, specializes in the field Shannon created with his 1948 paper, “A Mathematical Theory of Communication” – a field in which USC Viterbi emerged as a key international center.

The Ming Hsieh Department of Electrical Engineering, which sponsored the event, includes two faculty divisions, one of which deals with information theory. Alexander “Sandy” Sawchuk, chair of the Electrical Engineering – Systems division, served as host. Faculty members from USC Viterbi’s Information Sciences Institute, based in Marina del Rey, were among the attendees, who included Viterbi PhD ’62, the lecture’s namesake.

What they heard was a lecture remarkable for its clarity and lack of technical language. Verdú’s account contained – as he proudly noted – only one equation, and it carefully defined its terms, starting with the basic information nexus that Shannon proved for the first time was mathematically describable.

The equation was described by Verdú in a simple diagram he called “The Coat of Arms.” On one side was an information source, on the opposite an information destination. Between source and destination were two boxes; a transmitter on the source side, a receiver on the destination side. The transmitter box produces a signal; the receiver box receives it, but the signal transmitted was not identical to the one received because between them was an input labeled “noise.”

Shannon saw that if a message was in digital form, as ones and zeros, then information transmission could be seen as an interaction of two factors. The first, bandwidth, is the number of ones and zeros that a pathway (usually electronic or photonic but alternatives have been used) can accommodate in a given time.

The second is the amount of noise, disorder or (to use Shannon’s expression) entropy contained by the pathway – that is, the likelihood that any given one received actually should have read as zero, and vice versa. Bandwidth is never infinite, and noise is never zero, but Shannon showed that mathematics can define and predict the relationship between the two.

Almost overnight, the insight and its rigid mathematical foundation turned electronic communication from what had been a mass of empirical guesswork and ad hoc, blind-design gadgets into a game with precisely defined rules.

Finding the best solution within the rules was not obvious or easy – it depended on finding mathematics to at least sketch algorithmic descriptions of complex phenomena like images or sounds – but the immense rewards of finding ways were built on each other. Many members of the audience had distinguished themselves in creating such mathematical models.

If the important thing is to minimize bandwidth but still transmit a recognizable product – a picture, for example – one set of equations can perform the desired task. But if accuracy instead is critical, it can be achieved by adding redundancy (the mathematical equivalent of repeating on the phone “that’s ‘b’ as in barney, ‘r’ as in rabbit …”) to the message, and Shannon theory specifies exactly how much redundancy to add to counteract any specified amount of bandwidth noise.

Verdú offered one universally familiar measure that combines the two factors. The bars displayed on a telephone do not indicate how much signal is available. Instead they display how much communication is possible given both bandwidth and noise.

Once Shannon opened the door to these kinds of mathematical manipulations of information, new technologies emerged.

Counting down a long list, Verdú began with Viterbi algorithm applications, based on the idea that if a receiver system avoided trying to interpret a message until it had a larger message sample, then it could distinguish more separate messages “and minimize the probability that the whole message is a mistake.”

A question from the audience was, “Can you say where we’d be without this?”

Verdu’s clipped answer: “Nowhere.”

Afterward, Sawchuk presented Verdú with three nondigital artifacts to take back to Princeton: a plaque, a Trojan baseball hat and a T-shirt labeling its wearer a “Trojan Dad.”

More stories about:

USC Viterbi lecture celebrates information theory

Top stories on USC News