12
Digital Communication Course Notes for ECE417 Frank R. Kschischang Department of Electrical and Computer Engineering University of Toronto I Q Version 0.92 Copyright c 2015 by F. R. Kschischang Not for distribution. These notes may not be distributed or reproduced without the express consent of the author.

Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

  • Upload
    tranque

  • View
    224

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

DigitalCommunicationCourse Notes for ECE417

Frank R. Kschischang

Department of Electrical and Computer EngineeringUniversity of Toronto

I

Q

Version 0.92 Copyright c© 2015 by F. R. KschischangNot for distribution. These notes may not be distributed orreproduced without the express consent of the author.

Page 2: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

ii

AcknowledgmentsI am grateful to have had many excellent students whose keen interest in digital

communications has kept me on my toes, and whose insightful questions and com-ments helped to shape these course notes. I particularly wish to thank the readersof previous versions of these notes who took the time to send me written commentsor point out errors; these include Major Bhadauria, Sikang Bian, Anton Brjozovski,David Chen, Yongfeng Chen, Francisca Chrysostom, Brad Drehmer, G. David For-ney, Jr., Jenny He, Chi Kin Ho, Pauline Ho, Neil Jain, Ashish Khisti, KrishnaKishore, Eddy Lee, Kevin Lien, Darryl DeXu Lin, Po-Yi Lin, Devin Lui, JoshuaMarshall, Vicky Ng, Ketan Shah, Danilo Silva, Benjamin Smith, Gokul Soundarara-jan, Tharshan Sundaramoorthy, Frank Tai, Srividya Thangirala, Jonathan Waismanand Thomas Wong. My thanks to you all!

Dear reader, I would be most happy to hear from you as well. Please emailcomments, questions, corrections, kudos and concerns to me; my email address [email protected].

Frank R. KschischangUniversity of TorontoJanuary 11, 2015

About the cover picture: The cover picture illustrates the signal-space diagram for a16-quadrature-amplitude-modulation (16-QAM) digital modulation scheme at the outputof an additive white Gaussian noise channel. Each of the 16 possible transmitted signalshas been sent 500 times, resulting in 8,000 received signals, each represented by a small dot.Given a received signal, the job of the demodulator is to determine which of the 16 possibletransmitted signals (corresponding roughly to the center of each “cloud” of received points)was sent. If the demodulator is successful, four bits will have been transferred over thechannel.

Page 3: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

Contents

Contents iii

1 Introduction 11.1 Digital Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 The Communication System Model . . . . . . . . . . . . . . . . . . . . . . 21.3 Outline of the Course . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.4 Prerequisite Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.6 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2 Information Theory I: Source Coding 92.1 Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2 Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3 Prefix Codes and Kraft’s Inequality . . . . . . . . . . . . . . . . . . . . . . 142.4 Entropy and the Source Coding Theorem . . . . . . . . . . . . . . . . . . . 182.5 Shannon-Fano Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.6 Huffman Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262.7 Approaching Entropy by Source Extension . . . . . . . . . . . . . . . . . . 272.8 Lempel-Ziv Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322.10 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

3 Waveform Coding 373.1 Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373.2 Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423.3 More Advanced Coding Methods . . . . . . . . . . . . . . . . . . . . . . . . 503.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523.5 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4 Geometric Signal Theory 554.1 Complex Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554.2 Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.3 Vector Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 574.4 Linear Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 584.5 Euclidean Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

iii

Page 4: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

iv CONTENTS

4.6 Norms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 614.7 Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 624.8 The Projection Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 624.9 Orthonormal Bases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 634.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 664.11 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

5 Signal Detection in AWGN 735.1 The Additive White Gaussian Noise Channel . . . . . . . . . . . . . . . . . 735.2 Signal Detection in AWGN . . . . . . . . . . . . . . . . . . . . . . . . . . . 765.3 Performance Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 835.4 Gaussian Channel Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . 945.5 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

6 Pulse Transmission over Bandwidth-Limited Channel 1096.1 Digital Modulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1096.2 Linear Time-Invariant Channels . . . . . . . . . . . . . . . . . . . . . . . . 1136.3 Nyquist’s Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1186.4 Partial Response Signaling . . . . . . . . . . . . . . . . . . . . . . . . . . . 1236.5 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

7 Information Theory II: Channel Coding 1377.1 Block Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1377.2 Linear Block Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1387.3 The Binary Symmetric Channel . . . . . . . . . . . . . . . . . . . . . . . . 1417.4 Distance Measures for Codes . . . . . . . . . . . . . . . . . . . . . . . . . . 1427.5 Error Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1447.6 Error Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1467.7 The Hamming Bound and Perfect Codes . . . . . . . . . . . . . . . . . . . 1497.8 The Singleton Bound and Reed-Solomon Codes . . . . . . . . . . . . . . . 1507.9 Convolutional Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1527.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1577.11 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157

A Random Processes 163A.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163A.2 Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164A.3 White Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165

B Bounds on the Complementary Error Function 167

C Problem Solutions 169

Page 5: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

CHAPTER 1Introduction

1.1 Digital Communication

Communication is the process of transferring information from “there and then” to “hereand now.” Information comes in many forms. Voice signals, email, photographs, audiosignals, television signals, web pages, textbooks: all of these—and many more—qualify asinformation sources. Communication in space, from “there” to “here,” is accomplished bya transmission system: for example, a pair of voice-band modems, a cellular telephone andcellular base-station, or a laser and photo-detector located at opposite ends of a transoceanicoptical fiber. Communication in time, from “then” to “now,” is accomplished by a storagesystem: for example a disk drive, compact disc, or audio tape.

Today, in the early part of the 21st century, we are nearing the end of a revolution—the digital communication revolution—that has seen analog communication systems largelysupplanted by digital communication systems.

Analog communication systems represent waveforms in a continuous (as opposed to adiscrete) manner. For example, in AM (amplitude modulation), the instantaneous ampli-tude of a sinusoidal carrier is varied continuously in proportion to a message signal (e.g.,a voice) whereas in FM (frequency modulation), it is the instantaneous frequency of thecarrier that is varied. Relics of the former analog era remain (e.g., AM/FM radio, analogtelevision, or cellular telephones that use analog FM transmission), but even these systemsare now nearing the end of their technological life, as their more efficient digital replacementsbecome available.

Digital communication systems use signals that are selected from a discrete (as opposedto a continuous) alphabet of possible signals. The idea is not new: smoke signals, drumsignals, and light signals have long been used to effect reliable long distance communication.

Digital communication systems have significant advantages over analog systems. Forexample, discrete signals are efficiently relayed: intermediate stations can reliably (i.e.,with high probability) reproduce the transmitted signal, even when it has been corruptedby noise. Creating a copy of a copy of a copy of an analog signal (e.g., an analog videotape) reduces signal quality at each stage, whereas discrete signals can be copied almost

1

Page 6: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

2 CHAPTER 1. INTRODUCTION

ad infinitum essentially without error. Analog communication systems are almost alwaystailored for transmission of a particular type of source: voice, music, video, etc. An analogsystem tailored for one type of source cannot typically be used to communicate anothertype. On the other hand, general-purpose digital communication systems can be designedwithout specifying the type of information transferred. Thus, for example, the Internet canbe used to transfer text, voice, music, video, etc.

Modern digital communication systems are almost always defined in terms of the trans-mission of binary digits or bits, i.e., symbols that take on values from the set {0, 1}. Usingmessages composed entirely out of bits, communication of just about any type of informa-tion message is achieved reliably and efficiently. Reliability is typically measured by the biterror rate, the probability that a transmitted bit is received in error. Efficiency is typicallymeasured by the bit rate (the number of bits transmitted per unit time in a transmissionsystem, or the number of bits stored per unit area in a recording system) while achieving aparticular target bit error rate (e.g., 10−3, 10−7, 10−9, or 10−15, depending on the particularsystem).

The designer of a communication system must make efficient use of available resources—signal power, bandwidth, signal-processing complexity, system cost—and face various impairments—noise, interference, system imperfections—in order to achieve a given system specification:communicate at a certain rate with a certain reliability. Communication theory in general,and this course specifically, is concerned with studying the tradeoffs involved.

1.2 The Communication System Model

The communication model that we will deal with in this course is shown in Fig. 1.1. Thismodel of a point-to-point communication system includes an information source, a transmit-ter, a noisy channel, a receiver, and a destination. The purpose of the transmitter is to takethe output of the information source, and convert it into a form suitable for transmissionthrough the noisy channel, for subsequent detection by the receiver. The receiver examinesthe channel output, and attempts to infer what the transmitter sent. It passes the resultof this inference to the destination. The whole purpose of such a communication systemis to deliver information from the source to the destination reliably and efficiently. Oncea communications medium has been chosen, of the five blocks shown in Fig. 1.1, typicallyonly the transmitter and the receiver are under the control of the system designer.

SOURCE TRANSMITTER RECEIVER DESTINATION

CHANNEL

NOISE

Figure 1.1: A point-to-point communication system.

Figure 1.1 is often expanded as shown in Fig. 1.2. Here, the transmitter now containsfour blocks: the source encoder, the encryptor, the channel encoder, and the modulator,not all of which are necessarily present in every point-to-point communication system. Thereceiver has corresponding blocks that perform the inverse function.

Page 7: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

1.2. THE COMMUNICATION SYSTEM MODEL 3

SOURCESOURCEENCODER ENCRYPTOR

CHANNELENCODER MODULATOR

DESTINATIONSOURCEDECODER DECRYPTOR

CHANNELDECODER DEMODULATOR

KEY CHANNEL NOISE

TRANSMITTER

RECEIVER

Figure 1.2: Expanded transmitter/receiver blocks.

The purpose of the source encoder is to transform the output of the information sourceto an efficient digital representation. If the information source is analog, the source encoderwill typically implement some type of analog-to-digital (A/D) conversion. If the informationsource is digital, the source encoder may simply implement some sort of translation fromthe alphabet of source symbols to some other alphabet (typically bits). Efficiency of asource encoder is typically measured in terms of some sort of bit rate (bits per second orbits per message), with lower bit rates indicating greater efficiency. Thus the domain ofthe source encoder is data compression; i.e., representation of the source with as few bitsas possible, while still permitting reconstruction of the source output. We will study theefficient representation of information in Chapters 2 and 3.

The purpose of the encryptor is secure communication. The transmitter and thereceiver often share some sort of secret key, which permits them to encrypt and decryptmessages while preventing an eavesdropper from doing the same. Encryption is particularlyimportant in communication networks, where transmitted packets (containing, e.g., creditcard information) might easily be observed in transit. The encryptor/decryptor blocks areshown within dashed boxes in Fig. 1.2 since they will not be a major focus of this course.

The purpose of the channel encoder is to provide reliable communication, even inthe presence of channel noise. By adding redundant bits to the transmitted messages, it ispossible to devise error-correcting codes, i.e., coding schemes that permit the correction ofsymbol errors within a transmitted block. The channel encoder function is dual to the sourceencoder function; the latter attempts to remove redundancy, the former adds redundancy(but in controlled fashion so as to permit useful error control). We will study error controlcoding in Chapter 7.

The modulator maps symbols from a discrete alphabet to a discrete set of waveformsthat are suitable for transmission through the given channel. For example, in a radiochannel, the transmitted pulses will have most of their energy confined to some prescribedfrequency band. In a voice-band modem, the transmitted signals are confined to the audioband; in a fiber-optic transmission system, the transmitted signals are light pulses. Pulsetransmission and the detection of pulses in the presence of noise will be the focus of ourstudy in Chapters 4, 5, and 6.

The channel block models the physical transmission medium between the transmitterand the receiver. Physical channels might be constructed by running wires or an optical

Page 8: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

4 CHAPTER 1. INTRODUCTION

Claude Elwood Shannon (1916–2001) was thefather of information theory.

Shannon was born, raised and educated inMichigan. He graduated from the University ofMichigan in 1936 with two bachelor’s degrees,one in electrical engineering and one in math-ematics. He then began graduate study at theMassachusetts Institute of Technology (MIT),where he worked on Vannevar Bush’s differ-ential analyzer, an analog computer. Whilestudying the complicated circuits of the differ-ential analyzer, Shannon saw that Boolean al-gebra (which he had studied in Michigan) couldbe of great use. His 1937 master’s thesis onthis subject, called “possibly the most impor-tant, and also the most famous, master’s thesisof the century” established the (now standard)use of Boolean algebra in digital logic design. Shannon’s 1940 Ph.D. thesis atMIT, entitled “An Algebra for Theoretical Genetics,” was well ahead of its time.

Following his graduate studies, Shannon joined Bell Labs to work on fire-control systems and cryptography during World War II, where he made manycontributions. In 1948 Shannon published his magnum opus, “A MathematicalTheory of Communication” in the Bell System Technical Journal. This worksingle-handedly founded the field of information theory by introducing conceptsthat establish the fundamental limits on information representation (entropy) andon information transmission (channel capacity). Another notable paper publishedin 1949 is “Communication Theory of Secrecy Systems,” a major contribution tothe development of a mathematical theory of cryptography.

Outside of his academic pursuits, Shannon was interested in juggling, unicy-cling, and chess. He also invented many unusual devices, including rocket-poweredflying discs, a motorized pogo stick, a wearable computer to predict the result ofplaying roulette, and a flame-throwing trumpet for a science exhibition. One ofhis more humorous devices was a box called the “Ultimate Machine,” based on anidea by Marvin Minsky. The box possessed just a single switch on its side, which,when flipped, caused a mechanical hand to reach from inside the box and flip theswitch back. The “Ultimate Machine” was one whose only purpose was to switchitself off!

In 1956 he returned to MIT to hold an endowed chair, where he worked foranother two decades. In his later life, Shannon was afflicted by Alzheimer’s disease.He died in a nursing home in Massachusetts in 2001.

Page 9: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

1.3. OUTLINE OF THE COURSE 5

fiber between transmitter and receiver; the channel might be wireless (as in a radio system);or it might be some magnetic recording medium (on a tape or disk). Channels inevitablyhave some type of noise or interference that corrupts the transmission of waveforms throughthem.

The receiver has blocks that correspond to the transmitter blocks. The demodula-tor translates from waveforms to discrete symbols, the channel decoder performs errorcorrection, the decryptor recovers the message from its encrypted form, and the sourcedecoder reconstructs the source signal from its digital representation.

1.3 Outline of the Course

The outline of this course follows the expanded block diagram of Fig. 1.2.The first section of the course will cover the source encoder and source decoder blocks.

Source coding for discrete sources, particularly Huffman coding, will be covered, alongwith a discussion of the fundamental information-theoretic limits that govern compressionsystems. The basic tradeoffs in analog-to-digital conversion and waveform coding will alsobe treated.

The next section of the course will cover geometric signal theory and pulse detectionschemes in additive white Gaussian noise. The optimum (minimum probability of error)pulse detection system for pulses transmitted over an additive white Gaussian noise channelwill be derived and analyzed. The main result will be to understand the tradeoff betweenthe signal-energy to noise-energy ratio and the maximum achievable transmission efficiency(measured in bits per symbol).

The third section of the course will cover pulse transmission over band-limited channels.The main problem considered in this section of the course is that of intersymbol interfer-ence, i.e., interference that can arise among the transmitted pulses themselves. The mainresult will be to understand the tradeoff between bandwidth and maximum achievable pulsetransmission rates (measured in symbols per second).

Combining these two sections, we will arrive at the main result of this course, whichmight be stated as:

bits per second = bits per symbol × symbols per second. (1.1)

Much of communication engineering results from optimizing the tradeoffs involved in thisequation.

The final section of the course will be devoted to error control coding. Fundamentalinformation-theoretic limits that govern transmission systems will be discussed. As we willsee, even with an optimum detection system for pulses, some form of error control codingis needed to operate a transmission system at maximum possible transmission efficiency.

1.4 Prerequisite Tools

The study of digital communication systems will rely on many previous courses. The mostimportant tools will be:

• System and Signal Analysis. Much of communication theory relies on knowl-edge of the theory of signals, in both the time domain and the frequency domain.

Page 10: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

6 CHAPTER 1. INTRODUCTION

Accordingly, the Fourier transform plays an important role, as does the SamplingTheorem.

• Probability Theory. Since noise is random, probability theory is needed to evaluatethe performance of various digital communication schemes. In fact, most importantperformance measures in digital communication—the bit error rate and the blockerror rate—are probabilities. The output of a source of information is typically mod-elled probabilistically. Other quantities, such as signal-to-noise ratios, are defined inprobabilistic terms, as the ratio of second moments of random variables.

• Linear Algebra. The “front-end” of a detector of signals in additive white Gaussiannoise performs a vector space projection. The theory of vector spaces, inner prod-ucts, metrics, and other aspects of linear algebra play an important role in digitalcommunication theory.

• Discrete Mathematics. Various aspects of discrete mathematics, including graphtheory, graph algorithms, and modular arithmetic, are used in digital communicationtheory.

Historical Notes

The block diagram of Fig. 1.1 appears as Figure 1 of Claude E. Shannon’s 1948 paper “AMathematical Theory of Communication” [1], a paper that has sometimes been referred toas the “Magna Carta of the information age.” Shannon founded the field of informationtheory, which establishes fundamental limits on data compression and data transmissionefficiency. An excellent documentary outlining the impact that Shannon’s work has had onmodern communication systems is available online [2].

1.5 References

[1] C. E. Shannon, “A mathematical theory of communication,” Bell System TechnicalJournal, vol. 27, pp. 379–423 and pp. 623-656, July and October, 1948. Online: http:

//cm.bell-labs.com/cm/ms/what/shannonday/paper.html.

[2] UCSD-TV, Regents of the University of California, “Claude Shannon: Father of theInformation Age,” 29.5 minute video documentary, 2002. Available online at http:

//www.ucsd.tv/search-details.asp?showID=6090.

Page 11: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

1.5. REFERENCES 7

Milestones in the History of Telecommunications and Recording1793 C. Chappe invents the first long-distance relay network based on visual semaphore.1831 J. Henry invents the first electric telegraph.1837 S. F. B. Morse publicly demonstrates the telegraph.1865 J. C. Maxwell predicts the propagation of electromagnetic waves through space.1866 Completion of first transatlantic telegraph cable.1876 A. G. Bell and E. Gray independently invent the telephone.1877 T. Edison patents the phonograph.1887 H. Hertz shows that electromagnetic waves exist.1899 V. Poulsen invents the first magnetic recordings.1901 G. Marconi sends the first transatlantic wireless signals.1904 J. A. Fleming invents the two-element “Fleming Valve” (vacuum tube diode).1905 L. de Forest presents paper on “Audion” vacuum tube.1907 First transatlantic commercial wireless service established by Marconi.1910 T. Edison demonstrates the first talking motion picture.1912 E. H. Armstrong invents the feedback regenerator.1913 Armstrong patents his version of the valve generator.1914 Transcontinental telephone line is completed.1918 Armstrong develops the superheterodyne receiver.1923 V. Zworykin invents cathode ray tube.1924 Directive short wave antenna developed by H. Yagi and S. Uda.1925 J. L. Baird transmits the first experimental TV signal.1926 First public test of radiotelephone service from New York to London.1927 First public demonstration of long distance television transmission.1931 AT&T opens Teletypewriter Exchange Service (TWX).1933 Armstrong demonstrates FM.1934 S. J. Begun invents the magnetic tape recorder.1936 First long distance coaxial cable for multi-channel telephony installed.1937 The combined handset telephone introduced commercially.1938 The first crossbar central office installation goes into service.1938 Orson Welles’ “War of the Worlds” radioplay sparks pandemonium.1944 A telephone submarine cable is laid across the English Channel1946 Mobile telephone service is introduced commercially in St. Louis, MO.1947 The point contact transistor is invented by Brattain and Bardeen.1948 Invention of the junction transistor.1948 C. E. Shannon publishes “A Mathematical Theory of Communication.”1948 Vinyl long playing (LP) record invented, played at 33 rpm.1949 Network television starts in U.S.1957 The Soviet Union launches Sputnik, the first artificial satellite.1958 J. S. Kilby and R. Noyce independently develop an integrated circuit.1958 C. Carlson invents the photocopier.1960 ECHO I communications satellite is launched.1960 Laser is invented.1962 T1 carrier is put into commercial service.1962 First transatlantic transmission of a TV signal via the TELSTAR satellite.1962 A.T.&T. introduces 300 bps voiceband modem, using FSK.1966 First public demonstration of worldwide direct telephone dialling.1966 Xerox invents the telecopier — the first successful fax machine.1969 ARPANET begins 4-node operation.1969 Video and audio are transmitted back from the first Moon landing.1970 Corning Glass demonstrates optical fibers.1971 The computer floppy disc invented.1973 Ethernet invented at Xerox PARC.1979 First cellular phone communication network started in Japan.1980 Sony Walkman invented.1983 First cellular phone network in the U.S.1983 First ISDN (integrated services digital network) trials begin in Japan.1988 First transatlantic fiber optic cable is completed.1989 Fiber to the home trials begin in the U.S.1993 First digital mobile network established in the U.S. (in Los Angeles).1996 The cable modem is introduced.Sources: http://www.ieee.org/web/aboutus/history center/conferences/comsoc/timelines.html

http://www.webbconsult.com/history.html

Page 12: Digital Communication - Electrical & Computer Engineering · signals, television signals, web pages, textbooks: all of these|and many more|qualify as information sources. ... Analog

8 CHAPTER 1. INTRODUCTION

1.6 Problems

(Solutions to problems can be found at the back of these notes.)

1.1 In magnetic recording systems, efficiency is measured in terms of storage density inbits/cm2. Generalize (1.1) to this case.

1.2 An optical fiber system transmits information at 40 Gbps (=4× 1010 bits/s) with abit error probability of 10−15. Assuming bit errors occur independently, what is theaverage time between successive errors?

1.3 Instead of bit error probability, packet systems often use the packet error probabilityas a reliability measure. Suppose that packets are N bits long and that bit errorsoccur independently with probability p. A packet is in error if any bit that it containsis in error.

a) Determine the packet error rate in terms of the bit error rate.

b) If p = 10−5 and we require a packet error rate of at most 10−2, how large canN be?

1.4 Let X be a random variable with probability density function

fX(x) =1√π

exp(−x2).

Show that E[X] = 0 and E[X2] = 1/2.