480 likes | 619 Views
Digital Representation of Analog Information. How Images and Sound are Stored and Communicated. The Sine Wave. Amplitude. Frequency and Period. Frequency in cycles/sec Period in sec/cycle. Frequency = 1/Period Period = 1/Frequency. Period and Wavelength.
E N D
Digital Representation of Analog Information How Images and Sound are Stored and Communicated Harvard Bits
The Sine Wave Harvard Bits
Amplitude Harvard Bits
Frequency and Period Frequency in cycles/sec Period in sec/cycle Frequency = 1/Period Period = 1/Frequency Harvard Bits
Period and Wavelength • Period = time duration of one cycle • Wavelength = spatial length of one cycle • For waves traveling at a fixed speed, period and wavelength are proportional • E.g. light travels at speed c m/sec, and Wavelength = c * Period Harvard Bits
Wavelength· Frequency = Speed(m) · (#/sec) = m/sec • If speed is fixed then wavelength and frequency vary inversely • E.g. speed of light in vacuum, speed of sound in air are constant • Frequency measured in Hertz: 1 Hz = 1 cycle/sec • AC current = 60 Hz • A note above middle C = 440 Hz • Audible telephone frequencies = 400 - 3400 Hz = 0.4 - 3.4 KHz • Visible light = (4-7.5) · 1014 Hz Harvard Bits
Phase Harvard Bits
sin(x) and 0.2*sin(10*x) sin(x) + 0.2*sin(10*x) Sum of Sine Waves Harvard Bits
1209 Hz 1336 Hz 1477 Hz 697 Hz 770 Hz 852 Hz 941 Hz Touch-Tone Telephone Harvard Bits
Blue Filter Red Filter Short wavelength, high frequency Long wavelength, low frequency Visible Light A filter is something that transmits only a limited “band” of wavelengths Harvard Bits
Signals can be Filtered Components Harvard Bits
Any Periodic Signal is Approximately a Sum of Sine Waves Harvard Bits
Fourier Analysis = Decomposition of Signal into Sines • Signal usually is a sum of waves of higher and higher frequency and lower and lower amplitude • Higher frequency components give greater accuracy • Next component of square wave: Harvard Bits
Sampling A signal can be reconstructed from samples taken at regular intervals as long as the intervals are short enough Harvard Bits
Undersampling causes Aliasing If the samples are too infrequent a lower-frequency signal may fit the sampled points and the original signal can’t be recovered Harvard Bits
Nyquist Sampling Theorem • For the signal to be recovered accurately from the samples, the sampling rate must be more than twice the frequency of the highest-frequency component • Wave frequency 1 KHz so sampling must be more than 2KHz to recover signal Harvard Bits
Alias = Another Signal with Same Samples as Original Harvard Bits
Audio Frequencies and Sampling • Telephone system designed around 3.4KHz max • Human hearing up to 20KHz • Loss of high frequency components ==> poorer quality sound • Digital telephones sample at 8KHz = 2*4kHz • CD ROM samples at 44.1KHz > 2*20KHz • Some PC sound cards sample at this rate • So VOIP (Voice Over IP) can have higher fidelity than telephone land lines! Harvard Bits
Quantization:How Many Bits per Sample? • n bits/sample => 2n possible sample values Audio CDs => 16 bits/sample * 2 channels for stereo Digital Telephones => 8 bits/sample Harvard Bits
How Many Bits of Music? • Audio CD: 1 hour of music = 3600 s * 44,100 sample/s * 16 bits/sample* 2 stereo channels = 5Gb = 636MB • Bits are used to reconstruct the sine waves, not simply to adjust the volume in jagged jumps Harvard Bits
Compression of Music • CDs are uncompressed • When CD standard was set it would have been too expensive to put decompression chips into consumer electronics • Requires intelligence in the processor • CDs are a dying technology. Already often used only once, to move music onto computer disk or Ipod • What you can do with information depends on the representation! Harvard Bits
Compressing Music Losslessly • For storage on computer disk, compression is possible because music samples have low entropy • Less space <==> more computing • Simple example: Take advantage of the fact that successive samples usually differ by only a little • E.g. Difference coding: Record one value (16 bits) and then just the changes, sample to sample • E.g. 4527; +1, 0, 0, -3, +2, 0, 0, 0, +7, 0, 0, -1, … • Huffman coding this sequence ==> huge compression • Real example: FLAC = Free Lossless Audio Code Harvard Bits
Lossy Compression of Music • Once you have the bits, there is lots of computing you can do on them • Principle: If the average teenager can’t hear the difference, why waste money preserving it? • Rely on psychoacoustic phenomena to compress music in a way that sounds almost perfect but isn’t • Not to be used at the studio for archival storage • A family of methods -- depending on the degree of compression, enough information may be thrown away to be subtly audible Harvard Bits
Lossy Audio Compession Ideas • Throw away very high frequency components • Throw away any component that is soft if it is simultaneous with a loud component • Change stereo to mono (50% savings) if mostly low frequencies -- where stereo is hard to hear • MP3, RealAudio, … • These standards stipulate decoding but not encoding -- there may be several encodings of the same music that discard different information to produce different storage sizes and bit rates Harvard Bits
Still Image and Video Encoding • GIF and JPEG for still images • JPEG better for continuous-tone color, GIF for monochrome and line drawings • JPEG exploits the fact that 24 bits of color are more than the eye can see • Eye is more sensitive to small fluctuations in intensity than small fluctuations in color • Spatial coherence: colors similar pixel to pixel • MPEG exploits temporal coherence for movies: successive frames of video are usually similar Harvard Bits
Modulation • There is only one sine wave at a given frequency, so how does the information get carried at a particular frequency? • Modulation = Encoding information on a signal • Analog radio modulation technologies: • FM = Frequency Modulation • AM = Amplitude Modulation Harvard Bits
Amplitude Modulation Harvard Bits
FM = Frequency Modulation Harvard Bits
Problems with AM and FM • Power varies with amplitude • [to be precise, power is the square root of the average of the square of the signal power, the root mean square or rms power] • As signal fades with distance, some parts of AM signals drop out before others • But AM can transmit over longer distances because AM frequencies bounce off ionosphere and diffract around hills and buildings but FM frequencies are absorbed, causing “shadows” • This difference between AM and FM is due to the frequency bands allocated to them, not the modulation technique! Harvard Bits
Signal and Noise • Signal is the information you want to transmit • Noise is just another signal, added to and interfering with the signal you want to transmit • Some noise is random and unavoidable and comes from natural sources • Some noise is intentional and is actually someone else’s signal • A party can be “noisy” even though most of the “noise” is just conversations other than yours! Harvard Bits
Noise and Channel Capacity • If the noise is “soft” it is easy to pick out the signal • If the noise is “loud” it introduces many errors into the received signal • In a digital communications channel the noise level affects the channel capacity • “Loud” noise can be compensated for by channel coding, at the expense of lower data rate • Recall Shannon’s Channel Coding Theorem: Error rate can be made as close to zero as desired, as long as the rate at which bits are transmitted does not exceed the channel capacity Harvard Bits
Signal to Noise Ratio • “Loudness” of signal and noise are their power • The key parameter is the Signal to Noise Ratio = SNR = S/N where S = signal power, N = noise power • High SNR = clearer signal = higher channel capacity Harvard Bits
Decibels • SNR is a pure number: (signal power)/(noise power) • Typically measured by its base ten logarithm: One bel = log10 (#) • (named after Alexander Graham Bell) • So a tenfold increase in SNR raises it by one bel • One decibel = (1/10) of one bel • So 90 db is a ratio of 109 = 1 billion http://docs.info.apple.com/article.html?artnum=58299 Harvard Bits
Decibels for Sounds • The loudness of sound X is the ratio of X to the softest sound S audible to humans, measured in decibels, i.e. 10 log10(X/S) • So S is a 0 decibel sound • Normal conversation: X = 106S so X = 60db • Rock concert: X = 1011S so X = 110db, or even 120db: 1,000,000,000,000 times the power of S! Hearing loss in a few minutes • Ipods around 115db -- mechanics of earbuds matter • A sound 1/10 as loud as the softest sound humans can hear would be -10db and absolute silence would be −∞ Harvard Bits
Other Logarithmic Scales • Richter scale for earthquakes • Richter magnitude = log10(largest horizontal displacement caused by quake) • So magnitude 5 quake is 10x stronger than magnitude 4 quake, etc. • Star magnitudes • Magnitude of star X = log (S/X) where S is a fixed reference brightness • So dimmer stars have higher magnitude • The base of the logarithm, 2.512, is somewhat accidental: it makes a difference of 5 magnitudes = a factor of 100 • Visual system “feels” that magnitudes 6, 5, 4, 3, 2, 1 are getting brighter in equal jumps • Brightest star = Sirius = magnitude -1.54 • Sun = -26.8 Harvard Bits
Restoration of Digital Signals • We know that a fundamental advantage of digital representation over analog is that data can be restored • E.g. if there are only two possibilities for a signal, the problem becomes recognizing which possibility the actual signal more closely resembles 1 0 ? Harvard Bits ?
111 110 101 100 00 01 10 0 00 01 11 10 1 11 A Communications Tradeoff:Bits per time unit vs. Power • If signals had four possible levels rather than two, in each time slice two bits of information could be transmitted • If the levels were closer together, the thresholding would be harder -- same noise => more errors • If levels were same distance apart, need more power vs. Harvard Bits
Bandwidth, Literally • Bandwidth = the width of a frequency range • E.g. the AM band is 530-1700 KHz, for a bandwidth of about 1200 KHz • Within a band, signals (e.g. radio stations) have to be kept a certain distance apart to avoid interference (could not have stations at both 1030 and 1031 KHz) • More bandwidth => more “stations,” “channels,” i.e., more channel capacity • With more bandwidth it is possible to transmit more information Harvard Bits
Bandwidth, Figuratively, = Speed • “Broadband” = uses a large frequency range • Because broadband communication channels can carry information at a high rate (i.e. have high channel capacity), any fast channel is now called “broadband” regardless of the underlying technology • Caveat emptor! when buying “broadband Internet service” - the term has no standardized meaning in terms of data rates • Check actual “bandwidth” or data rates -- often more in one direction than the other Harvard Bits
Signal, Noise, Bandwidth, Channel Capacity • These four are interrelated • Stronger signal (S) => higher channel capacity C • More noise (N) => lower C • More bandwidth (B) => higher C C = B lg (1+S/N) Shannon-Hartley Theorem • So channel capacity increases linearly with bandwidth but logarithmically with signal-to-noise ratio Harvard Bits
Why Binary is Best C = B lg (1+S/N) • Usually noise is uncontrollable • So to increase channel capacity, the engineer must increase either bandwidth or signal power • Power is a precious resource! • Use more power in a PC or cell phone => bigger battery, shorter battery life, etc. • With only two signal levels, power usage is minimized • To achieve a fixed data rate, can use 1000x less power if we can get 10x more bandwidth! Harvard Bits
Summary of Digital Communication (of Audio) Instead of Amplitude Modulation of analog signal or Frequency Modulation Harvard Bits
Use Pulse Code Modulation Sample and Quantize the Analog Signal (Analog to Digital conversion) Turn the quantized data as binary numerals into a bit stream And send digital pulses rather than the original analog signal 5 7 2 1 2 4 7 4 3 2 5 … 0 1 1 1 0 0 0 1 … Harvard Bits
Summary of Digital Radio/TV Communication • Sampling • Quantization • Source coding • Modulation • Transmission • Thresholding • Source decoding • Regeneration Analog to Digital conversion (A-D) With enough computing, could involve fancy digital signal processing Repeated many times, once per hop Channel coding/decoding on each hop D-A conversion Harvard Bits