310 likes | 452 Views
ECE 4371, Fall, 2013 Introduction to Telecommunication Engineering/Telecommunication Laboratory. Zhu Han Department of Electrical and Computer Engineering Class 20 Nov. 11 th , 2013. Outline. MIMO/Space time coding Trellis code modulation BICM Multimedia Transmission. MIMO. Model.
E N D
ECE 4371, Fall, 2013Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering Class 20 Nov. 11th, 2013
Outline • MIMO/Space time coding • Trellis code modulation • BICM • Multimedia Transmission
MIMO • Model T: Time index W: Noise
Alamouti Space-Time Code • Transmitted signals are orthogonal => Simplified receiver • Redundance in time and space => Diversity • Equivalent diversity gain as maximum ratio combining => Smaller terminals
Space Time Code Performance Block of T symbols Constellation mapper STBC Data in nt transmit antennas Block of K symbols • K input symbols, T output symbols T K • R=K/T is the code rate • If R=1 the STBC has full rate • If T= ntthe code has minimum delay • Detector is linear !!!
Time s1 s1 s1 s1 s1 s1 V-BLAST Antenna s2 s2 s2 s2 s2 s2 s3 s3 s3 s3 s3 s3 s0 s1 s2 s0 s1 s2 D-BLAST s0 s1 s2 s0 s1 s0 s1 s2 s0 BLAST • Bell Labs Layered Space Time Architecture • V-BLAST implemented -98 by Bell Labs (40 bps/Hz) • Steps for V-BLAST detection • Ordering: choosing the best channel • Nulling: using ZF or MMSE • Slicing: making a symbol decision • Canceling: subtracting the detected symbol • Iteration: going to the first step to detect the next symbol
Trellis Coded Modulation • Combine both encoding and modulation. (using Euclidean distance only) • Allow parallel transition in the trellis. • Has significant coding gain (3~4dB) without bandwidth compromise. • Has the same complexity (same amount of computation, same decoding time and same amount of memory needed). • Has great potential for fading channel. • Widely used in Modem
Set Partitioning • Branches diverging from the same state must have the largest distance. • Branches merging into the same state must have the largest distance. • Codes should be designed to maximize the length of the shortest error event path for fading channel (equivalent to maximizing diversity). • By satisfying the above two criterion, coding gain can be increased.
Coding Gain • About 3dB
Bit-Interleaved Coded Modulation • Coded bits are interleaved prior to modulation. • Performance of this scheme is quite desirable • Relatively simple (from a complexity standpoint) to implement. Binary Encoder Bitwise Interleaver M-ary Modulator Channel Soft Decoder Soft Demodulator Bitwise Deinterleaver
BICM Performance 12 CM BICM AWGN Channel, Noncoherent Detection M: Modulation Alphabet Size 10 Minimum Eb/No (in dB) 8 M = 2 6 M = 4 4 M = 16 2 M = 64 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Code Rate R
Video Standard • Two camps • H261, H263, H264; • MPEG1 (VCD), MPEG2 (DVD), MPEG4 • Spacial Redundancy: JPEG • Intraframe compression • DCT compression + Huffman coding • Temporal Redundancy • Interframe compression • Motion estimation
120 108 90 75 69 73 82 89 127 115 97 81 75 79 88 95 134 122 105 89 83 87 96 103 137 125 107 92 86 90 99 106 131 119 101 86 80 83 93 100 117 105 87 72 65 69 78 85 100 88 70 55 49 53 62 69 89 77 59 44 38 42 51 58 Discrete Cosine Transform (DCT) 0 – black 255 – white
700 90 100 0 0 0 0 0 90 0 0 0 0 0 0 0 -89 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 DCT and Huffman Coding 0 – black 255 – white
Using DCT in JPEG • DCT on 8x8 blocks
Quantization and Coding Zonal Coding: Coefficients outside the zone mask are zeroed. • The coefficients outside the zone may contain significant energy • Local variations are not reconstructed properly
Motion Compensation • I-Frame • Independently reconstructed • P-Frame • Forward predicted from the last I-Frame or P-Frame • B-Frame • forward predicted and backward predicted from the last/next I-frame or P-frame Transmitted as - I P B B B P B B B
Motion Compensation Approach(cont.) Motion Vectors • static background is a very special case, we should consider the displacement of the block. • Motion vector is used to inform decoder exactly where in the previous image to get the data. • Motion vector would be zero for a static background.
Motion estimation for different frames X Z Available from earlier frame (X) Available from later frame (Z) Y
A typical group of pictures in display order I P B B B P B B B P B B B 1 5 2 3 4 9 6 7 8 13 10 11 12 A typical group of pictures in coding order I B B B P B B B P B B B P
Y CB CR 0 1 4 5 2 3 Spatial sampling relationship for MPEG-1 -- Luminance sample -- Color difference sample Coding of Macroblock
Rate controller Scale factor Buffer fullness Variable-length coder IN OUT Frame recorder Inverse DCT Transmit buffer Quantize DCT DC Prediction Motion predictor De- quantize Prediction encoder Reference frame Motion vectors A Simplified MPEG encoder
MPEG Standards • MPEG stands for the Moving Picture Experts Group. MPEG is an ISO/IEC working group, established in 1988 to develop standards for digital audio and video formats. There are five MPEG standards being used or in development. Each compression standard was designed with a specific application and bit rate in mind, although MPEG compression scales well with increased bit rates. They include: • MPEG1 • MPEG2 • MPEG4 • MPEG7 • MPEG21 • MP3
MPEG Standards • MPEG-1Designed for up to 1.5 Mbit/secStandard for the compression of moving pictures and audio. This was based on CD-ROM video applications, and is a popular standard for video on the Internet, transmitted as .mpg files. In addition, level 3 of MPEG-1 is the most popular standard for digital compression of audio--known as MP3. MPEG-1 is the standard of compression for VideoCD, the most popular video distribution format thoughout much of Asia. • MPEG-2Designed for between 1.5 and 15 Mbit/secStandard on which Digital Television set top boxes and DVD compression is based. It is based on MPEG-1, but designed for the compression and transmission of digital broadcast television. The most significant enhancement from MPEG-1 is its ability to efficiently compress interlaced video. MPEG-2 scales well to HDTV resolution and bit rates, obviating the need for an MPEG-3. • MPEG-4Standard for multimedia and Web compression. MPEG-4 is based on object-based compression, similar in nature to the Virtual Reality Modeling Language. Individual objects within a scene are tracked separately and compressed together to create an MPEG4 file. This results in very efficient compression that is very scalable, from low bit rates to very high. It also allows developers to control objects independently in a scene, and therefore introduce interactivity. • MPEG-7 - this standard, currently under development, is also called the Multimedia Content Description Interface. When released, the group hopes the standard will provide a framework for multimedia content that will include information on content manipulation, filtering and personalization, as well as the integrity and security of the content. Contrary to the previous MPEG standards, which described actual content, MPEG-7 will represent information about the content. • MPEG-21 - work on this standard, also called the Multimedia Framework, has just begun. MPEG-21 will attempt to describe the elements needed to build an infrastructure for the delivery and consumption of multimedia content, and how they will relate to each other.
JPEG • JPEG stands for Joint Photographic Experts Group. It is also an ISO/IEC working group, but works to build standards for continuous tone image coding. JPEG is a lossy compression technique used for full-color or gray-scale images, by exploiting the fact that the human eye will not notice small color changes. • JPEG 2000 is an initiative that will provide an image coding system using compression techniques based on the use of wavelet technology.
DV • DV is a high-resolution digital video format used with video cameras and camcorders. The standard uses DCT to compress the pixel data and is a form of lossy compression. The resulting video stream is transferred from the recording device via FireWire (IEEE 1394), a high-speed serial bus capable of transferring data up to 50 MB/sec. • H.261 is an ITU standard designed for two-way communication over ISDN lines (video conferencing) and supports data rates which are multiples of 64Kbit/s. The algorithm is based on DCT and can be implemented in hardware or software and uses intraframe and interframe compression. H.261 supports CIF and QCIF resolutions. • H.263 is based on H.261 with enhancements that improve video quality over modems. It supports CIF, QCIF, SQCIF, 4CIF and 16CIF resolutions. • H.264
HDTV 4KTV 4-7 Mbps 25 - 27 Mbps