Abstract
This paper proposes a simple multiple description video coding approach based on zero padding theory. It is completely based on pre- and post-processing, which require no modifications to the source codec. Redundancy is added by padding zeros in the DCT domain, which results in interpolation of the original frame and increases correlations between pixels. Methods based on 1D and 2D DCT are presented. We also investigate two sub-sampling methods, which are interleaved and quincunx, to generate multiple descriptions. Results are presented for two zero padding approaches using H.264, which shows that the 1D approach performs much better than 2D padding techniques, at a much lower computational complexity. For 1D zero padding, results show that interleaved sub-sampling is better than quincunx.
Translated title of the contribution | Multiple description video coding based on zero padding |
---|---|
Original language | English |
Title of host publication | 2004 IEEE International Symposium on Circuits and Systems (ISCAS '04) Vancouver, BC, Canada |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 205 - 208 |
Number of pages | 4 |
Volume | 2 |
ISBN (Print) | 078038251X |
DOIs | |
Publication status | Published - May 2004 |
Event | International Symposium on Circuits and Systems - Vancouver, BC , Canada Duration: 1 May 2004 → … |
Conference
Conference | International Symposium on Circuits and Systems |
---|---|
Country/Territory | Canada |
City | Vancouver, BC |
Period | 1/05/04 → … |
Bibliographical note
Rose publication type: Conference contributionTerms of use: Copyright © 2004 IEEE. Reprinted from International Symposium on Circuits and Systems, 2004 (ISCAS '04).
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Bristol's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected].
By choosing to view this document, you agree to all provisions of the copyright laws protecting it.