학술논문

Data compression for the next-generation space telescope
Document Type
Conference
Source
Proceedings DCC'99 Data Compression Conference (Cat. No. PR00096) Data compression Data Compression Conference, 1999. Proceedings. DCC '99. :542 1999
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Data compression
Telescopes
Image coding
NASA
Cameras
Downlink
Pixel
Compression algorithms
Arithmetic
Telemetry
Language
ISSN
1068-0314
Abstract
Summary form only given. The next-generation space telescope (NGST) will produce about 600 GB/day, assuming we use the NASA yardstick 8k/spl times/8k NIR camera (16 bits/pixel), save and transmit 64 non-destructive read-outs per image, and the camera is in continuous use (about 80 observations/day, 10/sup 3/ s each). However, with an L2 halo orbit, the NASA NGST study estimates a downlink rate of 5.35 GB/day using X-band. Clearly the volume of data to downlink must be reduced by at least a factor of 100. Astronomical images are noisy. This fact makes them difficult to compress by lossless compression algorithms such as Huffman, Lempel-Ziv, run-length, or arithmetic code. However, they also have the virtue of showing similar values among adjacent pixels. Techniques such as Rice's algorithm (Rice et al., 1993) and derivatives (White and Becker, 1998; Stiavelli and White, 1997) can take advantage of this. We present the way in which some of these compression techniques would work with NGST images. Unfortunately, these lossless algorithms give us compression ratios that still exceed the telemetry guidelines. We have also looked into the feasibility of doing lossy compression by scaling the original image prior to the lossless compression. Under this scheme, we find substantial data reduction with a negligible effect on the data quality.