LeCo: Lightweight Compression via Learning Serial Correlations
Citations Over TimeTop 10% of 2024 papers
Abstract
Lightweight data compression is a key technique that allows column stores to exhibit superior performance for analytical queries. Despite a comprehensive study on dictionary-based encodings to approach Shannon's entropy, few prior works have systematically exploited the serial correlation in a column for compression. In this paper, we propose LeCo (i.e., Learned Compression), a framework that uses machine learning to remove the serial redundancy in a value sequence automatically to achieve an outstanding compression ratio and decompression performance. LeCo presents a general approach to this end, making existing algorithms such as Frame-of-Reference (FOR), Delta Encoding, and Run-Length Encoding (RLE) special cases under our framework. Our microbenchmark with three synthetic and eight real-world data sets shows that a prototype of LeCo achieves a Pareto improvement on both compression ratio and random access speed over the existing solutions. When integrating LeCo into widely-used applications, we observe up to 5.2× speed up in a data analytical query in the Arrow columnar execution engine, and a 16% increase in RocksDB's throughput.
Related Papers
- → A Randomly Accessible Lossless Compression Scheme for Time-Series Data(2020)29 cited
- → The methods of improving the compression ratio of LZ77 family data compression algorithms(2002)8 cited
- DATA COMPRESSION METHOD FOR SPECTRAL INFORMATION-TWO-TRUE-VALUE LINEAR PREDICTION(1992)
- → Iterative Data Compression with Calculated Codes-IEEE_1col.docx(2023)
- → Iterative Data Compression with Calculated Codes(2023)