Sorry, you need to enable JavaScript to visit this website.

Adaptive Stream-based Entropy Coding

Citation Author(s):
Eisaku Hayakawa, Koichi Marumo
Submitted by:
Shinichi Yamagiwa
Last updated:
24 April 2020 - 2:53pm
Document Type:


Fast data streams are applied by various applications such as multimedia, communication and sensory devices. The amount of data is getting larger and the transfer speed is also getting higher. To address this kind of fast applications, a high performance stream-based data compression mechanism is demanded.
Since 1950s, the lossless data compression methods have been widely used to encode data to the smaller size and decode it to the original. The arithmetic coding is one of the traditional methods to compress data, which applies Shannon's information entropy. The Huffman coding is very well known. The coding method assigns the shorter bit pattern to the more frequent data pattern. Then it encodes the original data to the smaller data size. Another is a method based on look-up table. The LZW is the typical algorithm, applied widely to such as ZIP compression software. It saves patterns appeared in data to a look-up table. Complete implementations of these methods inevitably are based on software because it needs to make chunks from data stream to analyze frequency in the look-up table with buffering the chunks to a memory. The applications that process extremely fast data stream demand high performance implementation of lossless data compression on hardware. However, the buffering problem causes difficulty to implement it.
This poster presentation proposes a novel lossless data compression algorithm that completely supports stream data called Adaptive Stream-based Entropy coding. It compresses continuous stream data using a look-up table without stalling/buffering by assigning the fewest bits according to instant entropy. The mechanism is suitable for hardware implementation.

0 users have voted: