The Data Compression Conference (DCC) is an international forum for current work on data compression and related applications. Both theoretical and experimental work are of interest. Visit website.
- Read more about Compressing and Randomly Accessing Sequences
- 1 comment
- Log in to post comments
In this paper we consider the problem of storing sequences of symbols in
a compressed format, while supporting random access to the symbols without
decompression. Although this is a well-studied problem when the data is
textual, the kind of sequences we look at are not textual, and we argue
that traditional compression methods used in the text algorithms community
(such as compressors targeting $k$-th order empirical entropy) do not
perform as well on these sequential data, and simpler methods such
- Categories:
- Read more about Adaptive Stream-based Entropy Coding
- Log in to post comments
Fast data streams are applied by various applications such as multimedia, communication and sensory devices. The amount of data is getting larger and the transfer speed is also getting higher. To address this kind of fast applications, a high performance stream-based data compression mechanism is demanded.
- Categories:
- Read more about Entropy Coders Based on the Splitting of Lexicographic Intervals
- Log in to post comments
poster.pdf
- Categories:
- Read more about Denoising Deep Boltzmann Machines: Compression for Deep Learning
- Log in to post comments
- Categories:
- Read more about Decompressing Lempel-Ziv compressed text
- Log in to post comments
We consider the problem of decompressing the Lempel--Ziv 77 representation of a string $S$ of length $n$ using a working space as close as possible to the size $z$ of the input. The folklore solution for the problem runs in $O(n)$ time but requires random access to the whole decompressed text. Another folklore solution is to convert LZ77 into a grammar of size $O(z\log(n/z))$ and then stream $S$ in linear time. In this paper, we show that $O(n)$ time and $O(z)$ working space can be achieved for constant-size alphabets.
- Categories:
- Read more about Compact Representation of Graphs with Small Bandwidth and Treedepth
- Log in to post comments
We consider the problem of compact representation of graphs with small bandwidth as well as graphs with small treedepth. These parameters capture structural properties of graphs that come in useful in certain applications.
- Categories:
- Read more about Intra Prediction in the Emerging VVC Video Coding
- Log in to post comments
- Categories:
- Read more about Compressed Quadratization of Higher Order Binary Optimization Problems
- Log in to post comments
Recent hardware advances in quantum and quantum-inspired annealers promise substantial speedup for solving NP-hard combinatorial optimization problems compared to general-purpose computers. These special-purpose hardware are built for solving hard instances of Quadratic Unconstrained Binary Optimization (QUBO) problems. In terms of number of variables and precision of these hardware are usually resource-constrained and they work either in Ising space $\ising$ or in Boolean space $\bool$. Many naturally occurring problem instances are higher-order in nature.
- Categories:
- Read more about Sibling Neural Estimators: Improving Iterative Image Decoding with Gradient Communication
- Log in to post comments
For lossy image compression, we develop a neural-based system which learns a nonlinear estimator for decoding from quantized representations. The system links two recurrent networks that \help" each other reconstruct same target image patches using complementary portions of spatial context that communicate via gradient signals. This dual agent system builds upon prior work that proposed the iterative refinement algorithm for recurrent neural network (RNN)based decoding which improved image reconstruction compared to standard decoding techniques.
- Categories:
- Read more about Efficient Storage of Images onto DNA Using Vector Quantization
- Log in to post comments
The archiving of digital data is becoming very challenging as conventional electronic devices wear out in time leaving at stake any data that has been stored in them. Therefore, data migration is necessary every 5-10 years. A great percentage of this stored data is "cold", which means that it is very rarely accessed but needs to be safely stored into back-up drives for security and compliance reasons. Unfortunately, the maintenance and replacement of back-up tape drives in big data centers is very expensive both in terms of money and energy.
- Categories: