Sorry, you need to enable JavaScript to visit this website.

Bitvectors that support rank and select queries are the workhorses of succinct data structures, implementations of which are now widespread, for example, in bioinformatics software. To date, however, most bitvector implementations are static, thus forcing more complex data structures built from them to be static too. In this paper we explore dynamic bitvectors, which, in addition to rank and select queries, also support update operations, specifically: insert, remove, and modify. We first provide several practical optimizations to the recent B-tree based bitvectors of Prezza (Proc.

Categories:
100 Views

Resampling-based coding, i.e. down-sampling before encoding and up-sampling after decoding, has been recognized to be an effective tool for compressing high-resolution videos at low bitrates. The newest video coding standard, Versatile Video Coding (VVC), supports resampling-based coding via a mechanism named Reference Picture Resampling (RPR), where the spatial resolution can be changed without inserting an intra frame. Intuitively, it is not wise to utilize a single resolution throughout the whole video, because frames with different contents may prefer different coding resolutions.

Categories:
230 Views

We consider the problem of designing space-efficient data structures for unlabelled path graphs with n vertices while supporting basic navigational queries such as degree, adjacency, and neighborhood queries efficiently. We provide two solutions for this problem. Our first data structure is succinct and occupies n log n+o(n log n) bits while answering adjacency query in O(log n) time, and neighborhood and degree queries in O(d log^2 n) time where d is the degree of the queried vertex. Our second data structure answers all these queries faster at the expense of slightly more space.

Categories:
27 Views

Recently, tensor decomposition approaches are used to compress deep convolutional neural networks (CNN) for getting a faster CNN with fewer parameters. However, there are two problems of tensor decomposition based CNN compression approaches, one is that they usually decompose CNN layer by layer, ignoring the correlation between layers, the other is that training and compressing a CNN is separated, easily leading to local optimum of ranks. In this paper, Learning Tucker Compression (LTC) is proposed.

Categories:
154 Views

Pages