Sorry, you need to enable JavaScript to visit this website.

On Stochastic Codebook Generation for Markov Sources

Citation Author(s):
Ahmed Elshafiy, Kenneth Rose
Submitted by:
Ahmed Elshafiy
Last updated:
26 February 2023 - 10:01pm
Document Type:
Video Presentation
Document Year:
2023
Event:
Presenters:
Ahmed Elshafiy
Paper Code:
236
Categories:
Keywords:
 

This paper proposes an effective universal "on-the-fly" mechanism for stochastic codebook generation in lossy coding of Markov sources. Earlier work has shown that the rate-distortion bound can be asymptotically achieved by a "natural type selection" (NTS) mechanism that iteratively considers asymptotically long source strings (from an unknown distribution P) and regenerates the codebook from a distribution obtained within a maximum likelihood distribution estimation framework, based on observation of a set of K codewords that "d-match" (i.e., satisfy the distortion constraint for) a respective set of K independently generated source words. This result was later generalized, in a straightforward manner, to account for source memory, by considering the source as a vector source, i.e., a sequence of super-symbols from a corresponding super-alphabet. While ensuring asymptotic optimality, this extension suffered from a significant practical flaw: it requires asymptotically long vectors or super-symbols, hence exponentially large super-alphabet, in order to approach the rate-distortion bound, even for finite memory sources, e.g., Markov sources. Such exponentially large super-alphabet implies that even a single NTS iteration is intractable, thus compromising the promise of NTS to approach the rate-distortion function, in practice, for sources with memory. This work describes a considerably more efficient and tractable mechanism to achieve asymptotically optimal performance given a prescribed memory constraint, within a practical framework tailored to Markov sources. Specifically, the algorithm finds, asymptotically, the optimal codebook reproduction distribution, within a constrained set of distributions satisfying a prescribed Markovian property, e.g., of the same order as the source, which achieves the minimum per letter coding rate while maintaining a specified distortion level.

up
0 users have voted: