Documents
Poster
Graph Based Transforms based on Graph Neural Networks for Predictive Transform Coding
- Citation Author(s):
- Submitted by:
- Debaleena Roy
- Last updated:
- 1 March 2021 - 6:40pm
- Document Type:
- Poster
- Document Year:
- 2021
- Event:
- Presenters:
- Debaleena Roy
- Paper Code:
- 188
- Categories:
- Log in to post comments
This paper introduces the GBT-NN, a novel class of Graph-based Transform within thecontext of block-based predictive transform coding using intra-prediction. The GBT-NNis constructed by learning a mapping function to map a graph Laplacian representing thecovariance matrix of the current block. Our objective of learning such a mapping functionis to design a GBT that performs as well as the KLT without requiring to explicitly com-pute the covariance matrix for each residual block to be transformed. To avoid signallingany additional information required to compute the inverse GBT-NN, we also introduce acoding framework that uses a template-based prediction to predict residuals at the decoder.Evaluation results on several video frames and medical images, in terms of the percentageof preserved energy and mean square error, show that the GBT-NN can outperform theDST and DCT.
Comments
Thanks for watching !
Thanks for watching !