Sorry, you need to enable JavaScript to visit this website.

HIERARCHICAL VAE BASED SEMANTIC COMMUNICATIONS FOR POMDP TASKS

DOI:
10.60864/crxv-8t61
Citation Author(s):
Wenhui Hua
Submitted by:
Dezhao Chen
Last updated:
6 June 2024 - 10:54am
Document Type:
Poster
Document Year:
2024
Event:
Presenters:
Dezhao Chen
Paper Code:
MLSP-P31.4
 

Partially Observable Markov Decision Process (POMDP) is a general framework for a wide range of control tasks, which can benefit from enabling semantic communicatons among different agents. Semantic communications aim to exchange compact messages that can convey task-relevant information between agents. A critical problem in semantic communication is source representation learning, which is governed by a fundamental tradeoff between compactness and sufficiency. Such a tradeoff is still underinvestigated in the context of POMDP. In this paper, we propose HVRL - Hierarchical Variational autoencoders for Reinforcement Learning. Experiments show that our method can effectively balance the pursuit of compactness and sufficiency, thereby learning enough information for decision and mitigates the risk of over-abstraction in the observation space. This approach effectively encodes the endogenous semantic information about the observation itself, and shows good sample efficiency and control performance.

up
1 user has voted: Dezhao Chen