Sorry, you need to enable JavaScript to visit this website.

Improving Medical Dialogue Generation with Abstract Meaning Representations

Citation Author(s):
Bohao Yang, Chen Tang, Chenghua Lin
Submitted by:
Chen Tang
Last updated:
15 April 2024 - 1:09pm
Document Type:
Presentation Slides
Document Year:
2024
Presenters:
Bohao Yang
Paper Code:
SLP-L11.4
 

Medical Dialogue Generation plays a critical role in telemedicine by facilitating the dissemination of medical expertise to patients. Existing studies focus on incorporating textual representations, which have limited their ability to represent text semantics, such as ignoring important medical entities. To enhance the model's understanding of textual semantics and medical knowledge including entities and relations, we introduce Abstract Meaning Representations (AMR) to construct graphical representations that delineate the roles of language constituents and medical entities within dialogues. In this paper, we propose a novel neural framework that models dialogues between patients and healthcare professionals using AMR graphs, where the framework incorporates both textual and graphical knowledge with a dual attention mechanism. Experimental results show that our framework outperforms robust baseline models in medical dialogue generation, demonstrating the effectiveness of AMR graphs in enhancing the representation of medical knowledge and logical relationships. Furthermore, to support future research in this domain, we provide the corresponding source code at https://github.com/Bernard-Yang/MedDiaAMR.

up
0 users have voted: