AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction
Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without...
Saved in:
Published in: | Mathematics (Basel) Vol. 12; no. 18; p. 2901 |
---|---|
Main Authors: | , , , |
Format: | Journal Article |
Language: | English |
Published: |
Basel
MDPI AG
01-09-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method. |
---|---|
ISSN: | 2227-7390 2227-7390 |
DOI: | 10.3390/math12182901 |