[Adapt] Improving Continual Relation Extraction through Prototypical Contrastive Learning

宋秀杰 xiujiesong at sjtu.edu.cn
Wed Dec 7 09:53:32 CST 2022


Hi Adapters

Continual relation extraction (CRE) aims to extract relations towards the continuous and iterative arrival of new data, of which the major challenge is the catastrophic forgetting of old tasks. 

In order to alleviate this critical problem for enhanced CRE performance, a novel Continual Relation Extraction framework with Contrastive Learning (CRECL) is proposed, which is built with a classification network and a prototypical contrastive network to achieve the incremental-class learning of CRE. Specifically, in the contrastive network a given instance is contrasted with the prototype of each candidate relations stored in the memory module. Such contrastive learning scheme ensures the data distributions of all tasks more distinguishable, so as to alleviate the catastrophic forgetting further. 

In today's talk, I will introduce this paper, Improving Continual Relation Extraction through Prototypical Contrastive Learning, to you. 

Hope you enjoy it!

Time: Wed 4:00 pm
Venue: SEIEE 3-414

Best Regards,
Xiujie Song


More information about the Adapt mailing list