[Adapt] [Seminar]Self-supervised pre-training targets for dialogue understanding (II)

贾琪 jia_qi_0217 at 163.com
Tue Apr 6 19:35:29 CST 2021


Hi Adapters,


In this seminar, I’ll still give a talk about self-supervised pre-training tasks to enhance the dialogue context modeling ability. 


First, I will give a brief introduction to the pre-trained model ELECTRA published on ICLR2020, which targets on enhancing the masked language modeling task for pre-training models. Then, I'll show the response selection results with pre-trained language models when facing an adversarial task. Finally, another paper accepted by AAAI2021 will be mentioned. It proposed three self-supervised pre-training tasks to enhance the robustness of pre-trained models on dialogue context modeling. 



Time: Wed 4:30pm

Venue: SEIEE 3-414

Best regards,
Angel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20210406/15f3573a/attachment.html>


More information about the Adapt mailing list