[Adapt] [Seminar] Relational Knowledge in Pretrained Language Models

rsy0702 at 163.com rsy0702 at 163.com
Tue Nov 17 17:00:43 CST 2020

Hi Adapters,

The NLP community has been revolutionized in the past few years by the two stage sequential transfer learning paradigm. There is no doubt that pretrained language models(PLMs) have encoded considerable amount of knowledge, but what type of and how much of knowledge is learned by PLMs still remains an open research topic. In this talk, I will focus on relational knowledge(factual world knowledge and commonsense knowledge etc.) within PLMs by first introducing two pioneers paper and then talking about my ongoing work on this.

Time: Wed 4:00pm
Venue: SEIEE 3-414

See you there!
Best Regards,

rsy0702 at 163.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20201117/bb5e1662/attachment-0001.html>

More information about the Adapt mailing list