[Adapt] [Seminar] Relational Knowledge in Pretrained Language Models

Xusheng Luo freefish_6174 at 126.com
Tue Nov 17 17:11:07 CST 2020

Could u pls share the papers? Thx!

> On Nov 17, 2020, at 17:00, rsy0702 at 163.com wrote:
> Hi Adapters,
> The NLP community has been revolutionized in the past few years by the two stage sequential transfer learning paradigm. There is no doubt that pretrained language models(PLMs) have encoded considerable amount of knowledge, but what type of and how much of knowledge is learned by PLMs still remains an open research topic. In this talk, I will focus on relational knowledge(factual world knowledge and commonsense knowledge etc.) within PLMs by first introducing two pioneers paper and then talking about my ongoing work on this.
> Time: Wed 4:00pm
> Venue: SEIEE 3-414
> See you there!
> Best Regards,
> Roy
> rsy0702 at 163.com <mailto:rsy0702 at 163.com>_______________________________________________
> Adapt mailing list
> Adapt at cs.sjtu.edu.cn <mailto:Adapt at cs.sjtu.edu.cn>
> http://cs.sjtu.edu.cn/mailman/listinfo/adapt <http://cs.sjtu.edu.cn/mailman/listinfo/adapt>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20201117/e71dfe56/attachment.html>

More information about the Adapt mailing list