[Adapt] [adapt][Seminar] Commonsense Machine Comprehension

Brandon 2275135452 at qq.com
Wed Dec 4 12:31:58 CST 2019

Hello Adapters,

Today I will talk about a paper: "Unified Language Model Pre-training for Natural Language Understanding and Generation". BERT reaches SOTA in many tasks. However it is mainly used in natural language understanding tasks(NLU). Because its bidirectional nature, it's difficult to applied BERT into natural language generation task(NLG). In this paper author propose a new pre-trianing method and the pre-trained model can be used in both NLU and NLG tasks directly. I think it's an alternative baseline for NLG tasks. Hope this can help you if you are doing some NLG tasks.

Time: Wed 4:30pm
Venue: SEIEE 3-414

See you there!

Best Regards,
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20191204/3f969676/attachment.html>

More information about the Adapt mailing list