798508656 at qq.com
Tue Oct 9 23:11:11 CST 2018
For seminar tomorrow, I will introduce a language understanding model which performs well on many NLP tasks such as textual entailment, question answering, semantic similarity assessment, and document classification. Although large unlabeled text corpora are abundant, labeled data for learning these specific tasks is scarce, making it challenging for discriminatively trained models to perform adequately.
Improving Language Understanding by Generative Pre-Training
demonstrates that large gains on these tasks can be realized by generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task.
For better understanding the talk tomorrow, I hope you have the prerequisite knowledge about attention and transformer block.
See you tomorrow!
Time:17:00 October 10
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Adapt