[Adapt] [Seminar] Deep Contextualized Word Representations
张盛瑶
sophie_zhang at sjtu.edu.cn
Tue Dec 11 23:01:14 CST 2018
Hi Adapters:
For seminar tomorrow, I will introduce a method of word embedding: ELMo. It is a deep contextualized word representation and word vectors are learned functions of the internal states of a deep bidirectional language model. It has a simple architecture but when the representations added to existing models it can significantly improve the state-of-the-art across some challenging NLP tasks. You can find more details in this paper: https://arxiv.org/pdf/1802.05365.pdf
See you then!
Time: 17:00 December 12
Venue: SEIEE 3-517A
Best Regards,
Sophie
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ATT00003.txt
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20181211/5bf46435/attachment.txt>
More information about the Adapt
mailing list