[Adapt] [Seminar] HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization
hyy
hongyunyan at sjtu.edu.cn
Tue Jun 4 10:53:02 CST 2019
Dear Adapters,
In this week, I will give you the talk about the paper “HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization” which is accepted by ACL in 2019.
In this paper,the core part of a neural extractive summarization model is the hierarchical document encoder. And author proposed a method to pre-train document level hierarchical bidirectional transformer encoders on unlabeled data. The HIBERT model achieves a quite good performance in CNNDM dataset and NYT50 dataset.
Time: Wed 5:00pm
Venue: SEIEE 3-414
Best regards,
Sophia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20190604/c157dc66/attachment.html>
More information about the Adapt
mailing list