[Adapt] [seminar] An Introduction on Language Model Compression

任思宇 rsy0702 at 163.com
Tue Oct 19 22:01:09 CST 2021


Hi Adapters,


Large-scale pre-trained language models have significantly advanced the field of Natural Language Processing(NLP) in recent years. However, they are also notoriously hard to train and serve for downstream applications due to their humongous size and unbearable inference latency. Language model compression is the technique used to obtain a smaller yet competitive counterpart of the large model. In this talk, I will present an introduction on this topic, covering the two most prevalent subfields: Knowledge Distillation and Model Pruning. 


Hope you enjoy it!


Time: Wed 4:00 pm
Venue: SEIEE 3-414
Best Regards,
Roy
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20211019/144207e4/attachment.htm>


More information about the Adapt mailing list