[Adapt] [Seminar] A Brief Introduction to Hallucination in Large Vision-Language Models
宋秀杰
xiujiesong at sjtu.edu.cn
Wed Nov 29 00:40:03 CST 2023
Hi Adapters,
Laster semester, Angel introduced faithfulness in natural language generation and mainly focused on hallucination problem in NLG tasks. Recently, inspired by the superior language abilities of large language models (LLM), large vision-language models (LVLMs) have been recently proposed by integrating powerful LLMs for improving the performance on complex multimodal tasks. Despite the promising progress on LVLMs, they also suffer from object hallucinations. Many researchers are trying to explore better ways to evaluate hallucination problem of LVLMs. Thus, in this talk I will briefly introduce the evaluation of object hallucination in LVLMs. Specifically, I will introduce the paper Evaluating Object Hallucination in Large Vision-Language Models, which is accepted by EMNLP 2023.
Hope you find this talk interesting and helpful.
Time: Wed 10 am. - 11:30 am.
Meeting link: https://teams.microsoft.com/l/meetup-join/19%3ameeting_M2VmMTU5MzgtODUzOC00NmU4LTg0MzktNGFjNDdiMmIwYTI1%40thread.v2/0?context=%7b%22Tid%22%3a%225cdc5b43-d7be-4caa-8173-729e3b0a62d9%22%2c%22Oid%22%3a%221a8b9fa0-af57-4a1c-9390-22d1c201d622%22%7d
Best wishes,
Xiujie Song
More information about the Adapt
mailing list