[CVPR'24] HallusionBench: You See What You Think? Or You Think What You See? An Image-Context Reasoning Benchmark Challenging for GPT-4V(ision), LLaVA-1.5, and Other Multi-modality Models
-
Updated
Nov 13, 2024 - Python
[CVPR'24] HallusionBench: You See What You Think? Or You Think What You See? An Image-Context Reasoning Benchmark Challenging for GPT-4V(ision), LLaVA-1.5, and Other Multi-modality Models
[ACL 2024] User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.
Can Knowledge Editing Really Correct Hallucinations?
RefChecker provides automatic checking pipeline and benchmark dataset for detecting fine-grained hallucinations generated by Large Language Models.
MLLM can see? Dynamic Correction Decoding for Hallucination Mitigation
[NeurIPS 2024] Knowledge Circuits in Pretrained Transformers
Loki: Open-source solution designed to automate the process of verifying factuality
[ACL 2024] An Easy-to-use Hallucination Detection Framework for LLMs.
Official code for 'Tackling Structural Hallucination in Image Translation with Local Diffusion' (ECCV'24 Oral)
🧙🏻Code and benchmark for our Findings of ACL 2024 paper - "TimeChara: Evaluating Point-in-Time Character Hallucination of Role-Playing Large Language Models"
sFRC: To identify fakes in medical images reconstructed using AI
OLAPH: Improving Factuality in Biomedical Long-form Question Answering
[NAACL24] Official Implementation of Mitigating Hallucination in Abstractive Summarization with Domain-Conditional Mutual Information
Cocktail dynamic graph prompting technique in LLM for hallucination
Controlled HALlucination-Evaluation (CHALE) Question-Answering Dataset
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
openai function calling demo that gets customizable weather information
openai assistant using code interpreter
[IJCAI 2024] FactCHD: Benchmarking Fact-Conflicting Hallucination Detection
Add a description, image, and links to the hallucination topic page so that developers can more easily learn about it.
To associate your repository with the hallucination topic, visit your repo's landing page and select "manage topics."