Noise-Robust Abstractive Compression in Retrieval-Augmented Language Models

Reading time: 2 minute
...

📝 Original Info

  • Title: Noise-Robust Abstractive Compression in Retrieval-Augmented Language Models
  • ArXiv ID: 2512.08943
  • Date: 2025-11-19
  • Authors: Singon Kim

📝 Abstract

Abstractive compression utilizes smaller langauge models to condense query-relevant context, reducing computational costs in retrieval-augmented generation (RAG). However, retrieved documents often include information that is either irrelevant to answering the query or misleading due to factual incorrect content, despite having high relevance scores. This behavior indicates that abstractive compressors are more likely to omit important information essential for the correct answer, especially in long contexts where attention dispersion occurs. To address this issue, we categorize retrieved documents in a more fine-grained manner and propose Abstractive Compression Robust against Noise (ACoRN), which introduces two novel training steps. First...

📄 Full Content

I was the lead investigator for the projects where I was responsible for all major areas of concept formation, data collection and analysis, as well as the majority of manuscript composition. Neither this, nor any substantially similar dissertation has been or is being submitted for any other degree, diploma, or other qualification at any other university. Retrieval-augmented language models (RALMs) [1,2,3,4] have strong capabilities in both academic research and industrial applications within the area of natural language processing (NLP). Retrieval-augmented generation (RAG) process in RALMs was designed to improve the performance of large language

…(Content truncated for length.)

📸 Image Gallery

figure_1.png figure_2.png figure_3.png figure_4.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut