LGM: Enhancing Large Language Models with Conceptual Meta-Relations and Iterative Retrieval
📝 Original Info
- Title: LGM: Enhancing Large Language Models with Conceptual Meta-Relations and Iterative Retrieval
- ArXiv ID: 2511.03214
- Date: 2025-11-05
- Authors: ** 정보 제공되지 않음 (논문에 저자 명시가 없습니다). **
📝 Abstract
Large language models (LLMs) exhibit strong semantic understanding, yet struggle when user instructions involve ambiguous or conceptually misaligned terms. We propose the Language Graph Model (LGM) to enhance conceptual clarity by extracting meta-relations-inheritance, alias, and composition-from natural language. The model further employs a reflection mechanism to validate these meta-relations. Leveraging a Concept Iterative Retrieval Algorithm, these relations and related descriptions are dynamically supplied to the LLM, improving its ability to interpret concepts and generate accurate responses. Unlike conventional Retrieval-Augmented Generation (RAG) approaches that rely on extended context windows, our method enables large language models to process texts of any length without the need for truncation. Experiments on standard benchmarks demonstrate that the LGM consistently outperforms existing RAG baselines.💡 Deep Analysis
📄 Full Content
Reference
This content is AI-processed based on open access ArXiv data.