AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP
Oliver Manjot
Oliver Manjot

Public Documents 1
Model Distillation for Enhanced Contextual Response Prediction Using Retrieval-Augmen...
Oliver Manjot
Marcus Delgado

Oliver Manjot

and 2 more

September 12, 2024
The rapid growth of conversational systems has highlighted the critical need for models capable of generating contextually accurate and relevant responses, particularly in knowledge-dependent interactions. A novel approach is introduced through the integration of retrieval-augmented generation (RAG) and model distillation, allowing enhanced alignment between generated responses and dynamic external knowledge sources. This method not only improves the factual correctness of responses but also addresses the limitations of static, pre-trained models, which often struggle with hallucinations and outdated information. Experiments reveal that the modified model demonstrates superior performance in terms of retrieval accuracy, response coherence, and computational efficiency, particularly in multi-turn conversations requiring complex knowledge integration. The synergy between retrieval mechanisms and generative processes significantly improves the model's scalability and adaptability, making it suitable for real-time applications across diverse query types. Additionally, the streamlined computational footprint achieved through model distillation ensures that the student model retains the teacher's high accuracy while operating with reduced resource requirements.

| Powered by Authorea.com

  • Home