AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP
Juri Kong
Juri Kong

Public Documents 1
Dynamic Semantic Memory Retention in Large Language Models: An Exploration of Spontan...
Juri Kong

Juri Kong

and 5 more

October 31, 2024
Models designed for artificial language understanding increasingly require robust memory retention mechanisms to maintain coherence and relevance across extended interactions. Introducing Dynamic Semantic Memory Retention (DSMR) provides a breakthrough in autonomous memory management, enabling hierarchical and context-driven memory recall that operates independently of user-prompt dependency. DSMR establishes a layered memory structure that supports the recall of semantically relevant information, reinforcing response coherence across both short-and long-term contexts. Through quantitative and qualitative analyses, DSMR consistently demonstrated higher retrieval accuracy, memory stability, and latency efficiency in contrast to standard configurations, proving effective in reducing error rates and enhancing contextual consistency across diverse scenarios. Additionally, DSMR's structured approach to memory retention and node prioritization offers scalability, positioning it as a foundational model for advanced memory functions within future interactive systems. Overall, the findings demonstrate DSMR's capacity to elevate language model performance by fostering a durable, autonomous memory framework suited for applications where long-term memory recall is critical.

| Powered by Authorea.com

  • Home