Smart Memory

As you add more and more memories, they can start to exceed the limits of the context window of various chat platforms.

But even before that, they can add irrelevant details to the chat which could confuse the AI, and on platforms like Claude where your usage limits are based on token usage, having a lot of text from memories in the context window can lead to using up your limits sooner.

Smart Memory solves for this by creating high level overviews of memories within a bucket, and enabling the AI to fetch more detailed memories if the conversation calls for it.

This might look like the following:

You have a high level overview that says:

“User works with Python and JavaScript”

And it will also say something like:

“I also know about the specific frameworks they use”

Now if you start talking about programming, the AI will load the detailed memories, otherwise it will use just the overview.

This keeps conversations focused and also lets you have more memories within a bucket before context windows become a worry.

Please authenticate to join the conversation.

Upvoters
Status

Completed

Board
💡

Feature Request

Date

11 months ago

Author

asad

Subscribe to post

Get notified by email when there are changes.