Skip to content

LLM Syntax showing after multiple messages #127

@NotNeelPatel

Description

@NotNeelPatel

When conversing with CityAgent, the system breaks after a while, as shown in the image below. The initial assumption is that this is caused by a context window getting full and causing the system to be confused as the memory gets purged from time to time. This is an issue especially with models with lower context windows.

Image

Potential solutions would involve designing a more intelligent system for purging data from the context window, but a better solution would be to try to optimize the prompts to be more token efficient from the jump. In addition, for a multi-agent system, maybe it is not entirely necessary to have all models know everything. That should make the system more robust in general if the prompts are written well.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    Status

    Todo

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions