-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Description
Problem
There needs to be some way for the LLM to lookup documentation for a particular tool at runtime rather than including all the information in the initial prompt. When there are lots of tools, or the tools have complex behavior with long prompts, this takes up lots of the limited input context available in GPT-4 (currently 8k tokens).
The current approach of stuffing everything into the initial prompt has the following drawbacks:
- expensive (openai costs per token)
- can confuse the LLM. If there are too many tools with long descriptions, the LLM may be less likely to hone in on the section it needs to complete its current task
- decreases the amount of interactions the user can have with the agent. GPT-4/etc. have a finite input length, so the longer the initial prompt, the less space for the user's actual conversation with the LLM
Approaches / Open Questions
- have a manual tool that the LLM can call with the name of a tool it wants more info on
- where does the more info live?
- how does the prompt generation know to add a note that more info can be looked up? (don't want to require the user to manually say that more info can be looked up)
- what about with the
PythonToolor similar? Want to be able to pass in local variables, that the model could presumably lookup the values of. This is semi-handled by the pyman tool: https://github.com/jataware/archytas/blob/master/archytas/tools.py#L134, though the prompt generated for thePythonTooldoesn't update to indicate that the locals exist, or that information about them can be looked up
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels