Describe the feature or improvement you are requesting
the api docs mention that you can now set the prompt cache retention to 24 hours, instead of the standard 5-10 minutes
https://platform.openai.com/docs/guides/prompt-caching
{
"model": "gpt-5.1",
"input": "Your prompt goes here...",
"prompt_cache_retention": "24h"
}
it would be brilliant to be able to set it in the .net SDK too - both on the chat and responses clients.
could it be added to the following objects:
ResponseCreationOptions and ChatCompletionOptions
Many thanks.
Additional context
No response
Describe the feature or improvement you are requesting
the api docs mention that you can now set the prompt cache retention to 24 hours, instead of the standard 5-10 minutes
https://platform.openai.com/docs/guides/prompt-caching
{
"model": "gpt-5.1",
"input": "Your prompt goes here...",
"prompt_cache_retention": "24h"
}
it would be brilliant to be able to set it in the .net SDK too - both on the chat and responses clients.
could it be added to the following objects:
ResponseCreationOptions and ChatCompletionOptions
Many thanks.
Additional context
No response