Langfuse v3: Prompts not getting linked to generations in UI when using custom prompt loader #13298
Replies: 1 comment 3 replies
-
|
The issue is that you're converting the prompt object to a string ( When using the Langfuse API directly or converting prompt objects to strings, the UI does not recognize this as a managed prompt link(1). The prompt must be fetched using the SDK's To fix this, you need to pass the actual prompt object (not the string content) to your generation. Here's how to properly link prompts: Python SDK approach: from langfuse import observe, get_client
langfuse = get_client()
@observe(as_type="generation")
def nested_generation():
prompt = langfuse.get_prompt("movie-critic")
langfuse.update_current_generation(
prompt=prompt,
)For your specific case, instead of extracting just the prompt text with Additionally, if you're using Langfuse v3.109.0 → 3.139.0, note that there was a known frontend regression where prompt links weren't displayed in generation spans (though they appeared in the Observation view). This was fixed in version 3.140.0(3). 📚 Sources:
Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe your question
I’m using Langfuse v3 with a custom prompt management layer (supporting both local files and Langfuse prompts).
The issue is that prompts are not getting linked to generations in the UI, even though I am fetching prompts via langfuse_client.get_prompt().
Prompt configuration
PROMPT_CONFIG = {
"email_subject_parsing": {
"local_file": "prompts/email_subject_parsing.txt",
"lf_key": "email_subject_parsing",
"system_messages_lf": lambda prompt: prompt,
}
}
Fetching prompt from Langfuse
def build_litellm_prompt_lf(config):
prompt_obj = langfuse_client.get_prompt(
config["lf_key"],
label=label
)
Using prompt object here
system_messages_lf = config"system_messages_lf"
❗ Converting to string here
return system_messages_lf.prompt
Building prompt dictionary
def get_litellm_prompt_dict():
prompt_dict = {}
for key, config in litellm_prompt_config.items():
prompt_dict[key] = build_litellm_prompt_lf(config)
return prompt_dict
Creating LLM messages
litellm_prompt_dict = get_litellm_prompt_dict()
messages = [
{
"role": "system",
"content": litellm_prompt_dict["email_subject_parsing"]
}
]
5. LLM call (wrapped with Langfuse)
def call(self, **kwargs):
metadata = kwargs.get("metadata", {})
Langfuse Cloud or Self-Hosted?
Self-Hosted
If Self-Hosted
No response
If Langfuse Cloud
No response
SDK and integration versions
No response
Pre-Submission Checklist
Beta Was this translation helpful? Give feedback.
All reactions