Skip to content

Conversation

@shellmayr
Copy link
Member

  • Auto-enable the LiteLLM integration
  • Auto-enable the Google GenAI integration

@shellmayr shellmayr marked this pull request as ready for review January 12, 2026 12:27
@shellmayr shellmayr requested a review from a team as a code owner January 12, 2026 12:27
@shellmayr shellmayr requested a review from a team January 12, 2026 12:27

_INTEGRATION_DEACTIVATES = {
"langchain": {"openai", "anthropic"},
"litellm": {"openai", "anthropic"},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: The litellm integration incorrectly deactivates the openai and anthropic integrations, which can lead to lost telemetry for direct SDK calls.
Severity: HIGH

🔍 Detailed Analysis

The litellm integration is being added to the _INTEGRATION_DEACTIVATES map, causing it to disable the openai and anthropic integrations. This is incorrect because the litellm integration uses callbacks and does not wrap the official OpenAI or Anthropic SDKs, unlike the langchain integration. There is no risk of duplicate telemetry. This change will cause a loss of instrumentation for any direct calls a user makes to the OpenAI or Anthropic SDKs if the litellm integration is also enabled in their application, leading to missing telemetry data.

💡 Suggested Fix

Remove the "litellm": {"openai", "anthropic"} entry from the _INTEGRATION_DEACTIVATES dictionary in sentry_sdk/integrations/__init__.py. The litellm integration's architecture does not require deactivating other AI integrations.

🤖 Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: sentry_sdk/integrations/__init__.py#L171

Potential issue: The `litellm` integration is being added to the
`_INTEGRATION_DEACTIVATES` map, causing it to disable the `openai` and `anthropic`
integrations. This is incorrect because the `litellm` integration uses callbacks and
does not wrap the official OpenAI or Anthropic SDKs, unlike the `langchain` integration.
There is no risk of duplicate telemetry. This change will cause a loss of
instrumentation for any direct calls a user makes to the OpenAI or Anthropic SDKs if the
`litellm` integration is also enabled in their application, leading to missing telemetry
data.

Did we get this right? 👍 / 👎 to inform future reviews.
Reference ID: 8473225


_INTEGRATION_DEACTIVATES = {
"langchain": {"openai", "anthropic"},
"litellm": {"openai", "anthropic"},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be nice to have tests for the new auto-deactivation.

Existing tests like this are in tests/test_ai_integration_deactivation.py.

@alexander-alderman-webb
Copy link
Contributor

alexander-alderman-webb commented Jan 12, 2026

@shellmayr can you investigate why the litellm tests are failing?

@shellmayr
Copy link
Member Author

@alexander-alderman-webb yep - on it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants