Skip to content

feat(ai-proxy): add native Anthropic Messages API protocol support#13181

Open
nic-6443 wants to merge 6 commits intoapache:masterfrom
nic-6443:feat/native-anthropic-messages-api
Open

feat(ai-proxy): add native Anthropic Messages API protocol support#13181
nic-6443 wants to merge 6 commits intoapache:masterfrom
nic-6443:feat/native-anthropic-messages-api

Conversation

@nic-6443
Copy link
Copy Markdown
Member

@nic-6443 nic-6443 commented Apr 8, 2026

Description

Backport native Anthropic Messages API (/v1/messages) support from the enterprise gateway to open-source APISIX.

What this PR does

  1. Native Anthropic Messages protocol adapter (ai-protocols/anthropic-messages.lua)

    • Detects Anthropic requests via URI suffix (/v1/messages) + valid JSON body
    • Parses Anthropic SSE events (message_start, content_block_delta, message_delta, message_stop)
    • Extracts usage data from Anthropic response format
    • Builds deny responses in Anthropic format
    • Handles message manipulation (content prepend/append)
  2. Bidirectional Anthropic↔OpenAI converter (ai-protocols/converters/anthropic-messages-to-openai-chat.lua)

    • Converts Anthropic requests to OpenAI format (request body, system prompts, tools)
    • Converts OpenAI responses back to Anthropic format (non-streaming and SSE)
    • Handles edge cases: finish_reason as JSON null, usage:null chunks, empty SSE frames, role+content in first chunk
  3. Protocol registration

    • Registered anthropic-messages in ai-protocols/init.lua (detection order: before openai-chat)
    • Registered converter in ai-protocols/converters/init.lua
    • Added anthropic-messages capability to ai-providers/anthropic.lua
  4. Refactored prepare_requestprepare_outgoing_request

    • Removed prepare_request from all protocol modules (was called on client protocol, but stream_options injection is a target protocol concern)
    • Inlined model extraction at call site in ai-proxy/base.lua
    • Added prepare_outgoing_request(body) to openai-chat.lua for stream_options.include_usage=true injection
    • Called in ai-providers/base.lua build_request() AFTER conversion, using target protocol module
    • This ensures stream_options is correctly injected for both passthrough (OpenAI→OpenAI) and convert (Anthropic→OpenAI) scenarios
  5. Test framework enhancement (t/APISIX.pm)

    • Added TEST_ENABLE_CONTROL_API_V1 env var support to conditionally disable the /v1/ control API location block
    • This allows tests using /v1/messages URI to work without the control API intercepting requests
    • Matching the same mechanism already present in the enterprise gateway test framework
  6. Comprehensive test suite (t/plugin/ai-proxy-protocol-conversion.t)

    • 26 test cases covering non-streaming, streaming, error handling, tools, edge cases

Files changed

New files:

  • apisix/plugins/ai-protocols/anthropic-messages.lua
  • apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua
  • t/plugin/ai-proxy-protocol-conversion.t

Modified files:

  • apisix/plugins/ai-protocols/init.lua — register anthropic-messages
  • apisix/plugins/ai-protocols/converters/init.lua — register converter
  • apisix/plugins/ai-providers/anthropic.lua — add capability
  • apisix/plugins/ai-protocols/openai-chat.lua — replace prepare_request with prepare_outgoing_request
  • apisix/plugins/ai-protocols/openai-embeddings.lua — remove prepare_request
  • apisix/plugins/ai-providers/base.lua — add prepare_outgoing_request call after conversion
  • apisix/plugins/ai-proxy/base.lua — inline model extraction, store ctx.ai_target_protocol
  • t/APISIX.pm — add TEST_ENABLE_CONTROL_API_V1 env var support

Add native support for the Anthropic Messages API (/v1/messages) to ai-proxy,
enabling clients to use Anthropic's native protocol format directly instead of
only the OpenAI-compatible endpoint.

Changes:
- New protocol adapter: anthropic-messages.lua
  Handles URI-based detection (/v1/messages), streaming/non-streaming request
  processing, native Anthropic SSE event parsing (message_start,
  content_block_delta, message_delta, message_stop), usage extraction
  (input_tokens/output_tokens), deny-response building, and message
  manipulation (prepend/append).

- New bidirectional converter: anthropic-messages-to-openai-chat.lua
  Converts Anthropic Messages requests to OpenAI Chat format (system prompts,
  messages, tool_use/tool_result, parameter mapping) and OpenAI responses back
  to Anthropic format. Includes full streaming SSE translation between the two
  protocols.

- Register anthropic-messages protocol in detection order (before openai-chat,
  since it uses URI+body matching)
- Register the new converter in the converters registry
- Add anthropic-messages capability to the Anthropic provider alongside
  existing openai-chat capability

- New test file: ai-proxy-protocol-conversion.t with 26 test cases covering
  basic conversion, error cases, streaming SSE, system prompts, tool calling,
  and various edge cases (null finish_reason, duplicate finish_reason,
  usage:null, deferred usage flush, etc.)
@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. enhancement New feature or request labels Apr 8, 2026
@nic-6443 nic-6443 requested a review from Copilot April 8, 2026 06:23
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds first-class support for Anthropic’s native Messages API (/v1/messages) to the ai-proxy plugin, including protocol detection, request/response conversion to/from OpenAI Chat Completions, and streaming SSE translation.

Changes:

  • Introduces an anthropic-messages protocol adapter with native SSE parsing, usage extraction, and deny-response generation.
  • Adds an anthropic-messagesopenai-chat converter for both non-streaming and streaming (SSE) translation, including tool-calling support.
  • Registers the new protocol/converter and expands the Anthropic provider capabilities; adds a comprehensive protocol conversion test suite.

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
t/plugin/ai-proxy-protocol-conversion.t Adds coverage for Anthropic Messages detection and streaming/non-streaming conversion behavior.
apisix/plugins/ai-providers/anthropic.lua Advertises native anthropic-messages capability (/v1/messages).
apisix/plugins/ai-protocols/init.lua Registers and prioritizes anthropic-messages in protocol detection.
apisix/plugins/ai-protocols/converters/init.lua Registers the new Anthropic Messages ↔ OpenAI Chat converter.
apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua Implements request/response + streaming SSE conversion logic.
apisix/plugins/ai-protocols/anthropic-messages.lua Implements the Anthropic Messages protocol adapter (detection + SSE parsing + deny response).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

nic-6443 and others added 5 commits April 8, 2026 14:43
- Remove redundant newline separators between sse.encode() calls in
  streaming deny response (sse.encode already terminates with \n\n)
- Fix log message: 'response' -> 'request' in convert_request() warning
- Inject stream_options.include_usage=true when converting Anthropic
  streaming requests to OpenAI format, ensuring providers report usage
The prepare_request interface was doing two things:
1. Extracting the model from the request body (identical across all protocols)
2. Injecting stream_options.include_usage for OpenAI (only in openai-chat)

Problem: stream_options injection happened on the *client* protocol, but it is
a requirement of the *target* protocol (OpenAI). When a converter bridged
Anthropic→OpenAI, the openai-chat prepare_request was never called, so
stream_options was missing from converted streaming requests.

Fix:
- Remove prepare_request from all protocol modules (openai-chat,
  openai-embeddings, anthropic-messages) — the model extraction is
  now inlined at the call site in ai-proxy/base.lua
- Add prepare_outgoing_request(body) to openai-chat that injects
  stream_options when streaming
- Call prepare_outgoing_request in build_request() (ai-providers/base.lua)
  *after* protocol conversion, using the target protocol module
- Store ctx.ai_target_protocol in ai-proxy/base.lua for this purpose

This ensures stream_options.include_usage=true is injected for all scenarios:
- Passthrough: OpenAI client → OpenAI provider
- Convert: Anthropic client → OpenAI provider (via converter)
Move require('apisix.plugins.ai-protocols') to module-level local to avoid
'getting the Lua global require' lint error.
…flict

The APISIX test framework maps 'location /v1/' to http_control(),
which intercepts all requests starting with /v1/ before they reach the
APISIX router. This caused 404 for test requests to /v1/messages.

Changed test URIs to /anything/v1/messages which avoids the conflict
while still matching the Anthropic protocol detection (URI suffix check).
…ocation

The APISIX test framework hardcodes 'location /v1/' for the control API,
which intercepts all /v1/* requests before APISIX routing. This breaks
tests that use /v1/messages for Anthropic protocol detection.

Add TEST_ENABLE_CONTROL_API_V1 env var support (matching gateway's
t/APISIX.pm) to conditionally disable the /v1/ location block.
Restore test file to use /v1/messages URIs (matching gateway exactly).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants