feat(ai-proxy): add native Anthropic Messages API protocol support#13181
Open
nic-6443 wants to merge 6 commits intoapache:masterfrom
Open
feat(ai-proxy): add native Anthropic Messages API protocol support#13181nic-6443 wants to merge 6 commits intoapache:masterfrom
nic-6443 wants to merge 6 commits intoapache:masterfrom
Conversation
Add native support for the Anthropic Messages API (/v1/messages) to ai-proxy, enabling clients to use Anthropic's native protocol format directly instead of only the OpenAI-compatible endpoint. Changes: - New protocol adapter: anthropic-messages.lua Handles URI-based detection (/v1/messages), streaming/non-streaming request processing, native Anthropic SSE event parsing (message_start, content_block_delta, message_delta, message_stop), usage extraction (input_tokens/output_tokens), deny-response building, and message manipulation (prepend/append). - New bidirectional converter: anthropic-messages-to-openai-chat.lua Converts Anthropic Messages requests to OpenAI Chat format (system prompts, messages, tool_use/tool_result, parameter mapping) and OpenAI responses back to Anthropic format. Includes full streaming SSE translation between the two protocols. - Register anthropic-messages protocol in detection order (before openai-chat, since it uses URI+body matching) - Register the new converter in the converters registry - Add anthropic-messages capability to the Anthropic provider alongside existing openai-chat capability - New test file: ai-proxy-protocol-conversion.t with 26 test cases covering basic conversion, error cases, streaming SSE, system prompts, tool calling, and various edge cases (null finish_reason, duplicate finish_reason, usage:null, deferred usage flush, etc.)
There was a problem hiding this comment.
Pull request overview
This PR adds first-class support for Anthropic’s native Messages API (/v1/messages) to the ai-proxy plugin, including protocol detection, request/response conversion to/from OpenAI Chat Completions, and streaming SSE translation.
Changes:
- Introduces an
anthropic-messagesprotocol adapter with native SSE parsing, usage extraction, and deny-response generation. - Adds an
anthropic-messages↔openai-chatconverter for both non-streaming and streaming (SSE) translation, including tool-calling support. - Registers the new protocol/converter and expands the Anthropic provider capabilities; adds a comprehensive protocol conversion test suite.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| t/plugin/ai-proxy-protocol-conversion.t | Adds coverage for Anthropic Messages detection and streaming/non-streaming conversion behavior. |
| apisix/plugins/ai-providers/anthropic.lua | Advertises native anthropic-messages capability (/v1/messages). |
| apisix/plugins/ai-protocols/init.lua | Registers and prioritizes anthropic-messages in protocol detection. |
| apisix/plugins/ai-protocols/converters/init.lua | Registers the new Anthropic Messages ↔ OpenAI Chat converter. |
| apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua | Implements request/response + streaming SSE conversion logic. |
| apisix/plugins/ai-protocols/anthropic-messages.lua | Implements the Anthropic Messages protocol adapter (detection + SSE parsing + deny response). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua
Show resolved
Hide resolved
apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua
Show resolved
Hide resolved
apisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.lua
Outdated
Show resolved
Hide resolved
- Remove redundant newline separators between sse.encode() calls in streaming deny response (sse.encode already terminates with \n\n) - Fix log message: 'response' -> 'request' in convert_request() warning - Inject stream_options.include_usage=true when converting Anthropic streaming requests to OpenAI format, ensuring providers report usage
The prepare_request interface was doing two things: 1. Extracting the model from the request body (identical across all protocols) 2. Injecting stream_options.include_usage for OpenAI (only in openai-chat) Problem: stream_options injection happened on the *client* protocol, but it is a requirement of the *target* protocol (OpenAI). When a converter bridged Anthropic→OpenAI, the openai-chat prepare_request was never called, so stream_options was missing from converted streaming requests. Fix: - Remove prepare_request from all protocol modules (openai-chat, openai-embeddings, anthropic-messages) — the model extraction is now inlined at the call site in ai-proxy/base.lua - Add prepare_outgoing_request(body) to openai-chat that injects stream_options when streaming - Call prepare_outgoing_request in build_request() (ai-providers/base.lua) *after* protocol conversion, using the target protocol module - Store ctx.ai_target_protocol in ai-proxy/base.lua for this purpose This ensures stream_options.include_usage=true is injected for all scenarios: - Passthrough: OpenAI client → OpenAI provider - Convert: Anthropic client → OpenAI provider (via converter)
Move require('apisix.plugins.ai-protocols') to module-level local to avoid
'getting the Lua global require' lint error.
…flict The APISIX test framework maps 'location /v1/' to http_control(), which intercepts all requests starting with /v1/ before they reach the APISIX router. This caused 404 for test requests to /v1/messages. Changed test URIs to /anything/v1/messages which avoids the conflict while still matching the Anthropic protocol detection (URI suffix check).
…ocation The APISIX test framework hardcodes 'location /v1/' for the control API, which intercepts all /v1/* requests before APISIX routing. This breaks tests that use /v1/messages for Anthropic protocol detection. Add TEST_ENABLE_CONTROL_API_V1 env var support (matching gateway's t/APISIX.pm) to conditionally disable the /v1/ location block. Restore test file to use /v1/messages URIs (matching gateway exactly).
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Backport native Anthropic Messages API (
/v1/messages) support from the enterprise gateway to open-source APISIX.What this PR does
Native Anthropic Messages protocol adapter (
ai-protocols/anthropic-messages.lua)/v1/messages) + valid JSON bodymessage_start,content_block_delta,message_delta,message_stop)Bidirectional Anthropic↔OpenAI converter (
ai-protocols/converters/anthropic-messages-to-openai-chat.lua)finish_reasonas JSON null,usage:nullchunks, empty SSE frames, role+content in first chunkProtocol registration
anthropic-messagesinai-protocols/init.lua(detection order: beforeopenai-chat)ai-protocols/converters/init.luaanthropic-messagescapability toai-providers/anthropic.luaRefactored
prepare_request→prepare_outgoing_requestprepare_requestfrom all protocol modules (was called on client protocol, butstream_optionsinjection is a target protocol concern)ai-proxy/base.luaprepare_outgoing_request(body)toopenai-chat.luaforstream_options.include_usage=trueinjectionai-providers/base.luabuild_request()AFTER conversion, using target protocol modulestream_optionsis correctly injected for both passthrough (OpenAI→OpenAI) and convert (Anthropic→OpenAI) scenariosTest framework enhancement (
t/APISIX.pm)TEST_ENABLE_CONTROL_API_V1env var support to conditionally disable the/v1/control API location block/v1/messagesURI to work without the control API intercepting requestsComprehensive test suite (
t/plugin/ai-proxy-protocol-conversion.t)Files changed
New files:
apisix/plugins/ai-protocols/anthropic-messages.luaapisix/plugins/ai-protocols/converters/anthropic-messages-to-openai-chat.luat/plugin/ai-proxy-protocol-conversion.tModified files:
apisix/plugins/ai-protocols/init.lua— register anthropic-messagesapisix/plugins/ai-protocols/converters/init.lua— register converterapisix/plugins/ai-providers/anthropic.lua— add capabilityapisix/plugins/ai-protocols/openai-chat.lua— replaceprepare_requestwithprepare_outgoing_requestapisix/plugins/ai-protocols/openai-embeddings.lua— removeprepare_requestapisix/plugins/ai-providers/base.lua— addprepare_outgoing_requestcall after conversionapisix/plugins/ai-proxy/base.lua— inline model extraction, storectx.ai_target_protocolt/APISIX.pm— addTEST_ENABLE_CONTROL_API_V1env var support