diff --git a/CHANGELOG.md b/CHANGELOG.md index 16e6a3b..2725846 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -3,17 +3,64 @@ Inspired from [Keep a Changelog](https://keepachangelog.com/en/1.0.0/) ## [Unreleased] +### Added + +### Fixed +### Removed + +## [Released 0.9.0] ### Added +- Add Search Relevance Workbench tools for query set management (add, get, delete) ([#187](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/187)) +- Add Search Relevance Workbench tools for judgment list management (create, get, delete) ([#190](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/190)) +- Add Search Relevance Workbench tools for experiment management (create, get, delete) ([#192](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/192)) +- Add Search Relevance Workbench `_search` API tools for querying query sets, search configurations, judgments, and experiments using OpenSearch query DSL ([#193](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/193)) + +### Improved + +- Enhanced SearchIndexTool descriptions to improve LLM query construction consistency ([#194](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/194)) +- Added Bearer Authorization support when header authentication is enabled ([#189](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/189)) +- Sanitized write-disabled error message in GenericOpenSearchApiTool to avoid exposing internal configuration details ([#196](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/196)) + +## [Released 0.8.0] +### Added +- Add new toolset for the OpenSearch Agentic Memory API: `CreateAgenticMemorySessionTool`, `AddAgenticMemoriesTool`, `GetAgenticMemoryTool`, `UpdateAgenticMemoryTool`, `DeleteAgenticMemoryByIDTool`, `DeleteAgenticMemoryByQueryTool`, and `SearchAgenticMemoryTool`. Agentic memory tools are disabled by default and automatically enabled when `memory_container_id` is configured via the `agentic_memory` config section or `OPENSEARCH_MEMORY_CONTAINER_ID` environment variable. The `memory_container_id` is automatically pre-filled in all tool calls so agents don't need to pass it manually. ([#138](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/138)) +- Add support for setting default values for tool arguments via config ([#138](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/138)) + - Optimize JSON output token usage by removing formatting whitespace across all tools ([#167](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/167)) +- Rename SearchIndexTool parameter `query` to `query_dsl` to avoid confusion with nested query objects ([#172](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/172)) +- Add structured JSON logging (`--log-format json`) for monitoring and metrics ([#178](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/178)) + +### Fixed + +- Fix SearchIndexTool `AttributeError` after `query` to `query_dsl` rename ([#176](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/176)) + +### Dependencies + +- Bump `aiohttp` from 3.11.18 to 3.13.3 ([#175](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/175)) +- Bump `mcp` from 1.9.4 to 1.23.0 ([#180](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/180)) +- Bump `starlette` from 0.46.2 to 0.49.1 ([#181](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/181)) +- Bump `h11` from 0.14.0 to 0.16.0 ([#182](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/182)) +- Bump `requests` from 2.32.3 to 2.32.4 ([#183](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/183)) +- Bump `python-multipart` from 0.0.20 to 0.0.22 ([#173](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/173)) +- Bump `urllib3` from 2.4.0 to 2.6.3 ([#174](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/174)) + +### Removed + +## [Released 0.7.0] +### Added + + - Support basic auth through header in HTTP transport ([#152](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/152)) +- Add Search Relevance Workbench tools for search configuration management (add, get, delete) ([#171](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/171)) ### Fixed - Fix _fallback_perform_request using wrong url ([#157](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/157)) - Fix search index tool time format issue. ([#159](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/159)) ### Removed +- Remove `CreateAgenticMemoryContainerTool` - container creation is an infrastructure setup operation that should be done via the OpenSearch API or dashboard, not at agent runtime ([#138](https://github.com/opensearch-project/opensearch-mcp-server-py/pull/138)) ## [Released 0.6.1] ### Added diff --git a/README.md b/README.md index a7cecc3..8516eba 100644 --- a/README.md +++ b/README.md @@ -61,6 +61,41 @@ The following tools are available but disabled by default. To enable them, see t - [GetAllocationTool](https://docs.opensearch.org/latest/api-reference/cat/cat-allocation/): Gets information about shard allocation across nodes in the cluster from the /\_cat/allocation endpoint. - [GetLongRunningTasksTool](https://docs.opensearch.org/latest/api-reference/cat/cat-tasks/): Gets information about long-running tasks in the cluster, sorted by running time in descending order. +### Agentic Memory Tools (Disabled by Default) +The following tools provide AI agents with persistent memory capabilities using the [OpenSearch Agentic Memory API](https://docs.opensearch.org/latest/ml-commons-plugin/agentic-memory/). These tools require OpenSearch **3.3.0 or later** and are **disabled by default**. They are automatically enabled when a `memory_container_id` is configured via the `agentic_memory` section in the config file or the `OPENSEARCH_MEMORY_CONTAINER_ID` environment variable. See [Agentic Memory Usage](USER_GUIDE.md#agentic-memory-usage) in the User Guide for setup instructions. + +**Note:** Container creation is an infrastructure setup operation that requires careful configuration of embedding models, LLM connectors, strategies, and index settings. Create your memory container using the [OpenSearch API](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/create-memory-container/) or dashboard before configuring the MCP server. + +- [CreateAgenticMemorySessionTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/create-session/): Creates a new session within a memory container. +- [AddAgenticMemoriesTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/add-memory/): Adds conversational or structured data memories to a container. +- [GetAgenticMemoryTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/get-memory/): Retrieves a specific memory by its ID and type. +- [SearchAgenticMemoryTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/search-memory/): Searches for memories using OpenSearch Query DSL. +- [UpdateAgenticMemoryTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/update-memory/): Updates an existing memory (supports specific fields based on memory type). +- [DeleteAgenticMemoryByIDTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/delete-memory/): Deletes a specific memory by its ID. +- [DeleteAgenticMemoryByQueryTool](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/delete-memory/): Deletes multiple memories matching a query criteria. +### Search Relevance Workbench Tools (Disabled by Default) +Search Relevance Workbench tools are grouped under the `search_relevance` category and can be enabled at once using `OPENSEARCH_ENABLED_CATEGORIES=search_relevance` or by adding `enabled_categories: [search_relevance]` or explicitly adding individual tools to their config file. See the [Tool Filter](USER_GUIDE.md#tool-filter) section in the User Guide for additional information about how to filter tools. + +- [CreateSearchConfigurationTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/search-configurations/#creating-search-configurations): Creates a search configuration consisting of a name, a query body (a query in OpenSearch query domain-specific language), and the target index. +- [GetSearchConfigurationTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/search-configurations/#retrieve-search-configurations): Retrieves a search configuration by ID. +- [DeleteSearchConfigurationTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/search-configurations/#delete-a-search-configuration): Deletes a search configuration by ID. +- [CreateQuerySetTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/query-sets/#example-request-uploading-a-query-set-manually): Creates a query set consisting of a name, a description, and a list of queries. +- [SampleQuerySetTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/query-sets/#creating-query-sets): Samples a query set based on UBI data with different statistical sampling techniques. +- [GetQuerySetTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/query-sets/#retrieve-query-sets): Retrieves a query set by ID. +- [DeleteQuerySetTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/query-sets/#delete-a-query-set): Deletes a query set by ID. +- [CreateJudgmentListTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/#importing-judgments): Creates a judgment list with judgments originating from an external process. +- [CreateLLMJudgmentListTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/#creating-ai-assisted-judgments): Creates a judgment list by using an LLM. +- [CreateUBIJudgmentListTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/#implicit-judgments): Creates a judgment list based on implicit feedback (User Behavior Insights data). +- [GetJudgmentListTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/): Retrieves a judgment list by ID. +- [DeleteJudgmentListTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/): Deletes a judgment list by ID. +- [CreateExperimentTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/experiments/): Creates a search relevance experiment. Supports PAIRWISE_COMPARISON (compares 2 search configurations), POINTWISE_EVALUATION (evaluates 1 configuration against judgment lists), and HYBRID_OPTIMIZER (optimizes 1 configuration using judgment lists). +- [GetExperimentTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/experiments/): Retrieves an experiment by ID. +- [DeleteExperimentTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/experiments/): Deletes an experiment by ID. +- [SearchQuerySetsTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/query-sets/): Searches query sets using OpenSearch query DSL. Defaults to match_all if no query body is provided. +- [SearchSearchConfigurationsTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/search-configurations/): Searches search configurations using OpenSearch query DSL. Defaults to match_all if no query body is provided. +- [SearchJudgmentsTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/judgments/): Searches judgments using OpenSearch query DSL. Defaults to match_all if no query body is provided. +- [SearchExperimentsTool](https://docs.opensearch.org/latest/search-plugins/search-relevance/experiments/): Searches experiments using OpenSearch query DSL. Defaults to match_all if no query body is provided. + ### Skills Tools (Enabled by Default) Advanced analysis tools for data analysis and troubleshooting. @@ -84,7 +119,7 @@ Advanced analysis tools for data analysis and troubleshooting. - `opensearch_url` (optional): The OpenSearch cluster URL to connect to - `index` (required): The name of the index to search in - - `query` (required): The search query in OpenSearch Query DSL format + - `query_dsl` (required): The search query in OpenSearch Query DSL format - `format` (optional): The format of SearchIndexTool response. options are csv and json - `size` (optional): The size of SearchIndexTool response. Default is 10, maximum is 100 (configurable). To change the maximum limit, set `max_size_limit` via CLI arguments or config file. See [Tool Customization](USER_GUIDE.md#tool-customization) for details. @@ -164,6 +199,71 @@ Advanced analysis tools for data analysis and troubleshooting. - `opensearch_url` (optional): The OpenSearch cluster URL to connect to - `limit` (optional): The maximum number of tasks to return. Default is 10. +- **CreateAgenticMemorySessionTool** + + - `memory_container_id` (auto-populated): The ID of the memory container where the session will be created. Automatically set when configured via `agentic_memory` config or `OPENSEARCH_MEMORY_CONTAINER_ID` env var. *(Path Parameter)* + - `session_id` (optional): A custom session ID. If not provided, a random ID is generated. *(Body Parameter)* + - `summary` (optional): A session summary or description. *(Body Parameter)* + - `metadata` (optional): Additional metadata for the session provided as key-value pairs. *(Body Parameter)* + - `namespace` (optional): Namespace information for organizing the session. *(Body Parameter)* + +- **AddAgenticMemoriesTool** + + - `memory_container_id` (auto-populated): The ID of the memory container to add the memory to. Automatically set when configured. *(Path Parameter)* + - `messages` (conditional): A list of messages. Required when `payload_type` is `conversational`. *(Body Parameter)* + - `structured_data` (conditional): Structured data content. Required when `payload_type` is `data`. *(Body Parameter)* + - `binary_data` (optional): Binary data content encoded as a Base64 string for binary payloads. *(Body Parameter)* + - `payload_type` (required): The type of payload. Valid values are `conversational` or `data`. See [Payload types](https://docs.opensearch.org/latest/ml-commons-plugin/agentic-memory/#payload-types). *(Body Parameter)* + - `namespace` (optional): The [namespace](https://docs.opensearch.org/latest/ml-commons-plugin/agentic-memory/#namespaces) context for organizing memories (for example, `user_id`, `session_id`, or `agent_id`). If `session_id` is not specified in the namespace field and `disable_session`: `false` (default is `true`), a new session with a new session ID is created. *(Body Parameter)* + - `metadata` (optional): Additional metadata for the memory (for example, `status`, `branch`, or custom fields). *(Body Parameter)* + - `tags` (optional): Tags for categorizing memories. *(Body Parameter)* + - `infer` (optional): Whether to use an LLM to extract key information (default: `false`). When `true`, the LLM extracts key information from the original text and stores it as a memory. See [Inference mode](https://docs.opensearch.org/latest/ml-commons-plugin/agentic-memory/#inference-mode). *(Body Parameter)* + +- **GetAgenticMemoryTool** + + - `memory_container_id` (auto-populated): The ID of the memory container from which to retrieve the memory. Automatically set when configured. *(Path Parameter)* + - `type` (required): The memory type. Valid values are `sessions`, `working`, `long-term`, and `history`. *(Path Parameter)* + - `id` (required): The ID of the memory to retrieve. *(Path Parameter)* + +- **SearchAgenticMemoryTool** + + - `memory_container_id` (auto-populated): The ID of the memory container. Automatically set when configured. *(Path Parameter)* + - `type` (required): The memory type. Valid values are `sessions`, `working`, `long-term`, and `history`. *(Path Parameter)* + - `query` (required): The search query using OpenSearch [query DSL](https://docs.opensearch.org/latest/query-dsl/). *(Body Parameter)* + - `sort` (optional): Sort specification for the search results. *(Body Parameter)* + +- **UpdateAgenticMemoryTool** + + - `memory_container_id` (auto-populated): The ID of the memory container. Automatically set when configured. *(Path Parameter)* + - `type` (required): The memory type (`sessions`, `working`, or `long-term`).*(Path Parameter)* + - `id` (required): The ID of the memory to update.*(Path Parameter)* + - **Session memory request fields:** + - `summary` (optional): The summary of the session. *(Body Parameter)* + - `metadata` (optional): Additional metadata for the memory (for example, `status`, `branch`, or custom fields). *(Body Parameter)* + - `agents` (optional): Additional information about the agents. *(Body Parameter)* + - `additional_info` (optional): Additional metadata to associate with the session. *(Body Parameter)* + - **Working memory request fields** + - `messages` (optional): Updated conversation messages (for conversation type). *(Body Parameter)* + - `structured_data` (optional): Updated structured data content (for data memory payloads). *(Body Parameter)* + - `binary_data` (optional): Updated binary data content (for data memory payloads). *(Body Parameter)* + - `tags` (optional): Updated tags for categorization. *(Body Parameter)* + - `metadata` (optional): Additional metadata for the memory (for example, `status`, `branch`, or custom fields). *(Body Parameter)* + - **Long-term memory request fields** + - `memory` (optional): The updated memory content. *(Body Parameter)* + - `tags` (optional): Updated tags for categorization. *(Body Parameter)* + +- **DeleteAgenticMemoryByIDTool** + + - `memory_container_id` (auto-populated): The ID of the memory container from which to delete the memory. Automatically set when configured. *(Path Parameter)* + - `type` (required): The type of memory to delete. Valid values are `sessions`, `working`, `long-term`, and `history`. *(Path Parameter)* + - `id` (required): The ID of the specific memory to delete. *(Path Parameter)* + +- **DeleteAgenticMemoryByQueryTool** + + - `memory_container_id` (auto-populated): The ID of the memory container from which to delete the memory. Automatically set when configured. *(Path Parameter)* + - `type` (required): The type of memory to delete. Valid values are `sessions`, `working`, `long-term`, and `history`. *(Path Parameter)* + - `query` (required): The OpenSearch [DSL query](https://docs.opensearch.org/latest/query-dsl/) to match memories for deletion. *(Body Parameter)* + - **DataDistributionTool** - `index` (required): Target OpenSearch index name. diff --git a/USER_GUIDE.md b/USER_GUIDE.md index e32abdb..f54209b 100644 --- a/USER_GUIDE.md +++ b/USER_GUIDE.md @@ -11,6 +11,8 @@ - [Running the Server](#running-the-server) - [Tool Filter](#tool-filter) - [Tool Customization](#tool-customization) +- [Agentic Memory Usage](#agentic-memory-usage) +- [Structured Logging](#structured-logging) - [LangChain Integration](#langchain-integration) ## Overview @@ -200,7 +202,7 @@ The LLM needs to have context about the available cluster names to make informed { "opensearch_cluster_name": "production", "index": "users", - "query": { + "query_dsl": { "match": { "status": "active" } @@ -256,6 +258,10 @@ For Basic authentication: - `Authorization`: HTTP Basic authentication header (format: `Basic `) - Example: `Authorization: Basic YWRtaW46cGFzc3dvcmQ=` (where `YWRtaW46cGFzc3dvcmQ=` is base64-encoded `admin:password`) +For Bearer authentication: + +- `Authorization`: HTTP Bearer authentication header (format: `Bearer `) + **Note:** When `OPENSEARCH_HEADER_AUTH=true` (single mode) or `opensearch_header_auth: true` (multi mode), headers take priority over environment variables or cluster configuration values. If a header is not provided, the system falls back to the corresponding environment variable (single mode) or cluster configuration value (multi mode). #### IAM Role Authentication @@ -422,6 +428,7 @@ python -m mcp_server_opensearch --mode multi | `--mode` | string | `single` | Server mode: `single` or `multi` | | `--profile` | string | `''` | AWS profile to use for OpenSearch connection | | `--config` | string | `''` | Path to a YAML configuration file | +| `--log-format` | string | `text` | Log output format: `text` (human-readable) or `json` (structured) | ## Environment Variables @@ -468,6 +475,17 @@ python -m mcp_server_opensearch --mode multi | `OPENSEARCH_DISABLED_TOOLS_REGEX` | No | `''` | Comma-separated list of regex patterns for disabled tools | | `OPENSEARCH_SETTINGS_ALLOW_WRITE` | No | `"true"` | Enable/disable write operations (`"true"` or `"false"`) | +### Agentic Memory Variables + +| Variable | Required | Default | Description | +|----------|----------|---------|-------------| +| `OPENSEARCH_MEMORY_CONTAINER_ID` | No | `''` | Memory container ID for agentic memory tools. When set, agentic memory tools are automatically enabled and `memory_container_id` is pre-filled in all tool calls. Config file `agentic_memory.memory_container_id` takes precedence over this variable. | +### Logging & Monitoring Variables + +| Variable | Required | Default | Description | +|----------|----------|---------|-------------| +| `OPENSEARCH_MEMORY_MONITOR_INTERVAL` | No | `60` | Interval in seconds between memory usage snapshots (only active with `--log-format json`) | + *Required in single mode or when not using multi-mode config file ## Multi-Mode Cluster Configuration @@ -679,6 +697,244 @@ When the response size limit is exceeded, you'll see an error like: Response size exceeded limit of 10485760 bytes. Stopped reading at 15728640 bytes to prevent memory exhaustion. Consider increasing max_response_size or refining your query to return less data. ``` +## Agentic Memory Usage + +The Agentic Memory tools allow AI agents to maintain state and long-term memory using [OpenSearch Agentic Memory](https://docs.opensearch.org/latest/ml-commons-plugin/agentic-memory/). +**Note:** These tools require OpenSearch version **3.3.0 or later**. + +### Prerequisites + +Before using the agentic memory tools, you must create a memory container in OpenSearch. Container creation is an infrastructure setup operation that requires careful configuration of embedding models, LLM connectors, strategies, and index settings. This is typically a one-time admin operation, not something agents should do at runtime. + +Create a memory container using the [OpenSearch Create Container API](https://docs.opensearch.org/latest/ml-commons-plugin/api/agentic-memory-apis/create-container/) or the OpenSearch dashboard. Note the `memory_container_id` returned — you will need it for configuration. + +### Enabling Agentic Memory Tools + +Agentic memory tools are **disabled by default**. They are automatically enabled when a `memory_container_id` is configured. There are two ways to configure it: + +#### Option 1: Config File (Recommended) + +Add an `agentic_memory` section to your YAML config file: + +```yaml +agentic_memory: + memory_container_id: "your-container-id-here" +``` + +Then start the server with the config file: +```bash +python -m mcp_server_opensearch --config path/to/config.yml +``` + +#### Option 2: Environment Variable + +Set the `OPENSEARCH_MEMORY_CONTAINER_ID` environment variable: + +```bash +export OPENSEARCH_MEMORY_CONTAINER_ID="your-container-id-here" +``` + +**Priority:** Config file setting takes precedence over the environment variable. + +When configured, the `memory_container_id` is **automatically populated** into all agentic memory tool calls — agents do not need to pass it manually with every request. + +### Workflow + +The typical workflow involves establishing a session and then reading/writing memories. + +#### 1. Create a Session +Start a new planning session for a summer trip. + +**Tool:** `CreateAgenticMemorySessionTool` +```json +{ + "session_id": "summer-trip-2025", + "namespace": { + "traveler_id": "adventurous_alice" + }, + "metadata": { + "vibe": "relaxing but fun", + "budget": "medium" + } +} +``` + +#### 2. Add Memories +Store the initial context. Alice shares her first ideas for the trip - quiet museum days and culinary experiences. The agent stores these conversational messages and runs inference (due to `infer: true`) to extract facts and preferences for future recommendations. + +**Tool:** `AddAgenticMemoriesTool` +```json +{ + "payload_type": "conversational", + "namespace": { + "traveler_id": "adventurous_alice", + "session_id": "summer-trip-2025" + }, + "messages": [ + { + "role": "user", + "content": [{"type": "text", "text": "I'm thinking about Italy! I absolutely love gelato. I was also thinking about spending my days walking through quiet museums and art galleries."}] + }, + { + "role": "assistant", + "content": [{"type": "text", "text": "Sounds lovely! Florence would be perfect for art galleries and authentic gelato."}] + } + ], + "infer": true +} +``` + +#### 3. Search Memories +Later, the agent needs to make a restaurant recommendation. It searches the working memory to retrieve Alice's specific food preferences (like her love for gelato) to ensure the suggestion matches her taste. + +**Tool:** `SearchAgenticMemoryTool` +```json +{ + "type": "working", + "query": { + "match": { + "text": "food preferences" + } + }, + "sort": [ + { + "created_time": { + "order": "desc" + } + } + ] +} +``` + +#### 4. Update & Delete +Alice changes her mind. + +1. We update the session summary to reflect the new goal (clubs and grappa). + +2. We delete memories related to "museums". Since Alice decided to ditch the cultural tour for partying, removing the old "museum" context ensures the agent won't mix up the vibe and suggest art exhibitions when she wants bars. + +**Tool:** `UpdateAgenticMemoryTool` +```json +{ + "type": "sessions", + "id": "", + "summary": "Planning a trip to Italy focused on nightlife, clubs, and drinking grappa, while retaining the interest in gelato.", + "metadata": { + "vibe": "nightlife & party", + "budget": "medium" + } +} +``` + +**Tool:** `DeleteAgenticMemoryByQueryTool` +```json +{ + "type": "working", + "query": { + "match": { + "text": "museums art galleries" + } + } +} +``` + +> **Note:** The `memory_container_id` field is omitted from the examples above because it is automatically populated from your configuration. If you need to override it for a specific call, you can still pass it explicitly. +## Structured Logging + +The OpenSearch MCP server supports structured JSON logging for monitoring and metrics. When enabled, every log line is a JSON object with fields that can be directly targeted by metric filters in log aggregation platforms such as Amazon CloudWatch, Datadog, Splunk, Grafana Loki, or the ELK stack. + +### Enabling Structured Logging + +Use the `--log-format json` flag when starting the server: + +```bash +# Stdio server with JSON logging +python -m mcp_server_opensearch --log-format json + +# Streaming server with JSON logging +python -m mcp_server_opensearch --transport stream --log-format json + +# Capture logs to a file for analysis +python -m mcp_server_opensearch --log-format json 2>server.jsonl +``` + +The default (`--log-format text`) produces the same human-readable output as before and is fully backward-compatible. + +### Event Types + +The server emits the following structured event types: + +| Event Type | When Emitted | Key Fields | +|-----------|-------------|------------| +| `tool_execution` | Every tool invocation | `tool_name`, `status`, `duration_ms` | +| `tool_error` | Tool failure | `tool_name`, `status_code`, `exception_type`, `root_cause` | +| `opensearch_request` | Every HTTP request to OpenSearch | `http_method`, `endpoint`, `status_code`, `duration_ms` | +| `datasource_connection` | Auth/connection failure | `auth_method`, `datasource_type`, `opensearch_url` | +| `memory_snapshot` | Periodically (default: every 60s) | `memory_rss_mb`, `pid` | + +### Example Log Events + +**Successful tool execution:** +```json +{ + "timestamp": "2026-02-27T19:50:45.687Z", + "level": "INFO", + "logger": "mcp_server_opensearch.tool_executor", + "message": "Tool executed: ListIndexTool (711.81ms)", + "event_type": "tool_execution", + "tool_name": "ListIndexTool", + "status": "success", + "duration_ms": 711.81, + "tool_key": "ListIndexTool" +} +``` + +**Tool error with root cause:** +```json +{ + "timestamp": "2026-02-27T21:02:02.095Z", + "level": "ERROR", + "logger": "tools.tool_logging", + "message": "Tool error: CountTool - executing CountTool (NotFoundError)", + "event_type": "tool_error", + "tool_name": "CountTool", + "exception_type": "NotFoundError", + "status": "error", + "status_code": 404, + "root_cause": "index_not_found_exception" +} +``` + +### Filtering Log Events + +Use `jq` to filter structured events from the log file: + +```bash +# Show all structured events +cat server.jsonl | jq 'select(.event_type)' + +# Show only tool errors +cat server.jsonl | jq 'select(.event_type == "tool_error")' + +# Show failed tool executions +cat server.jsonl | jq 'select(.event_type == "tool_execution" and .status == "error")' + +# Show 403 errors +cat server.jsonl | jq 'select(.status_code == 403)' +``` + +### Memory Monitor + +When structured logging is enabled, the server periodically logs process memory usage as `memory_snapshot` events. The interval can be configured via the `OPENSEARCH_MEMORY_MONITOR_INTERVAL` environment variable (default: 60 seconds). + +```bash +# Set memory snapshot interval to 30 seconds +export OPENSEARCH_MEMORY_MONITOR_INTERVAL=30 +python -m mcp_server_opensearch --log-format json +``` + +For a complete RFC with detailed field references, metric filter patterns, and integration test results, see [RFC: Structured JSON Logging for Monitoring and Metrics](https://github.com/opensearch-project/opensearch-mcp-server-py/issues/177). + ## LangChain Integration The OpenSearch MCP server can be easily integrated with LangChain using the SSE server transport. diff --git a/example_config.yml b/example_config.yml index 803456e..831dbc8 100644 --- a/example_config.yml +++ b/example_config.yml @@ -1,7 +1,9 @@ version: "1.0" description: "Unified OpenSearch MCP Server Configuration" +# ============================================================================== # Cluster configurations (used in Multi Mode) +# ============================================================================== clusters: local-cluster: opensearch_url: "http://localhost:9200" @@ -34,7 +36,21 @@ clusters: opensearch_url: "https://your-opensearch-domain.us-east-2.es.amazonaws.com" opensearch_header_auth: true -Tool customization configurations (supported in both Single and Multi Mode) +# ============================================================================== +# Agentic Memory Configuration (supported in both Single and Multi Mode) +# ============================================================================== +# Configure the memory container ID for agentic memory tools. +# When set, agentic memory tools will be automatically enabled +# +# Priority: Config file setting > OPENSEARCH_MEMORY_CONTAINER_ID env var +# Example configuration: +# +# agentic_memory: +# memory_container_id: "your-container-id-here" + +# ============================================================================== +# Tool customization configurations (supported in both Single and Multi Mode) +# ============================================================================== tools: ListIndexTool: display_name: "My_Custom_Index_Lister" @@ -45,7 +61,7 @@ tools: display_name: "Super_Searcher" args: index: "Custom description for the 'index' argument in SearchIndexTool." - query: "Explain what kind of query JSON is expected here." + query_dsl: "Explain what kind of query DSL JSON is expected here." GetShardsTool: description: "A better description to get information about shards in OpenSearch." args: diff --git a/pyproject.toml b/pyproject.toml index e62fcd7..e96f9d9 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [project] name = "opensearch-mcp-server-py" -version = "0.7.0" +version = "0.9.0" description = "OpenSearch MCP Server" readme = "README.md" requires-python = ">=3.10" @@ -8,24 +8,31 @@ dependencies = [ "aiohttp>=3.11.18", "boto3>=1.38.3", "mcp[cli]>=1.9.4", - "opensearch-py==2.8.0", + "opensearch-py==3.1.0", "pydantic>=2.11.3", "pyyaml>=6.0.2", "requests-aws4auth>=1.3.1", "semver>=3.0.4", + "pyjwt>=2.12.0", ] license = "Apache-2.0" -license-files = ["LICENSE", "NOTICE" ] +license-files = ["LICENSE.txt", "NOTICE.txt"] [dependency-groups] dev = [ + "anthropic>=0.50.0", "pytest>=8.3.5", "pytest-asyncio>=0.26.0", "pytest-cov>=6.2.1", "ruff>=0.9.7", ] +[tool.pytest.ini_options] +markers = [ + "eval: marks tests as LLM eval tests that call the Anthropic API (run with --run-evals)", +] + [build-system] requires = ["setuptools >= 77.0.3"] build-backend = "setuptools.build_meta" diff --git a/src/mcp_server_opensearch/__init__.py b/src/mcp_server_opensearch/__init__.py index f800589..6083a96 100644 --- a/src/mcp_server_opensearch/__init__.py +++ b/src/mcp_server_opensearch/__init__.py @@ -74,14 +74,20 @@ def main() -> None: action='store_true', help='Enable debug logging', ) + parser.add_argument( + '--log-format', + choices=['text', 'json'], + default='text', + help='Log output format: text (default, human-readable) or json (structured)', + ) args, unknown = parser.parse_known_args() - # Configure logging with appropriate level + # Configure logging with appropriate level and format + from .logging_config import configure_logging + log_level = logging.DEBUG if args.debug else logging.INFO - logging.basicConfig( - level=log_level, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s' - ) + configure_logging(level=log_level, log_format=args.log_format) logger = logging.getLogger(__name__) logger.info('Starting MCP server...') diff --git a/src/mcp_server_opensearch/logging_config.py b/src/mcp_server_opensearch/logging_config.py new file mode 100644 index 0000000..ec1cd29 --- /dev/null +++ b/src/mcp_server_opensearch/logging_config.py @@ -0,0 +1,161 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +""" +Structured logging configuration for metric extraction. + +Provides a JSON formatter that outputs one JSON object per log line, making +log events directly targetable by metric filters. Extra fields passed via +logger.info("msg", extra={...}) are merged into the top-level JSON object. +""" + +import asyncio +import json +import logging +import os +from datetime import datetime, timezone + + +# Attributes that exist on every LogRecord by default. +# We compute this once at import time to avoid per-record overhead. +_STANDARD_LOG_RECORD_ATTRS = frozenset( + logging.LogRecord('', 0, '', 0, '', (), None).__dict__.keys() +) | {'message', 'asctime'} + + +class JsonFormatter(logging.Formatter): + """Minimal JSON formatter for structured logging. + + Produces one JSON object per line on stderr. Extra fields attached + to the LogRecord (via logger.info("msg", extra={...})) are merged + into the top-level object, making them directly targetable by + metric filters (e.g., { $.event_type = "tool_execution" }). + """ + + def format(self, record: logging.LogRecord) -> str: + log_entry = { + 'timestamp': datetime.fromtimestamp(record.created, tz=timezone.utc).strftime( + '%Y-%m-%dT%H:%M:%S.%f' + )[:-3] + + 'Z', + 'level': record.levelname, + 'logger': record.name, + 'message': record.getMessage(), + } + + # Merge extra fields into the top-level object. + for key, value in record.__dict__.items(): + if key not in _STANDARD_LOG_RECORD_ATTRS and not key.startswith('_'): + log_entry[key] = value + + if record.exc_info and record.exc_info[0] is not None: + log_entry['exception'] = self.formatException(record.exc_info) + + return json.dumps(log_entry, default=str) + + +def configure_logging(level: int = logging.INFO, log_format: str = 'text') -> None: + """Configure the root logger for the MCP server. + + Args: + level: Logging level (DEBUG, INFO, etc.) + log_format: "text" for human-readable (default, backward-compatible) + or "json" for structured logging. + """ + root_logger = logging.getLogger() + root_logger.setLevel(level) + + # Remove any existing handlers to avoid duplicate output. + for handler in root_logger.handlers[:]: + root_logger.removeHandler(handler) + + handler = logging.StreamHandler() # defaults to stderr + handler.setLevel(level) + + if log_format == 'json': + handler.setFormatter(JsonFormatter()) + else: + handler.setFormatter( + logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') + ) + + root_logger.addHandler(handler) + + +_memory_logger = logging.getLogger(__name__ + '.memory') +_async_sleep = asyncio.sleep + + +def _get_rss_mb() -> float: + """Return process memory in MB (current RSS on Linux, peak RSS on macOS, -1 elsewhere).""" + # Linux / containers: read current RSS from procfs + try: + with open('/proc/self/statm') as f: + pages = int(f.read().split()[1]) # 2nd field = RSS in pages + return round(pages * os.sysconf('SC_PAGE_SIZE') / (1024 * 1024), 2) + except (OSError, IndexError, ValueError): + pass + + # macOS: peak RSS via resource (no procfs available) + try: + import resource + + return round(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss / (1024 * 1024), 2) + except ImportError: + pass + + return -1.0 + + +async def memory_monitor(interval_seconds: int = 60) -> None: + """Periodically log process RSS memory usage. + + Emits a structured log event (event_type="memory_snapshot") with + memory_rss_mb and pid fields, enabling CloudWatch metric filters + for memory tracking. + + Args: + interval_seconds: Seconds between snapshots (default: 60). + """ + interval_seconds = max(interval_seconds, 1) + pid = os.getpid() + while True: + await _async_sleep(interval_seconds) + try: + rss_mb = _get_rss_mb() + _memory_logger.info( + f'Memory snapshot: {rss_mb} MB (pid={pid})', + extra={ + 'event_type': 'memory_snapshot', + 'memory_rss_mb': rss_mb, + 'pid': pid, + }, + ) + except Exception as e: + _memory_logger.warning(f'Memory monitor iteration failed: {e}') + + +def start_memory_monitor(interval_seconds: int | None = None) -> asyncio.Task: + """Start the memory monitor as a background asyncio task. + + Args: + interval_seconds: Seconds between snapshots. If not provided, + reads from OPENSEARCH_MEMORY_MONITOR_INTERVAL env var (default: 60). + + Returns: + The asyncio.Task running the monitor. + """ + if interval_seconds is None: + interval_seconds = int(os.environ.get('OPENSEARCH_MEMORY_MONITOR_INTERVAL', '60')) + task = asyncio.create_task(memory_monitor(interval_seconds)) + task.add_done_callback(_handle_monitor_error) + return task + + +def _handle_monitor_error(task: asyncio.Task) -> None: + """Log any unexpected errors from the memory monitor task.""" + if task.cancelled(): + return + exc = task.exception() + if exc is not None: + _memory_logger.error(f'MCP Memory monitor crashed: {exc}') diff --git a/src/mcp_server_opensearch/stdio_server.py b/src/mcp_server_opensearch/stdio_server.py index 7e0362d..3dc692f 100644 --- a/src/mcp_server_opensearch/stdio_server.py +++ b/src/mcp_server_opensearch/stdio_server.py @@ -1,6 +1,7 @@ # Copyright OpenSearch Contributors # SPDX-License-Identifier: Apache-2.0 +import asyncio import logging from mcp.server import Server from mcp.server.stdio import stdio_server @@ -18,7 +19,7 @@ async def serve( mode: str = 'single', profile: str = '', config_file_path: str = '', - cli_tool_overrides: dict = None, + cli_tool_overrides: dict | None = None, ) -> None: # Set the global mode set_mode(mode) @@ -38,6 +39,9 @@ async def serve( # Call tool generator await generate_tools_from_openapi() + # Register agentic memory tools if memory_container_id is configured + from tools.tools import register_agentic_memory_tools + register_agentic_memory_tools(config_file_path) # Apply custom tool config (custom name and description) customized_registry = apply_custom_tool_config( TOOL_REGISTRY, config_file_path, cli_tool_overrides or {} @@ -63,23 +67,21 @@ async def list_tools() -> list[Tool]: @server.call_tool() async def call_tool(name: str, arguments: dict) -> list[TextContent]: - # Find the tool by its display name, which is what the client sees - found_tool_key = None - for key, tool_info in enabled_tools.items(): - if tool_info.get('display_name', key) == name: - found_tool_key = key - break + from mcp_server_opensearch.tool_executor import execute_tool - if not found_tool_key: - raise ValueError(f'Unknown or disabled tool: {name}') - - tool = enabled_tools.get(found_tool_key) - from tools.tool_params import validate_args_for_mode - - parsed = validate_args_for_mode(arguments, tool['args_model']) - return await tool['function'](parsed) + return await execute_tool(name, arguments, enabled_tools) # Start stdio-based MCP server + from mcp_server_opensearch.logging_config import start_memory_monitor + options = server.create_initialization_options() async with stdio_server() as (reader, writer): - await server.run(reader, writer, options, raise_exceptions=True) + monitor_task = start_memory_monitor() + try: + await server.run(reader, writer, options, raise_exceptions=True) + finally: + monitor_task.cancel() + try: + await monitor_task + except (asyncio.CancelledError, Exception): + pass diff --git a/src/mcp_server_opensearch/streaming_server.py b/src/mcp_server_opensearch/streaming_server.py index 8a627d6..3b02c38 100644 --- a/src/mcp_server_opensearch/streaming_server.py +++ b/src/mcp_server_opensearch/streaming_server.py @@ -1,6 +1,7 @@ # Copyright OpenSearch Contributors # SPDX-License-Identifier: Apache-2.0 +import asyncio import logging import uvicorn import contextlib @@ -26,7 +27,7 @@ async def create_mcp_server( mode: str = 'single', profile: str = '', config_file_path: str = '', - cli_tool_overrides: dict = None, + cli_tool_overrides: dict | None = None, ) -> Server: # Set the global mode set_mode(mode) @@ -46,6 +47,9 @@ async def create_mcp_server( server = Server('opensearch-mcp-server') # Call tool generator await generate_tools_from_openapi() + # Register agentic memory tools if memory_container_id is configured + from tools.tools import register_agentic_memory_tools + register_agentic_memory_tools(config_file_path) # Apply custom tool config (custom name and description) customized_registry = apply_custom_tool_config( TOOL_REGISTRY, config_file_path, cli_tool_overrides or {} @@ -71,21 +75,9 @@ async def list_tools() -> list[Tool]: @server.call_tool() async def call_tool(name: str, arguments: dict) -> list[TextContent]: - # Find the tool by its display name, which is what the client sees - found_tool_key = None - for key, tool_info in enabled_tools.items(): - if tool_info.get('display_name', key) == name: - found_tool_key = key - break + from mcp_server_opensearch.tool_executor import execute_tool - if not found_tool_key: - raise ValueError(f'Unknown or disabled tool: {name}') - - tool = enabled_tools.get(found_tool_key) - from tools.tool_params import validate_args_for_mode - - parsed = validate_args_for_mode(arguments, tool['args_model']) - return await tool['function'](parsed) + return await execute_tool(name, arguments, enabled_tools) return server @@ -101,7 +93,7 @@ def __init__(self, mcp_server: Server, stateless: bool = True): stateless=stateless, ) - async def handle_sse(self, request: Request) -> None: + async def handle_sse(self, request: Request) -> Response: async with self.sse.connect_sse( request.scope, request.receive, @@ -125,11 +117,19 @@ async def lifespan(self, app: Starlette) -> AsyncIterator[None]: Context manager for session manager lifecycle. Ensures proper startup and shutdown of the session manager. """ + from mcp_server_opensearch.logging_config import start_memory_monitor + async with self.session_manager.run(): logging.info('Application started with StreamableHTTP session manager!') + monitor_task = start_memory_monitor() try: yield finally: + monitor_task.cancel() + try: + await monitor_task + except (asyncio.CancelledError, Exception): + pass logging.info('Application shutting down...') async def handle_streamable_http(self, scope: Scope, receive: Receive, send: Send) -> None: @@ -155,7 +155,7 @@ async def serve( mode: str = 'single', profile: str = '', config_file_path: str = '', - cli_tool_overrides: dict = None, + cli_tool_overrides: dict | None = None, stateless: bool = True, ) -> None: mcp_server = await create_mcp_server(mode, profile, config_file_path, cli_tool_overrides) diff --git a/src/mcp_server_opensearch/tool_executor.py b/src/mcp_server_opensearch/tool_executor.py new file mode 100644 index 0000000..eeff979 --- /dev/null +++ b/src/mcp_server_opensearch/tool_executor.py @@ -0,0 +1,115 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +""" +Shared tool execution with structured logging for metrics. + +Extracts the duplicated call_tool() logic from stdio_server.py and +streaming_server.py into a single function that wraps tool invocation +with timing, error detection, and structured metric event logging. + +Emits a structured log event (event_type="tool_execution") for every +tool invocation, enabling metric filters for: +- Tool execution failure alarms +- Invocation counts by tool name +- Execution latency by tool name +- Error rates by error type and tool +""" + +import logging +import time + +from mcp.types import TextContent + +logger = logging.getLogger(__name__) + + +async def execute_tool( + name: str, + arguments: dict, + enabled_tools: dict, +) -> list[TextContent]: + """Execute an MCP tool with structured logging for metrics. + + Resolves the tool by display name, validates arguments, executes, + and emits a structured log event with timing and status information. + + Args: + name: The display name of the tool as seen by the MCP client. + arguments: The raw argument dict from the MCP protocol. + enabled_tools: The enabled tools registry dict. + + Returns: + list[TextContent]: The tool's result (success or error text). + + Raises: + ValueError: If the tool name is unknown or disabled. + """ + start_time = time.monotonic() + status = 'success' + error_type = None + found_tool_key = None + + try: + # Resolve tool by display name + for key, tool_info in enabled_tools.items(): + if tool_info.get('display_name', key) == name: + found_tool_key = key + break + + if not found_tool_key: + status = 'error' + error_type = 'UnknownToolError' + raise ValueError(f'Unknown or disabled tool: {name}') + + tool = enabled_tools[found_tool_key] + from tools.tool_params import validate_args_for_mode + + parsed = validate_args_for_mode(arguments, tool['args_model'], tool['input_schema']) + result = await tool['function'](parsed) + + # Detect soft errors: tools catch exceptions internally and + # return errors via log_tool_error(), which sets is_error=True + # on the response dict as an explicit status indicator. + if result and len(result) > 0: + if isinstance(result[0], dict) and result[0].get('is_error'): + status = 'error' + + return result + + except ValueError: + # For unknown tool, status/error_type were already set above. + # For validation errors (from validate_args_for_mode), set them now. + if status != 'error': + status = 'error' + error_type = 'ValidationError' + raise + + except Exception as e: + status = 'error' + error_type = type(e).__name__ + raise + + finally: + duration_ms = round((time.monotonic() - start_time) * 1000, 2) + log_extra: dict[str, object] = { + 'event_type': 'tool_execution', + 'tool_name': name, + 'status': status, + 'duration_ms': duration_ms, + } + if found_tool_key: + log_extra['tool_key'] = found_tool_key + if error_type: + log_extra['error_type'] = error_type + + if status == 'success': + logger.info( + f'Tool executed: {name} ({duration_ms}ms)', + extra=log_extra, + ) + else: + logger.error( + f'Tool execution failed: {name} ({duration_ms}ms)', + extra=log_extra, + ) diff --git a/src/opensearch/client.py b/src/opensearch/client.py index 8d73b97..5f4055e 100644 --- a/src/opensearch/client.py +++ b/src/opensearch/client.py @@ -35,7 +35,11 @@ # Import custom connection classes and exceptions -from .connection import BufferedAsyncHttpConnection, ResponseSizeExceededError, OpenSearchClientError, DEFAULT_MAX_RESPONSE_SIZE +from .connection import ( + BufferedAsyncHttpConnection, + OpenSearchClientError, + DEFAULT_MAX_RESPONSE_SIZE, +) class AuthenticationError(OpenSearchClientError): @@ -50,6 +54,30 @@ class ConfigurationError(OpenSearchClientError): pass +def _log_connection_event( + auth_method: str, + datasource_type: str, + opensearch_url: str, + error: str, +) -> None: + """Emit a structured error log event for failed datasource connections. + + Only logs failures because AsyncOpenSearch() construction does not + actually connect — a "success" event would be misleading. + """ + logger.error( + f'Datasource connection failed: {auth_method} ({datasource_type})', + extra={ + 'event_type': 'datasource_connection', + 'auth_method': auth_method, + 'datasource_type': datasource_type, + 'status': 'error', + 'opensearch_url': opensearch_url, + 'error': error, + }, + ) + + # Public API Functions def initialize_client(args: baseToolArgs) -> AsyncOpenSearch: """Initialize and return an OpenSearch client based on the current mode. @@ -147,11 +175,7 @@ def _initialize_client_single_mode() -> AsyncOpenSearch: - OPENSEARCH_NO_AUTH - AWS_OPENSEARCH_SERVERLESS - OPENSEARCH_TIMEOUT - - When OPENSEARCH_HEADER_AUTH=true, headers are preferred: - - opensearch-url, aws-region, aws-access-key-id, aws-secret-access-key, - aws-session-token, aws-service-name - - Authorization: For Basic auth (format: Basic ) + - BEARER Returns: OpenSearch: An initialized OpenSearch client instance @@ -181,13 +205,18 @@ def _initialize_client_single_mode() -> AsyncOpenSearch: try: max_response_size = int(max_response_size_str) if max_response_size <= 0: - logger.warning(f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE value {max_response_size}, using default') + logger.warning( + f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE value {max_response_size}, using default' + ) max_response_size = None except ValueError: - logger.warning(f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE format: {max_response_size_str}, using default') + logger.warning( + f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE format: {max_response_size_str}, using default' + ) aws_access_key_id = None aws_secret_access_key = None aws_session_token = None + bearer_auth_header = None # Default to region from environment aws_region = get_aws_region_single_mode() @@ -215,6 +244,8 @@ def _initialize_client_single_mode() -> AsyncOpenSearch: if header_username and header_password: opensearch_username = header_username opensearch_password = header_password + # Pass through Bearer token if provided in headers + bearer_auth_header = header_auth.get('bearer_auth_header') # Validate URL after potential header override (must come from either env or headers) if not opensearch_url or not opensearch_url.strip(): @@ -246,6 +277,7 @@ def _initialize_client_single_mode() -> AsyncOpenSearch: aws_secret_access_key=aws_secret_access_key, aws_session_token=aws_session_token, max_response_size=max_response_size, + bearer_auth_header=bearer_auth_header, ) except (ConfigurationError, AuthenticationError): @@ -300,14 +332,19 @@ def _initialize_client_multi_mode(cluster_info: ClusterInfo) -> AsyncOpenSearch: try: max_response_size = int(max_response_size_str) if max_response_size <= 0: - logger.warning(f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE value {max_response_size}, using default') + logger.warning( + f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE value {max_response_size}, using default' + ) max_response_size = None except ValueError: - logger.warning(f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE format: {max_response_size_str}, using default') - + logger.warning( + f'Invalid OPENSEARCH_MAX_RESPONSE_SIZE format: {max_response_size_str}, using default' + ) + aws_access_key_id = None aws_secret_access_key = None aws_session_token = None + bearer_auth_header = None # Default to region from cluster config aws_region = get_aws_region_multi_mode(cluster_info) @@ -335,6 +372,8 @@ def _initialize_client_multi_mode(cluster_info: ClusterInfo) -> AsyncOpenSearch: if header_username and header_password: opensearch_username = header_username opensearch_password = header_password + # Pass through Bearer token if provided in headers + bearer_auth_header = header_auth.get('bearer_auth_header') # Use common client creation function return _create_opensearch_client( @@ -352,6 +391,7 @@ def _initialize_client_multi_mode(cluster_info: ClusterInfo) -> AsyncOpenSearch: aws_secret_access_key=aws_secret_access_key, aws_session_token=aws_session_token, max_response_size=max_response_size, + bearer_auth_header=bearer_auth_header, ) except (ConfigurationError, AuthenticationError): @@ -380,6 +420,7 @@ def _create_opensearch_client( aws_secret_access_key: Optional[str] = None, aws_session_token: Optional[str] = None, max_response_size: Optional[int] = None, + bearer_auth_header: Optional[str] = None, ) -> AsyncOpenSearch: """Common function to create OpenSearch client with authentication. @@ -401,6 +442,7 @@ def _create_opensearch_client( aws_secret_access_key: AWS secret access key from headers (optional) aws_session_token: AWS session token from headers (optional) max_response_size: Maximum response size in bytes (None means no limit) + bearer_auth_header: Authorization Bearer header value (optional) Returns: OpenSearch: An initialized OpenSearch client instance @@ -424,8 +466,9 @@ def _create_opensearch_client( except Exception as e: raise ConfigurationError(f'Invalid OpenSearch URL format: {opensearch_url}. Error: {e}') - # Determine service name + # Determine service name and datasource type service_name = OPENSEARCH_SERVERLESS_SERVICE if is_serverless_mode else OPENSEARCH_SERVICE + datasource_type = 'aoss' if is_serverless_mode else 'aos' if is_serverless_mode: logger.info('Initializing OpenSearch Serverless client with service name: aoss') @@ -437,8 +480,10 @@ def _create_opensearch_client( timeout = DEFAULT_TIMEOUT # Determine response size limit - response_size_limit = max_response_size if max_response_size is not None else DEFAULT_MAX_RESPONSE_SIZE - + response_size_limit = ( + max_response_size if max_response_size is not None else DEFAULT_MAX_RESPONSE_SIZE + ) + # Build client configuration with buffered connection client_kwargs: Dict[str, Any] = { 'hosts': [opensearch_url], @@ -450,7 +495,9 @@ def _create_opensearch_client( } if response_size_limit is not None: - logger.info(f'Configuring OpenSearch client with max_response_size={response_size_limit} bytes') + logger.info( + f'Configuring OpenSearch client with max_response_size={response_size_limit} bytes' + ) else: logger.info('Configuring OpenSearch client with no response size limit') @@ -469,10 +516,22 @@ def _create_opensearch_client( try: return AsyncOpenSearch(**client_kwargs) except Exception as e: - logger.error(f'[NO AUTH] Failed to connect without authentication: {e}') + _log_connection_event('no_auth', datasource_type, opensearch_url, str(e)) raise AuthenticationError(f'Failed to connect without authentication: {e}') - # 2. Header-based AWS credentials authentication (highest priority when provided) + # 2. Header-based Authorization (Bearer token) + if bearer_auth_header: + logger.info('[HEADER AUTH] Using Authorization Bearer header') + try: + client_kwargs['headers'] = {'Authorization': bearer_auth_header} + return AsyncOpenSearch(**client_kwargs) + except Exception as e: + _log_connection_event('header_auth_bearer', datasource_type, opensearch_url, str(e)) + raise AuthenticationError( + f'Failed to authenticate with Authorization Bearer header: {e}' + ) + + # 3. Header-based AWS credentials authentication (highest priority when provided) if aws_access_key_id and aws_secret_access_key and aws_region: logger.info('[HEADER AUTH] Using AWS credentials from headers') try: @@ -491,10 +550,10 @@ def _create_opensearch_client( client_kwargs['http_auth'] = aws_auth return AsyncOpenSearch(**client_kwargs) except Exception as e: - logger.error(f'[HEADER AUTH] Failed to authenticate with header credentials: {e}') + _log_connection_event('header_auth', datasource_type, opensearch_url, str(e)) raise AuthenticationError(f'Failed to authenticate with header credentials: {e}') - # 3. IAM role authentication + # 4. IAM role authentication if iam_arn and iam_arn.strip(): logger.info(f'[IAM AUTH] Using IAM role authentication: {iam_arn}') try: @@ -518,20 +577,20 @@ def _create_opensearch_client( client_kwargs['http_auth'] = aws_auth return AsyncOpenSearch(**client_kwargs) except Exception as e: - logger.error(f'[IAM AUTH] Failed to assume IAM role {iam_arn}: {e}') + _log_connection_event('iam_auth', datasource_type, opensearch_url, str(e)) raise AuthenticationError(f'Failed to assume IAM role {iam_arn}: {e}') - # 4. Basic authentication + # 5. Basic authentication if opensearch_username and opensearch_password: logger.info(f'[BASIC AUTH] Using basic authentication for user: {opensearch_username}') try: client_kwargs['http_auth'] = (opensearch_username.strip(), opensearch_password) return AsyncOpenSearch(**client_kwargs) except Exception as e: - logger.error(f'[BASIC AUTH] Failed to connect with basic authentication: {e}') + _log_connection_event('basic_auth', datasource_type, opensearch_url, str(e)) raise AuthenticationError(f'Failed to connect with basic authentication: {e}') - # 5. AWS credentials authentication + # 6. AWS credentials authentication logger.info('[AWS CREDS] Attempting AWS credentials authentication') try: if not aws_region or (isinstance(aws_region, str) and not aws_region.strip()): @@ -549,7 +608,7 @@ def _create_opensearch_client( client_kwargs['http_auth'] = aws_auth return AsyncOpenSearch(**client_kwargs) except Exception as e: - logger.error(f'[AWS CREDS] Failed to authenticate with AWS credentials: {e}') + _log_connection_event('aws_creds', datasource_type, opensearch_url, str(e)) raise AuthenticationError(f'Failed to authenticate with AWS credentials: {e}') except AuthenticationError: @@ -672,6 +731,7 @@ def _get_auth_from_headers() -> Dict[str, Optional[str]]: - aws_service_name: AWS service name (es or aoss) - opensearch_username: Username from Basic auth (Authorization header) - opensearch_password: Password from Basic auth (Authorization header) + - bearer_auth_header: Authorization Bearer header value (if provided) All values are None if headers are not available or not set. """ result: Dict[str, Optional[str]] = { @@ -683,6 +743,7 @@ def _get_auth_from_headers() -> Dict[str, Optional[str]]: 'aws_service_name': None, 'opensearch_username': None, 'opensearch_password': None, + 'bearer_auth_header': None, } try: @@ -700,20 +761,26 @@ def _get_auth_from_headers() -> Dict[str, Optional[str]]: result['aws_session_token'] = headers.get('aws-session-token', '').strip() or None result['aws_service_name'] = headers.get('aws-service-name', '').strip() or None - # Extract Basic auth from Authorization header + # Extract auth from Authorization header auth_header = headers.get('authorization', '').strip() - if auth_header and auth_header.lower().startswith('basic '): - import base64 - # Extract the base64 encoded credentials - encoded_credentials = auth_header[6:] # Skip 'Basic ' - decoded_bytes = base64.b64decode(encoded_credentials) - decoded_credentials = decoded_bytes.decode('utf-8') - - # Split into username and password - if ':' in decoded_credentials: - username, password = decoded_credentials.split(':', 1) - result['opensearch_username'] = username - result['opensearch_password'] = password + if auth_header: + auth_header_lower = auth_header.lower() + if auth_header_lower.startswith('bearer '): + token = auth_header[7:].strip() + if token: + result['bearer_auth_header'] = f'Bearer {token}' + elif auth_header_lower.startswith('basic '): + import base64 + # Extract the base64 encoded credentials + encoded_credentials = auth_header[6:] # Skip 'Basic ' + decoded_bytes = base64.b64decode(encoded_credentials) + decoded_credentials = decoded_bytes.decode('utf-8') + + # Split into username and password + if ':' in decoded_credentials: + username, password = decoded_credentials.split(':', 1) + result['opensearch_username'] = username + result['opensearch_password'] = password except Exception as e: logger.debug(f'Could not read headers from request context: {e}') diff --git a/src/opensearch/connection.py b/src/opensearch/connection.py index 07026fd..a7d476e 100644 --- a/src/opensearch/connection.py +++ b/src/opensearch/connection.py @@ -9,6 +9,8 @@ """ import logging +import time + from opensearchpy import AsyncHttpConnection # Configure logging @@ -29,6 +31,42 @@ class ResponseSizeExceededError(OpenSearchClientError): pass +def _log_request_event( + method: str, + endpoint: str, + status_code: int | None, + duration_ms: float, + status: str, + response_size: int | None = None, + error: str | None = None, +) -> None: + """Emit a structured log event for OpenSearch HTTP requests.""" + log_extra: dict[str, object] = { + 'event_type': 'opensearch_request', + 'http_method': method, + 'endpoint': endpoint, + 'status': status, + 'duration_ms': duration_ms, + } + if status_code is not None: + log_extra['status_code'] = status_code + if response_size is not None: + log_extra['response_size'] = response_size + if error: + log_extra['error'] = error + + if status == 'success': + logger.info( + f'OpenSearch request: {method} {endpoint} -> {status_code} ({duration_ms}ms)', + extra=log_extra, + ) + else: + logger.error( + f'OpenSearch request failed: {method} {endpoint} -> {status_code} ({duration_ms}ms)', + extra=log_extra, + ) + + class BufferedAsyncHttpConnection(AsyncHttpConnection): """ Async HTTP connection that buffers responses with size limiting. @@ -50,11 +88,15 @@ def __init__(self, *args, max_response_size=DEFAULT_MAX_RESPONSE_SIZE, **kwargs) super().__init__(*args, **kwargs) self.max_response_size = max_response_size if max_response_size is not None: - logger.debug(f'Initialized BufferedAsyncHttpConnection with max_response_size={max_response_size} bytes') + logger.debug( + f'Initialized BufferedAsyncHttpConnection with max_response_size={max_response_size} bytes' + ) else: logger.debug('Initialized BufferedAsyncHttpConnection with no response size limit') - async def perform_request(self, method, url, params=None, body=None, timeout=None, ignore=(), headers=None): + async def perform_request( + self, method, url, params=None, body=None, timeout=None, ignore=(), headers=None + ): """ Perform HTTP request with response size limiting. @@ -76,8 +118,10 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non Raises: ResponseSizeExceededError: If response exceeds max_response_size during streaming """ - logger.debug(f'Making size-limited request: {method} {url} (max_size={self.max_response_size})') - original_url = url; + logger.debug( + f'Making size-limited request: {method} {url} (max_size={self.max_response_size})' + ) + original_url = url try: # Import required modules import aiohttp @@ -95,11 +139,11 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non if params: query_string = urlencode(params) else: - query_string = "" + query_string = '' url = self.url_prefix + url if query_string: - url = f"{url}?{query_string}" + url = f'{url}?{query_string}' url = self.host + url timeout_obj = aiohttp.ClientTimeout( @@ -113,12 +157,10 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non if self.http_compress and body: body = self._gzip_compress(body) - req_headers["content-encoding"] = "gzip" + req_headers['content-encoding'] = 'gzip' # Handle authentication (following parent class logic) - auth = ( - self._http_auth if isinstance(self._http_auth, aiohttp.BasicAuth) else None - ) + auth = self._http_auth if isinstance(self._http_auth, aiohttp.BasicAuth) else None if callable(self._http_auth): req_headers = { **req_headers, @@ -137,14 +179,16 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non timeout=timeout_obj, fingerprint=self.ssl_assert_fingerprint, ) as response: - # Stream the response with optional size checking chunks = [] total_size = 0 async for chunk in response.content.iter_chunked(8192): # Only check size limit if max_response_size is set - if self.max_response_size is not None and total_size + len(chunk) > self.max_response_size: + if ( + self.max_response_size is not None + and total_size + len(chunk) > self.max_response_size + ): duration = self.loop.time() - start self.log_request_fail( method, @@ -152,16 +196,16 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non url_path, orig_body, duration, - exception=f"Response size exceeded {self.max_response_size} bytes" + exception=f'Response size exceeded {self.max_response_size} bytes', ) logger.error( f'Response size exceeded limit during streaming: ' f'{total_size + len(chunk)} > {self.max_response_size} bytes' ) raise ResponseSizeExceededError( - f"Response size exceeded limit of {self.max_response_size} bytes. " - f"Stopped reading at {total_size} bytes to prevent memory exhaustion. " - f"Consider increasing max_response_size or refining your query to return less data." + f'Response size exceeded limit of {self.max_response_size} bytes. ' + f'Stopped reading at {total_size} bytes to prevent memory exhaustion. ' + f'Consider increasing max_response_size or refining your query to return less data.' ) chunks.append(chunk) @@ -178,10 +222,11 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non duration = self.loop.time() - start # Handle warnings (following parent class logic) - warning_headers = response.headers.getall("warning", ()) + warning_headers = response.headers.getall('warning', ()) self._raise_warnings(warning_headers) # Handle errors (following parent class logic) + duration_ms = round(duration * 1000, 2) if not (200 <= response.status < 300) and response.status not in ignore: self.log_request_fail( method, @@ -192,32 +237,51 @@ async def perform_request(self, method, url, params=None, body=None, timeout=Non status_code=response.status, response=raw_data, ) + _log_request_event( + method, + original_url, + response.status, + duration_ms, + 'error', + response_size=total_size, + ) self._raise_error(response.status, raw_data) # Log success self.log_request_success( method, str(url), url_path, orig_body, response.status, raw_data, duration ) + _log_request_event( + method, + original_url, + response.status, + duration_ms, + 'success', + response_size=total_size, + ) - if self.max_response_size is not None: - logger.debug(f'Response size check passed: {total_size} bytes (limit: {self.max_response_size})') - else: - logger.debug(f'Response received: {total_size} bytes (no size limit)') return response.status, response.headers, raw_data except ResponseSizeExceededError: raise except Exception as e: # For connection errors and other failures, fall back to parent implementation - logger.warning(f'Streaming request failed ({type(e).__name__}: {e}), falling back to parent implementation') - return await self._fallback_perform_request(method, original_url, params, body, timeout, ignore, headers) + logger.warning( + f'Streaming request failed ({type(e).__name__}: {e}), falling back to parent implementation' + ) + return await self._fallback_perform_request( + method, original_url, params, body, timeout, ignore, headers + ) - async def _fallback_perform_request(self, method, url, params=None, body=None, timeout=None, ignore=(), headers=None): + async def _fallback_perform_request( + self, method, url, params=None, body=None, timeout=None, ignore=(), headers=None + ): """ Fallback to parent implementation with post-download size checking. This is used when streaming is not available or fails. """ + fallback_start = time.monotonic() try: # Use parent implementation for the actual request (preserves auth) status, response_headers, response_data = await super().perform_request( @@ -238,19 +302,35 @@ async def _fallback_perform_request(self, method, url, params=None, body=None, t f'Response size exceeded limit: {data_size} > {self.max_response_size} bytes' ) raise ResponseSizeExceededError( - f"Response size exceeded limit of {self.max_response_size} bytes. " - f"Received {data_size} bytes. " - f"Consider increasing max_response_size or refining your query to return less data." + f'Response size exceeded limit of {self.max_response_size} bytes. ' + f'Received {data_size} bytes. ' + f'Consider increasing max_response_size or refining your query to return less data.' ) - if self.max_response_size is not None: - logger.debug(f'Response size check passed: {data_size} bytes (limit: {self.max_response_size})') - else: - logger.debug(f'Response received: {data_size} bytes (no size limit)') + fallback_duration_ms = round((time.monotonic() - fallback_start) * 1000, 2) + _log_request_event( + method, + url, + status, + fallback_duration_ms, + 'success', + response_size=data_size, + ) + return status, response_headers, response_data except ResponseSizeExceededError: raise except Exception as e: - logger.error(f'Error in fallback size-limited request: {e}') - raise \ No newline at end of file + fallback_duration_ms = round((time.monotonic() - fallback_start) * 1000, 2) + raw_status = getattr(e, 'status_code', None) + exc_status_code = raw_status if isinstance(raw_status, int) else None + _log_request_event( + method, + url, + exc_status_code, + fallback_duration_ms, + 'error', + error=str(e), + ) + raise diff --git a/src/opensearch/helper.py b/src/opensearch/helper.py index ad0d655..d92c29c 100644 --- a/src/opensearch/helper.py +++ b/src/opensearch/helper.py @@ -10,6 +10,10 @@ import json from semver import Version from tools.tool_params import * +from tools.agentic_memory.params import * +from tools.utils import helper_error +from urllib.parse import quote + # Configure logging logger = logging.getLogger(__name__) @@ -55,16 +59,19 @@ async def search_index(args: SearchIndexArgs) -> json: from .client import get_opensearch_client from tools.tools import TOOL_REGISTRY + if isinstance(args.query_dsl, str): + validate_json_string(args.query_dsl) + async with get_opensearch_client(args) as client: - query = normalize_scientific_notation(args.query) - + query = normalize_scientific_notation(args.query_dsl) + # Limit size to maximum of 100 tool_info = TOOL_REGISTRY.get('SearchIndexTool', {}) max_size_limit = tool_info.get('max_size_limit', 100) # Default to 100 if not configured effective_size = min(args.size, max_size_limit) if args.size else 10 query['size'] = effective_size - + response = await client.search(index=args.index, body=query) return response @@ -294,50 +301,318 @@ async def get_nodes_info(args: GetNodesArgs) -> json: return response +async def get_query_set(args: GetQuerySetArgs) -> json: + """Get a specific query set by ID from the Search Relevance plugin. + + Args: + args: GetQuerySetArgs containing the query_set_id + + Returns: + json: Query set details + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.get_query_sets( + query_set_id=args.query_set_id + ) + return response + + +async def create_query_set(args: CreateQuerySetArgs) -> json: + """Create a new query set with a list of queries in the Search Relevance plugin. + + Args: + args: CreateQuerySetArgs containing name, queries (JSON string), and optional description + + Returns: + json: Result of the creation operation with query set ID + """ + import json as _json + + from .client import get_opensearch_client + + queries = _json.loads(args.queries) if isinstance(args.queries, str) else args.queries + if not isinstance(queries, list): + raise ValueError( + 'queries must be a JSON array of strings or objects with queryText, e.g. ["q1", "q2"]' + ) + + query_set_queries = [] + for q in queries: + if isinstance(q, str): + query_set_queries.append({'queryText': q}) + elif isinstance(q, dict) and 'queryText' in q: + query_set_queries.append(q) + else: + query_set_queries.append({'queryText': str(q)}) + + body = { + 'name': args.name, + 'description': args.description or f'Query set: {args.name}', + 'sampling': 'manual', + 'querySetQueries': query_set_queries, + } + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.put_query_sets(body=body) + return response + + +async def sample_query_set(args: SampleQuerySetArgs) -> json: + """Create a query set by sampling the top N most frequent queries from UBI data. + + Args: + args: SampleQuerySetArgs containing name, query_set_size, and optional description + + Returns: + json: Result of the sampling operation with the created query set ID + """ + from .client import get_opensearch_client + + body = { + 'name': args.name, + 'description': args.description or f'Query set: {args.name} ({args.sampling}, size={args.query_set_size})', + 'sampling': args.sampling, + 'querySetSize': args.query_set_size, + } + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.post_query_sets(body=body) + return response + + +async def delete_query_set(args: DeleteQuerySetArgs) -> json: + """Delete a query set by ID from the Search Relevance plugin. + + Args: + args: DeleteQuerySetArgs containing the query_set_id + + Returns: + json: Result of the deletion operation + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.delete_query_sets( + query_set_id=args.query_set_id + ) + return response + + +async def get_experiment(args: GetExperimentArgs) -> json: + """Retrieve an experiment by ID via the Search Relevance plugin. + + Args: + args: GetExperimentArgs containing the experiment_id + + Returns: + json: OpenSearch response with the experiment details + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.get_experiments( + experiment_id=args.experiment_id + ) + return response + + +async def _srw_search(args, entity: str) -> json: + """Execute a _search request against a Search Relevance Workbench entity index. + + Args: + args: Tool args containing the optional query_body + entity: The SRW entity name, e.g. 'query_sets', 'search_configurations', + 'judgments', or 'experiment' + + Returns: + json: OpenSearch search response + """ + from .client import get_opensearch_client + + if args.query_body is None: + body = {'query': {'match_all': {}}} + elif isinstance(args.query_body, str): + validate_json_string(args.query_body) + body = json.loads(args.query_body) + else: + body = args.query_body + async with get_opensearch_client(args) as client: + response = await client.transport.perform_request( + method='POST', + url=f'/_plugins/_search_relevance/{entity}/_search', + body=json.dumps(body), + ) + return response + + +async def create_experiment(args: CreateExperimentArgs) -> json: + """Create an experiment via the Search Relevance plugin. + + Validates configuration counts for each experiment type and requires + judgment lists for POINTWISE_EVALUATION and HYBRID_OPTIMIZER. + + Args: + args: CreateExperimentArgs containing query_set_id, search_configuration_ids, + experiment_type, size, and optional judgment_list_ids + + Returns: + json: OpenSearch response with the created experiment ID + """ + from .client import get_opensearch_client + + search_configuration_ids = ( + json.loads(args.search_configuration_ids) + if isinstance(args.search_configuration_ids, str) + else args.search_configuration_ids + ) + if not isinstance(search_configuration_ids, list): + raise ValueError('search_configuration_ids must be a JSON array of configuration ID strings') + + if args.experiment_type == 'PAIRWISE_COMPARISON' and len(search_configuration_ids) != 2: + raise ValueError('PAIRWISE_COMPARISON requires exactly 2 search configuration IDs') + if args.experiment_type in ('POINTWISE_EVALUATION', 'HYBRID_OPTIMIZER') and len(search_configuration_ids) != 1: + raise ValueError(f'{args.experiment_type} requires exactly 1 search configuration ID') + + body: dict = { + 'querySetId': args.query_set_id, + 'searchConfigurationList': search_configuration_ids, + 'size': args.size, + 'type': args.experiment_type, + } + + if args.experiment_type in ('POINTWISE_EVALUATION', 'HYBRID_OPTIMIZER'): + if not args.judgment_list_ids: + raise ValueError( + f'{args.experiment_type} requires judgment_list_ids. ' + 'Provide one or more judgment list IDs as a JSON array, ' + 'e.g. ["judgment-id-1"] or ["judgment-id-1", "judgment-id-2"]' + ) + judgment_list_ids = ( + json.loads(args.judgment_list_ids) + if isinstance(args.judgment_list_ids, str) + else args.judgment_list_ids + ) + if not isinstance(judgment_list_ids, list) or len(judgment_list_ids) == 0: + raise ValueError('judgment_list_ids must be a non-empty JSON array of judgment list ID strings') + body['judgmentList'] = judgment_list_ids + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.put_experiments(body=body) + return response + + +async def delete_experiment(args: DeleteExperimentArgs) -> json: + """Delete an experiment by ID via the Search Relevance plugin. + + Args: + args: DeleteExperimentArgs containing the experiment_id + + Returns: + json: OpenSearch response confirming deletion + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.delete_experiments( + experiment_id=args.experiment_id + ) + return response + + +async def search_query_sets(args: SearchQuerySetsArgs) -> json: + """Search query sets using OpenSearch query DSL. + + Args: + args: SearchQuerySetsArgs containing an optional query_body + + Returns: + json: OpenSearch search response + """ + return await _srw_search(args, 'query_sets') + + +async def search_search_configurations(args: SearchSearchConfigurationsArgs) -> json: + """Search search configurations using OpenSearch query DSL. + + Args: + args: SearchSearchConfigurationsArgs containing an optional query_body + + Returns: + json: OpenSearch search response + """ + return await _srw_search(args, 'search_configurations') + + +async def search_judgments(args: SearchJudgmentsArgs) -> json: + """Search judgments using OpenSearch query DSL. + + Args: + args: SearchJudgmentsArgs containing an optional query_body + + Returns: + json: OpenSearch search response + """ + return await _srw_search(args, 'judgments') + + +async def search_experiments(args: SearchExperimentsArgs) -> json: + """Search experiments using OpenSearch query DSL. + + Args: + args: SearchExperimentsArgs containing an optional query_body + + Returns: + json: OpenSearch search response + """ + return await _srw_search(args, 'experiment') + + def convert_search_results_to_csv(search_results: dict) -> str: """Convert OpenSearch search results to CSV format. - + Args: search_results: The JSON response from search_index function - + Returns: str: CSV formatted string of the search results """ if not search_results: return "No search results to convert" - + has_hits = 'hits' in search_results and search_results['hits']['hits'] has_aggregations = 'aggregations' in search_results - + # Handle aggregations-only queries if has_aggregations and not has_hits: return json.dumps(search_results['aggregations'], separators=(',', ':')) - + # Handle hits-only queries if has_hits and not has_aggregations: return _convert_hits_to_csv(search_results['hits']['hits']) - + # Handle queries with both hits and aggregations if has_hits and has_aggregations: hits_csv = _convert_hits_to_csv(search_results['hits']['hits']) aggregations_json = json.dumps(search_results['aggregations'], separators=(',', ':')) return f"SEARCH HITS:\n{hits_csv}\n\nAGGREGATIONS:\n{aggregations_json}" - + return "No search results to convert" def _convert_hits_to_csv(hits: list) -> str: """Convert search hits to CSV format. - + Args: hits: List of search hits - + Returns: str: CSV formatted string """ if not hits: return "No documents found in search results" - + # Extract all unique field names from all documents (flattened) all_fields = set() for hit in hits: @@ -345,15 +620,15 @@ def _convert_hits_to_csv(hits: list) -> str: _flatten_fields(hit['_source'], all_fields) # Also include metadata fields all_fields.update(['_index', '_id', '_score']) - + # Convert to sorted list for consistent column order fieldnames = sorted(list(all_fields)) - + # Create CSV in memory output = io.StringIO() writer = csv.DictWriter(output, fieldnames=fieldnames) writer.writeheader() - + # Write each document as a row for hit in hits: row = {} @@ -361,19 +636,19 @@ def _convert_hits_to_csv(hits: list) -> str: row['_index'] = hit.get('_index', '') row['_id'] = hit.get('_id', '') row['_score'] = hit.get('_score', '') - + # Add source fields (flattened) if '_source' in hit: _flatten_object(hit['_source'], row) - + writer.writerow(row) - + return output.getvalue() def _flatten_fields(obj: dict, fields: set, prefix: str = '') -> None: """Extract all field names from nested objects. - + Args: obj: Object to extract field names from fields: Set to add field names to @@ -393,7 +668,7 @@ def _flatten_fields(obj: dict, fields: set, prefix: str = '') -> None: def _flatten_object(obj: dict, row: dict, prefix: str = '') -> None: """Flatten nested objects into separate columns. - + Args: obj: Object to flatten row: Row dictionary to add flattened fields to @@ -431,6 +706,244 @@ async def get_opensearch_version(args: baseToolArgs) -> Version: logger.error(f'Error getting OpenSearch version: {e}') return None + +async def create_agentic_memory_session( + args: CreateAgenticMemorySessionArgs, +) -> Dict[str, Any]: + """Create a new agentic memory session in the specified memory container. + + Args: + args: CreateAgenticMemorySessionArgs containing memory_container_id and optional session_id, summary, metadata, namespace + + Returns: + json: Response from the session creation endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories/sessions', + ] + url = '/'.join(url_parts) + + body = args.model_dump( + exclude={'memory_container_id', 'opensearch_cluster_name'}, + exclude_none=True, + ) + + try: + response = await client.transport.perform_request(method='POST', url=url, body=body) + except Exception as e: + raise helper_error('create agentic memory session', e) + + return response + + +async def add_agentic_memories(args: AddAgenticMemoriesArgs) -> Dict[str, Any]: + """Add agentic memories to the specified memory container based on the payload type. + + Args: + args: AddAgenticMemoriesArgs containing memory_container_id, payload_type, and content fields like messages or structured_data, plus optional namespace, metadata, tags, infer + + Returns: + json: Response from the add memories endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + ] + url = '/'.join(url_parts) + + body = args.model_dump( + exclude={'memory_container_id', 'opensearch_cluster_name'}, + exclude_none=True, + by_alias=True, + ) + + try: + response = await client.transport.perform_request(method='POST', url=url, body=body) + except Exception as e: + raise helper_error('add agentic memories', e) + + return response + + +async def get_agentic_memory(args: GetAgenticMemoryArgs) -> Dict[str, Any]: + """Retrieve a specific agentic memory by its type and ID from the memory container. + + Args: + args: GetAgenticMemoryArgs containing memory_container_id, memory_type, and id + + Returns: + json: The retrieved memory information from the /_memory endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + quote(args.memory_type, safe=''), + quote(args.id, safe=''), + ] + url = '/'.join(url_parts) + + try: + response = await client.transport.perform_request(method='GET', url=url) + except Exception as e: + raise helper_error('get agentic memory', e) + + return response + + +async def update_agentic_memory(args: UpdateAgenticMemoryArgs) -> Dict[str, Any]: + """Update a specific agentic memory by its type and ID in the memory container. + + Args: + args: UpdateAgenticMemoryArgs containing memory_container_id, memory_type, id, and optional update fields based on type + + Returns: + json: Response from the update memory endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + quote(args.memory_type, safe=''), + quote(args.id, safe=''), + ] + url = '/'.join(url_parts) + + body = args.model_dump( + exclude={ + 'memory_container_id', + 'memory_type', + 'id', + 'opensearch_cluster_name', + }, + exclude_none=True, + by_alias=True, + ) + + try: + response = await client.transport.perform_request(method='PUT', url=url, body=body) + except Exception as e: + raise helper_error('update agentic memory', e) + + return response + + +async def delete_agentic_memory_by_id( + args: DeleteAgenticMemoryByIDArgs, +) -> Dict[str, Any]: + """Delete a specific agentic memory by its type and ID from the memory container. + + Args: + args: DeleteAgenticMemoryByIDArgs containing memory_container_id, memory_type, and id + + Returns: + json: Response from the delete memory endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + quote(args.memory_type, safe=''), + quote(args.id, safe=''), + ] + url = '/'.join(url_parts) + + try: + response = await client.transport.perform_request(method='DELETE', url=url) + except Exception as e: + raise helper_error('delete agentic memory by ID', e) + + return response + + +async def delete_agentic_memory_by_query( + args: DeleteAgenticMemoryByQueryArgs, +) -> Dict[str, Any]: + """Delete agentic memories matching the provided query from the specified memory type in the container. + + Args: + args: DeleteAgenticMemoryByQueryArgs containing memory_container_id, memory_type, and query + + Returns: + json: Response from the delete memory by query endpoint + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + quote(args.memory_type, safe=''), + '_delete_by_query', + ] + url = '/'.join(url_parts) + + body = args.model_dump( + exclude={'memory_container_id', 'memory_type', 'opensearch_cluster_name'}, + exclude_none=True, + ) + + try: + response = await client.transport.perform_request(method='POST', url=url, body=body) + except Exception as e: + raise helper_error('delete agentic memory by query', e) + + return response + + +async def search_agentic_memory(args: SearchAgenticMemoryArgs) -> Dict[str, Any]: + """Search for agentic memories of a specific type within the memory container using OpenSearch query DSL. + + Args: + args: SearchAgenticMemoryArgs containing memory_container_id, memory_type, query, and optional sort + + Returns: + json: Search memories results + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + url_parts = [ + '/_plugins/_ml/memory_containers', + quote(args.memory_container_id, safe=''), + 'memories', + quote(args.memory_type, safe=''), + '_search', + ] + url = '/'.join(url_parts) + + body = args.model_dump( + exclude={'memory_container_id', 'memory_type', 'opensearch_cluster_name'}, + exclude_none=True, + ) + + try: + response = await client.transport.perform_request(method='GET', url=url, body=body) + except Exception as e: + raise helper_error('search agentic memory', e) + + return response + + def plain_float(value): """Convert a float to a non-scientific notation number. @@ -496,6 +1009,222 @@ def _convert_value(v): return v +async def create_search_configuration(args: CreateSearchConfigurationArgs) -> json: + """Create a search configuration via the Search Relevance plugin. + + Args: + args: CreateSearchConfigurationArgs containing name, index, and query + + Returns: + json: OpenSearch response with the created configuration ID + """ + from .client import get_opensearch_client + + validate_json_string(args.query) + + async with get_opensearch_client(args) as client: + body = { + 'name': args.name, + 'index': args.index, + 'query': args.query, # must remain a JSON string, not a dict + } + response = await client.plugins.search_relevance.put_search_configurations(body=body) + return response + + +async def get_search_configuration(args: GetSearchConfigurationArgs) -> json: + """Retrieve a search configuration by ID via the Search Relevance plugin. + + Args: + args: GetSearchConfigurationArgs containing the search_configuration_id + + Returns: + json: OpenSearch response with the search configuration details + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.get_search_configurations( + search_configuration_id=args.search_configuration_id + ) + return response + + +async def delete_search_configuration(args: DeleteSearchConfigurationArgs) -> json: + """Delete a search configuration by ID via the Search Relevance plugin. + + Args: + args: DeleteSearchConfigurationArgs containing the search_configuration_id + + Returns: + json: OpenSearch response confirming deletion + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.delete_search_configurations( + search_configuration_id=args.search_configuration_id + ) + return response + + +async def get_judgment_list(args: GetJudgmentListArgs) -> json: + """Retrieve a judgment by ID via the Search Relevance plugin. + + Args: + args: GetJudgmentListArgs containing the judgment_id + + Returns: + json: OpenSearch response with the judgment details + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.get_judgments( + judgment_id=args.judgment_id + ) + return response + + +async def create_judgment_list(args: CreateJudgmentListArgs) -> json: + """Create a judgment list with manual relevance ratings via the Search Relevance plugin. + + Args: + args: CreateJudgmentListArgs containing name, judgment_ratings (JSON string), and optional description + + Returns: + json: OpenSearch response with the created judgment ID + """ + from .client import get_opensearch_client + + judgment_ratings = ( + json.loads(args.judgment_ratings) + if isinstance(args.judgment_ratings, str) + else args.judgment_ratings + ) + if not isinstance(judgment_ratings, list): + raise ValueError( + 'judgment_ratings must be a JSON array of query-ratings objects, ' + 'e.g. [{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}]}]' + ) + + body = { + 'name': args.name, + 'type': 'IMPORT_JUDGMENT', + 'judgmentRatings': judgment_ratings, + } + if args.description: + body['description'] = args.description + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.put_judgments(body=body) + return response + + +async def create_ubi_judgment_list(args: CreateUBIJudgmentListArgs) -> json: + """Create a judgment list by mining relevance signals from UBI click data. + + Args: + args: CreateUBIJudgmentListArgs containing name, click_model, max_rank, and optional date range + + Returns: + json: OpenSearch response with the created judgment ID and processing status + """ + from .client import get_opensearch_client + + body = { + 'name': args.name, + 'type': 'UBI_JUDGMENT', + 'clickModel': args.click_model, + 'maxRank': args.max_rank, + } + if args.start_date: + body['startDate'] = args.start_date + if args.end_date: + body['endDate'] = args.end_date + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.put_judgments(body=body) + return response + + +async def create_llm_judgment_list(args: CreateLLMJudgmentListArgs) -> json: + """Create a judgment list using an LLM model via the Search Relevance plugin. + + For each query in the query set, the top k documents are retrieved using the + specified search configuration and rated by the LLM model. + + Args: + args: CreateLLMJudgmentListArgs containing name, query_set_id, search_configuration_id, + model_id, size, and optional context_fields + + Returns: + json: OpenSearch response with the created judgment ID and processing status + """ + from .client import get_opensearch_client + + context_fields = ( + json.loads(args.context_fields) + if isinstance(args.context_fields, str) + else args.context_fields + ) + if not isinstance(context_fields, list): + raise ValueError('context_fields must be a JSON array of field name strings, e.g. ["title", "description"]') + + body = { + 'name': args.name, + 'type': 'LLM_JUDGMENT', + 'querySetId': args.query_set_id, + 'searchConfigurationList': [args.search_configuration_id], + 'modelId': args.model_id, + 'size': args.size, + 'contextFields': context_fields, + } + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.put_judgments(body=body) + return response + + +async def delete_judgment_list(args: DeleteJudgmentListArgs) -> json: + """Delete a judgment by ID via the Search Relevance plugin. + + Args: + args: DeleteJudgmentListArgs containing the judgment_id + + Returns: + json: OpenSearch response confirming deletion + """ + from .client import get_opensearch_client + + async with get_opensearch_client(args) as client: + response = await client.plugins.search_relevance.delete_judgments( + judgment_id=args.judgment_id + ) + return response + + +def validate_json_string(value: str) -> None: + """Validate that a string is valid JSON, raising ValueError with a concise message if not. + + Intended to be called early (before any API call) so the LLM receives a small, + precise error rather than a verbose OpenSearch response. + + Args: + value: The string to validate as JSON. + + Raises: + ValueError: If the string is not valid JSON. The message includes the parse + error description, line, and column so the problem is immediately obvious. + """ + try: + json.loads(value) + except json.JSONDecodeError as e: + raise ValueError( + f"query is not valid JSON: {e.msg} (line {e.lineno}, col {e.colno})" + ) from e + + def normalize_scientific_notation(body): """Normalize scientific-notation floats in a request body. diff --git a/src/tools/agentic_memory/__init__.py b/src/tools/agentic_memory/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/src/tools/agentic_memory/actions.py b/src/tools/agentic_memory/actions.py new file mode 100644 index 0000000..6881e9b --- /dev/null +++ b/src/tools/agentic_memory/actions.py @@ -0,0 +1,246 @@ +import json +from .params import ( + AddAgenticMemoriesArgs, + CreateAgenticMemorySessionArgs, + DeleteAgenticMemoryByIDArgs, + DeleteAgenticMemoryByQueryArgs, + GetAgenticMemoryArgs, + SearchAgenticMemoryArgs, + UpdateAgenticMemoryArgs, +) +from opensearch.helper import ( + add_agentic_memories, + create_agentic_memory_session, + delete_agentic_memory_by_id, + delete_agentic_memory_by_query, + get_agentic_memory, + search_agentic_memory, + update_agentic_memory, +) +from tools.exceptions import HelperOperationError + + +async def create_agentic_memory_session_tool( + args: CreateAgenticMemorySessionArgs, +) -> list[dict]: + """Tool to create a new session in an agentic memory container. + + Args: + args: CreateAgenticMemorySessionArgs containing the memory_container_id and optional session details like session_id, summary, metadata, or namespace. + + Returns: + list[dict]: A confirmation message with the new session ID in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('CreateAgenticMemorySessionTool', args) + result = await create_agentic_memory_session(args) + + session_id = result.get('session_id') + message = ( + f'Successfully created session. ID: {session_id}. Response: {json.dumps(result)}' + if session_id + else f'Session created, but no ID was returned. Response: {json.dumps(result)}' + ) + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error creating session: {str(error_to_report)}'}] + + +async def add_agentic_memories_tool(args: AddAgenticMemoriesArgs) -> list[dict]: + """Tool to add memories to an agentic memory container. + + Args: + args: AddAgenticMemoriesArgs containing the memory_container_id, payload_type, and content (either messages or structured_data). + + Returns: + list[dict]: A confirmation message, often including the new working_memory_id or session_id, in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('AddAgenticMemoriesTool', args) + result = await add_agentic_memories(args) + + session_id = result.get('session_id') + memory_id = result.get('working_memory_id') + + message = 'Successfully added memory.' + if memory_id: + message += f' Working Memory ID: {memory_id}.' + if session_id: + message += f' Session ID: {session_id}.' + message += f' Response: {json.dumps(result)}' + + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error adding memory: {str(error_to_report)}'}] + + +async def get_agentic_memory_tool(args: GetAgenticMemoryArgs) -> list[dict]: + """Tool to retrieve a specific agentic memory by its type and ID. + + Args: + args: GetAgenticMemoryArgs containing the memory_container_id, memory_type, and the specific memory id. + + Returns: + list[dict]: The retrieved memory object as a JSON string within a confirmation message, in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('GetAgenticMemoryTool', args) + result = await get_agentic_memory(args) + + message = f'Successfully retrieved memory {args.id} ({args.memory_type.value}): {json.dumps(result)}' + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error getting memory: {str(error_to_report)}'}] + + +async def update_agentic_memory_tool(args: UpdateAgenticMemoryArgs) -> list[dict]: + """Tool to update a specific agentic memory (session, working, or long-term) by its ID. + + Args: + args: UpdateAgenticMemoryArgs containing the memory_container_id, memory_type, id, and the fields to be updated. + + Returns: + list[dict]: A confirmation message of the update operation in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('UpdateAgenticMemoryTool', args) + result = await update_agentic_memory(args) + + memory_id = result.get('_id', args.id) + message = f'Successfully updated memory {memory_id} ({args.memory_type.value}). Response: {json.dumps(result)}' + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error updating memory: {str(error_to_report)}'}] + + +async def delete_agentic_memory_by_id_tool( + args: DeleteAgenticMemoryByIDArgs, +) -> list[dict]: + """Tool to delete a specific agentic memory by its type and ID. + + Args: + args: DeleteAgenticMemoryByIDArgs containing the memory_container_id, memory_type, and the id of the memory to delete. + + Returns: + list[dict]: A confirmation message of the deletion in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('DeleteAgenticMemoryByIDTool', args) + result = await delete_agentic_memory_by_id(args) + + memory_id = result.get('_id', args.id) + message = f'Successfully deleted memory {memory_id} ({args.memory_type.value}). Response: {json.dumps(result)}' + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error deleting memory: {str(error_to_report)}'}] + + +async def delete_agentic_memory_by_query_tool( + args: DeleteAgenticMemoryByQueryArgs, +) -> list[dict]: + """Tool to delete agentic memories matching an OpenSearch query DSL. + + Args: + args: DeleteAgenticMemoryByQueryArgs containing the memory_container_id, memory_type, and the query. + + Returns: + list[dict]: A summary of the delete-by-query operation, including counts, in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('DeleteAgenticMemoryByQueryTool', args) + result = await delete_agentic_memory_by_query(args) + + deleted_count = result.get('deleted', 0) + failures = result.get('failures', []) + + message = ( + f'Delete by query for {args.memory_type.value} completed with {len(failures)} failures. ' + f'Deleted: {deleted_count}. Response: {json.dumps(result)}' + if failures + else f'Successfully deleted memories by query for {args.memory_type.value}. ' + f'Deleted: {deleted_count}. Response: {json.dumps(result)}' + ) + + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [ + { + 'type': 'text', + 'text': f'Error deleting memories by query: {str(error_to_report)}', + } + ] + + +async def search_agentic_memory_tool(args: SearchAgenticMemoryArgs) -> list[dict]: + """Tool to search for agentic memories using an OpenSearch query DSL. + + Args: + args: SearchAgenticMemoryArgs containing the memory_container_id, memory_type, query, and optional sort parameters. + + Returns: + list[dict]: The search results from OpenSearch as a JSON string within a summary message, in MCP format. + """ + try: + from tools.tools import check_tool_compatibility + + await check_tool_compatibility('SearchAgenticMemoryTool', args) + result = await search_agentic_memory(args) + + hits = result.get('hits', {}).get('hits', []) + count = len(hits) + total = result.get('hits', {}).get('total', {}).get('value', count) + + message = ( + f'Search results for {args.memory_type.value}: No memories found. Response: {json.dumps(result)}' + if total == 0 + else f'Search results for {args.memory_type.value}: Found {total} memories, returning {count}. ' + f'Response: {json.dumps(result)}' + ) + + return [{'type': 'text', 'text': message}] + except Exception as e: + error_to_report = e + if isinstance(e, HelperOperationError): + error_to_report = e.original + return [{'type': 'text', 'text': f'Error searching memory: {str(error_to_report)}'}] + + +__all__ = [ + 'create_agentic_memory_session_tool', + 'add_agentic_memories_tool', + 'get_agentic_memory_tool', + 'update_agentic_memory_tool', + 'delete_agentic_memory_by_id_tool', + 'delete_agentic_memory_by_query_tool', + 'search_agentic_memory_tool', +] diff --git a/src/tools/agentic_memory/params.py b/src/tools/agentic_memory/params.py new file mode 100644 index 0000000..c210be3 --- /dev/null +++ b/src/tools/agentic_memory/params.py @@ -0,0 +1,638 @@ +from enum import Enum +from pydantic import BaseModel, Field, model_validator +from pydantic_core import PydanticCustomError +from tools.tool_params import baseToolArgs +from typing import Any, Dict, List, Literal, Optional, Set + + +class EmbeddingModelType(str, Enum): + """Specifies the type of embedding model used.""" + + text_embedding = 'TEXT_EMBEDDING' + sparse_encoding = 'SPARSE_ENCODING' + + +class StrategyType(str, Enum): + """Specifies the type of agentic memory processing strategy.""" + + semantic = 'SEMANTIC' + user_preference = 'USER_PREFERENCE' + summary = 'SUMMARY' + + +class MemoryType(str, Enum): + """Specifies the different types of agentic memory.""" + + sessions = 'sessions' + working = 'working' + long_term = 'long-term' + history = 'history' + + +class PayloadType(str, Enum): + """Specifies the type of payload being added to agentic memory.""" + + conversational = 'conversational' + data = 'data' + + +ERR_FIELD_NOT_ALLOWED = 'field_not_allowed' +ERR_MISSING_WORKING_FIELD = 'missing_working_field' +ERR_MISSING_LONG_TERM_FIELD = 'missing_long_term_field' +ERR_MESSAGES_REQUIRED = 'messages_required' +ERR_FIELD_PROHIBITED = 'field_prohibited' +ERR_STRUCTURED_DATA_REQUIRED = 'structured_data_required' +ERR_MISSING_CONTENT_FIELD = 'missing_content_field' +ERR_EMBEDDING_DIMENSION_REQUIRED = 'embedding_dimension_required' + + +class MessageContentItem(BaseModel): + """Schema for the content part of a message. + Used for strong typing in 'messages' fields. + """ + + text: str = Field(..., description='The text content of the message.') + content_type: str = Field( + ..., description="The type of the content (e.g., 'text'). ", alias='type' + ) + + +class MessageItem(BaseModel): + """Schema for a single message in 'messages' field. + Used for strong typing. + """ + + role: Optional[str] = Field( + None, description="The role of the entity (e.g., 'user', 'assistant')." + ) + content: List[MessageContentItem] = Field( + ..., description='A list of content items for this message.' + ) + + +class BaseAgenticMemoryContainerArgs(baseToolArgs): + """Base arguments for tools operating on an existing Agentic Memory Container.""" + + memory_container_id: str = Field(..., description='The ID of the memory container.') + + +class CreateAgenticMemorySessionArgs(BaseAgenticMemoryContainerArgs): + """Arguments for creating a new session in a agentic memory container.""" + + session_id: Optional[str] = Field( + default=None, + description='A custom session ID. If provided, this ID is used for the session. If not provided, a random ID is generated.', + ) + summary: Optional[str] = Field(default=None, description='A session summary or description.') + metadata: Optional[Dict[str, Any]] = Field( + default=None, + description='Additional metadata for the session provided as key-value pairs.', + ) + namespace: Optional[Dict[str, str]] = Field( + default=None, description='Namespace information for organizing the session.' + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + 'memory_container_id': 'SdjmmpgBOh0h20Y9kWuN', + # Optional: Client-provided ID. If omitted, OpenSearch auto-generates one, must be unique within the memory container + 'session_id': 'abc123', + # Optional key-value pairs for session context + 'metadata': {'key1': 'value1'}, + }, + { + 'memory_container_id': 'SdjmmpgBOh0h20Y9kWuN', + # Human-readable description of the session + 'summary': 'This is a test session', + 'metadata': {'key1': 'value1'}, + # Isolates session to specific user - matches strategy namespace from container + 'namespace': {'user_id': 'bob'}, + }, + { + 'memory_container_id': 'SdjmmpgBOh0h20Y9kWuN', + 'summary': 'Session for user onboarding', + # Multi-dimensional namespacing supported + 'namespace': { + 'user_id': 'alice', + 'agent_id': 'onboarding_bot', + }, + # Used for filtering and organization + 'metadata': { + 'priority': 'high', + 'category': 'onboarding', + }, + }, + ] + } + + +class AddAgenticMemoriesArgs(BaseAgenticMemoryContainerArgs): + """Arguments for adding memories to the agentic memory container.""" + + # --- Payload Fields --- + messages: Optional[List[MessageItem]] = Field( + default=None, description='A list of messages for a conversational payload...' + ) + structured_data: Optional[Dict[str, Any]] = Field( + default=None, + description='Structured data content for data memory. Required when payload_type is data.', + ) + binary_data: Optional[str] = Field( + default=None, + description='Binary data content encoded as a Base64 string for binary payloads.', + ) + payload_type: PayloadType = Field( + ..., description='The type of payload. Valid values are conversational or data.' + ) + + # --- Optional Fields --- + namespace: Optional[Dict[str, str]] = Field( + default=None, description='The namespace context for organizing memories...' + ) + metadata: Optional[Dict[str, Any]] = Field( + default=None, description='Additional metadata for the memory...' + ) + tags: Optional[Dict[str, Any]] = Field( + default=None, description='Tags for categorizing and organizing memories.' + ) + infer: Optional[bool] = Field( + default=False, + description='Whether to use a large language model (LLM) to extract key information...', + ) + + @model_validator(mode='after') + def validate_payload_requirements(self) -> 'AddAgenticMemoriesArgs': + """Validate that the correct fields are provided based on payload_type.""" + # Getting fields that were actually set + set_fields = self.model_fields_set + + if self.payload_type == PayloadType.conversational: + if 'messages' not in set_fields: + raise PydanticCustomError( + ERR_MESSAGES_REQUIRED, + "'messages' field is required when payload_type is 'conversational'", + ) + if 'structured_data' in set_fields: + raise PydanticCustomError( + ERR_FIELD_PROHIBITED, + "'structured_data' should not be provided when payload_type is 'conversational'", + {'field_name': 'structured_data'}, + ) + + elif self.payload_type == PayloadType.data: + if 'structured_data' not in set_fields: + raise PydanticCustomError( + ERR_STRUCTURED_DATA_REQUIRED, + "'structured_data' field is required when payload_type is 'data'", + ) + if 'messages' in set_fields: + raise PydanticCustomError( + ERR_FIELD_PROHIBITED, + "'messages' should not be provided when payload_type is 'data'", + {'field_name': 'messages'}, + ) + + # Validate that at least one content field is provided + content_fields = {'messages', 'structured_data', 'binary_data'} + if not any(field in set_fields for field in content_fields): + raise PydanticCustomError( + ERR_MISSING_CONTENT_FIELD, + 'At least one content field (messages, structured_data, or binary_data) must be provided', + ) + + return self + + class Config: + json_schema_extra = { + 'examples': [ + { + 'memory_container_id': 'SdjmmpgBOh0h20Y9kWuN', + # Conversational exchange between user and assistant + 'messages': [ + { + # Standard chat roles: 'user', 'assistant' + 'role': 'user', + 'content': [ + { + 'text': "I'm Bob, I really like swimming.", + 'type': 'text', + } + ], + }, + { + 'role': 'assistant', + 'content': [ + { + 'text': 'Cool, nice. Hope you enjoy your life.', + 'type': 'text', + } + ], + }, + ], + # Must match namespace from container strategies + 'namespace': {'user_id': 'bob'}, + 'metadata': { + # Custom workflow state tracking + 'status': 'checkpoint', + # Supports branching conversations for exploration + 'branch': { + # Branch identifier + 'branch_name': 'high', + # Parent conversation point + 'root_event_id': '228nadfs879mtgk', + }, + }, + # Enables filtering and categorization + 'tags': {'topic': 'personal info'}, + # Enables AI processing (summarization, semantic extraction, etc.) + 'infer': True, + # Determines how AI strategies are applied + 'payload_type': 'conversational', + }, + { + 'memory_container_id': 'SdjmmpgBOh0h20Y9kWuN', + # Alternative to messages - for non-conversational data + 'structured_data': { + 'time_range': {'start': '2025-09-11', 'end': '2025-09-15'} + }, + 'namespace': {'agent_id': 'testAgent1'}, + # Flexible schema + 'metadata': {'status': 'checkpoint', 'anyobject': 'abc'}, + 'tags': {'topic': 'agent_state'}, + # Skips AI processing - stores raw data only + 'infer': False, + # Bypasses conversational AI pipelines + 'payload_type': 'data', + }, + ] + } + + +class GetAgenticMemoryArgs(BaseAgenticMemoryContainerArgs): + """Arguments for retrieving a specific agentic memory by its type and ID.""" + + memory_type: MemoryType = Field( + ..., + alias='type', + description='The memory type. Valid values are sessions, working, long-term, and history.', + ) + id: str = Field(..., description='The ID of the memory to retrieve.') + + class Config: + json_schema_extra = { + 'examples': [ + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Active conversation data, agent state, and temporary context used during ongoing interactions + 'type': 'working', + 'id': 'XyEuiJkBeh2gPPwzjYWM', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Processed knowledge and facts extracted from conversations over time via LLM inference + 'type': 'long-term', + 'id': 'DcxjTpkBvwXRq366C1Zz', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Manages conversation sessions and their metadata (start time, participants, state) + 'type': 'sessions', + 'id': 'CcxjTpkBvwXRq366A1aE', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Audit trail of all memory operations (add/update/delete) across the container + 'type': 'history', + # Specific history record ID tracking memory evolution + 'id': 'eMxnTpkBvwXRq366hmAU', + }, + ] + } + + +class UpdateAgenticMemoryArgs(BaseAgenticMemoryContainerArgs): + """Arguments for updating a specific agentic memory by its type and ID.""" + + # --- Constants for Validation --- + _SESSION_ONLY_FIELDS: Set[str] = {'summary', 'agents', 'additional_info'} + _WORKING_ONLY_FIELDS: Set[str] = {'messages', 'structured_data', 'binary_data'} + _LONG_TERM_ONLY_FIELDS: Set[str] = {'memory'} + _UPDATABLE_WORKING_FIELDS: Set[str] = { + 'messages', + 'structured_data', + 'binary_data', + 'tags', + 'metadata', + } + _UPDATABLE_LONG_TERM_FIELDS: Set[str] = {'memory', 'tags', 'metadata'} + + # --- Required Path Fields --- + memory_type: Literal[MemoryType.sessions, MemoryType.working, MemoryType.long_term] = Field( + ..., + alias='type', + description='The memory type. Valid values are sessions, working, and long-term. Note that history memory cannot be updated.', + ) + id: str = Field(..., description='The ID of the memory to update.') + + # --- Session memory fields --- + summary: Optional[str] = Field(default=None, description='The summary of the session.') + metadata: Optional[Dict[str, Any]] = Field( + default=None, + description='Additional metadata for the memory (for example, status, branch, or custom fields).', + ) + agents: Optional[Dict[str, Any]] = Field( + default=None, description='Additional information about the agents.' + ) + additional_info: Optional[Dict[str, Any]] = Field( + default=None, description='Additional metadata to associate with the session.' + ) + + # --- Working memory fields --- + messages: Optional[List[MessageItem]] = Field( + default=None, + description='Updated conversation messages (for conversation type).', + ) + structured_data: Optional[Dict[str, Any]] = Field( + default=None, + description='Updated structured data content (for data memory payloads).', + ) + binary_data: Optional[Dict[str, Any]] = Field( + default=None, + description='Updated binary data content (for data memory payloads).', + ) + tags: Optional[Dict[str, Any]] = Field( + default=None, description='Updated tags for categorization.' + ) + + # --- Long-term memory fields --- + memory: Optional[str] = Field(default=None, description='The updated memory content.') + + @model_validator(mode='after') + def validate_memory_type_fields(self) -> 'UpdateAgenticMemoryArgs': + """Validate that fields match the specified memory_type and minimum requirements. + + Ensures that: + 1. Fields exclusive to one memory type (e.g., 'messages' for 'working') are not + provided when updating another type (e.g., 'sessions'). + 2. 'working' and 'long-term' updates provide at least one + updatable field. + """ + set_fields = self.model_fields_set + + def _raise_not_allowed_error(field_name: str, memory_type: str): + raise PydanticCustomError( + ERR_FIELD_NOT_ALLOWED, + "Field '{field_name}' should not be provided when updating {memory_type} memory", + {'field_name': field_name, 'memory_type': memory_type}, + ) + + if self.memory_type == MemoryType.sessions: + disallowed_fields = self._WORKING_ONLY_FIELDS | self._LONG_TERM_ONLY_FIELDS + for field in disallowed_fields: + if field in set_fields: + _raise_not_allowed_error(field, MemoryType.sessions) + + elif self.memory_type == MemoryType.working: + disallowed_fields = self._SESSION_ONLY_FIELDS | self._LONG_TERM_ONLY_FIELDS + for field in disallowed_fields: + if field in set_fields: + _raise_not_allowed_error(field, MemoryType.working) + + if not any(field in set_fields for field in self._UPDATABLE_WORKING_FIELDS): + raise PydanticCustomError( + ERR_MISSING_WORKING_FIELD, + 'At least one field ({fields}) must be provided for updating working memory', + {'fields': ', '.join(self._UPDATABLE_WORKING_FIELDS)}, + ) + + elif self.memory_type == MemoryType.long_term: + disallowed_fields = self._SESSION_ONLY_FIELDS | self._WORKING_ONLY_FIELDS + for field in disallowed_fields: + if field in set_fields: + _raise_not_allowed_error(field, MemoryType.long_term) + + if not any(field in set_fields for field in self._UPDATABLE_LONG_TERM_FIELDS): + raise PydanticCustomError( + ERR_MISSING_LONG_TERM_FIELD, + 'At least one field ({fields}) must be provided for updating long-term memory', + {'fields': ', '.join(self._UPDATABLE_LONG_TERM_FIELDS)}, + ) + + return self + + class Config: + json_schema_extra = { + 'examples': [ + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'sessions', + 'id': 'N2CDipkB2Mtr6INFFcX8', + 'additional_info': { + # Flexible object for storing any session-specific metadata + 'key1': 'value1', + # Timestamp of the last activity in the session (ISO 8601 format) + 'last_activity': '2025-09-15T17:30:00Z', + }, + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'working', + 'id': 'XyEuiJkBeh2gPPwzjYWM', + # Key-value pairs for categorizing and filtering working memories + 'tags': {'topic': 'updated_topic', 'priority': 'high'}, + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'long-term', + 'id': 'DcxjTpkBvwXRq366C1Zz', + # Actual memory content for long-term storage + 'memory': "User's name is Bob Smith", + # Tags help in organizing and retrieving long-term memories + 'tags': {'topic': 'personal info', 'updated': 'true'}, + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'working', + 'id': 'another_working_memory_id', + # Array of conversation messages (typically used for conversational memory) + 'messages': [ + { + # Role of the message sender (e.g., 'user', 'assistant') + 'role': 'user', + 'content': [ + # Content supports multiple types and structures + {'text': 'Updated user message', 'type': 'text'} + ], + } + ], + # Custom key-value pairs for storing operational state or other context + 'metadata': {'status': 'updated'}, + }, + ] + } + + +class DeleteAgenticMemoryByIDArgs(BaseAgenticMemoryContainerArgs): + """Arguments for deleting a specific agentic memory by its type and ID.""" + + memory_type: MemoryType = Field( + ..., + alias='type', + description='The type of memory to delete. Valid values are sessions, working, long-term, and history.', + ) + id: str = Field(..., description='The ID of the specific memory to delete.') + + class Config: + json_schema_extra = { + 'examples': [ + { + # The unique identifier for the memory container from which the memory will be deleted + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Specifies the type of memory to delete. Valid values are 'sessions', 'working', 'long-term', and 'history' + 'type': 'working', + # The unique identifier of the specific 'working' memory to be deleted + 'id': 'XyEuiJkBeh2gPPwzjYWM', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Use to delete a long-term memory, which typically stores factual information + 'type': 'long-term', + 'id': 'DcxjTpkBvwXRq366C1Zz', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Use to delete a session memory, which tracks conversation sessions + 'type': 'sessions', + 'id': 'CcxjTpkBvwXRq366A1aE', + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Use to delete a history memory, which maintains an audit trail of memory operations + 'type': 'history', + 'id': 'eMxnTpkBvwXRq366hmAU', + }, + ] + } + + +class DeleteAgenticMemoryByQueryArgs(BaseAgenticMemoryContainerArgs): + """Arguments for deleting agentic memories by query.""" + + memory_type: MemoryType = Field( + ..., + alias='type', + description='The type of memory to delete. Valid values are sessions, working, long-term, and history.', + ) + query: Dict[str, Any] = Field( + ..., + description='The query to match the memories you want to delete. This should be a valid OpenSearch query DSL object.', + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + # The unique identifier for the memory container from which memories will be deleted + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # The type of memory to delete. Valid values are 'sessions', 'working', 'long-term', and 'history' + 'type': 'working', + # Uses OpenSearch Query DSL to match all 'working' memories where the 'owner_id' field is "admin" + 'query': {'match': {'owner_id': 'admin'}}, + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'long-term', + # Deletes 'long-term' memories created before 2025-09-01; useful for data retention policies + 'query': {'range': {'created_time': {'lt': '2025-09-01'}}}, + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'sessions', + # Deletes 'sessions' memories for a specific user; 'term' query finds exact matches in the 'namespace.user_id' field + 'query': {'term': {'namespace.user_id': 'inactive_user'}}, + }, + ] + } + + +class SearchAgenticMemoryArgs(BaseAgenticMemoryContainerArgs): + """Arguments for searching memories of a specific type within a agentic memory container.""" + + memory_type: MemoryType = Field( + ..., + alias='type', + description='The memory type. Valid values are sessions, working, long-term, and history.', + ) + query: Dict[str, Any] = Field(..., description='The search query using OpenSearch query DSL.') + sort: Optional[List[Dict[str, Any]]] = Field( + default=None, description='Sort specification for the search results.' + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + # The unique identifier for the memory container to search within + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # Specifies the type of memory to search (e.g., sessions, long-term, working, history) + 'type': 'sessions', + # OpenSearch Query DSL: matches all documents in the specified memory type + 'query': {'match_all': {}}, + # Sorts results by creation time, newest first + 'sort': [{'created_time': {'order': 'desc'}}], + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'long-term', + 'query': { + # Term query finds exact matches in the 'namespace.user_id' field for user isolation + 'bool': {'must': [{'term': {'namespace.user_id': 'bob'}}]} + }, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + # 'history' type stores past interactions; typically searched with match_all to review chronologically + 'type': 'history', + 'query': {'match_all': {}}, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'working', + 'query': { + 'bool': { + # Finds memories for a specific user + 'must': [{'term': {'namespace.user_id': 'bob'}}], + 'must_not': [ + # Excludes memories that have a 'parent_memory_id' tag + {'exists': {'field': 'tags.parent_memory_id'}} + ], + } + }, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + { + 'memory_container_id': 'HudqiJkB1SltqOcZusVU', + 'type': 'working', + # Finds memories associated with a specific session + 'query': {'term': {'namespace.session_id': '123'}}, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + ] + } + + +__all__ = [ + 'MemoryType', + 'PayloadType', + 'CreateAgenticMemorySessionArgs', + 'AddAgenticMemoriesArgs', + 'GetAgenticMemoryArgs', + 'UpdateAgenticMemoryArgs', + 'DeleteAgenticMemoryByIDArgs', + 'DeleteAgenticMemoryByQueryArgs', + 'SearchAgenticMemoryArgs', +] diff --git a/src/tools/config.py b/src/tools/config.py index 4325c05..4c0589f 100644 --- a/src/tools/config.py +++ b/src/tools/config.py @@ -3,6 +3,7 @@ import copy import logging +import os import re import yaml from typing import Dict, Any @@ -236,6 +237,46 @@ def _apply_validated_configs( tool_info[field_name] = field_value +def _apply_memory_container_defaults( + custom_registry: Dict[str, Any], container_id: str +) -> None: + """ + Set memory_container_id as a default in the input_schema of all agentic memory tools. + + This modifies the JSON schema so that: + 1. MCP clients see the default value (and the field is no longer required) + 2. validate_args_for_mode can inject it at runtime when agents omit it + + :param custom_registry: The registry to modify + :param container_id: The memory container ID to set as default + """ + agentic_memory_tool_names = [ + 'CreateAgenticMemorySessionTool', + 'AddAgenticMemoriesTool', + 'GetAgenticMemoryTool', + 'UpdateAgenticMemoryTool', + 'DeleteAgenticMemoryByIDTool', + 'DeleteAgenticMemoryByQueryTool', + 'SearchAgenticMemoryTool', + ] + + for tool_name in agentic_memory_tool_names: + if tool_name not in custom_registry: + continue + + tool_info = custom_registry[tool_name] + base_schema = tool_info.get('input_schema') or {} + input_schema = copy.deepcopy(base_schema) + properties = input_schema.get('properties') or {} + + if 'memory_container_id' in properties: + properties['memory_container_id']['default'] = container_id + if 'required' in input_schema and 'memory_container_id' in input_schema['required']: + input_schema['required'].remove('memory_container_id') + + tool_info['input_schema'] = input_schema + + def apply_custom_tool_config( tool_registry: Dict[str, Any], config_file_path: str, @@ -248,6 +289,9 @@ def apply_custom_tool_config( 1. Config file settings (if config file is provided, CLI is completely ignored) 2. CLI argument settings (only used if no config file is provided) + Additionally, if memory_container_id is configured (via config file or environment variable), + it will be automatically set as a default value for all agentic memory tools. + :param tool_registry: The original tool registry :param config_file_path: Path to the YAML configuration file :param cli_tool_overrides: Dictionary of tool overrides from command line @@ -255,6 +299,11 @@ def apply_custom_tool_config( """ custom_registry = copy.deepcopy(tool_registry) + # Apply memory_container_id defaults to agentic memory tools + container_id = get_memory_container_id_from_config(config_file_path) + if container_id: + _apply_memory_container_defaults(custom_registry, container_id) + # Load configuration from file config_from_file = {} if config_file_path: @@ -284,3 +333,49 @@ def apply_custom_tool_config( default_tool_registry.update(custom_registry) return custom_registry + + +def get_memory_container_id_from_config(config_file_path: str = '') -> str: + """Get memory container ID from config file or environment variable. + + Priority order: + 1. Config file (if provided and contains agentic_memory.memory_container_id) + 2. Environment variable OPENSEARCH_MEMORY_CONTAINER_ID + + :param config_file_path: Path to the YAML configuration file + :return: Memory container ID or empty string if not found + """ + container_id = '' + + if config_file_path: + try: + with open(config_file_path, 'r') as f: + config = yaml.safe_load(f) + if config: + agentic_memory_config = config.get('agentic_memory', {}) + if isinstance(agentic_memory_config, dict): + container_id = agentic_memory_config.get('memory_container_id', '') + if container_id: + logging.info(f'Using memory_container_id from config file: {container_id}') + return container_id + except Exception as e: + logging.debug(f'Could not load memory_container_id from config file {config_file_path}: {e}') + + container_id = os.getenv('OPENSEARCH_MEMORY_CONTAINER_ID', '') + if container_id: + logging.info(f'Using memory_container_id from environment variable: {container_id}') + + return container_id + + +def should_enable_agentic_memory_tools(config_file_path: str = '') -> bool: + """Check if agentic memory tools should be enabled. + + Memory tools are enabled only if memory_container_id is configured either + in the config file or as an environment variable. + + :param config_file_path: Path to the YAML configuration file + :return: True if agentic memory tools should be enabled, False otherwise + """ + container_id = get_memory_container_id_from_config(config_file_path) + return bool(container_id) diff --git a/src/tools/exceptions.py b/src/tools/exceptions.py new file mode 100644 index 0000000..6061ee8 --- /dev/null +++ b/src/tools/exceptions.py @@ -0,0 +1,28 @@ +class OpenSearchMCPError(Exception): + """Base exception for all errors in the OpenSearch MCP project.""" + + pass + + +class HelperOperationError(OpenSearchMCPError): + """Error raised when a helper function fails to perform its operation. + Enriches the exception with runtime context from the helper call. + """ + + def __init__(self, *, message: str, func_name: str, action: str, original: Exception): + """Initialize the HelperOperationError. + + Args: + message: A high-level message for the exception (rarely used directly). + func_name: The name of the function where the error occurred. + action: A description of the action that failed (e.g., "get agentic memory"). + original: The original exception that was caught and wrapped. + """ + super().__init__(message) + self.func_name = func_name + self.action = action + self.original = original + + def __str__(self): + """Return a human-readable string representation of the error.""" + return f'{self.func_name} failed to {self.action}: {self.original}' diff --git a/src/tools/generic_api_tool.py b/src/tools/generic_api_tool.py index 91f34f7..e4c37cf 100644 --- a/src/tools/generic_api_tool.py +++ b/src/tools/generic_api_tool.py @@ -7,6 +7,7 @@ from typing import Any, Dict, Optional from urllib.parse import urlencode +from .tool_logging import log_tool_error from .tool_params import baseToolArgs from pydantic import BaseModel, Field @@ -87,12 +88,11 @@ async def generic_opensearch_api_tool(args: GenericOpenSearchApiArgs) -> list[di valid_methods = ['GET', 'POST', 'PUT', 'DELETE', 'HEAD', 'PATCH'] method = args.method.upper() if method not in valid_methods: - return [ - { - 'type': 'text', - 'text': f'Error: Invalid HTTP method "{args.method}". Valid methods are: {", ".join(valid_methods)}', - } - ] + return log_tool_error( + 'GenericOpenSearchApiTool', + ValueError(f'Invalid HTTP method "{args.method}". Valid methods are: {", ".join(valid_methods)}'), + 'validating request', + ) # Check if write operations are allowed using the global setting # Import here to avoid circular import (tool_filter -> tools -> generic_api_tool -> tool_filter) @@ -102,16 +102,19 @@ async def generic_opensearch_api_tool(args: GenericOpenSearchApiArgs) -> list[di write_methods = ['POST', 'PUT', 'DELETE', 'PATCH'] if method in write_methods and not allow_write: - return [ - { - 'type': 'text', - 'text': f'Error: Write operations are disabled. Method "{method}" is not allowed. Enable write operations by setting OPENSEARCH_SETTINGS_ALLOW_WRITE=true or configuring allow_write: true in your config file.', - } - ] + return log_tool_error( + 'GenericOpenSearchApiTool', + PermissionError(f'Write operations are disabled. Method "{method}" is not allowed.'), + 'validating request', + ) # Validate path if not args.path.startswith('/'): - return [{'type': 'text', 'text': 'Error: API path must start with "/"'}] + return log_tool_error( + 'GenericOpenSearchApiTool', + ValueError('API path must start with "/"'), + 'validating request', + ) # Initialize OpenSearch client with context manager for proper cleanup from opensearch.client import get_opensearch_client @@ -158,6 +161,8 @@ async def generic_opensearch_api_tool(args: GenericOpenSearchApiArgs) -> list[di return [{'type': 'text', 'text': f'{message}:\n{formatted_response}'}] except Exception as e: - error_message = f'Error calling OpenSearch API ({args.method} {args.path}): {str(e)}' - logger.error(error_message) - return [{'type': 'text', 'text': error_message}] + return log_tool_error( + 'GenericOpenSearchApiTool', e, + f'calling OpenSearch API ({args.method} {args.path})', + method=args.method, path=args.path, + ) diff --git a/src/tools/skills_tools.py b/src/tools/skills_tools.py index 8cc55d6..105711e 100644 --- a/src/tools/skills_tools.py +++ b/src/tools/skills_tools.py @@ -4,6 +4,7 @@ import json import logging from typing import Dict, Any +from .tool_logging import log_tool_error from .tool_params import baseToolArgs from pydantic import Field from opensearch.client import get_opensearch_client @@ -45,7 +46,7 @@ async def call_opensearch_tool(tool_name: str, parameters: Dict[str, Any], args: return [{'type': 'text', 'text': f'{tool_name} result:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error executing {tool_name}: {str(e)}'}] + return log_tool_error(tool_name, e, f'executing {tool_name}') async def data_distribution_tool(args: DataDistributionToolArgs) -> list[dict]: params = { diff --git a/src/tools/tool_filter.py b/src/tools/tool_filter.py index 533b3fa..fc89040 100644 --- a/src/tools/tool_filter.py +++ b/src/tools/tool_filter.py @@ -184,6 +184,39 @@ def process_tool_filter( # Add core_tools as a built-in category using display name category_to_tools['core_tools'] = core_tools_display_name + # Initialize search_relevance tool names + search_relevance_tools = [ + 'CreateSearchConfigurationTool', + 'GetSearchConfigurationTool', + 'DeleteSearchConfigurationTool', + 'GetQuerySetTool', + 'CreateQuerySetTool', + 'SampleQuerySetTool', + 'DeleteQuerySetTool', + 'GetJudgmentListTool', + 'CreateJudgmentListTool', + 'CreateUBIJudgmentListTool', + 'CreateLLMJudgmentListTool', + 'DeleteJudgmentListTool', + 'GetExperimentTool', + 'CreateExperimentTool', + 'DeleteExperimentTool', + 'SearchQuerySetsTool', + 'SearchSearchConfigurationsTool', + 'SearchJudgmentsTool', + 'SearchExperimentsTool', + ] + + # Build search_relevance tools list using display names + search_relevance_display_names = [] + for tool_name in search_relevance_tools: + if tool_name in tool_registry: + tool_display_name = tool_registry[tool_name].get('display_name', tool_name) + search_relevance_display_names.append(tool_display_name) + + # Add search_relevance as a built-in category (not enabled by default) + category_to_tools['search_relevance'] = search_relevance_display_names + # Process YAML config file if provided config = load_yaml_config(filter_path) if config: diff --git a/src/tools/tool_generator.py b/src/tools/tool_generator.py index f8688f7..ec9bfe8 100644 --- a/src/tools/tool_generator.py +++ b/src/tools/tool_generator.py @@ -6,6 +6,7 @@ import yaml import ssl import os +from .tool_logging import log_tool_error from .tool_params import baseToolArgs from .tools import TOOL_REGISTRY, check_tool_compatibility from mcp.types import TextContent @@ -224,10 +225,9 @@ def generate_tool_from_group(base_name: str, endpoints: List[Dict]) -> Dict[str, # Create the tool function that will execute the OpenSearch API async def tool_func(params: BaseModel) -> list[TextContent]: + tool_name = f'{base_name.replace("_", "")}Tool' try: from opensearch.client import get_opensearch_client - - tool_name = f'{base_name.replace("_", "")}Tool' params_dict = params.model_dump() if hasattr(params, 'model_dump') else {} try: @@ -243,11 +243,7 @@ async def tool_func(params: BaseModel) -> list[TextContent]: args = baseToolArgs(**base_args) except Exception as e: - return [ - TextContent( - type='text', text=f'Error initializing OpenSearch client: {str(e)}' - ) - ] + return log_tool_error(tool_name, e, 'initializing OpenSearch client') # Use context manager to ensure proper client cleanup async with get_opensearch_client(args) as request_client: @@ -280,7 +276,7 @@ async def tool_func(params: BaseModel) -> list[TextContent]: ] except Exception as e: - return [TextContent(type='text', text=f'Error: {str(e)}')] + return log_tool_error(tool_name, e, f'executing {tool_name}') # Create input schema with required fields input_schema = {'type': 'object', 'title': f'{base_name}Args', 'properties': all_parameters} diff --git a/src/tools/tool_logging.py b/src/tools/tool_logging.py new file mode 100644 index 0000000..326bf3d --- /dev/null +++ b/src/tools/tool_logging.py @@ -0,0 +1,91 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +""" +Structured error logging for tool functions. + +Provides a helper that extracts structured data from exceptions +(status codes, error types, root causes) before they are stringified, +and emits a structured log event for metric extraction. +""" + +import json +import logging + +logger = logging.getLogger(__name__) + + +def log_tool_error( + tool_name: str, + exception: Exception, + operation: str = '', + **context: object, +) -> list[dict]: + """Log a structured tool error and return the MCP error response. + + Extracts status_code, exception type, and root cause from opensearchpy + exceptions before they are lost to stringification. + + Args: + tool_name: The registry key of the tool (e.g. 'SearchIndexTool'). + exception: The caught exception object. + operation: Human-readable description of what failed + (e.g. 'searching index', 'getting mapping'). + **context: Additional context fields to include in the log + (e.g. index='my-index', method='GET'). + + Returns: + list[dict]: MCP-format error response + [{'type': 'text', 'text': 'Error ...'}]. + """ + error_text = f'Error {operation}: {exception}' if operation else f'Error: {exception}' + exception_type = type(exception).__name__ + + # Extract status_code from opensearchpy TransportError and subclasses. + # ConnectionError sets status_code = "N/A" (a string), so only keep ints. + raw_status = getattr(exception, 'status_code', None) + status_code = raw_status if isinstance(raw_status, int) else None + + # Extract root cause from opensearchpy error info. + # exception.info is a dict when opensearch-py parses the JSON response, + # but can be a raw JSON string when the request goes through the fallback path. + # The async library stores the response body in exception.error (2nd arg) + # rather than exception.info (3rd arg), so fall back to that. + error_info = getattr(exception, 'info', None) + if error_info is None: + error_info = getattr(exception, 'error', None) + if isinstance(error_info, str): + try: + error_info = json.loads(error_info) + except (json.JSONDecodeError, TypeError): + error_info = None + root_cause = None + if isinstance(error_info, dict): + error_detail = error_info.get('error', {}) + if isinstance(error_detail, dict): + causes = error_detail.get('root_cause', []) + if causes and isinstance(causes, list) and len(causes) > 0: + root_cause = causes[0].get('type') + + log_extra: dict[str, object] = { + 'event_type': 'tool_error', + 'tool_name': tool_name, + 'exception_type': exception_type, + 'status': 'error', + } + if status_code is not None: + log_extra['status_code'] = status_code + if root_cause: + log_extra['root_cause'] = root_cause + + # Merge caller-provided context (index, query, method, etc.) + for key, value in context.items(): + if value is not None: + log_extra[key] = value + + logger.error( + f'Tool error: {tool_name} - {operation} ({exception_type})', + extra=log_extra, + ) + + return [{'type': 'text', 'text': error_text, 'is_error': True}] diff --git a/src/tools/tool_params.py b/src/tools/tool_params.py index f41b808..67c1781 100644 --- a/src/tools/tool_params.py +++ b/src/tools/tool_params.py @@ -2,19 +2,25 @@ # SPDX-License-Identifier: Apache-2.0 from pydantic import BaseModel, Field -from typing import Any, Optional, Type, TypeVar, Dict +from typing import Any, Dict, Literal, Optional, Type, TypeVar from mcp_server_opensearch.global_state import get_mode + T = TypeVar('T', bound=BaseModel) -def validate_args_for_mode(args_dict: Dict[str, Any], args_model_class: Type[T]) -> T: +def validate_args_for_mode( + args_dict: Dict[str, Any], + args_model_class: Type[T], + input_schema: Optional[Dict[str, Any]] = None, +) -> T: """ Validation middleware that handles mode-specific validation. Args: args_dict: Dictionary of arguments provided by the user args_model_class: The Pydantic model class to validate against + input_schema: Optional tool input schema containing default values Returns: Validated instance of args_model_class @@ -22,9 +28,20 @@ def validate_args_for_mode(args_dict: Dict[str, Any], args_model_class: Type[T]) # Get the current mode from global state mode = get_mode() + args_dict = args_dict.copy() # Don't modify the original + + # Inject defaults that were set in the tool's input_schema by the server + # at startup (e.g. memory_container_id from agentic_memory config / env var). + # Only non-None defaults are injected to avoid interfering with + # model_fields_set tracking used by cross-field validators. + if input_schema: + properties = input_schema.get('properties', {}) + for field_name, field_schema in properties.items(): + if 'default' in field_schema and field_schema['default'] is not None and field_name not in args_dict: + args_dict[field_name] = field_schema['default'] + if mode == 'single': # In single mode, add default values for base fields - args_dict = args_dict.copy() # Don't modify the original args_dict.setdefault('opensearch_cluster_name', '') try: @@ -79,7 +96,7 @@ class GetIndexMappingArgs(baseToolArgs): class SearchIndexArgs(baseToolArgs): index: str = Field(description='The name of the index to search in') - query: Any = Field(description='The search query in OpenSearch query DSL format. For date/time fields, if a range query is applied to a date/time field, it MUST include a "format" parameter. Most commonly "format": "strict_date_optional_time||epoch_millis" (supports ISO-8601 strings and epoch millis), e.g. {"range": {"timestamp": {"gte": "2025-12-29T17:15:12Z", "lte": "2025-12-30T08:15:12Z", "format": "strict_date_optional_time||epoch_millis"}}}; if date values use non-ISO or custom string formats (e.g. "2025/12/29 17:15:12"), the format MUST be explicitly adjusted to match the provided values. For text fields, use "match" queries for analyzed search (e.g. {"match": {"message": "error"}}). For exact matches on text fields with keyword subfields, use the ".keyword" suffix with a "term" query (e.g. {"term": {"status.keyword": "ERROR"}}).') + query_dsl: Any = Field(description='The search query in OpenSearch query DSL format. For keyword-type fields (mapping shows "type": "keyword"), use field name DIRECTLY - do NOT add .keyword suffix. For text-type fields with .keyword subfields, use the .keyword suffix for exact matches. For date/time range queries, MUST include "format" parameter (commonly "format": "strict_date_optional_time||epoch_millis"), e.g. {"range": {"timestamp": {"gte": "2025-12-29T17:15:12Z", "lte": "2025-12-30T08:15:12Z", "format": "strict_date_optional_time||epoch_millis"}}}; if using non-ISO formats, adjust "format" accordingly.') format: str = Field(default='json', description='Output format: "json" or "csv"') size: int = Field(default=10, description='Number of search results to return. The maximum allowed value is 100, unless overridden by configuration.') @@ -254,3 +271,376 @@ class Config: {'node_id': '_local', 'metric': 'jvm,os'}, ] } + + +class CreateSearchConfigurationArgs(baseToolArgs): + """Arguments for the CreateSearchConfigurationTool.""" + + name: str = Field(description='Name of the search configuration') + index: str = Field(description='The index to search') + query: str = Field( + description='The search query in OpenSearch DSL format, provided as a JSON string. ' + 'Use %SearchText% as a placeholder for the search term, e.g. ' + '\'{"query":{"match":{"title":"%SearchText%"}}}\'' + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + 'name': 'my-config', + 'index': 'my-index', + 'query': '{"query":{"match":{"title":"%SearchText%"}}}', + } + ] + } + + +class GetSearchConfigurationArgs(baseToolArgs): + """Arguments for the GetSearchConfigurationTool.""" + + search_configuration_id: str = Field(description='ID of the search configuration to retrieve') + + class Config: + json_schema_extra = {'examples': [{'search_configuration_id': 'abc123'}]} + + +class DeleteSearchConfigurationArgs(baseToolArgs): + """Arguments for the DeleteSearchConfigurationTool.""" + + search_configuration_id: str = Field(description='ID of the search configuration to delete') + + class Config: + json_schema_extra = {'examples': [{'search_configuration_id': 'abc123'}]} + + +class GetQuerySetArgs(baseToolArgs): + """Arguments for the GetQuerySetTool.""" + + query_set_id: str = Field(description='ID of the query set to retrieve') + + class Config: + json_schema_extra = { + 'examples': [ + {'query_set_id': 'my-query-set-id'}, + ] + } + + +class CreateQuerySetArgs(baseToolArgs): + """Arguments for the CreateQuerySetTool.""" + + name: str = Field(description='Name of the query set') + queries: str = Field( + description='JSON array of queries, e.g. ["query1", "query2"] or [{"queryText": "query1"}]' + ) + description: str = Field(default='', description='Optional description of the query set') + + class Config: + json_schema_extra = { + 'examples': [ + { + 'name': 'my-query-set', + 'queries': '["laptop", "wireless headphones", "4k monitor"]', + 'description': 'Sample product search queries', + }, + ] + } + + +class SampleQuerySetArgs(baseToolArgs): + """Arguments for the SampleQuerySetTool.""" + + name: str = Field(description='Name of the query set') + query_set_size: int = Field( + default=20, description='Number of top queries to sample (default: 20)', ge=1 + ) + sampling: Literal['topn', 'random', 'pptss', 'all'] = Field( + default='topn', + description=( + 'Sampling method: "topn" (most frequent N queries), ' + '"random" (random sample), ' + '"pptss" (probability-proportional-to-size sampling), ' + '"all" (all queries)' + ), + ) + description: str = Field(default='', description='Optional description of the query set') + + class Config: + json_schema_extra = { + 'examples': [ + {'name': 'top-queries', 'query_set_size': 20}, + { + 'name': 'top-50-queries', + 'query_set_size': 50, + 'description': 'Top 50 most frequent user queries', + }, + { + 'name': 'random-queries', + 'query_set_size': 30, + 'sampling': 'random', + 'description': 'Random sample of 30 queries', + }, + ] + } + + +class DeleteQuerySetArgs(baseToolArgs): + """Arguments for the DeleteQuerySetTool.""" + + query_set_id: str = Field(description='ID of the query set to delete') + + class Config: + json_schema_extra = { + 'examples': [ + {'query_set_id': 'my-query-set-id'}, + ] + } + + +class GetJudgmentListArgs(baseToolArgs): + """Arguments for the GetJudgmentListTool.""" + + judgment_id: str = Field(description='ID of the judgment list to retrieve') + + class Config: + json_schema_extra = {'examples': [{'judgment_id': 'abc123'}]} + + +class CreateJudgmentListArgs(baseToolArgs): + """Arguments for the CreateJudgmentListTool.""" + + name: str = Field(description='Name of the judgment list') + judgment_ratings: str = Field( + description='JSON array of query-ratings objects. Each object must have ' + '"query" (string) and "ratings" (array of {"docId": string, "rating": number}). ' + 'Example: [{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}, {"docId": "doc2", "rating": 1}]}]' + ) + description: str = Field(default='', description='Optional description of the judgment list') + + class Config: + json_schema_extra = { + 'examples': [ + { + 'name': 'my-judgments', + 'judgment_ratings': '[{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}, {"docId": "doc2", "rating": 1}]}]', + }, + { + 'name': 'product-judgments', + 'description': 'Manual relevance judgments for product search', + 'judgment_ratings': '[{"query": "wireless headphones", "ratings": [{"docId": "prod-001", "rating": 3}, {"docId": "prod-002", "rating": 2}]}, {"query": "4k monitor", "ratings": [{"docId": "prod-010", "rating": 3}]}]', + }, + ] + } + + +class CreateUBIJudgmentListArgs(baseToolArgs): + """Arguments for the CreateUBIJudgmentListTool.""" + + name: str = Field(description='Name of the judgment list') + click_model: str = Field( + description='Click model used to derive relevance from UBI click data. ' + 'Common value: "coec" (Clicks Over Expected Clicks)' + ) + max_rank: int = Field( + default=20, + description='Maximum rank position to consider when computing click signals (default: 20)', + ge=1, + ) + start_date: Optional[str] = Field( + default=None, + description='Start date for UBI event filtering in ISO format (YYYY-MM-DD)', + ) + end_date: Optional[str] = Field( + default=None, + description='End date for UBI event filtering in ISO format (YYYY-MM-DD)', + ) + + class Config: + json_schema_extra = { + 'examples': [ + {'name': 'ubi-judgments', 'click_model': 'coec'}, + { + 'name': 'ubi-judgments-q1', + 'click_model': 'coec', + 'max_rank': 20, + 'start_date': '2024-01-01', + 'end_date': '2024-03-31', + }, + ] + } + + +class CreateLLMJudgmentListArgs(baseToolArgs): + """Arguments for the CreateLLMJudgmentListTool.""" + + name: str = Field(description='Name of the judgment list') + query_set_id: str = Field(description='ID of the query set to use for generating judgments') + search_configuration_id: str = Field( + description='ID of the search configuration that defines how documents are retrieved' + ) + model_id: str = Field( + description='ID of the ML Commons model connector used to generate LLM relevance ratings' + ) + size: int = Field( + default=5, + description='Number of top documents to retrieve per query for rating (default: 5)', + ge=1, + ) + context_fields: str = Field( + default='[]', + description='JSON array of document field names to include as context for the LLM, e.g. ["title", "description"]. Defaults to all fields.', + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + 'name': 'llm-judgments', + 'query_set_id': '5f0115ad-94b9-403a-912f-3e762870ccf6', + 'search_configuration_id': '2f90d4fd-bd5e-450f-95bb-eabe4a740bd1', + 'model_id': 'N8AE1osB0jLkkocYjz7D', + 'size': 5, + }, + { + 'name': 'llm-judgments-with-context', + 'query_set_id': '5f0115ad-94b9-403a-912f-3e762870ccf6', + 'search_configuration_id': '2f90d4fd-bd5e-450f-95bb-eabe4a740bd1', + 'model_id': 'N8AE1osB0jLkkocYjz7D', + 'size': 10, + 'context_fields': '["title", "description"]', + }, + ] + } + + +class DeleteJudgmentListArgs(baseToolArgs): + """Arguments for the DeleteJudgmentListTool.""" + + judgment_id: str = Field(description='ID of the judgment list to delete') + + class Config: + json_schema_extra = {'examples': [{'judgment_id': 'abc123'}]} + + +class GetExperimentArgs(baseToolArgs): + """Arguments for the GetExperimentTool.""" + + experiment_id: str = Field(description='ID of the experiment to retrieve') + + class Config: + json_schema_extra = {'examples': [{'experiment_id': 'abc123'}]} + + +class CreateExperimentArgs(baseToolArgs): + """Arguments for the CreateExperimentTool.""" + + query_set_id: str = Field(description='ID of the query set to use for the experiment') + search_configuration_ids: str = Field( + description='JSON array of search configuration IDs. ' + 'PAIRWISE_COMPARISON requires exactly 2, ' + 'POINTWISE_EVALUATION and HYBRID_OPTIMIZER require exactly 1. ' + 'Example: ["config-id-1", "config-id-2"]' + ) + experiment_type: Literal['PAIRWISE_COMPARISON', 'POINTWISE_EVALUATION', 'HYBRID_OPTIMIZER'] = Field( + description=( + 'Type of experiment: ' + '"PAIRWISE_COMPARISON" (compares 2 search configurations, no judgment lists required), ' + '"POINTWISE_EVALUATION" (evaluates 1 configuration against judgment lists), ' + '"HYBRID_OPTIMIZER" (optimizes 1 configuration using judgment lists)' + ) + ) + size: int = Field( + default=10, + description='Number of results to retrieve per query (default: 10)', + ge=1, + ) + judgment_list_ids: Optional[str] = Field( + default=None, + description='JSON array of judgment list IDs. Required for POINTWISE_EVALUATION and HYBRID_OPTIMIZER. ' + 'Example: ["judgment-id-1"] or ["judgment-id-1", "judgment-id-2"]', + ) + + class Config: + json_schema_extra = { + 'examples': [ + { + 'query_set_id': 'qs-123', + 'search_configuration_ids': '["config-1", "config-2"]', + 'experiment_type': 'PAIRWISE_COMPARISON', + 'size': 10, + }, + { + 'query_set_id': 'qs-123', + 'search_configuration_ids': '["config-1"]', + 'experiment_type': 'POINTWISE_EVALUATION', + 'judgment_list_ids': '["judgment-1"]', + }, + ] + } + + +class DeleteExperimentArgs(baseToolArgs): + """Arguments for the DeleteExperimentTool.""" + + experiment_id: str = Field(description='ID of the experiment to delete') + + class Config: + json_schema_extra = {'examples': [{'experiment_id': 'abc123'}]} + + +_SRW_SEARCH_QUERY_BODY_DESCRIPTION = ( + 'OpenSearch query DSL body to filter, sort, and paginate results. ' + 'Defaults to {"query": {"match_all": {}}} if not provided. ' + 'Example: {"query": {"match": {"name": "my-config"}}, "size": 20}' +) + +_SRW_SEARCH_QUERY_BODY_EXAMPLES = [ + {}, + {'query_body': {'query': {'match_all': {}}, 'size': 20}}, + {'query_body': {'query': {'match': {'name': 'my-name'}}}}, +] + + +class SearchQuerySetsArgs(baseToolArgs): + """Arguments for the SearchQuerySetsTool.""" + + query_body: Optional[Any] = Field( + default=None, description=_SRW_SEARCH_QUERY_BODY_DESCRIPTION + ) + + class Config: + json_schema_extra = {'examples': _SRW_SEARCH_QUERY_BODY_EXAMPLES} + + +class SearchSearchConfigurationsArgs(baseToolArgs): + """Arguments for the SearchSearchConfigurationsTool.""" + + query_body: Optional[Any] = Field( + default=None, description=_SRW_SEARCH_QUERY_BODY_DESCRIPTION + ) + + class Config: + json_schema_extra = {'examples': _SRW_SEARCH_QUERY_BODY_EXAMPLES} + + +class SearchJudgmentsArgs(baseToolArgs): + """Arguments for the SearchJudgmentsTool.""" + + query_body: Optional[Any] = Field( + default=None, description=_SRW_SEARCH_QUERY_BODY_DESCRIPTION + ) + + class Config: + json_schema_extra = {'examples': _SRW_SEARCH_QUERY_BODY_EXAMPLES} + + +class SearchExperimentsArgs(baseToolArgs): + """Arguments for the SearchExperimentsTool.""" + + query_body: Optional[Any] = Field( + default=None, description=_SRW_SEARCH_QUERY_BODY_DESCRIPTION + ) + + class Config: + json_schema_extra = {'examples': _SRW_SEARCH_QUERY_BODY_EXAMPLES} diff --git a/src/tools/tools.py b/src/tools/tools.py index d422157..09a7476 100644 --- a/src/tools/tools.py +++ b/src/tools/tools.py @@ -3,42 +3,83 @@ import json from .tool_params import ( + CatNodesArgs, + CreateJudgmentListArgs, + CreateLLMJudgmentListArgs, + CreateSearchConfigurationArgs, + CreateUBIJudgmentListArgs, + DeleteJudgmentListArgs, + DeleteSearchConfigurationArgs, GetAllocationArgs, GetClusterStateArgs, GetIndexInfoArgs, GetIndexMappingArgs, GetIndexStatsArgs, + GetJudgmentListArgs, GetLongRunningTasksArgs, - CatNodesArgs, GetNodesArgs, GetNodesHotThreadsArgs, GetQueryInsightsArgs, + GetSearchConfigurationArgs, GetSegmentsArgs, GetShardsArgs, ListIndicesArgs, SearchIndexArgs, + GetQuerySetArgs, + CreateQuerySetArgs, + SampleQuerySetArgs, + DeleteQuerySetArgs, + GetExperimentArgs, + CreateExperimentArgs, + DeleteExperimentArgs, + SearchQuerySetsArgs, + SearchSearchConfigurationsArgs, + SearchJudgmentsArgs, + SearchExperimentsArgs, baseToolArgs, ) +from .tool_logging import log_tool_error from .utils import is_tool_compatible from opensearch.helper import ( convert_search_results_to_csv, + create_judgment_list, + create_llm_judgment_list, + create_search_configuration, + create_ubi_judgment_list, + delete_judgment_list, + delete_search_configuration, get_allocation, get_cluster_state, get_index, get_index_info, get_index_mapping, get_index_stats, + get_judgment_list, get_long_running_tasks, get_nodes, - get_nodes_info, get_nodes_hot_threads, + get_nodes_info, get_opensearch_version, get_query_insights, + get_search_configuration, get_segments, get_shards, list_indices, search_index, + get_query_set, + create_query_set, + sample_query_set, + delete_query_set, + get_experiment, + create_experiment, + delete_experiment, + search_query_sets, + search_search_configurations, + search_judgments, + search_experiments, ) +from tools.agentic_memory.actions import * +from tools.agentic_memory.params import * from .skills_tools import SKILLS_TOOLS_REGISTRY @@ -93,7 +134,7 @@ async def list_indices_tool(args: ListIndicesArgs) -> list[dict]: formatted_names = json.dumps(index_names, separators=(',', ':')) return [{'type': 'text', 'text': f'Indices:\n{formatted_names}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error listing indices: {str(e)}'}] + return log_tool_error('ListIndexTool', e, 'listing indices', index=getattr(args, 'index', None)) async def get_index_mapping_tool(args: GetIndexMappingArgs) -> list[dict]: @@ -104,14 +145,14 @@ async def get_index_mapping_tool(args: GetIndexMappingArgs) -> list[dict]: return [{'type': 'text', 'text': f'Mapping for {args.index}:\n{formatted_mapping}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting mapping: {str(e)}'}] + return log_tool_error('IndexMappingTool', e, 'getting mapping', index=args.index) async def search_index_tool(args: SearchIndexArgs) -> list[dict]: try: await check_tool_compatibility('SearchIndexTool', args) result = await search_index(args) - + if args.format.lower() == 'csv': csv_result = convert_search_results_to_csv(result) return [ @@ -129,7 +170,7 @@ async def search_index_tool(args: SearchIndexArgs) -> list[dict]: } ] except Exception as e: - return [{'type': 'text', 'text': f'Error searching index: {str(e)}'}] + return log_tool_error('SearchIndexTool', e, 'searching index', index=args.index) async def get_shards_tool(args: GetShardsArgs) -> list[dict]: @@ -138,7 +179,7 @@ async def get_shards_tool(args: GetShardsArgs) -> list[dict]: result = await get_shards(args) if isinstance(result, dict) and 'error' in result: - return [{'type': 'text', 'text': f'Error getting shards: {result["error"]}'}] + return log_tool_error('GetShardsTool', Exception(result['error']), 'getting shards', index=getattr(args, 'index', None)) formatted_text = 'index | shard | prirep | state | docs | store | ip | node\n' # Format each shard row @@ -154,7 +195,7 @@ async def get_shards_tool(args: GetShardsArgs) -> list[dict]: return [{'type': 'text', 'text': formatted_text}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting shards information: {str(e)}'}] + return log_tool_error('GetShardsTool', e, 'getting shards information', index=getattr(args, 'index', None)) async def get_cluster_state_tool(args: GetClusterStateArgs) -> list[dict]: @@ -182,7 +223,7 @@ async def get_cluster_state_tool(args: GetClusterStateArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting cluster state: {str(e)}'}] + return log_tool_error('GetClusterStateTool', e, 'getting cluster state') async def get_segments_tool(args: GetSegmentsArgs) -> list[dict]: @@ -199,7 +240,7 @@ async def get_segments_tool(args: GetSegmentsArgs) -> list[dict]: result = await get_segments(args) if isinstance(result, dict) and 'error' in result: - return [{'type': 'text', 'text': f'Error getting segments: {result["error"]}'}] + return log_tool_error('GetSegmentsTool', Exception(result['error']), 'getting segments', index=getattr(args, 'index', None)) # Create a formatted table for better readability formatted_text = 'index | shard | prirep | segment | generation | docs.count | docs.deleted | size | memory.bookkeeping | memory.vectors | memory.docvalues | memory.terms | version\n' @@ -229,7 +270,7 @@ async def get_segments_tool(args: GetSegmentsArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_text}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting segment information: {str(e)}'}] + return log_tool_error('GetSegmentsTool', e, 'getting segment information', index=getattr(args, 'index', None)) async def cat_nodes_tool(args: CatNodesArgs) -> list[dict]: @@ -246,7 +287,7 @@ async def cat_nodes_tool(args: CatNodesArgs) -> list[dict]: result = await get_nodes(args) if isinstance(result, dict) and 'error' in result: - return [{'type': 'text', 'text': f'Error getting nodes: {result["error"]}'}] + return log_tool_error('CatNodesTool', Exception(result['error']), 'getting node information') # If no nodes found if not result: @@ -272,7 +313,7 @@ async def cat_nodes_tool(args: CatNodesArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_text}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting node information: {str(e)}'}] + return log_tool_error('CatNodesTool', e, 'getting node information') async def get_index_info_tool(args: GetIndexInfoArgs) -> list[dict]: @@ -296,7 +337,7 @@ async def get_index_info_tool(args: GetIndexInfoArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting index information: {str(e)}'}] + return log_tool_error('GetIndexInfoTool', e, 'getting index information', index=args.index) async def get_index_stats_tool(args: GetIndexStatsArgs) -> list[dict]: @@ -322,7 +363,7 @@ async def get_index_stats_tool(args: GetIndexStatsArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting index statistics: {str(e)}'}] + return log_tool_error('GetIndexStatsTool', e, 'getting index statistics', index=args.index) async def get_query_insights_tool(args: GetQueryInsightsArgs) -> list[dict]: @@ -346,7 +387,7 @@ async def get_query_insights_tool(args: GetQueryInsightsArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting query insights: {str(e)}'}] + return log_tool_error('GetQueryInsightsTool', e, 'getting query insights') async def get_nodes_hot_threads_tool(args: GetNodesHotThreadsArgs) -> list[dict]: @@ -368,7 +409,7 @@ async def get_nodes_hot_threads_tool(args: GetNodesHotThreadsArgs) -> list[dict] # The hot_threads API returns text, not JSON, so we don't need to format it return [{'type': 'text', 'text': f'{message}:\n{result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting hot threads information: {str(e)}'}] + return log_tool_error('GetNodesHotThreadsTool', e, 'getting hot threads information') async def get_allocation_tool(args: GetAllocationArgs) -> list[dict]: @@ -385,12 +426,7 @@ async def get_allocation_tool(args: GetAllocationArgs) -> list[dict]: result = await get_allocation(args) if isinstance(result, dict) and 'error' in result: - return [ - { - 'type': 'text', - 'text': f'Error getting allocation information: {result["error"]}', - } - ] + return log_tool_error('GetAllocationTool', Exception(result['error']), 'getting allocation information') # If no allocation information found if not result: @@ -414,7 +450,7 @@ async def get_allocation_tool(args: GetAllocationArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_text}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting allocation information: {str(e)}'}] + return log_tool_error('GetAllocationTool', e, 'getting allocation information') async def get_nodes_tool(args: GetNodesArgs) -> list[dict]: @@ -431,9 +467,7 @@ async def get_nodes_tool(args: GetNodesArgs) -> list[dict]: result = await get_nodes_info(args) if isinstance(result, dict) and 'error' in result: - return [ - {'type': 'text', 'text': f'Error getting nodes information: {result["error"]}'} - ] + return log_tool_error('GetNodesTool', Exception(result['error']), 'getting nodes information') # Format the response for better readability formatted_result = json.dumps(result, separators=(',', ':')) @@ -450,7 +484,7 @@ async def get_nodes_tool(args: GetNodesArgs) -> list[dict]: return [{'type': 'text', 'text': f'{message}:\n{formatted_result}'}] except Exception as e: - return [{'type': 'text', 'text': f'Error getting nodes information: {str(e)}'}] + return log_tool_error('GetNodesTool', e, 'getting nodes information') async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dict]: @@ -467,9 +501,7 @@ async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dic result = await get_long_running_tasks(args) if isinstance(result, dict) and 'error' in result: - return [ - {'type': 'text', 'text': f'Error getting long-running tasks: {result["error"]}'} - ] + return log_tool_error('GetLongRunningTasksTool', Exception(result['error']), 'getting long-running tasks') # If no tasks found if not result: @@ -493,14 +525,436 @@ async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dic return [{'type': 'text', 'text': f'{message}:\n{formatted_text}'}] except Exception as e: + return log_tool_error('GetLongRunningTasksTool', e, 'getting long-running tasks information') + + +async def create_search_configuration_tool(args: CreateSearchConfigurationArgs) -> list[dict]: + """Tool to create a search configuration via the Search Relevance plugin. + + Args: + args: CreateSearchConfigurationArgs + + Returns: + list[dict]: Created configuration details in MCP format + """ + try: + await check_tool_compatibility('CreateSearchConfigurationTool', args) + result = await create_search_configuration(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Search configuration created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateSearchConfigurationTool', e, 'creating search configuration') + + +async def get_search_configuration_tool(args: GetSearchConfigurationArgs) -> list[dict]: + """Tool to retrieve a search configuration by ID. + + Args: + args: GetSearchConfigurationArgs + + Returns: + list[dict]: Search configuration details in MCP format + """ + try: + await check_tool_compatibility('GetSearchConfigurationTool', args) + result = await get_search_configuration(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [ + { + 'type': 'text', + 'text': f'Search configuration {args.search_configuration_id}:\n{formatted_result}', + } + ] + except Exception as e: + return log_tool_error('GetSearchConfigurationTool', e, 'retrieving search configuration') + + +async def delete_search_configuration_tool(args: DeleteSearchConfigurationArgs) -> list[dict]: + """Tool to delete a search configuration by ID. + + Args: + args: DeleteSearchConfigurationArgs + + Returns: + list[dict]: Deletion result in MCP format + """ + try: + await check_tool_compatibility('DeleteSearchConfigurationTool', args) + result = await delete_search_configuration(args) + formatted_result = json.dumps(result, separators=(',', ':')) return [ - {'type': 'text', 'text': f'Error getting long-running tasks information: {str(e)}'} + { + 'type': 'text', + 'text': f'Search configuration {args.search_configuration_id} deleted:\n{formatted_result}', + } ] + except Exception as e: + return log_tool_error('DeleteSearchConfigurationTool', e, 'deleting search configuration') + + +async def get_query_set_tool(args: GetQuerySetArgs) -> list[dict]: + """Tool to retrieve a specific query set by ID from the Search Relevance plugin. + + Args: + args: GetQuerySetArgs containing the query_set_id + + Returns: + list[dict]: Query set details in MCP format + """ + try: + await check_tool_compatibility('GetQuerySetTool', args) + result = await get_query_set(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Query set {args.query_set_id}:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('GetQuerySetTool', e, 'retrieving query set') + + +async def create_query_set_tool(args: CreateQuerySetArgs) -> list[dict]: + """Tool to create a new query set with a list of queries. + + Args: + args: CreateQuerySetArgs containing name, queries (JSON string), and optional description + + Returns: + list[dict]: Result of the creation operation in MCP format + """ + try: + await check_tool_compatibility('CreateQuerySetTool', args) + result = await create_query_set(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Query set created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateQuerySetTool', e, 'creating query set') + + +async def sample_query_set_tool(args: SampleQuerySetArgs) -> list[dict]: + """Tool to create a query set by sampling top queries from user behavior data (UBI). + + Args: + args: SampleQuerySetArgs containing name, query_set_size, and optional description + + Returns: + list[dict]: Result of the sampling operation in MCP format + """ + try: + await check_tool_compatibility('SampleQuerySetTool', args) + result = await sample_query_set(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Query set sampled:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('SampleQuerySetTool', e, 'sampling query set') + + +async def delete_query_set_tool(args: DeleteQuerySetArgs) -> list[dict]: + """Tool to delete a query set by ID. + + Args: + args: DeleteQuerySetArgs containing the query_set_id + + Returns: + list[dict]: Result of the deletion operation in MCP format + """ + try: + await check_tool_compatibility('DeleteQuerySetTool', args) + result = await delete_query_set(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Query set {args.query_set_id} deleted:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('DeleteQuerySetTool', e, 'deleting query set') + + +async def get_judgment_list_tool(args: GetJudgmentListArgs) -> list[dict]: + """Tool to retrieve a specific judgment list by ID from the Search Relevance plugin. + + Args: + args: GetJudgmentListArgs containing the judgment_id + + Returns: + list[dict]: Judgment list details in MCP format + """ + try: + await check_tool_compatibility('GetJudgmentListTool', args) + result = await get_judgment_list(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Judgment list: {args.judgment_id}:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('GetJudgmentListTool', e, 'retrieving judgment list') + + +async def create_judgment_list_tool(args: CreateJudgmentListArgs) -> list[dict]: + """Tool to create a judgment list with manual relevance ratings. + + Args: + args: CreateJudgmentListArgs containing name, judgment_ratings (JSON string), and optional description + + Returns: + list[dict]: Result of the creation operation in MCP format + """ + try: + await check_tool_compatibility('CreateJudgmentListTool', args) + result = await create_judgment_list(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Judgment created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateJudgmentListTool', e, 'creating judgment list') + + +async def create_ubi_judgment_list_tool(args: CreateUBIJudgmentListArgs) -> list[dict]: + """Tool to create a judgment list by mining relevance signals from UBI click data. + + Args: + args: CreateUBIJudgmentListArgs containing name, click_model, max_rank, and optional date range + + Returns: + list[dict]: Result of the creation operation in MCP format + """ + try: + await check_tool_compatibility('CreateUBIJudgmentListTool', args) + result = await create_ubi_judgment_list(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'UBI judgment created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateUBIJudgmentListTool', e, 'creating UBI judgment') + + +async def delete_judgment_list_tool(args: DeleteJudgmentListArgs) -> list[dict]: + """Tool to delete a judgment list by ID from the Search Relevance plugin. + + Args: + args: DeleteJudgmentListArgs containing the judgment_id + + Returns: + list[dict]: Result of the deletion operation in MCP format + """ + try: + await check_tool_compatibility('DeleteJudgmentListTool', args) + result = await delete_judgment_list(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Judgment list {args.judgment_id} deleted:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('DeleteJudgmentListTool', e, 'deleting judgment list') +async def create_llm_judgment_list_tool(args: CreateLLMJudgmentListArgs) -> list[dict]: + """Tool to create a judgment list using an LLM model via the Search Relevance plugin. + + For each query in the query set, the top k documents are retrieved using the + specified search configuration and rated by the LLM model. + + Args: + args: CreateLLMJudgmentListArgs containing name, query_set_id, search_configuration_id, + model_id, size, and optional context_fields + + Returns: + list[dict]: Result of the creation operation in MCP format + """ + try: + await check_tool_compatibility('CreateLLMJudgmentListTool', args) + result = await create_llm_judgment_list(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'LLM judgment list created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateLLMJudgmentListTool', e, 'creating LLM judgment list') + + +async def get_experiment_tool(args: GetExperimentArgs) -> list[dict]: + """Tool to retrieve a specific experiment by ID from the Search Relevance plugin. + + Args: + args: GetExperimentArgs containing the experiment_id + + Returns: + list[dict]: Experiment details in MCP format + """ + try: + await check_tool_compatibility('GetExperimentTool', args) + result = await get_experiment(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Experiment {args.experiment_id}:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('GetExperimentTool', e, 'retrieving experiment') + + +async def create_experiment_tool(args: CreateExperimentArgs) -> list[dict]: + """Tool to create a search relevance experiment via the Search Relevance plugin. + + Args: + args: CreateExperimentArgs containing query_set_id, search_configuration_ids, + experiment_type, size, and optional judgment_list_ids + + Returns: + list[dict]: Result of the creation operation in MCP format + """ + try: + await check_tool_compatibility('CreateExperimentTool', args) + result = await create_experiment(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Experiment created:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('CreateExperimentTool', e, 'creating experiment') + + +async def delete_experiment_tool(args: DeleteExperimentArgs) -> list[dict]: + """Tool to delete an experiment by ID from the Search Relevance plugin. + + Args: + args: DeleteExperimentArgs containing the experiment_id + + Returns: + list[dict]: Result of the deletion operation in MCP format + """ + try: + await check_tool_compatibility('DeleteExperimentTool', args) + result = await delete_experiment(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Experiment {args.experiment_id} deleted:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('DeleteExperimentTool', e, 'deleting experiment') + + +async def search_query_sets_tool(args: SearchQuerySetsArgs) -> list[dict]: + """Tool to search query sets using OpenSearch query DSL. + + Args: + args: SearchQuerySetsArgs containing an optional query_body + + Returns: + list[dict]: Search results in MCP format + """ + try: + await check_tool_compatibility('SearchQuerySetsTool', args) + result = await search_query_sets(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Query set search results:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('SearchQuerySetsTool', e, 'searching query sets') + + +async def search_search_configurations_tool(args: SearchSearchConfigurationsArgs) -> list[dict]: + """Tool to search search configurations using OpenSearch query DSL. + + Args: + args: SearchSearchConfigurationsArgs containing an optional query_body + + Returns: + list[dict]: Search results in MCP format + """ + try: + await check_tool_compatibility('SearchSearchConfigurationsTool', args) + result = await search_search_configurations(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Search configuration search results:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('SearchSearchConfigurationsTool', e, 'searching search configurations') + + +async def search_judgments_tool(args: SearchJudgmentsArgs) -> list[dict]: + """Tool to search judgments using OpenSearch query DSL. + + Args: + args: SearchJudgmentsArgs containing an optional query_body + + Returns: + list[dict]: Search results in MCP format + """ + try: + await check_tool_compatibility('SearchJudgmentsTool', args) + result = await search_judgments(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Judgment search results:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('SearchJudgmentsTool', e, 'searching judgments') + + +async def search_experiments_tool(args: SearchExperimentsArgs) -> list[dict]: + """Tool to search experiments using OpenSearch query DSL. + + Args: + args: SearchExperimentsArgs containing an optional query_body + + Returns: + list[dict]: Search results in MCP format + """ + try: + await check_tool_compatibility('SearchExperimentsTool', args) + result = await search_experiments(args) + formatted_result = json.dumps(result, separators=(',', ':')) + return [{'type': 'text', 'text': f'Experiment search results:\n{formatted_result}'}] + except Exception as e: + return log_tool_error('SearchExperimentsTool', e, 'searching experiments') + from .generic_api_tool import GenericOpenSearchApiArgs, generic_opensearch_api_tool +# Agentic memory tools registry (conditionally registered based on configuration) +AGENTIC_MEMORY_TOOLS = { + 'CreateAgenticMemorySessionTool': { + 'display_name': 'CreateAgenticMemorySessionTool', + 'description': 'Create a new session in a memory container.', + 'input_schema': CreateAgenticMemorySessionArgs.model_json_schema(), + 'function': create_agentic_memory_session_tool, + 'args_model': CreateAgenticMemorySessionArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'POST', + }, + 'AddAgenticMemoriesTool': { + 'display_name': 'AddAgenticMemoriesTool', + 'description': 'Add an agentic memory to a memory container.', + 'input_schema': AddAgenticMemoriesArgs.model_json_schema(), + 'function': add_agentic_memories_tool, + 'args_model': AddAgenticMemoriesArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'POST', + }, + 'GetAgenticMemoryTool': { + 'display_name': 'GetAgenticMemoryTool', + 'description': 'Retrieve a specific memory by its type and ID.', + 'input_schema': GetAgenticMemoryArgs.model_json_schema(), + 'function': get_agentic_memory_tool, + 'args_model': GetAgenticMemoryArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'GET', + }, + 'UpdateAgenticMemoryTool': { + 'display_name': 'UpdateAgenticMemoryTool', + 'description': 'Update a specific memory by its type and ID.', + 'input_schema': UpdateAgenticMemoryArgs.model_json_schema(), + 'function': update_agentic_memory_tool, + 'args_model': UpdateAgenticMemoryArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'PUT', + }, + 'DeleteAgenticMemoryByIDTool': { + 'display_name': 'DeleteAgenticMemoryByIDTool', + 'description': 'Deletes specific agentic memory container by its type and ID.', + 'input_schema': DeleteAgenticMemoryByIDArgs.model_json_schema(), + 'function': delete_agentic_memory_by_id_tool, + 'args_model': DeleteAgenticMemoryByIDArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'DELETE', + }, + 'DeleteAgenticMemoryByQueryTool': { + 'display_name': 'DeleteAgenticMemoryByQueryTool', + 'description': 'Deletes specific agentic memory by query.', + 'input_schema': DeleteAgenticMemoryByQueryArgs.model_json_schema(), + 'function': delete_agentic_memory_by_query_tool, + 'args_model': DeleteAgenticMemoryByQueryArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'POST', + }, + 'SearchAgenticMemoryTool': { + 'display_name': 'SearchAgenticMemoryTool', + 'description': 'Search for memories of a specific type within a memory container.', + 'input_schema': SearchAgenticMemoryArgs.model_json_schema(), + 'function': search_agentic_memory_tool, + 'args_model': SearchAgenticMemoryArgs, + 'min_version': '3.3.0', # Agentic memory APIs requires OpenSearch 3.3+ + 'http_methods': 'GET', + }, +} + + # Registry of available OpenSearch tools with their metadata TOOL_REGISTRY = { **SKILLS_TOOLS_REGISTRY, @@ -523,7 +977,7 @@ async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dic }, 'SearchIndexTool': { 'display_name': 'SearchIndexTool', - 'description': 'Searches an index using a query written in query domain-specific language (DSL) in OpenSearch', + 'description': 'Searches an index using a query written in query domain-specific language (DSL) in OpenSearch. PREREQUISITE: You need to know the mappings of the index before constructing queries.', 'input_schema': SearchIndexArgs.model_json_schema(), 'function': search_index_tool, 'args_model': SearchIndexArgs, @@ -627,6 +1081,127 @@ async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dic 'min_version': '1.0.0', 'http_methods': 'GET', }, + 'GetQuerySetTool': { + 'display_name': 'GetQuerySetTool', + 'description': 'Retrieves a specific query set by ID from the OpenSearch Search Relevance plugin. Query sets are collections of search queries used for relevance testing and evaluation.', + 'input_schema': GetQuerySetArgs.model_json_schema(), + 'function': get_query_set_tool, + 'args_model': GetQuerySetArgs, + 'min_version': '3.1.0', + 'http_methods': 'GET', + }, + 'CreateQuerySetTool': { + 'display_name': 'CreateQuerySetTool', + 'description': 'Creates a new query set in the OpenSearch Search Relevance plugin by providing a list of queries. Query sets are used for relevance testing and evaluation.', + 'input_schema': CreateQuerySetArgs.model_json_schema(), + 'function': create_query_set_tool, + 'args_model': CreateQuerySetArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, + 'SampleQuerySetTool': { + 'display_name': 'SampleQuerySetTool', + 'description': 'Creates a query set by sampling the top N most frequent queries from user behavior data (UBI indices) in the OpenSearch Search Relevance plugin.', + 'input_schema': SampleQuerySetArgs.model_json_schema(), + 'function': sample_query_set_tool, + 'args_model': SampleQuerySetArgs, + 'min_version': '3.1.0', + 'http_methods': 'POST', + }, + 'DeleteQuerySetTool': { + 'display_name': 'DeleteQuerySetTool', + 'description': 'Deletes a query set by ID from the OpenSearch Search Relevance plugin.', + 'input_schema': DeleteQuerySetArgs.model_json_schema(), + 'function': delete_query_set_tool, + 'args_model': DeleteQuerySetArgs, + 'min_version': '3.1.0', + 'http_methods': 'DELETE', + }, + 'GetExperimentTool': { + 'display_name': 'GetExperimentTool', + 'description': 'Retrieves a search relevance experiment by ID from the OpenSearch Search Relevance plugin.', + 'input_schema': GetExperimentArgs.model_json_schema(), + 'function': get_experiment_tool, + 'args_model': GetExperimentArgs, + 'min_version': '3.1.0', + 'http_methods': 'GET', + }, + 'CreateExperimentTool': { + 'display_name': 'CreateExperimentTool', + 'description': ( + 'Creates a search relevance experiment using the OpenSearch Search Relevance plugin. ' + 'Supports three experiment types: ' + 'PAIRWISE_COMPARISON (compares 2 search configurations head-to-head), ' + 'POINTWISE_EVALUATION (evaluates 1 configuration against judgment lists), ' + 'HYBRID_OPTIMIZER (optimizes 1 configuration using judgment lists).' + ), + 'input_schema': CreateExperimentArgs.model_json_schema(), + 'function': create_experiment_tool, + 'args_model': CreateExperimentArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, + 'DeleteExperimentTool': { + 'display_name': 'DeleteExperimentTool', + 'description': 'Deletes a search relevance experiment by ID from the OpenSearch Search Relevance plugin.', + 'input_schema': DeleteExperimentArgs.model_json_schema(), + 'function': delete_experiment_tool, + 'args_model': DeleteExperimentArgs, + 'min_version': '3.1.0', + 'http_methods': 'DELETE', + }, + 'SearchQuerySetsTool': { + 'display_name': 'SearchQuerySetsTool', + 'description': ( + 'Searches query sets in the OpenSearch Search Relevance plugin using OpenSearch query DSL.' + 'Accepts a full query DSL body to filter, sort, and paginate results. ' + 'Returns all query sets when called without a query body.' + ), + 'input_schema': SearchQuerySetsArgs.model_json_schema(), + 'function': search_query_sets_tool, + 'args_model': SearchQuerySetsArgs, + 'min_version': '3.5.0', + 'http_methods': 'GET, POST', + }, + 'SearchSearchConfigurationsTool': { + 'display_name': 'SearchSearchConfigurationsTool', + 'description': ( + 'Searches search configurations in the OpenSearch Search Relevance plugin using OpenSearch query DSL.' + 'Accepts a full query DSL body to filter, sort, and paginate results. ' + 'Returns all search configurations when called without a query body.' + ), + 'input_schema': SearchSearchConfigurationsArgs.model_json_schema(), + 'function': search_search_configurations_tool, + 'args_model': SearchSearchConfigurationsArgs, + 'min_version': '3.5.0', + 'http_methods': 'GET, POST', + }, + 'SearchJudgmentsTool': { + 'display_name': 'SearchJudgmentsTool', + 'description': ( + 'Searches judgments in the OpenSearch Search Relevance plugin using OpenSearch query DSL.' + 'Accepts a full query DSL body to filter, sort, and paginate results. ' + 'Returns all judgments when called without a query body.' + ), + 'input_schema': SearchJudgmentsArgs.model_json_schema(), + 'function': search_judgments_tool, + 'args_model': SearchJudgmentsArgs, + 'min_version': '3.5.0', + 'http_methods': 'GET, POST', + }, + 'SearchExperimentsTool': { + 'display_name': 'SearchExperimentsTool', + 'description': ( + 'Searches experiments in the OpenSearch Search Relevance plugin using OpenSearch query DSL.' + 'Accepts a full query DSL body to filter, sort, and paginate results. ' + 'Returns all experiments when called without a query body.' + ), + 'input_schema': SearchExperimentsArgs.model_json_schema(), + 'function': search_experiments_tool, + 'args_model': SearchExperimentsArgs, + 'min_version': '3.5.0', + 'http_methods': 'GET, POST', + }, 'GenericOpenSearchApiTool': { 'display_name': 'GenericOpenSearchApiTool', 'description': "A flexible tool for calling any OpenSearch API endpoint. Supports all HTTP methods with custom paths, query parameters, request bodies, and headers. Use this when you need to access OpenSearch APIs that don't have dedicated tools, or when you need more control over the request. Leverages your knowledge of OpenSearch API documentation to construct appropriate requests.", @@ -636,4 +1211,99 @@ async def get_long_running_tasks_tool(args: GetLongRunningTasksArgs) -> list[dic 'min_version': '1.0.0', 'http_methods': 'GET, POST, PUT, DELETE, HEAD, PATCH', }, + 'CreateSearchConfigurationTool': { + 'display_name': 'CreateSearchConfigurationTool', + 'description': 'Creates a new search configuration in OpenSearch using the Search Relevance plugin. ' + 'The query must be an OpenSearch DSL JSON string with %SearchText% as the search placeholder.', + 'input_schema': CreateSearchConfigurationArgs.model_json_schema(), + 'function': create_search_configuration_tool, + 'args_model': CreateSearchConfigurationArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, + 'GetSearchConfigurationTool': { + 'display_name': 'GetSearchConfigurationTool', + 'description': 'Retrieves a specific search configuration by ID from OpenSearch using the Search Relevance plugin.', + 'input_schema': GetSearchConfigurationArgs.model_json_schema(), + 'function': get_search_configuration_tool, + 'args_model': GetSearchConfigurationArgs, + 'min_version': '3.1.0', + 'http_methods': 'GET', + }, + 'DeleteSearchConfigurationTool': { + 'display_name': 'DeleteSearchConfigurationTool', + 'description': 'Deletes a search configuration by ID from OpenSearch using the Search Relevance plugin.', + 'input_schema': DeleteSearchConfigurationArgs.model_json_schema(), + 'function': delete_search_configuration_tool, + 'args_model': DeleteSearchConfigurationArgs, + 'min_version': '3.1.0', + 'http_methods': 'DELETE', + }, + 'GetJudgmentListTool': { + 'display_name': 'GetJudgmentListTool', + 'description': 'Retrieves a specific judgment list by ID from OpenSearch using the Search Relevance plugin.', + 'input_schema': GetJudgmentListArgs.model_json_schema(), + 'function': get_judgment_list_tool, + 'args_model': GetJudgmentListArgs, + 'min_version': '3.1.0', + 'http_methods': 'GET', + }, + 'CreateJudgmentListTool': { + 'display_name': 'CreateJudgmentListTool', + 'description': 'Creates a judgment list with manual relevance ratings in OpenSearch using the Search Relevance plugin. ' + 'Accepts a JSON array of query-ratings objects with docId and numeric rating (0–3) per document.', + 'input_schema': CreateJudgmentListArgs.model_json_schema(), + 'function': create_judgment_list_tool, + 'args_model': CreateJudgmentListArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, + 'CreateUBIJudgmentListTool': { + 'display_name': 'CreateUBIJudgmentListTool', + 'description': 'Creates a judgment list by mining relevance signals from User Behavior Insights (UBI) click data ' + 'stored in OpenSearch. Requires UBI indices to be populated.', + 'input_schema': CreateUBIJudgmentListArgs.model_json_schema(), + 'function': create_ubi_judgment_list_tool, + 'args_model': CreateUBIJudgmentListArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, + 'DeleteJudgmentListTool': { + 'display_name': 'DeleteJudgmentListTool', + 'description': 'Deletes a judgment list by ID from OpenSearch using the Search Relevance plugin.', + 'input_schema': DeleteJudgmentListArgs.model_json_schema(), + 'function': delete_judgment_list_tool, + 'args_model': DeleteJudgmentListArgs, + 'min_version': '3.1.0', + 'http_methods': 'DELETE', + }, + 'CreateLLMJudgmentListTool': { + 'display_name': 'CreateLLMJudgmentListTool', + 'description': 'Creates a judgment list using an LLM model configured in OpenSearch ML Commons. ' + 'For each query in the specified query set, the top k documents are retrieved via the search ' + 'configuration and rated by the LLM for relevance.', + 'input_schema': CreateLLMJudgmentListArgs.model_json_schema(), + 'function': create_llm_judgment_list_tool, + 'args_model': CreateLLMJudgmentListArgs, + 'min_version': '3.1.0', + 'http_methods': 'PUT', + }, } + + +def register_agentic_memory_tools(config_file_path: str = '') -> None: + """Register agentic memory tools if memory_container_id is configured. + + This function adds agentic memory tools to the TOOL_REGISTRY only if + memory_container_id is set via config file or environment variable. + If configured, it also sets the default value for memory_container_id + in all agentic memory tools. + + :param config_file_path: Path to the YAML configuration file + """ + from tools.config import should_enable_agentic_memory_tools, get_memory_container_id_from_config + + if not should_enable_agentic_memory_tools(config_file_path): + return + + TOOL_REGISTRY.update(AGENTIC_MEMORY_TOOLS) diff --git a/src/tools/utils.py b/src/tools/utils.py index 4a68bc5..8086b45 100644 --- a/src/tools/utils.py +++ b/src/tools/utils.py @@ -3,7 +3,13 @@ import logging import yaml +import logging +import inspect from semver import Version +from tools.exceptions import HelperOperationError + +# Configure logging +logger = logging.getLogger(__name__) def is_tool_compatible(current_version: Version | None, tool_info: dict = {}): @@ -63,3 +69,32 @@ def validate_tools(tool_list, display_lookup, source_name): else: logging.warning(f"Ignoring invalid tool from '{source_name}': '{tool}'") return valid_tools + + +def helper_error( + action: str, exc: Exception, func_name: str | None = None +) -> HelperOperationError: + """Create a HelperOperationError with context about the failed operation. + + Args: + action: Description of the action that failed (e.g., 'search agentic memory') + exc: The original exception that occurred + func_name: Name of the function where the error occurred; + if None, automatically detected from direct caller function + + Returns: + HelperOperationError: Exception enriched with context about the failure + """ + if func_name is None: + try: + stack = inspect.stack() + caller_frame = stack[1] # 0 = current function, 1 = direct caller + func_name = caller_frame.function + except (IndexError, AttributeError, Exception): + func_name = "unknown_function" + + logger.error(f"[{func_name}] Failed to {action}: {exc}") + + return HelperOperationError( + message=f"Failed to {action}", func_name=func_name, action=action, original=exc + ) diff --git a/tests/conftest.py b/tests/conftest.py new file mode 100644 index 0000000..12b4174 --- /dev/null +++ b/tests/conftest.py @@ -0,0 +1,27 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import os +import pytest + + +def pytest_addoption(parser): + parser.addoption( + '--run-evals', + action='store_true', + default=False, + help='Run LLM eval tests that call the Anthropic API (requires ANTHROPIC_API_KEY)', + ) + + +def pytest_collection_modifyitems(config, items): + if not config.getoption('--run-evals'): + skip = pytest.mark.skip(reason='LLM eval tests are skipped by default; pass --run-evals to run them') + for item in items: + if item.get_closest_marker('eval'): + item.add_marker(skip) + elif not os.environ.get('ANTHROPIC_API_KEY'): + skip = pytest.mark.skip(reason='ANTHROPIC_API_KEY environment variable is not set') + for item in items: + if item.get_closest_marker('eval'): + item.add_marker(skip) diff --git a/tests/mcp_server_opensearch/test_logging_config.py b/tests/mcp_server_opensearch/test_logging_config.py new file mode 100644 index 0000000..b05b8ae --- /dev/null +++ b/tests/mcp_server_opensearch/test_logging_config.py @@ -0,0 +1,308 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import asyncio +import io +import json +import logging +from unittest.mock import patch + +import pytest + +from mcp_server_opensearch.logging_config import ( + JsonFormatter, + _get_rss_mb, + _handle_monitor_error, + configure_logging, + memory_monitor, + start_memory_monitor, +) + + +class TestJsonFormatter: + def setup_method(self): + self.formatter = JsonFormatter() + + def test_basic_format_produces_valid_json(self): + record = logging.LogRecord( + name='test.logger', + level=logging.INFO, + pathname='test.py', + lineno=1, + msg='Hello %s', + args=('world',), + exc_info=None, + ) + output = self.formatter.format(record) + parsed = json.loads(output) + + assert parsed['level'] == 'INFO' + assert parsed['logger'] == 'test.logger' + assert parsed['message'] == 'Hello world' + assert 'timestamp' in parsed + + def test_extra_fields_merged_into_top_level(self): + record = logging.LogRecord( + name='test.logger', + level=logging.ERROR, + pathname='test.py', + lineno=1, + msg='tool failed', + args=(), + exc_info=None, + ) + record.event_type = 'tool_execution' + record.tool_name = 'SearchIndexTool' + record.duration_ms = 42.5 + + output = self.formatter.format(record) + parsed = json.loads(output) + + assert parsed['event_type'] == 'tool_execution' + assert parsed['tool_name'] == 'SearchIndexTool' + assert parsed['duration_ms'] == 42.5 + + def test_exception_info_included(self): + try: + raise ValueError('test error') + except ValueError: + import sys + + exc_info = sys.exc_info() + + record = logging.LogRecord( + name='test.logger', + level=logging.ERROR, + pathname='test.py', + lineno=1, + msg='error occurred', + args=(), + exc_info=exc_info, + ) + output = self.formatter.format(record) + parsed = json.loads(output) + + assert 'exception' in parsed + assert 'ValueError' in parsed['exception'] + + def test_timestamp_format_is_iso8601(self): + record = logging.LogRecord( + name='test', + level=logging.INFO, + pathname='', + lineno=0, + msg='test', + args=(), + exc_info=None, + ) + output = self.formatter.format(record) + parsed = json.loads(output) + # Should end with Z and have T separator + assert parsed['timestamp'].endswith('Z') + assert 'T' in parsed['timestamp'] + + +class TestConfigureLogging: + def teardown_method(self): + # Reset root logger after each test + root = logging.getLogger() + for handler in root.handlers[:]: + root.removeHandler(handler) + + def test_json_format_uses_json_formatter(self): + configure_logging(level=logging.INFO, log_format='json') + root = logging.getLogger() + assert any(isinstance(h.formatter, JsonFormatter) for h in root.handlers) + + def test_text_format_uses_standard_formatter(self): + configure_logging(level=logging.INFO, log_format='text') + root = logging.getLogger() + assert not any(isinstance(h.formatter, JsonFormatter) for h in root.handlers) + assert len(root.handlers) > 0 + + def test_removes_existing_handlers(self): + root = logging.getLogger() + initial_count = len(root.handlers) + # Add extra handlers + root.addHandler(logging.StreamHandler()) + root.addHandler(logging.StreamHandler()) + assert len(root.handlers) == initial_count + 2 + + configure_logging(level=logging.INFO, log_format='text') + # Should have exactly 1 handler (ours), regardless of what was there before + assert len(root.handlers) == 1 + + def test_sets_log_level(self): + configure_logging(level=logging.DEBUG, log_format='text') + root = logging.getLogger() + assert root.level == logging.DEBUG + + +class TestGetRssMb: + def test_returns_float(self): + """_get_rss_mb returns a float on any platform.""" + result = _get_rss_mb() + assert isinstance(result, float) + + def test_returns_positive_value(self): + """_get_rss_mb returns a positive value (or -1 on unsupported platforms).""" + result = _get_rss_mb() + assert result > 0 or result == -1.0 + + def test_linux_procfs_path(self): + """_get_rss_mb computes correct MB from /proc/self/statm on Linux.""" + # statm format: size resident shared text lib data dt (in pages) + fake_statm = io.StringIO('50000 12345 3000 100 0 8000 0') + page_size = 4096 # common Linux page size + expected_mb = round(12345 * page_size / (1024 * 1024), 2) + + with patch('builtins.open', return_value=fake_statm): + with patch('os.sysconf', create=True, return_value=page_size): + assert _get_rss_mb() == expected_mb + + def test_fallback_when_procfs_and_resource_unavailable(self): + """_get_rss_mb returns -1.0 when neither procfs nor resource is available.""" + with patch('builtins.open', side_effect=OSError('no procfs')): + with patch.dict('sys.modules', {'resource': None}): + assert _get_rss_mb() == -1.0 + + +_PATCH_SLEEP = 'mcp_server_opensearch.logging_config._async_sleep' +_PATCH_GET_RSS = 'mcp_server_opensearch.logging_config._get_rss_mb' + + +class TestMemoryMonitor: + async def _run_one_iteration(self, caplog, **monitor_kwargs): + """Helper: run memory_monitor for one iteration then cancel.""" + call_count = 0 + + async def _fast_sleep(seconds): + nonlocal call_count + call_count += 1 + if call_count > 1: + await asyncio.sleep(10) + return + + with patch(_PATCH_SLEEP, _fast_sleep): + with caplog.at_level( + logging.INFO, logger='mcp_server_opensearch.logging_config.memory' + ): + task = asyncio.create_task(memory_monitor(**monitor_kwargs)) + await asyncio.sleep(0.05) + task.cancel() + with pytest.raises(asyncio.CancelledError): + await task + + @pytest.mark.asyncio + async def test_emits_memory_snapshot_event(self, caplog): + """memory_monitor logs an event_type=memory_snapshot entry.""" + await self._run_one_iteration(caplog, interval_seconds=1) + + assert any('Memory snapshot' in r.message for r in caplog.records) + snapshot_record = next(r for r in caplog.records if 'Memory snapshot' in r.message) + assert snapshot_record.event_type == 'memory_snapshot' + assert isinstance(snapshot_record.memory_rss_mb, float) + assert isinstance(snapshot_record.pid, int) + + @pytest.mark.asyncio + async def test_start_memory_monitor_returns_task(self): + """start_memory_monitor returns a running asyncio.Task.""" + task = start_memory_monitor(interval_seconds=60) + try: + assert isinstance(task, asyncio.Task) + assert not task.done() + finally: + task.cancel() + try: + await task + except asyncio.CancelledError: + pass + + @pytest.mark.asyncio + async def test_start_memory_monitor_reads_env_var(self): + """start_memory_monitor reads interval from OPENSEARCH_MEMORY_MONITOR_INTERVAL env var.""" + sleep_values = [] + + async def capture_sleep(seconds): + sleep_values.append(seconds) + raise asyncio.CancelledError + + with patch(_PATCH_SLEEP, side_effect=capture_sleep): + with patch.dict('os.environ', {'OPENSEARCH_MEMORY_MONITOR_INTERVAL': '30'}): + task = start_memory_monitor() + try: + await task + except asyncio.CancelledError: + pass + + assert sleep_values == [30] + + @pytest.mark.asyncio + async def test_fallback_when_no_rss_source(self, caplog): + """memory_monitor logs rss_mb=-1.0 when no RSS source is available.""" + with patch(_PATCH_GET_RSS, return_value=-1.0): + await self._run_one_iteration(caplog, interval_seconds=1) + + snapshot_record = next(r for r in caplog.records if 'Memory snapshot' in r.message) + assert snapshot_record.memory_rss_mb == -1.0 + + @pytest.mark.asyncio + async def test_interval_clamped_to_minimum_of_1(self): + """interval_seconds=0 is clamped to 1 to prevent CPU spin.""" + sleep_values = [] + + async def _capture_sleep(seconds): + sleep_values.append(seconds) + await asyncio.sleep(10) + + with patch(_PATCH_SLEEP, _capture_sleep): + task = asyncio.create_task(memory_monitor(interval_seconds=0)) + await asyncio.sleep(0.05) + task.cancel() + try: + await task + except asyncio.CancelledError: + pass + + assert sleep_values[0] >= 1 + + @pytest.mark.asyncio + async def test_loop_survives_iteration_error(self, caplog): + """An exception in one iteration is logged and the loop continues.""" + call_count = 0 + + async def _fast_sleep(seconds): + nonlocal call_count + call_count += 1 + if call_count > 2: + await asyncio.sleep(10) + return + + with patch(_PATCH_GET_RSS, side_effect=RuntimeError('boom')): + with patch(_PATCH_SLEEP, _fast_sleep): + with caplog.at_level( + logging.WARNING, + logger='mcp_server_opensearch.logging_config.memory', + ): + task = asyncio.create_task(memory_monitor(interval_seconds=1)) + await asyncio.sleep(0.05) + task.cancel() + with pytest.raises(asyncio.CancelledError): + await task + + assert any('Memory monitor iteration failed' in r.message for r in caplog.records) + + @pytest.mark.asyncio + async def test_handle_monitor_error_logs_crash(self, caplog): + """_handle_monitor_error logs when the task raises an unexpected exception.""" + + async def _failing_coro(): + raise RuntimeError('unexpected crash') + + with caplog.at_level(logging.ERROR, logger='mcp_server_opensearch.logging_config.memory'): + task = asyncio.create_task(_failing_coro()) + with pytest.raises(RuntimeError): + await task + _handle_monitor_error(task) + + assert any('Memory monitor crashed' in r.message for r in caplog.records) diff --git a/tests/mcp_server_opensearch/test_tool_executor.py b/tests/mcp_server_opensearch/test_tool_executor.py new file mode 100644 index 0000000..f88c85b --- /dev/null +++ b/tests/mcp_server_opensearch/test_tool_executor.py @@ -0,0 +1,144 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import logging +import pytest +from unittest.mock import AsyncMock, Mock, patch + +from mcp_server_opensearch.tool_executor import execute_tool + + +def make_enabled_tools(tool_key='TestTool', display_name=None, return_value=None): + """Helper to create a mock enabled_tools dict.""" + if return_value is None: + return_value = [{'type': 'text', 'text': 'Success'}] + return { + tool_key: { + 'display_name': display_name or tool_key, + 'description': 'A test tool', + 'input_schema': {'type': 'object', 'properties': {}}, + 'args_model': Mock(), + 'function': AsyncMock(return_value=return_value), + } + } + + +class TestExecuteTool: + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_successful_execution_logs_success(self, mock_validate, caplog): + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools() + + with caplog.at_level(logging.INFO): + result = await execute_tool('TestTool', {}, enabled_tools) + + assert result == [{'type': 'text', 'text': 'Success'}] + # Check structured log was emitted + assert any('Tool executed: TestTool' in r.message for r in caplog.records) + # Check extra fields + success_records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(success_records) == 1 + assert success_records[0].status == 'success' + assert hasattr(success_records[0], 'duration_ms') + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_soft_error_detected_via_is_error_flag(self, mock_validate, caplog): + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools( + return_value=[{'type': 'text', 'text': 'Error searching index: connection refused', 'is_error': True}] + ) + + with caplog.at_level(logging.ERROR): + result = await execute_tool('TestTool', {}, enabled_tools) + + assert result[0]['is_error'] is True + error_records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(error_records) == 1 + assert error_records[0].status == 'error' + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_text_starting_with_error_without_flag_is_success(self, mock_validate, caplog): + """Text that happens to start with 'Error' but lacks is_error flag should be success.""" + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools( + return_value=[{'type': 'text', 'text': 'Error codes explained: 404 means not found'}] + ) + + with caplog.at_level(logging.INFO): + result = await execute_tool('TestTool', {}, enabled_tools) + + records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(records) == 1 + assert records[0].status == 'success' + + @pytest.mark.asyncio + async def test_unknown_tool_raises_value_error(self, caplog): + with caplog.at_level(logging.ERROR): + with pytest.raises(ValueError, match='Unknown or disabled tool'): + await execute_tool('NonExistentTool', {}, {}) + + error_records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(error_records) == 1 + assert error_records[0].status == 'error' + assert error_records[0].error_type == 'UnknownToolError' + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_exception_in_tool_propagates_and_logs(self, mock_validate, caplog): + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools() + enabled_tools['TestTool']['function'] = AsyncMock(side_effect=RuntimeError('boom')) + + with caplog.at_level(logging.ERROR): + with pytest.raises(RuntimeError, match='boom'): + await execute_tool('TestTool', {}, enabled_tools) + + error_records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(error_records) == 1 + assert error_records[0].status == 'error' + assert error_records[0].error_type == 'RuntimeError' + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_validation_error_logs_error_status(self, mock_validate, caplog): + """Validation failures (missing required field) should log status='error'.""" + mock_validate.side_effect = ValueError("Missing required field: 'query_dsl'") + enabled_tools = make_enabled_tools() + + with caplog.at_level(logging.ERROR): + with pytest.raises(ValueError, match='Missing required field'): + await execute_tool('TestTool', {}, enabled_tools) + + error_records = [r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_execution'] + assert len(error_records) == 1 + assert error_records[0].status == 'error' + assert error_records[0].error_type == 'ValidationError' + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_duration_ms_is_populated(self, mock_validate, caplog): + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools() + + with caplog.at_level(logging.INFO): + await execute_tool('TestTool', {}, enabled_tools) + + records = [r for r in caplog.records if hasattr(r, 'duration_ms')] + assert len(records) == 1 + assert records[0].duration_ms >= 0 + + @pytest.mark.asyncio + @patch('tools.tool_params.validate_args_for_mode') + async def test_tool_key_logged(self, mock_validate, caplog): + mock_validate.return_value = Mock() + enabled_tools = make_enabled_tools(tool_key='SearchIndexTool', display_name='SearchIndexTool') + + with caplog.at_level(logging.INFO): + await execute_tool('SearchIndexTool', {}, enabled_tools) + + records = [r for r in caplog.records if hasattr(r, 'tool_key')] + assert len(records) == 1 + assert records[0].tool_key == 'SearchIndexTool' diff --git a/tests/opensearch/test_client.py b/tests/opensearch/test_client.py index cb84415..71805aa 100644 --- a/tests/opensearch/test_client.py +++ b/tests/opensearch/test_client.py @@ -691,6 +691,64 @@ def test_basic_auth_falls_back_to_env_when_no_header( call_kwargs = mock_opensearch.call_args[1] assert call_kwargs['http_auth'] == ('env-user', 'env-password') + +class TestHeaderBasedBearerAuth: + """Tests for Bearer authentication via Authorization header.""" + + def setup_method(self): + """Setup before each test method.""" + # Clear environment variables + for key in [ + 'OPENSEARCH_USERNAME', + 'OPENSEARCH_PASSWORD', + 'AWS_REGION', + 'OPENSEARCH_URL', + 'OPENSEARCH_NO_AUTH', + 'OPENSEARCH_HEADER_AUTH', + ]: + if key in os.environ: + del os.environ[key] + + # Set global mode for tests + from mcp_server_opensearch.global_state import set_mode + + set_mode('single') + + @patch('opensearch.client.request_ctx') + @patch('opensearch.client.AsyncOpenSearch') + def test_bearer_auth_from_authorization_header( + self, mock_opensearch, mock_request_ctx + ): + """Test Bearer auth passthrough from Authorization header.""" + from starlette.requests import Request + + # Set required environment variables + os.environ['OPENSEARCH_URL'] = 'https://test-opensearch-domain.com' + os.environ['OPENSEARCH_HEADER_AUTH'] = 'true' + + # Create mock request with Authorization Bearer header + bearer_token = 'test-bearer-token' + mock_request = Mock(spec=Request) + mock_request.headers = {'authorization': f'Bearer {bearer_token}'} + + # Mock request context + mock_context = Mock() + mock_context.request = mock_request + mock_request_ctx.get.return_value = mock_context + + # Mock OpenSearch client + mock_client = Mock() + mock_opensearch.return_value = mock_client + + # Execute + client = initialize_client(baseToolArgs(opensearch_cluster_name='')) + + # Assert + assert client == mock_client + call_kwargs = mock_opensearch.call_args[1] + assert call_kwargs['headers'] == {'Authorization': f'Bearer {bearer_token}'} + assert 'http_auth' not in call_kwargs + @patch('opensearch.client.request_ctx') @patch('opensearch.client.AsyncOpenSearch') @patch('opensearch.client.get_aws_region_single_mode') diff --git a/tests/opensearch/test_helper.py b/tests/opensearch/test_helper.py index da8ce69..7ae3481 100644 --- a/tests/opensearch/test_helper.py +++ b/tests/opensearch/test_helper.py @@ -110,13 +110,13 @@ async def test_search_index(self, mock_get_client): # Execute result = await self.search_index( - SearchIndexArgs(index='test-index', query=test_query, opensearch_cluster_name='') + SearchIndexArgs(index='test-index', query_dsl=test_query, opensearch_cluster_name='') ) # Assert assert result == mock_response mock_get_client.assert_called_once_with( - SearchIndexArgs(index='test-index', query=test_query, opensearch_cluster_name='') + SearchIndexArgs(index='test-index', query_dsl=test_query, opensearch_cluster_name='') ) # The search_index function adds size to the query body (default 10, max 100) expected_body = {'query': {'match_all': {}}, 'size': 10} @@ -210,7 +210,7 @@ async def test_search_index_error(self, mock_get_client): with pytest.raises(Exception) as exc_info: await self.search_index( SearchIndexArgs( - index='test-index', query={'invalid': 'query'}, opensearch_cluster_name='' + index='test-index', query_dsl={'invalid': 'query'}, opensearch_cluster_name='' ) ) assert str(exc_info.value) == 'Invalid query' @@ -539,3 +539,197 @@ def test_normalize_scientific_notation(self): result = normalize_scientific_notation(query_dsl) assert "1732693003000" in json.dumps(result) assert "173.5" in json.dumps(result) + + +class TestValidateJsonString: + def setup_method(self): + from opensearch.helper import validate_json_string + + self.validate = validate_json_string + + # --- valid inputs (should not raise) --- + + def test_valid_object(self): + self.validate('{"query": {"match_all": {}}}') + + def test_valid_empty_object(self): + self.validate('{}') + + def test_valid_array(self): + self.validate('[1, 2, 3]') + + def test_valid_nested_object(self): + self.validate('{"a": {"b": {"c": 42}}}') + + def test_valid_with_whitespace(self): + self.validate(' { "key" : "value" } ') + + def test_valid_with_newlines(self): + self.validate('{\n "query": {\n "match_all": {}\n }\n}') + + def test_valid_types(self): + # booleans, null, numbers + self.validate('{"flag": true, "missing": null, "count": 99}') + + def test_valid_search_config_query(self): + self.validate('{"query":{"match":{"title":"%SearchText%"}}}') + + # --- invalid inputs (should raise ValueError) --- + + def test_invalid_trailing_comma(self): + with pytest.raises(ValueError) as exc_info: + self.validate('{"query": {"match_all": {}},}') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_single_quotes(self): + with pytest.raises(ValueError) as exc_info: + self.validate("{'key': 'value'}") + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_unquoted_key(self): + with pytest.raises(ValueError) as exc_info: + self.validate('{key: "value"}') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_unclosed_brace(self): + with pytest.raises(ValueError) as exc_info: + self.validate('{"query": {"match_all": {}') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_empty_string(self): + with pytest.raises(ValueError) as exc_info: + self.validate('') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_plain_text(self): + with pytest.raises(ValueError) as exc_info: + self.validate('not json at all') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_invalid_bad_escape(self): + with pytest.raises(ValueError) as exc_info: + self.validate('{"key": "bad\\escape"}') + assert 'query is not valid JSON' in str(exc_info.value) + + def test_error_message_includes_location(self): + """Error message should contain line and column so the problem is easy to pinpoint.""" + with pytest.raises(ValueError) as exc_info: + self.validate('{"a": 1,\n"b": 2,\n"c": }') + msg = str(exc_info.value) + assert 'line' in msg + assert 'col' in msg + + def test_error_message_format(self): + """ValueError should be raised (not json.JSONDecodeError directly).""" + with pytest.raises(ValueError): + self.validate('{bad}') + + def test_cause_is_json_decode_error(self): + """The ValueError should chain the original JSONDecodeError.""" + import json as _json + + with pytest.raises(ValueError) as exc_info: + self.validate('{bad}') + assert isinstance(exc_info.value.__cause__, _json.JSONDecodeError) + + +class TestSearchConfigurationHelpers: + def setup_method(self): + """Setup that runs before each test method.""" + from opensearch.helper import ( + create_search_configuration, + delete_search_configuration, + get_search_configuration, + ) + + self.create_search_configuration = create_search_configuration + self.get_search_configuration = get_search_configuration + self.delete_search_configuration = delete_search_configuration + + @pytest.mark.asyncio + @patch('opensearch.client.get_opensearch_client') + async def test_create_search_configuration(self, mock_get_client): + """Test create_search_configuration calls put_search_configurations with correct body.""" + from tools.tool_params import CreateSearchConfigurationArgs + + mock_response = {'_id': 'cfg-1', 'result': 'created'} + mock_client = AsyncMock() + mock_client.plugins = AsyncMock() + mock_client.plugins.search_relevance = AsyncMock() + mock_client.plugins.search_relevance.put_search_configurations = AsyncMock( + return_value=mock_response + ) + + mock_get_client.return_value.__aenter__ = AsyncMock(return_value=mock_client) + mock_get_client.return_value.__aexit__ = AsyncMock(return_value=None) + + args = CreateSearchConfigurationArgs( + name='my-config', + index='my-index', + query='{"query":{"match":{"title":"%SearchText%"}}}', + opensearch_cluster_name='', + ) + result = await self.create_search_configuration(args) + + assert result == mock_response + mock_client.plugins.search_relevance.put_search_configurations.assert_called_once_with( + body={ + 'name': 'my-config', + 'index': 'my-index', + 'query': '{"query":{"match":{"title":"%SearchText%"}}}', + } + ) + + @pytest.mark.asyncio + @patch('opensearch.client.get_opensearch_client') + async def test_get_search_configuration(self, mock_get_client): + """Test get_search_configuration calls get_search_configurations with correct ID.""" + from tools.tool_params import GetSearchConfigurationArgs + + mock_response = {'_id': 'cfg-1', '_source': {'name': 'my-config', 'index': 'my-index'}} + mock_client = AsyncMock() + mock_client.plugins = AsyncMock() + mock_client.plugins.search_relevance = AsyncMock() + mock_client.plugins.search_relevance.get_search_configurations = AsyncMock( + return_value=mock_response + ) + + mock_get_client.return_value.__aenter__ = AsyncMock(return_value=mock_client) + mock_get_client.return_value.__aexit__ = AsyncMock(return_value=None) + + args = GetSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self.get_search_configuration(args) + + assert result == mock_response + mock_client.plugins.search_relevance.get_search_configurations.assert_called_once_with( + search_configuration_id='cfg-1' + ) + + @pytest.mark.asyncio + @patch('opensearch.client.get_opensearch_client') + async def test_delete_search_configuration(self, mock_get_client): + """Test delete_search_configuration calls delete_search_configurations with correct ID.""" + from tools.tool_params import DeleteSearchConfigurationArgs + + mock_response = {'result': 'deleted'} + mock_client = AsyncMock() + mock_client.plugins = AsyncMock() + mock_client.plugins.search_relevance = AsyncMock() + mock_client.plugins.search_relevance.delete_search_configurations = AsyncMock( + return_value=mock_response + ) + + mock_get_client.return_value.__aenter__ = AsyncMock(return_value=mock_client) + mock_get_client.return_value.__aexit__ = AsyncMock(return_value=None) + + args = DeleteSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self.delete_search_configuration(args) + + assert result == mock_response + mock_client.plugins.search_relevance.delete_search_configurations.assert_called_once_with( + search_configuration_id='cfg-1' + ) diff --git a/tests/tools/agentic_memory_data.py b/tests/tools/agentic_memory_data.py new file mode 100644 index 0000000..a862188 --- /dev/null +++ b/tests/tools/agentic_memory_data.py @@ -0,0 +1,533 @@ +import pytest +from tools.agentic_memory.params import MemoryType, PayloadType + + +# --- Agentic Memory: parametrized test input cases for Pydantic model validation --- + +UPDATE_HAPPY_PATH_CASES = [ + pytest.param( + MemoryType.sessions, + 'N2CDipkB2Mtr6INFFcX8', + { + 'additional_info': { + 'key1': 'value1', + 'last_activity': '2025-09-15T17:30:00Z', + } + }, + {'result': 'updated', '_id': 'N2CDipkB2Mtr6INFFcX8', '_version': 2}, + ), + pytest.param( + MemoryType.working, + 'XyEuiJkBeh2gPPwzjYWM', + {'tags': {'topic': 'updated_topic', 'priority': 'high'}}, + {'result': 'updated', '_id': 'XyEuiJkBeh2gPPwzjYWM', '_version': 3}, + ), + pytest.param( + MemoryType.long_term, + 'DcxjTpkBvwXRq366C1Zz', + { + 'memory': "User's name is Bob Smith", + 'tags': {'topic': 'personal info', 'updated': 'true'}, + }, + {'result': 'updated', '_id': 'DcxjTpkBvwXRq366C1Zz', '_version': 2}, + ), + pytest.param( + MemoryType.working, + 'another_working_memory_id', + { + 'messages': [ + { + 'role': 'user', + 'content': [{'text': 'Updated user message', 'type': 'text'}], + } + ], + 'metadata': {'status': 'updated'}, + }, + {'result': 'updated', '_id': 'another_working_memory_id', '_version': 2}, + ), + pytest.param( + MemoryType.sessions, + 'session_id_123', + { + 'summary': 'Updated session summary', + 'metadata': {'status': 'active', 'branch': 'main'}, + 'agents': { + 'primary_agent': 'assistant', + 'secondary_agents': ['tool1', 'tool2'], + }, + }, + {'result': 'updated', '_id': 'session_id_123', '_version': 2}, + ), + pytest.param( + MemoryType.working, + 'working_struct_id', + { + 'structured_data': {'updated_state': {'status': 'completed', 'progress': 100}}, + 'tags': {'type': 'state_update'}, + }, + {'result': 'updated', '_id': 'working_struct_id', '_version': 2}, + ), + pytest.param( + MemoryType.long_term, + 'long_term_minimal', + {'memory': 'Updated memory content only'}, + {'result': 'updated', '_id': 'long_term_minimal', '_version': 2}, + ), +] + +ADD_MEMORIES_HAPPY_PATH_CASES = [ + pytest.param( + { + 'messages': [ + { + 'role': 'user', + 'content': [{'text': "I'm Bob, I really like swimming.", 'type': 'text'}], + }, + { + 'role': 'assistant', + 'content': [ + { + 'text': 'Cool, nice. Hope you enjoy your life.', + 'type': 'text', + } + ], + }, + ], + 'namespace': {'user_id': 'bob'}, + 'metadata': { + 'status': 'checkpoint', + 'branch': {'branch_name': 'high', 'root_event_id': '228nadfs879mtgk'}, + }, + 'tags': {'topic': 'personal info'}, + 'infer': True, + 'payload_type': PayloadType.conversational, + }, + { + 'session_id': 'XSEuiJkBeh2gPPwzjYVh', + 'working_memory_id': 'XyEuiJkBeh2gPPwzjYWM', + }, + id='conversational_with_infer', # lets pytest fill missing parameters via fixtures when not provided by parametrize + ), + pytest.param( + { + 'structured_data': {'time_range': {'start': '2025-09-11', 'end': '2025-09-15'}}, + 'namespace': {'agent_id': 'testAgent1'}, + 'metadata': {'status': 'checkpoint', 'anyobject': 'abc'}, + 'tags': {'topic': 'agent_state'}, + 'infer': False, + 'payload_type': PayloadType.data, + }, + {'working_memory_id': 'Z8xeTpkBvwXRq366l0iA'}, + id='data_payload', + ), + pytest.param( + { + 'structured_data': { + 'tool_invocations': [ + { + 'tool_name': 'ListIndexTool', + 'tool_input': {'filter': '*,-.plugins*'}, + 'tool_output': '...', + } + ] + }, + 'namespace': { + 'user_id': 'bob', + 'agent_id': 'testAgent1', + 'session_id': '123', + }, + 'metadata': {'status': 'checkpoint'}, + 'tags': {'topic': 'personal info'}, + 'infer': False, + 'payload_type': PayloadType.data, + }, + {'working_memory_id': 'Z8xeTpkBvwXRq366l0iA'}, + id='data_payload_tool_invocation', + ), + pytest.param( + { + 'messages': [{'content': [{'text': 'Hello, world!', 'type': 'text'}]}], + 'payload_type': PayloadType.conversational, + }, + {'session_id': 'minimal_session', 'working_memory_id': 'minimal_memory'}, + id='minimal_conversational', + ), +] + +CREATE_SESSION_HAPPY_PATH_CASES = [ + pytest.param( + {'session_id': 'abc123', 'metadata': {'key1': 'value1'}}, + {'session_id': 'abc123', 'status': 'created'}, + id='with_custom_id', + ), + pytest.param( + { + 'summary': 'This is a test session', + 'metadata': {'key1': 'value1'}, + 'namespace': {'user_id': 'bob'}, + }, + {'session_id': 'jTYm35kBt8CyICnjxJl9', 'status': 'created'}, + id='with_autogenerated_id', + ), + pytest.param( + { + 'session_id': 'custom_session_123', + 'summary': 'Session for user onboarding', + 'namespace': {'user_id': 'alice', 'agent_id': 'onboarding_bot'}, + 'metadata': {'priority': 'high', 'category': 'onboarding'}, + }, + {'session_id': 'custom_session_123', 'status': 'created'}, + id='with_all_fields', + ), + pytest.param( + {}, # Empty pld + {'session_id': 'auto_generated_id_123', 'status': 'created'}, + id='minimal', + ), +] + +GET_MEMORY_HAPPY_PATH_CASES = [ + pytest.param( + MemoryType.working, + 'XyEuiJkBeh2gPPwzjYWM', + {'_id': 'XyEuiJkBeh2gPPwzjYWM', 'namespace': {'user': 'test'}, 'messages': []}, + id='get_working_memory', + ), + pytest.param( + MemoryType.long_term, + 'DcxjTpkBvwXRq366C1Zz', + { + '_id': 'DcxjTpkBvwXRq366C1Zz', + 'namespace': {'user': 'test'}, + 'memory': 'some data', + }, + id='get_long_term_memory', + ), + pytest.param( + MemoryType.sessions, + 'CcxjTpkBvwXRq366A1aE', + {'_id': 'CcxjTpkBvwXRq366A1aE', 'summary': 'A session'}, + id='get_session_memory', + ), + pytest.param( + MemoryType.history, + 'eMxnTpkBvwXRq366hmAU', + {'_id': 'eMxnTpkBvwXRq366hmAU', 'trace_id': 'trace-123'}, + id='get_history_memory', + ), +] + + +SEARCH_SESSIONS_RESPONSE = { + 'took': 5, + 'hits': { + 'hits': [ + { + '_id': 'CcxjTpkBvwXRq366A1aE', + '_source': {'namespace': {'user_id': 'bob'}}, + } + ] + }, +} +SEARCH_LONG_TERM_RESPONSE = { + 'took': 3, + 'hits': {'hits': [{'_id': 'abc123'}, {'_id': 'def456'}]}, +} +SEARCH_WORKING_COMPLEX_RESPONSE = { + 'took': 4, + 'hits': {'hits': [{'_id': 'working_mem_123'}]}, +} +SEARCH_HISTORY_RESPONSE = {'took': 2, 'hits': {'total': {'value': 0}, 'hits': []}} +SEARCH_NO_SORT_RESPONSE = { + 'took': 1, + 'hits': {'hits': [{'_id': 'no_sort_id', '_score': 1.0}]}, +} +SEARCH_SESSION_ID_RESPONSE = { + 'took': 3, + 'hits': { + 'hits': [ + { + '_id': 'session_trace_123', + '_source': {'namespace': {'session_id': '123'}}, + } + ] + }, +} + +SEARCH_MEMORY_HAPPY_PATH_CASES = [ + pytest.param( + MemoryType.sessions, + {'query': {'match_all': {}}, 'sort': [{'created_time': {'order': 'desc'}}]}, + SEARCH_SESSIONS_RESPONSE, + id='search_sessions_match_all', + ), + pytest.param( + MemoryType.long_term, + { + 'query': {'bool': {'must': [{'term': {'namespace.user_id': 'bob'}}]}}, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + SEARCH_LONG_TERM_RESPONSE, + id='search_long_term_with_namespace', + ), + pytest.param( + MemoryType.working, + { + 'query': { + 'bool': { + 'must': [{'term': {'namespace.user_id': 'bob'}}], + 'must_not': [{'exists': {'field': 'tags.parent_memory_id'}}], + } + }, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + SEARCH_WORKING_COMPLEX_RESPONSE, + id='search_working_complex_query', + ), + pytest.param( + MemoryType.history, + {'query': {'match_all': {}}, 'sort': [{'created_time': {'order': 'desc'}}]}, + SEARCH_HISTORY_RESPONSE, + id='search_history_no_results', + ), + pytest.param( + MemoryType.sessions, + {'query': {'match_all': {}}}, # Without 'sort' + SEARCH_NO_SORT_RESPONSE, + id='search_without_sort', + ), + pytest.param( + MemoryType.working, + { + 'query': {'term': {'namespace.session_id': '123'}}, + 'sort': [{'created_time': {'order': 'desc'}}], + }, + SEARCH_SESSION_ID_RESPONSE, + id='search_working_by_session_id', + ), +] + + +DELETE_MEMORY_ID_HAPPY_PATH_CASES = [ + pytest.param( + MemoryType.working, + 'XyEuiJkBeh2gPPwzjYWM', + {'result': 'deleted', '_id': 'XyEuiJkBeh2gPPwzjYWM', '_version': 2}, + id='delete_working_memory', + ), + pytest.param( + MemoryType.long_term, + 'DcxjTpkBvwXRq366C1Zz', + {'result': 'deleted', '_id': 'DcxjTpkBvwXRq366C1Zz', '_version': 1}, + id='delete_long_term_memory', + ), + pytest.param( + MemoryType.sessions, + 'CcxjTpkBvwXRq366A1aE', + {'result': 'deleted', '_id': 'CcxjTpkBvwXRq366A1aE', '_version': 3}, + id='delete_session_memory', + ), + pytest.param( + MemoryType.history, + 'eMxnTpkBvwXRq366hmAU', + {'result': 'deleted', '_id': 'eMxnTpkBvwXRq366hmAU', '_version': 1}, + id='delete_history_memory', + ), +] + +DELETE_QUERY_WORKING_RESP = {'deleted': 6, 'total': 6, 'failures': [], 'took': 159} +DELETE_QUERY_LONG_TERM_RESP = {'deleted': 10, 'total': 10, 'failures': [], 'took': 85} +DELETE_QUERY_SESSIONS_RESP = {'deleted': 3, 'total': 3, 'failures': [], 'took': 42} +DELETE_QUERY_COMPLEX_RESP = {'deleted': 5, 'total': 5, 'failures': [], 'took': 120} +DELETE_QUERY_NO_RESULTS_RESP = {'deleted': 0, 'total': 0, 'failures': [], 'took': 15} + +DELETE_MEMORY_QUERY_HAPPY_PATH_CASES = [ + pytest.param( + MemoryType.working, + {'query': {'match': {'owner_id': 'admin'}}}, + DELETE_QUERY_WORKING_RESP, + id='delete_working_by_match', + ), + pytest.param( + MemoryType.long_term, + {'query': {'range': {'created_time': {'lt': '2025-09-01'}}}}, + DELETE_QUERY_LONG_TERM_RESP, + id='delete_long_term_by_range', + ), + pytest.param( + MemoryType.sessions, + {'query': {'term': {'namespace.user_id': 'inactive_user'}}}, + DELETE_QUERY_SESSIONS_RESP, + id='delete_sessions_by_term', + ), + pytest.param( + MemoryType.working, + { + 'query': { + 'bool': { + 'must': [{'term': {'namespace.agent_id': 'test_agent'}}], + 'must_not': [{'exists': {'field': 'tags.important'}}], + } + } + }, + DELETE_QUERY_COMPLEX_RESP, + id='delete_working_complex_bool', + ), + pytest.param( + MemoryType.history, + {'query': {'term': {'namespace.user_id': 'non_existent_user'}}}, + DELETE_QUERY_NO_RESULTS_RESP, + id='delete_history_no_results', + ), +] + +BASIC_CONFIG_PAYLOAD = { + 'name': 'agentic memory test', + 'description': 'Store conversations with semantic search and summarization', + 'configuration': { + 'embedding_model_type': 'TEXT_EMBEDDING', + 'embedding_model_id': 'embedding-model-123', + 'embedding_dimension': 1024, + 'llm_id': 'llm-model-456', + 'strategies': [{'type': 'SEMANTIC', 'namespace': ['user_id']}], + }, +} +BASIC_CONFIG_EXPECTED_BODY = { + 'name': 'agentic memory test', + 'description': 'Store conversations with semantic search and summarization', + 'configuration': { + 'embedding_model_type': 'TEXT_EMBEDDING', + 'embedding_model_id': 'embedding-model-123', + 'embedding_dimension': 1024, + 'llm_id': 'llm-model-456', + 'use_system_index': True, + 'disable_history': False, + 'disable_session': True, + 'strategies': [{'type': 'SEMANTIC', 'namespace': ['user_id'], 'enabled': True}], + }, +} + +ADVANCED_CONFIG_PAYLOAD = { + 'name': 'advanced memory container', + 'description': 'Advanced memory container with multiple strategies', + 'configuration': { + 'embedding_model_type': 'TEXT_EMBEDDING', + 'embedding_model_id': 'embedding-model-789', + 'embedding_dimension': 1024, + 'llm_id': 'llm-model-456', + 'index_prefix': 'my_custom_prefix', + 'use_system_index': False, + 'disable_history': True, + 'disable_session': False, + 'max_infer_size': 50, + 'strategies': [ + { + 'type': 'SEMANTIC', + 'namespace': ['agent_id'], + 'configuration': { + 'llm_result_path': '$.output.message.content[0].text', + 'system_prompt': 'Extract semantic information from user conversations', + 'llm_id': 'strategy-llm-id', + }, + 'enabled': True, + }, + { + 'type': 'USER_PREFERENCE', + 'namespace': ['agent_id'], + 'configuration': {'llm_result_path': '$.choices[0].message.content'}, + }, + ], + 'parameters': {'llm_result_path': '$.output.message.content[0].text'}, + 'index_settings': { + 'session_index': {'index': {'number_of_shards': '2', 'number_of_replicas': '2'}}, + 'short_term_memory_index': { + 'index': {'number_of_shards': '3', 'number_of_replicas': '1'} + }, + }, + }, +} +ADVANCED_CONFIG_EXPECTED_BODY = { + 'name': 'advanced memory container', + 'description': 'Advanced memory container with multiple strategies', + 'configuration': { + 'embedding_model_type': 'TEXT_EMBEDDING', + 'embedding_model_id': 'embedding-model-789', + 'embedding_dimension': 1024, + 'llm_id': 'llm-model-456', + 'index_prefix': 'my_custom_prefix', + 'use_system_index': False, + 'disable_history': True, + 'disable_session': False, + 'max_infer_size': 50, + 'strategies': [ + { + 'type': 'SEMANTIC', + 'namespace': ['agent_id'], + 'configuration': { + 'llm_result_path': '$.output.message.content[0].text', + 'system_prompt': 'Extract semantic information from user conversations', + 'llm_id': 'strategy-llm-id', + }, + 'enabled': True, + }, + { + 'type': 'USER_PREFERENCE', + 'namespace': ['agent_id'], + 'configuration': {'llm_result_path': '$.choices[0].message.content'}, + 'enabled': True, + }, + ], + 'parameters': {'llm_result_path': '$.output.message.content[0].text'}, + 'index_settings': { + 'session_index': {'index': {'number_of_shards': '2', 'number_of_replicas': '2'}}, + 'short_term_memory_index': { + 'index': {'number_of_shards': '3', 'number_of_replicas': '1'} + }, + }, + }, +} + +MINIMAL_CONFIG_PAYLOAD = { + 'name': 'minimal container', + 'configuration': { + 'embedding_model_type': 'SPARSE_ENCODING', + 'embedding_model_id': 'sparse-model-123', + }, +} +MINIMAL_CONFIG_EXPECTED_BODY = { + 'name': 'minimal container', + 'configuration': { + 'embedding_model_type': 'SPARSE_ENCODING', + 'embedding_model_id': 'sparse-model-123', + 'use_system_index': True, + 'disable_history': False, + 'disable_session': True, + }, +} + +__all__ = [ + 'UPDATE_HAPPY_PATH_CASES', + 'ADD_MEMORIES_HAPPY_PATH_CASES', + 'CREATE_SESSION_HAPPY_PATH_CASES', + 'GET_MEMORY_HAPPY_PATH_CASES', + 'SEARCH_SESSIONS_RESPONSE', + 'SEARCH_LONG_TERM_RESPONSE', + 'SEARCH_WORKING_COMPLEX_RESPONSE', + 'SEARCH_HISTORY_RESPONSE', + 'SEARCH_NO_SORT_RESPONSE', + 'SEARCH_SESSION_ID_RESPONSE', + 'SEARCH_MEMORY_HAPPY_PATH_CASES', + 'DELETE_MEMORY_ID_HAPPY_PATH_CASES', + 'DELETE_QUERY_WORKING_RESP', + 'DELETE_QUERY_LONG_TERM_RESP', + 'DELETE_QUERY_SESSIONS_RESP', + 'DELETE_QUERY_COMPLEX_RESP', + 'DELETE_QUERY_NO_RESULTS_RESP', + 'DELETE_MEMORY_QUERY_HAPPY_PATH_CASES', + 'BASIC_CONFIG_PAYLOAD', + 'BASIC_CONFIG_EXPECTED_BODY', + 'ADVANCED_CONFIG_PAYLOAD', + 'ADVANCED_CONFIG_EXPECTED_BODY', + 'MINIMAL_CONFIG_PAYLOAD', + 'MINIMAL_CONFIG_EXPECTED_BODY', +] diff --git a/tests/tools/test_agentic_memory_tools.py b/tests/tools/test_agentic_memory_tools.py new file mode 100644 index 0000000..d29d87a --- /dev/null +++ b/tests/tools/test_agentic_memory_tools.py @@ -0,0 +1,770 @@ +import agentic_memory_data +import json +import pytest +from pydantic import ValidationError +from tools.agentic_memory.params import ( + ERR_FIELD_NOT_ALLOWED, + ERR_FIELD_PROHIBITED, + ERR_MESSAGES_REQUIRED, + ERR_MISSING_LONG_TERM_FIELD, + ERR_MISSING_WORKING_FIELD, + ERR_STRUCTURED_DATA_REQUIRED, + MemoryType, + PayloadType, +) +from typing import Any, Dict +from unittest.mock import AsyncMock, Mock, patch + + +class TestAgenticMemoryTools: + def setup_method(self): + """Setup specific for Agentic Memory tests (requires OpenSearch 3.3.0+).""" + # Create a properly configured mock client + self.mock_client = Mock() + + # Configure mock client methods to return proper data structures + # These will be overridden in individual tests as needed + # Use AsyncMock for async methods + self.mock_client.cat.indices = AsyncMock(return_value=[]) + self.mock_client.indices.get_mapping = AsyncMock(return_value={}) + self.mock_client.indices.get = AsyncMock(return_value={}) + self.mock_client.search = AsyncMock(return_value={}) + self.mock_client.cat.shards = AsyncMock(return_value=[]) + self.mock_client.cat.segments = AsyncMock(return_value=[]) + self.mock_client.cat.nodes = AsyncMock(return_value=[]) + self.mock_client.cat.allocation = AsyncMock(return_value=[]) + self.mock_client.cluster.state = AsyncMock(return_value={}) + self.mock_client.indices.stats = AsyncMock(return_value={}) + self.mock_client.transport.perform_request = AsyncMock(return_value={}) + self.mock_client.info = AsyncMock(return_value={'version': {'number': '3.3.0'}}) + + # Patch initialize_client to always return our mock client + self.init_client_patcher = patch( + 'opensearch.client.initialize_client', return_value=self.mock_client + ) + self.init_client_patcher.start() + + # Clear any existing imports to ensure fresh imports + import sys + + modules_to_clear = [ + 'tools.tools', + ] + for module in modules_to_clear: + if module in sys.modules: + del sys.modules[module] + + # Set environment variable for agentic memory tools registration + import os + os.environ['OPENSEARCH_MEMORY_CONTAINER_ID'] = 'test-container-id' + + # Import after patching to ensure fresh imports + from tools.agentic_memory.actions import ( + add_agentic_memories_tool, + create_agentic_memory_session_tool, + delete_agentic_memory_by_query_tool, + delete_agentic_memory_by_id_tool, + get_agentic_memory_tool, + search_agentic_memory_tool, + update_agentic_memory_tool, + ) + from tools.agentic_memory.params import ( + AddAgenticMemoriesArgs, + CreateAgenticMemorySessionArgs, + DeleteAgenticMemoryByIDArgs, + DeleteAgenticMemoryByQueryArgs, + GetAgenticMemoryArgs, + SearchAgenticMemoryArgs, + UpdateAgenticMemoryArgs, + ) + from tools.tools import ( + TOOL_REGISTRY, + register_agentic_memory_tools, + ) + + # Register agentic memory tools for tests + register_agentic_memory_tools() + + self.CreateAgenticMemorySessionArgs = CreateAgenticMemorySessionArgs + self.AddAgenticMemoriesArgs = AddAgenticMemoriesArgs + self.GetAgenticMemoryArgs = GetAgenticMemoryArgs + self.UpdateAgenticMemoryArgs = UpdateAgenticMemoryArgs + self.DeleteAgenticMemoryByIDArgs = DeleteAgenticMemoryByIDArgs + self.DeleteAgenticMemoryByQueryArgs = DeleteAgenticMemoryByQueryArgs + self.SearchAgenticMemoryArgs = SearchAgenticMemoryArgs + self.TOOL_REGISTRY = TOOL_REGISTRY + self._create_agentic_memory_session_tool = create_agentic_memory_session_tool + self._add_agentic_memories_tool = add_agentic_memories_tool + self._get_agentic_memory_tool = get_agentic_memory_tool + self._update_agentic_memory_tool = update_agentic_memory_tool + self._delete_agentic_memory_by_id_tool = delete_agentic_memory_by_id_tool + self._delete_agentic_memory_by_query_tool = delete_agentic_memory_by_query_tool + self._search_agentic_memory_tool = search_agentic_memory_tool + + def teardown_method(self): + """Cleanup after each test method.""" + self.init_client_patcher.stop() + + # Clean up environment variable + import os + if 'OPENSEARCH_MEMORY_CONTAINER_ID' in os.environ: + del os.environ['OPENSEARCH_MEMORY_CONTAINER_ID'] + + @pytest.fixture + def memory_container_id(self): + """Fixture for a common memory container ID.""" + return 'HudqiJkB1SltqOcZusVU' + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'payload, mock_response', agentic_memory_data.CREATE_SESSION_HAPPY_PATH_CASES + ) + async def test_create_agentic_memory_session_happy_paths( + self, memory_container_id, payload, mock_response + ): + """Test successful create_agentic_memory_session for various payloads.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.CreateAgenticMemorySessionArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + **payload, + ) + result = await self._create_agentic_memory_session_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Successfully created session' in result[0]['text'] + # Verify that the ID from the response is in the output + assert mock_response['session_id'] in result[0]['text'] + + # Request body verification + self.mock_client.transport.perform_request.assert_called_once_with( + method='POST', + url=f'/_plugins/_ml/memory_containers/{memory_container_id}/memories/sessions', + body=payload, + ) + + @pytest.mark.asyncio + async def test_create_agentic_memory_session_error(self): + """Test create_agentic_memory_session exception handling.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception( + 'Memory container not found' + ) + payload: Dict[str, Any] = {'session_id': 'abc123'} + container_id = 'non_existent_container' + + # Execute + args = self.CreateAgenticMemorySessionArgs( + opensearch_cluster_name='', memory_container_id=container_id, **payload + ) + result = await self._create_agentic_memory_session_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating session: Memory container not found' in result[0]['text'] + self.mock_client.transport.perform_request.assert_called_once_with( + method='POST', + url=f'/_plugins/_ml/memory_containers/{container_id}/memories/sessions', + body=payload, + ) + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'memory_type, memory_id, mock_response', + agentic_memory_data.GET_MEMORY_HAPPY_PATH_CASES, + ) + async def test_get_agentic_memory_happy_paths( + self, memory_container_id, memory_type, memory_id, mock_response + ): + """Test successful get_agentic_memory for all memory types.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.GetAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + id=memory_id, + ) + result = await self._get_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + # Check that the response is in the body (as a JSON string) + assert mock_response['_id'] in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/{memory_id}' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='GET', url=expected_url + ) + + @pytest.mark.asyncio + async def test_get_agentic_memory_api_error(self, memory_container_id): + """Test get_agentic_memory exception handling for API errors.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception('Memory not found') + + # Execute + args = self.GetAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=MemoryType.working, + id='non_existent_id', + ) + result = await self._get_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error getting memory: Memory not found' in result[0]['text'] + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'payload, mock_response', + agentic_memory_data.ADD_MEMORIES_HAPPY_PATH_CASES, + ) + async def test_add_agentic_memories_happy_paths( + self, memory_container_id, payload, mock_response + ): + """Test successful add_agentic_memories for various payloads.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + **payload, + ) + result = await self._add_agentic_memories_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Successfully added memory' in result[0]['text'] + + # Verify that the ID from the response is in the output + if 'working_memory_id' in mock_response: + assert mock_response['working_memory_id'] in result[0]['text'] + if 'session_id' in mock_response: + assert mock_response['session_id'] in result[0]['text'] + + # Request body validation + # Pydantic adds default values (e.g., infer=False) that we need to take into account. + expected_body = payload.copy() + if 'infer' not in expected_body: + expected_body['infer'] = False # Pydantic adds default=False + + self.mock_client.transport.perform_request.assert_called_once_with( + method='POST', + url=f'/_plugins/_ml/memory_containers/{memory_container_id}/memories', + body=expected_body, + ) + + @pytest.mark.asyncio + async def test_add_agentic_memories_api_error(self, memory_container_id): + """Test add_agentic_memories exception handling from the API.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception('Container not found') + + payload = { + 'messages': [{'content': [{'text': 'Hello!', 'type': 'text'}]}], + 'payload_type': PayloadType.conversational, + } + + # Execute + args = self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + **payload, + ) + result = await self._add_agentic_memories_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error adding memory: Container not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_add_agentic_memories_validation_error_missing_messages(self): + """Test validation error when messages are missing for conversational payload.""" + with pytest.raises(ValidationError) as exc_info: + self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + payload_type=PayloadType.conversational, + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_MESSAGES_REQUIRED + + @pytest.mark.asyncio + async def test_add_agentic_memories_validation_error_missing_structured_data(self): + """Test validation error when structured_data is missing for data payload.""" + with pytest.raises(ValidationError) as exc_info: + self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + payload_type=PayloadType.data, + ) + + errors = exc_info.value.errors() + assert len(errors) > 0 # There may be more errors + assert any(e['type'] == ERR_STRUCTURED_DATA_REQUIRED for e in errors) + + @pytest.mark.asyncio + async def test_add_agentic_memories_validation_error_conflicting_fields(self): + """Test validation error when both messages and structured_data are provided.""" + with pytest.raises(ValidationError) as exc_info: + self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + messages=[{'content': [{'text': 'Hello!', 'type': 'text'}]}], # type: ignore + structured_data={'key': 'value'}, + payload_type=PayloadType.conversational, + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_FIELD_PROHIBITED + ctx = errors[0].get('ctx') + assert ctx is not None + assert ctx['field_name'] == 'structured_data' + + @pytest.mark.asyncio + async def test_add_agentic_memories_validation_error_invalid_messages_structure( + self, + ): + """Test validation error (from sub-model) when messages have invalid structure.""" + with pytest.raises(ValidationError) as exc_info: + self.AddAgenticMemoriesArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + messages=[ + { + 'role': 'user' + # Missing 'content' + } + ], # type: ignore + payload_type=PayloadType.conversational, + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == 'missing' + assert errors[0]['loc'] == ('messages', 0, 'content') + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'memory_type, search_body, mock_response', + agentic_memory_data.SEARCH_MEMORY_HAPPY_PATH_CASES, + ) + async def test_search_agentic_memory_happy_paths( + self, memory_container_id, memory_type, search_body, mock_response + ): + """Test successful search_agentic_memory for various types and queries.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.SearchAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + **search_body, + ) + result = await self._search_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert f'Search results for {memory_type.value}' in result[0]['text'] + + # Verify that the response is in the body (as a JSON string) + assert json.dumps(mock_response) in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/_search' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='GET', url=expected_url, body=search_body + ) + + @pytest.mark.asyncio + async def test_search_agentic_memory_api_error(self): + """Test search_agentic_memory exception handling for API errors.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception('Container not found') + container_id = 'non_existent_container' + search_body: Dict[str, Any] = {'query': {'match_all': {}}} + + # Execute + args = self.SearchAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=container_id, + type=MemoryType.sessions, + **search_body, + ) + result = await self._search_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error searching memory: Container not found' in result[0]['text'] + self.mock_client.transport.perform_request.assert_called_once_with( + method='GET', + url=f'/_plugins/_ml/memory_containers/{container_id}/memories/sessions/_search', + body=search_body, + ) + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'memory_type, memory_id, mock_response', + agentic_memory_data.DELETE_MEMORY_ID_HAPPY_PATH_CASES, + ) + async def test_delete_agentic_memory_by_id_happy_paths( + self, memory_container_id, memory_type, memory_id, mock_response + ): + """Test successful delete_agentic_memory_by_id for all memory types.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.DeleteAgenticMemoryByIDArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + id=memory_id, + ) + result = await self._delete_agentic_memory_by_id_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Successfully deleted memory' in result[0]['text'] + assert memory_id in result[0]['text'] + assert memory_type.value in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/{memory_id}' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='DELETE', url=expected_url + ) + + @pytest.mark.asyncio + async def test_delete_agentic_memory_by_id_error(self, memory_container_id): + """Test delete_agentic_memory_by_id exception handling.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception('Memory not found') + memory_type = MemoryType.working + memory_id = 'non_existent_id' + + # Execute + args = self.DeleteAgenticMemoryByIDArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + id=memory_id, + ) + result = await self._delete_agentic_memory_by_id_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting memory: Memory not found' in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/{memory_id}' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='DELETE', url=expected_url + ) + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'memory_type, query_body, mock_response', + agentic_memory_data.DELETE_MEMORY_QUERY_HAPPY_PATH_CASES, + ) + async def test_delete_agentic_memory_by_query_happy_paths( + self, memory_container_id, memory_type, query_body, mock_response + ): + """Test successful delete_agentic_memory_by_query for various types and queries.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.DeleteAgenticMemoryByQueryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + **query_body, + ) + result = await self._delete_agentic_memory_by_query_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Successfully deleted memories by query' in result[0]['text'] + assert f'Deleted: {mock_response["deleted"]}' in result[0]['text'] + assert memory_type.value in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/_delete_by_query' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='POST', url=expected_url, body=query_body + ) + + @pytest.mark.asyncio + async def test_delete_agentic_memory_by_query_error(self, memory_container_id): + """Test delete_agentic_memory_by_query exception handling.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception( + 'Query validation failed' + ) + memory_type = MemoryType.working + + query_body: Dict[str, Any] = {'query': {'invalid_query': {'field': 'value'}}} + + # Execute + args = self.DeleteAgenticMemoryByQueryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + **query_body, + ) + result = await self._delete_agentic_memory_by_query_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting memories by query: Query validation failed' in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/_delete_by_query' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='POST', url=expected_url, body=query_body + ) + + @pytest.mark.asyncio + @pytest.mark.parametrize( + 'memory_type, memory_id, update_body, mock_response', + agentic_memory_data.UPDATE_HAPPY_PATH_CASES, + ) + async def test_update_agentic_memory_happy_paths( + self, memory_container_id, memory_type, memory_id, update_body, mock_response + ): + """Test successful update_agentic_memory for various types and fields.""" + # Setup + self.mock_client.transport.perform_request.return_value = mock_response + + # Execute + args = self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=memory_type, + id=memory_id, + **update_body, + ) + result = await self._update_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Successfully updated memory' in result[0]['text'] + assert memory_id in result[0]['text'] + + expected_url = ( + f'/_plugins/_ml/memory_containers/{memory_container_id}/' + f'memories/{memory_type.value}/{memory_id}' + ) + self.mock_client.transport.perform_request.assert_called_once_with( + method='PUT', url=expected_url, body=update_body + ) + + @pytest.mark.asyncio + async def test_update_agentic_memory_api_error(self, memory_container_id): + """Test update_agentic_memory exception handling from the API.""" + # Setup + self.mock_client.transport.perform_request.side_effect = Exception('Memory not found') + + # Execute + args = self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id=memory_container_id, + type=MemoryType.working, + id='non_existent_id', + tags={'topic': 'test'}, + ) + result = await self._update_agentic_memory_tool(args) + + # Assert + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error updating memory: Memory not found' in result[0]['text'] + self.mock_client.transport.perform_request.assert_called_once_with( + method='PUT', + url=f'/_plugins/_ml/memory_containers/{memory_container_id}/memories/working/non_existent_id', + body={'tags': {'topic': 'test'}}, + ) + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_session_with_working_fields( + self, + ): + """Test validation error when session has working memory fields.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.sessions, + id='session_id', + messages=[{'role': 'user', 'content': [{'text': 'test', 'type': 'text'}]}], # type: ignore + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_FIELD_NOT_ALLOWED + + ctx = errors[0].get('ctx') + assert ctx is not None + assert ctx['field_name'] == 'messages' + assert ctx['memory_type'] == MemoryType.sessions.value + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_working_with_session_fields( + self, + ): + """Test validation error when working memory has session fields.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.working, + id='working_id', + summary='This should not be here', # Session field + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_FIELD_NOT_ALLOWED + + ctx = errors[0].get('ctx') + assert ctx is not None + assert ctx['field_name'] == 'summary' + assert ctx['memory_type'] == MemoryType.working.value + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_long_term_with_working_fields( + self, + ): + """Test validation error when long-term memory has working fields.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.long_term, + id='long_term_id', + structured_data={'key': 'value'}, # Working field + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_FIELD_NOT_ALLOWED + + ctx = errors[0].get('ctx') + assert ctx is not None + assert ctx['field_name'] == 'structured_data' + assert ctx['memory_type'] == MemoryType.long_term.value + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_working_no_fields(self): + """Test validation error when working memory has no updatable fields.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.working, + id='working_id', + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_MISSING_WORKING_FIELD + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_long_term_no_fields(self): + """Test validation error when long-term memory has no updatable fields.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.long_term, + id='long_term_id', + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == ERR_MISSING_LONG_TERM_FIELD + + @pytest.mark.asyncio + async def test_update_agentic_memory_validation_error_working_invalid_messages( + self, + ): + """Test validation error (from sub-model) when working memory messages have invalid structure.""" + with pytest.raises(ValidationError) as exc_info: + self.UpdateAgenticMemoryArgs( + opensearch_cluster_name='', + memory_container_id='id_123', + type=MemoryType.working, + id='working_id', + messages=[ + { + 'role': 'user' + # Missing 'content' field + } + ], # type: ignore + ) + + errors = exc_info.value.errors() + assert len(errors) == 1 + assert errors[0]['type'] == 'missing' + assert errors[0]['loc'] == ('messages', 0, 'content') + + def test_tool_registry(self): + """Test TOOL_REGISTRY structure.""" + expected_tools = [ + 'CreateAgenticMemorySessionTool', + 'AddAgenticMemoriesTool', + 'GetAgenticMemoryTool', + 'UpdateAgenticMemoryTool', + 'DeleteAgenticMemoryByIDTool', + 'DeleteAgenticMemoryByQueryTool', + 'SearchAgenticMemoryTool', + ] + + for tool in expected_tools: + assert tool in self.TOOL_REGISTRY + assert 'description' in self.TOOL_REGISTRY[tool] + assert 'input_schema' in self.TOOL_REGISTRY[tool] + assert 'function' in self.TOOL_REGISTRY[tool] + assert 'args_model' in self.TOOL_REGISTRY[tool] diff --git a/tests/tools/test_config.py b/tests/tools/test_config.py index 1ed735a..8e9dd8d 100644 --- a/tests/tools/test_config.py +++ b/tests/tools/test_config.py @@ -203,7 +203,7 @@ def test_parse_cli_to_nested_config_top_level_and_args(): 'tool.ListIndexTool.customTag': 'analytics', # should be ignored (non-standard) 'tool.ListIndexTool.args.index.required': 'true', 'tool.ListIndexTool.args.index.default': 'my-index', - 'tool.SearchIndexTool.args.query.default': '{"match_all": {}}', + 'tool.SearchIndexTool.args.query_dsl.default': '{"match_all": {}}', 'invalid.no_prefix': 'ignored', 'tool.Bad': 'ignored', # missing fieldPath } @@ -217,7 +217,7 @@ def test_parse_cli_to_nested_config_top_level_and_args(): assert nested['ListIndexTool']['args']['index']['default'] == 'my-index' assert 'SearchIndexTool' in nested - assert nested['SearchIndexTool']['args']['query']['default'] == {'match_all': {}} + assert nested['SearchIndexTool']['args']['query_dsl']['default'] == {'match_all': {}} # invalid keys should not create entries assert 'Bad' not in nested diff --git a/tests/tools/test_experiment_tools.py b/tests/tools/test_experiment_tools.py new file mode 100644 index 0000000..6af2f0a --- /dev/null +++ b/tests/tools/test_experiment_tools.py @@ -0,0 +1,320 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import pytest +from unittest.mock import Mock, patch, AsyncMock + + +class TestExperimentTools: + def setup_method(self): + """Setup that runs before each test method.""" + self.mock_client = Mock() + self.mock_client.info = AsyncMock(return_value={'version': {'number': '3.1.0'}}) + + self.mock_client.plugins = Mock() + self.mock_client.plugins.search_relevance = Mock() + self.mock_client.plugins.search_relevance.get_experiments = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.put_experiments = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.delete_experiments = AsyncMock(return_value={}) + + self.init_client_patcher = patch( + 'opensearch.client.initialize_client', return_value=self.mock_client + ) + self.init_client_patcher.start() + + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import ( + GetExperimentArgs, + CreateExperimentArgs, + DeleteExperimentArgs, + get_experiment_tool, + create_experiment_tool, + delete_experiment_tool, + ) + + self.GetExperimentArgs = GetExperimentArgs + self.CreateExperimentArgs = CreateExperimentArgs + self.DeleteExperimentArgs = DeleteExperimentArgs + self._get_experiment_tool = get_experiment_tool + self._create_experiment_tool = create_experiment_tool + self._delete_experiment_tool = delete_experiment_tool + + def teardown_method(self): + """Cleanup after each test method.""" + self.init_client_patcher.stop() + + @pytest.mark.asyncio + async def test_get_experiment_tool_success(self): + """Test successful retrieval of an experiment by ID.""" + experiment_id = 'exp-abc123' + mock_response = { + '_id': experiment_id, + '_source': { + 'type': 'PAIRWISE_COMPARISON', + 'status': 'COMPLETED', + 'querySetId': 'qs-1', + }, + } + self.mock_client.plugins.search_relevance.get_experiments.return_value = mock_response + + result = await self._get_experiment_tool( + self.GetExperimentArgs(opensearch_cluster_name='', experiment_id=experiment_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert experiment_id in result[0]['text'] + assert 'PAIRWISE_COMPARISON' in result[0]['text'] + self.mock_client.plugins.search_relevance.get_experiments.assert_called_once_with( + experiment_id=experiment_id + ) + + @pytest.mark.asyncio + async def test_get_experiment_tool_error(self): + """Test error handling when retrieving an experiment fails.""" + self.mock_client.plugins.search_relevance.get_experiments.side_effect = Exception( + 'Experiment not found' + ) + + result = await self._get_experiment_tool( + self.GetExperimentArgs(opensearch_cluster_name='', experiment_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error retrieving experiment' in result[0]['text'] + assert 'Experiment not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_experiment_tool_pairwise(self): + """Test creating a PAIRWISE_COMPARISON experiment with 2 configs.""" + mock_response = {'_id': 'exp-new', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_experiments.return_value = mock_response + + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1", "config-2"]', + experiment_type='PAIRWISE_COMPARISON', + size=10, + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Experiment created' in result[0]['text'] + assert 'exp-new' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_experiments.call_args + body = call_kwargs.kwargs['body'] + assert body['querySetId'] == 'qs-1' + assert body['searchConfigurationList'] == ['config-1', 'config-2'] + assert body['type'] == 'PAIRWISE_COMPARISON' + assert body['size'] == 10 + assert 'judgmentList' not in body + + @pytest.mark.asyncio + async def test_create_experiment_tool_pointwise(self): + """Test creating a POINTWISE_EVALUATION experiment with 1 config and judgment lists.""" + mock_response = {'_id': 'exp-new', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_experiments.return_value = mock_response + + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1"]', + experiment_type='POINTWISE_EVALUATION', + judgment_list_ids='["judgment-1", "judgment-2"]', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Experiment created' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_experiments.call_args + body = call_kwargs.kwargs['body'] + assert body['type'] == 'POINTWISE_EVALUATION' + assert body['searchConfigurationList'] == ['config-1'] + assert body['judgmentList'] == ['judgment-1', 'judgment-2'] + + @pytest.mark.asyncio + async def test_create_experiment_tool_hybrid_optimizer(self): + """Test creating a HYBRID_OPTIMIZER experiment.""" + mock_response = {'_id': 'exp-hybrid', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_experiments.return_value = mock_response + + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1"]', + experiment_type='HYBRID_OPTIMIZER', + judgment_list_ids='["judgment-1"]', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Experiment created' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_experiments.call_args + body = call_kwargs.kwargs['body'] + assert body['type'] == 'HYBRID_OPTIMIZER' + assert body['judgmentList'] == ['judgment-1'] + + @pytest.mark.asyncio + async def test_create_experiment_pairwise_wrong_config_count(self): + """Test that PAIRWISE_COMPARISON with != 2 configs returns an error.""" + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1"]', + experiment_type='PAIRWISE_COMPARISON', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating experiment' in result[0]['text'] + assert 'PAIRWISE_COMPARISON requires exactly 2' in result[0]['text'] + self.mock_client.plugins.search_relevance.put_experiments.assert_not_called() + + @pytest.mark.asyncio + async def test_create_experiment_pointwise_wrong_config_count(self): + """Test that POINTWISE_EVALUATION with != 1 config returns an error.""" + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1", "config-2"]', + experiment_type='POINTWISE_EVALUATION', + judgment_list_ids='["judgment-1"]', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating experiment' in result[0]['text'] + assert 'POINTWISE_EVALUATION requires exactly 1' in result[0]['text'] + self.mock_client.plugins.search_relevance.put_experiments.assert_not_called() + + @pytest.mark.asyncio + async def test_create_experiment_pointwise_missing_judgment_lists(self): + """Test that POINTWISE_EVALUATION without judgment_list_ids returns an error.""" + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1"]', + experiment_type='POINTWISE_EVALUATION', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating experiment' in result[0]['text'] + assert 'judgment_list_ids' in result[0]['text'] + self.mock_client.plugins.search_relevance.put_experiments.assert_not_called() + + @pytest.mark.asyncio + async def test_create_experiment_invalid_search_config_json(self): + """Test that invalid JSON for search_configuration_ids returns an error.""" + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='not-valid-json', + experiment_type='PAIRWISE_COMPARISON', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating experiment' in result[0]['text'] + self.mock_client.plugins.search_relevance.put_experiments.assert_not_called() + + @pytest.mark.asyncio + async def test_create_experiment_tool_error(self): + """Test error handling when creating an experiment fails.""" + self.mock_client.plugins.search_relevance.put_experiments.side_effect = Exception( + 'Server error' + ) + + result = await self._create_experiment_tool( + self.CreateExperimentArgs( + opensearch_cluster_name='', + query_set_id='qs-1', + search_configuration_ids='["config-1", "config-2"]', + experiment_type='PAIRWISE_COMPARISON', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating experiment' in result[0]['text'] + assert 'Server error' in result[0]['text'] + + @pytest.mark.asyncio + async def test_delete_experiment_tool_success(self): + """Test successful deletion of an experiment by ID.""" + experiment_id = 'exp-abc123' + mock_response = {'_id': experiment_id, 'result': 'deleted'} + self.mock_client.plugins.search_relevance.delete_experiments.return_value = mock_response + + result = await self._delete_experiment_tool( + self.DeleteExperimentArgs(opensearch_cluster_name='', experiment_id=experiment_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert experiment_id in result[0]['text'] + assert 'deleted' in result[0]['text'] + self.mock_client.plugins.search_relevance.delete_experiments.assert_called_once_with( + experiment_id=experiment_id + ) + + @pytest.mark.asyncio + async def test_delete_experiment_tool_error(self): + """Test error handling when deleting an experiment fails.""" + self.mock_client.plugins.search_relevance.delete_experiments.side_effect = Exception( + 'Experiment not found' + ) + + result = await self._delete_experiment_tool( + self.DeleteExperimentArgs(opensearch_cluster_name='', experiment_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting experiment' in result[0]['text'] + assert 'Experiment not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_experiment_tools_registered_in_registry(self): + """Test that all experiment tools are registered in the TOOL_REGISTRY.""" + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import TOOL_REGISTRY + + assert 'GetExperimentTool' in TOOL_REGISTRY + assert 'CreateExperimentTool' in TOOL_REGISTRY + assert 'DeleteExperimentTool' in TOOL_REGISTRY + + for tool_name in ['GetExperimentTool', 'CreateExperimentTool', 'DeleteExperimentTool']: + tool = TOOL_REGISTRY[tool_name] + assert 'description' in tool + assert 'input_schema' in tool + assert 'function' in tool + assert 'args_model' in tool + assert tool.get('min_version') == '3.1.0' diff --git a/tests/tools/test_generic_tool.py b/tests/tools/test_generic_tool.py index 58aae94..3f8b2d5 100644 --- a/tests/tools/test_generic_tool.py +++ b/tests/tools/test_generic_tool.py @@ -101,6 +101,39 @@ async def test_generic_tool(): print() +@pytest.mark.asyncio +async def test_write_disabled_message_does_not_leak_config(): + """Test that the write-disabled error message does not expose config settings.""" + from tools.tool_filter import set_allow_write_setting + + original_allow_write = os.environ.get('OPENSEARCH_SETTINGS_ALLOW_WRITE', 'true') + os.environ['OPENSEARCH_SETTINGS_ALLOW_WRITE'] = 'false' + set_allow_write_setting(False) + + try: + for method in ['PUT', 'POST', 'DELETE', 'PATCH']: + args = GenericOpenSearchApiArgs( + opensearch_cluster_name='', + path='/test_index/_doc/1', + method=method, + ) + result = await generic_opensearch_api_tool(args) + error_text = result[0]['text'] + + assert 'Write operations are disabled' in error_text, ( + f'Expected write-disabled message for {method}' + ) + assert 'OPENSEARCH_SETTINGS_ALLOW_WRITE' not in error_text, ( + f'Error message for {method} should not expose env var name' + ) + assert 'allow_write' not in error_text, ( + f'Error message for {method} should not expose config setting name' + ) + finally: + os.environ['OPENSEARCH_SETTINGS_ALLOW_WRITE'] = original_allow_write + set_allow_write_setting(None) + + if __name__ == '__main__': print('Testing GenericOpenSearchApiTool...') print('Note: This test requires a running OpenSearch instance and proper configuration.') diff --git a/tests/tools/test_judgment_tool_evals.py b/tests/tools/test_judgment_tool_evals.py new file mode 100644 index 0000000..e1b81f7 --- /dev/null +++ b/tests/tools/test_judgment_tool_evals.py @@ -0,0 +1,93 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import sys +import pytest +import anthropic +from unittest.mock import patch, Mock, AsyncMock + + +JUDGMENT_CREATE_TOOLS = [ + 'CreateJudgmentListTool', + 'CreateUBIJudgmentListTool', + 'CreateLLMJudgmentListTool', +] + + +@pytest.mark.eval +class TestJudgmentToolRouting: + @classmethod + def setup_class(cls): + mock_client = Mock() + mock_client.info = AsyncMock(return_value={'version': {'number': '3.1.0'}}) + + cls.patcher = patch('opensearch.client.initialize_client', return_value=mock_client) + cls.patcher.start() + + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import TOOL_REGISTRY + + cls.tool_definitions = [ + { + 'name': name, + 'description': TOOL_REGISTRY[name]['description'], + 'input_schema': TOOL_REGISTRY[name]['input_schema'], + } + for name in JUDGMENT_CREATE_TOOLS + ] + + @classmethod + def teardown_class(cls): + cls.patcher.stop() + + def ask_agent(self, scenario: str) -> anthropic.types.ToolUseBlock: + client = anthropic.Anthropic() + response = client.messages.create( + model='claude-haiku-4-5-20251001', + max_tokens=512, + tools=self.tool_definitions, + tool_choice={'type': 'any'}, + messages=[{'role': 'user', 'content': scenario}], + ) + tool_uses = [b for b in response.content if b.type == 'tool_use'] + assert len(tool_uses) == 1, f'Expected 1 tool call, got {len(tool_uses)}' + return tool_uses[0] + + def test_selects_import_tool_for_manual_ratings(self): + """When a user has manually annotated ratings, the agent should use CreateJudgmentListTool.""" + tool_use = self.ask_agent( + "I have manually graded search results and want to store them as a judgment list " + "called 'manual-judgments'. Here are my ratings: " + '[{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}, {"docId": "doc2", "rating": 1}]}]' + ) + + assert tool_use.name == 'CreateJudgmentListTool' + assert tool_use.input.get('name') == 'manual-judgments' + + def test_selects_ubi_tool_for_click_data(self): + """When a user wants judgments derived from UBI click data, the agent should use CreateUBIJudgmentListTool.""" + tool_use = self.ask_agent( + 'My search application has been logging user behaviour through User Behavior Insights. ' + "Use that click data to create a judgment list called 'ubi-judgments' " + 'using the COEC click model.' + ) + + assert tool_use.name == 'CreateUBIJudgmentListTool' + assert tool_use.input.get('name') == 'ubi-judgments' + assert 'coec' in tool_use.input.get('click_model', '').lower() + + def test_selects_llm_tool_for_ai_generated_ratings(self): + """When a user wants an LLM to rate documents, the agent should use CreateLLMJudgmentListTool.""" + tool_use = self.ask_agent( + "Use ML model 'model-abc' to automatically evaluate search results for relevance. " + "The query set ID is 'qs-123' and the search configuration ID is 'sc-456'. " + "Create a judgment list called 'ai-judgments'." + ) + + assert tool_use.name == 'CreateLLMJudgmentListTool' + assert tool_use.input.get('model_id') == 'model-abc' + assert tool_use.input.get('query_set_id') == 'qs-123' + assert tool_use.input.get('search_configuration_id') == 'sc-456' diff --git a/tests/tools/test_judgment_tools.py b/tests/tools/test_judgment_tools.py new file mode 100644 index 0000000..47e0d0c --- /dev/null +++ b/tests/tools/test_judgment_tools.py @@ -0,0 +1,400 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import pytest +from unittest.mock import Mock, patch, AsyncMock + + +class TestJudgmentTools: + def setup_method(self): + """Setup that runs before each test method.""" + self.mock_client = Mock() + self.mock_client.info = AsyncMock(return_value={'version': {'number': '3.1.0'}}) + + self.mock_client.plugins = Mock() + self.mock_client.plugins.search_relevance = Mock() + self.mock_client.plugins.search_relevance.get_judgments = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.put_judgments = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.delete_judgments = AsyncMock(return_value={}) + + self.init_client_patcher = patch( + 'opensearch.client.initialize_client', return_value=self.mock_client + ) + self.init_client_patcher.start() + + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import ( + GetJudgmentListArgs, + CreateJudgmentListArgs, + CreateLLMJudgmentListArgs, + CreateUBIJudgmentListArgs, + DeleteJudgmentListArgs, + get_judgment_list_tool, + create_judgment_list_tool, + create_llm_judgment_list_tool, + create_ubi_judgment_list_tool, + delete_judgment_list_tool, + ) + + self.GetJudgmentListArgs = GetJudgmentListArgs + self.CreateJudgmentListArgs = CreateJudgmentListArgs + self.CreateLLMJudgmentListArgs = CreateLLMJudgmentListArgs + self.CreateUBIJudgmentListArgs = CreateUBIJudgmentListArgs + self.DeleteJudgmentListArgs = DeleteJudgmentListArgs + self._get_judgment_list_tool = get_judgment_list_tool + self._create_judgment_list_tool = create_judgment_list_tool + self._create_llm_judgment_list_tool = create_llm_judgment_list_tool + self._create_ubi_judgment_list_tool = create_ubi_judgment_list_tool + self._delete_judgment_list_tool = delete_judgment_list_tool + + def teardown_method(self): + """Cleanup after each test method.""" + self.init_client_patcher.stop() + + @pytest.mark.asyncio + async def test_get_judgment_list_tool_success(self): + """Test successful retrieval of a judgment by ID.""" + judgment_id = 'abc123' + mock_response = { + '_id': judgment_id, + '_source': { + 'name': 'my-judgments', + 'type': 'IMPORT_JUDGMENT', + 'judgmentRatings': [{'query': 'laptop', 'ratings': [{'docId': 'doc1', 'rating': 3}]}], + }, + } + self.mock_client.plugins.search_relevance.get_judgments.return_value = mock_response + + result = await self._get_judgment_list_tool( + self.GetJudgmentListArgs(opensearch_cluster_name='', judgment_id=judgment_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert judgment_id in result[0]['text'] + assert 'my-judgments' in result[0]['text'] + self.mock_client.plugins.search_relevance.get_judgments.assert_called_once_with( + judgment_id=judgment_id + ) + + @pytest.mark.asyncio + async def test_get_judgment_list_tool_error(self): + """Test error handling when retrieving a judgment fails.""" + self.mock_client.plugins.search_relevance.get_judgments.side_effect = Exception( + 'Not found' + ) + + result = await self._get_judgment_list_tool( + self.GetJudgmentListArgs(opensearch_cluster_name='', judgment_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error retrieving judgment' in result[0]['text'] + assert 'Not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_judgment_list_tool_success(self): + """Test creating a judgment list with manual relevance ratings.""" + mock_response = {'_id': 'new-id', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_judgments.return_value = mock_response + + result = await self._create_judgment_list_tool( + self.CreateJudgmentListArgs( + opensearch_cluster_name='', + name='my-judgments', + judgment_ratings='[{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}]}]', + description='Test judgments', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Judgment created' in result[0]['text'] + assert 'new-id' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert body['name'] == 'my-judgments' + assert body['type'] == 'IMPORT_JUDGMENT' + assert body['description'] == 'Test judgments' + assert body['judgmentRatings'] == [ + {'query': 'laptop', 'ratings': [{'docId': 'doc1', 'rating': 3}]} + ] + + @pytest.mark.asyncio + async def test_create_judgment_list_tool_no_description(self): + """Test that description is omitted from body when not provided.""" + self.mock_client.plugins.search_relevance.put_judgments.return_value = {'_id': 'id1'} + + await self._create_judgment_list_tool( + self.CreateJudgmentListArgs( + opensearch_cluster_name='', + name='my-judgments', + judgment_ratings='[{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}]}]', + ) + ) + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert 'description' not in body + + @pytest.mark.asyncio + async def test_create_judgment_list_tool_invalid_json(self): + """Test that invalid judgment_ratings JSON returns an error.""" + result = await self._create_judgment_list_tool( + self.CreateJudgmentListArgs( + opensearch_cluster_name='', + name='my-judgments', + judgment_ratings='not-valid-json', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating judgment' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_judgment_list_tool_error(self): + """Test error handling when creating a judgment fails.""" + self.mock_client.plugins.search_relevance.put_judgments.side_effect = Exception( + 'Index not found' + ) + + result = await self._create_judgment_list_tool( + self.CreateJudgmentListArgs( + opensearch_cluster_name='', + name='my-judgments', + judgment_ratings='[{"query": "laptop", "ratings": [{"docId": "doc1", "rating": 3}]}]', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating judgment' in result[0]['text'] + assert 'Index not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_ubi_judgment_list_tool_success(self): + """Test successful creation of a UBI judgment.""" + mock_response = {'_id': 'ubi-id', 'result': 'created', 'status': 'PROCESSING'} + self.mock_client.plugins.search_relevance.put_judgments.return_value = mock_response + + result = await self._create_ubi_judgment_list_tool( + self.CreateUBIJudgmentListArgs( + opensearch_cluster_name='', + name='ubi-judgments', + click_model='coec', + max_rank=20, + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'UBI judgment created' in result[0]['text'] + assert 'ubi-id' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert body['name'] == 'ubi-judgments' + assert body['type'] == 'UBI_JUDGMENT' + assert body['clickModel'] == 'coec' + assert body['maxRank'] == 20 + assert 'startDate' not in body + assert 'endDate' not in body + + @pytest.mark.asyncio + async def test_create_ubi_judgment_list_tool_with_date_range(self): + """Test UBI judgment creation with optional start and end dates.""" + self.mock_client.plugins.search_relevance.put_judgments.return_value = {'_id': 'id1'} + + await self._create_ubi_judgment_list_tool( + self.CreateUBIJudgmentListArgs( + opensearch_cluster_name='', + name='ubi-judgments-q1', + click_model='coec', + max_rank=10, + start_date='2024-01-01', + end_date='2024-03-31', + ) + ) + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert body['startDate'] == '2024-01-01' + assert body['endDate'] == '2024-03-31' + + @pytest.mark.asyncio + async def test_create_ubi_judgment_list_tool_error(self): + """Test error handling when creating a UBI judgment fails.""" + self.mock_client.plugins.search_relevance.put_judgments.side_effect = Exception( + 'UBI index not found' + ) + + result = await self._create_ubi_judgment_list_tool( + self.CreateUBIJudgmentListArgs( + opensearch_cluster_name='', + name='ubi-judgments', + click_model='coec', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating UBI judgment' in result[0]['text'] + assert 'UBI index not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_delete_judgment_list_tool_success(self): + """Test successful deletion of a judgment by ID.""" + judgment_id = 'abc123' + mock_response = {'_id': judgment_id, 'result': 'deleted'} + self.mock_client.plugins.search_relevance.delete_judgments.return_value = mock_response + + result = await self._delete_judgment_list_tool( + self.DeleteJudgmentListArgs(opensearch_cluster_name='', judgment_id=judgment_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert judgment_id in result[0]['text'] + assert 'deleted' in result[0]['text'] + self.mock_client.plugins.search_relevance.delete_judgments.assert_called_once_with( + judgment_id=judgment_id + ) + + @pytest.mark.asyncio + async def test_delete_judgment_list_tool_error(self): + """Test error handling when deleting a judgment fails.""" + self.mock_client.plugins.search_relevance.delete_judgments.side_effect = Exception( + 'Judgment not found' + ) + + result = await self._delete_judgment_list_tool( + self.DeleteJudgmentListArgs(opensearch_cluster_name='', judgment_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting judgment' in result[0]['text'] + assert 'Judgment not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_llm_judgment_list_tool_success(self): + """Test successful creation of an LLM judgment list.""" + mock_response = {'_id': 'llm-id', 'result': 'created', 'status': 'PROCESSING'} + self.mock_client.plugins.search_relevance.put_judgments.return_value = mock_response + + result = await self._create_llm_judgment_list_tool( + self.CreateLLMJudgmentListArgs( + opensearch_cluster_name='', + name='llm-judgments', + query_set_id='qs-123', + search_configuration_id='sc-456', + model_id='model-789', + size=5, + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'LLM judgment list created' in result[0]['text'] + assert 'llm-id' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert body['name'] == 'llm-judgments' + assert body['type'] == 'LLM_JUDGMENT' + assert body['querySetId'] == 'qs-123' + assert body['searchConfigurationList'] == ['sc-456'] + assert body['modelId'] == 'model-789' + assert body['size'] == 5 + assert body['contextFields'] == [] + + @pytest.mark.asyncio + async def test_create_llm_judgment_list_tool_with_context_fields(self): + """Test LLM judgment creation with explicit context fields.""" + self.mock_client.plugins.search_relevance.put_judgments.return_value = {'_id': 'id1'} + + await self._create_llm_judgment_list_tool( + self.CreateLLMJudgmentListArgs( + opensearch_cluster_name='', + name='llm-judgments', + query_set_id='qs-123', + search_configuration_id='sc-456', + model_id='model-789', + context_fields='["title", "description"]', + ) + ) + + call_kwargs = self.mock_client.plugins.search_relevance.put_judgments.call_args + body = call_kwargs.kwargs['body'] + assert body['contextFields'] == ['title', 'description'] + + @pytest.mark.asyncio + async def test_create_llm_judgment_list_tool_invalid_context_fields(self): + """Test that invalid context_fields JSON returns an error.""" + result = await self._create_llm_judgment_list_tool( + self.CreateLLMJudgmentListArgs( + opensearch_cluster_name='', + name='llm-judgments', + query_set_id='qs-123', + search_configuration_id='sc-456', + model_id='model-789', + context_fields='not-valid-json', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating LLM judgment list' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_llm_judgment_list_tool_error(self): + """Test error handling when creating an LLM judgment list fails.""" + self.mock_client.plugins.search_relevance.put_judgments.side_effect = Exception( + 'Model not found' + ) + + result = await self._create_llm_judgment_list_tool( + self.CreateLLMJudgmentListArgs( + opensearch_cluster_name='', + name='llm-judgments', + query_set_id='qs-123', + search_configuration_id='sc-456', + model_id='invalid-model', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating LLM judgment list' in result[0]['text'] + assert 'Model not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_judgment_tools_registered_in_registry(self): + """Test that all judgment tools are registered in the TOOL_REGISTRY.""" + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import TOOL_REGISTRY + + assert 'GetJudgmentListTool' in TOOL_REGISTRY + assert 'CreateJudgmentListTool' in TOOL_REGISTRY + assert 'CreateUBIJudgmentListTool' in TOOL_REGISTRY + assert 'CreateLLMJudgmentListTool' in TOOL_REGISTRY + assert 'DeleteJudgmentListTool' in TOOL_REGISTRY + + for tool_name in ['GetJudgmentListTool', 'CreateJudgmentListTool', 'CreateUBIJudgmentListTool', 'CreateLLMJudgmentListTool', 'DeleteJudgmentListTool']: + tool = TOOL_REGISTRY[tool_name] + assert 'description' in tool + assert 'input_schema' in tool + assert 'function' in tool + assert 'args_model' in tool + assert tool.get('min_version') == '3.1.0' diff --git a/tests/tools/test_query_set_tools.py b/tests/tools/test_query_set_tools.py new file mode 100644 index 0000000..d068600 --- /dev/null +++ b/tests/tools/test_query_set_tools.py @@ -0,0 +1,327 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import pytest +from unittest.mock import Mock, patch, AsyncMock + + +class TestQuerySetTools: + def setup_method(self): + """Setup that runs before each test method.""" + self.mock_client = Mock() + self.mock_client.info = AsyncMock(return_value={'version': {'number': '3.1.0'}}) + + self.mock_client.plugins = Mock() + self.mock_client.plugins.search_relevance = Mock() + self.mock_client.plugins.search_relevance.get_query_sets = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.put_query_sets = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.post_query_sets = AsyncMock(return_value={}) + self.mock_client.plugins.search_relevance.delete_query_sets = AsyncMock(return_value={}) + + self.init_client_patcher = patch( + 'opensearch.client.initialize_client', return_value=self.mock_client + ) + self.init_client_patcher.start() + + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import ( + GetQuerySetArgs, + CreateQuerySetArgs, + SampleQuerySetArgs, + DeleteQuerySetArgs, + get_query_set_tool, + create_query_set_tool, + sample_query_set_tool, + delete_query_set_tool, + ) + + self.GetQuerySetArgs = GetQuerySetArgs + self.CreateQuerySetArgs = CreateQuerySetArgs + self.SampleQuerySetArgs = SampleQuerySetArgs + self.DeleteQuerySetArgs = DeleteQuerySetArgs + self._get_query_set_tool = get_query_set_tool + self._create_query_set_tool = create_query_set_tool + self._sample_query_set_tool = sample_query_set_tool + self._delete_query_set_tool = delete_query_set_tool + + def teardown_method(self): + """Cleanup after each test method.""" + self.init_client_patcher.stop() + + @pytest.mark.asyncio + async def test_get_query_set_tool_success(self): + """Test successful retrieval of a query set by ID.""" + query_set_id = 'abc123' + mock_response = { + '_id': query_set_id, + '_source': { + 'name': 'my-query-set', + 'description': 'Test queries', + 'querySetQueries': [{'queryText': 'laptop'}], + }, + } + self.mock_client.plugins.search_relevance.get_query_sets.return_value = mock_response + + result = await self._get_query_set_tool( + self.GetQuerySetArgs(opensearch_cluster_name='', query_set_id=query_set_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert query_set_id in result[0]['text'] + assert 'my-query-set' in result[0]['text'] + self.mock_client.plugins.search_relevance.get_query_sets.assert_called_once_with( + query_set_id=query_set_id + ) + + @pytest.mark.asyncio + async def test_get_query_set_tool_error(self): + """Test error handling when retrieving a query set fails.""" + self.mock_client.plugins.search_relevance.get_query_sets.side_effect = Exception( + 'Not found' + ) + + result = await self._get_query_set_tool( + self.GetQuerySetArgs(opensearch_cluster_name='', query_set_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error retrieving query set' in result[0]['text'] + assert 'Not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_create_query_set_tool_string_queries(self): + """Test creating a query set with a list of plain string queries.""" + mock_response = {'_id': 'new-id', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_query_sets.return_value = mock_response + + result = await self._create_query_set_tool( + self.CreateQuerySetArgs( + opensearch_cluster_name='', + name='my-set', + queries='["laptop", "headphones"]', + description='Test set', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set created' in result[0]['text'] + assert 'new-id' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['name'] == 'my-set' + assert body['sampling'] == 'manual' + assert body['querySetQueries'] == [ + {'queryText': 'laptop'}, + {'queryText': 'headphones'}, + ] + + @pytest.mark.asyncio + async def test_create_query_set_tool_dict_queries(self): + """Test creating a query set with queries already in queryText dict format.""" + mock_response = {'_id': 'new-id', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_query_sets.return_value = mock_response + + result = await self._create_query_set_tool( + self.CreateQuerySetArgs( + opensearch_cluster_name='', + name='my-set', + queries='[{"queryText": "laptop"}, {"queryText": "monitor"}]', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set created' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.put_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['querySetQueries'] == [ + {'queryText': 'laptop'}, + {'queryText': 'monitor'}, + ] + + @pytest.mark.asyncio + async def test_create_query_set_tool_default_description(self): + """Test that description defaults to 'Query set: ' when not provided.""" + self.mock_client.plugins.search_relevance.put_query_sets.return_value = {'_id': 'id1'} + + await self._create_query_set_tool( + self.CreateQuerySetArgs( + opensearch_cluster_name='', + name='my-set', + queries='["query1"]', + ) + ) + + call_kwargs = self.mock_client.plugins.search_relevance.put_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['description'] == 'Query set: my-set' + + @pytest.mark.asyncio + async def test_create_query_set_tool_invalid_queries(self): + """Test that invalid queries JSON returns an error.""" + result = await self._create_query_set_tool( + self.CreateQuerySetArgs( + opensearch_cluster_name='', + name='my-set', + queries='not-valid-json', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating query set' in result[0]['text'] + + @pytest.mark.asyncio + async def test_sample_query_set_tool_success(self): + """Test successful sampling of a query set from UBI data.""" + mock_response = {'_id': 'sampled-id', 'result': 'created'} + self.mock_client.plugins.search_relevance.post_query_sets.return_value = mock_response + + result = await self._sample_query_set_tool( + self.SampleQuerySetArgs( + opensearch_cluster_name='', + name='top-queries', + query_set_size=20, + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set sampled' in result[0]['text'] + assert 'sampled-id' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.post_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['name'] == 'top-queries' + assert body['sampling'] == 'topn' + assert body['querySetSize'] == 20 + + @pytest.mark.asyncio + async def test_sample_query_set_tool_custom_sampling(self): + """Test sampling a query set with a non-default sampling method.""" + self.mock_client.plugins.search_relevance.post_query_sets.return_value = { + '_id': 'random-id', + 'result': 'created', + } + + result = await self._sample_query_set_tool( + self.SampleQuerySetArgs( + opensearch_cluster_name='', + name='random-queries', + query_set_size=30, + sampling='random', + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set sampled' in result[0]['text'] + + call_kwargs = self.mock_client.plugins.search_relevance.post_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['sampling'] == 'random' + assert body['querySetSize'] == 30 + + @pytest.mark.asyncio + async def test_sample_query_set_tool_default_description(self): + """Test that description defaults to a generated string when not provided.""" + self.mock_client.plugins.search_relevance.post_query_sets.return_value = {'_id': 'id1'} + + await self._sample_query_set_tool( + self.SampleQuerySetArgs( + opensearch_cluster_name='', + name='top-queries', + query_set_size=50, + ) + ) + + call_kwargs = self.mock_client.plugins.search_relevance.post_query_sets.call_args + body = call_kwargs.kwargs['body'] + assert body['description'] == 'Query set: top-queries (topn, size=50)' + + @pytest.mark.asyncio + async def test_sample_query_set_tool_error(self): + """Test error handling when sampling a query set fails.""" + self.mock_client.plugins.search_relevance.post_query_sets.side_effect = Exception( + 'UBI index not found' + ) + + result = await self._sample_query_set_tool( + self.SampleQuerySetArgs( + opensearch_cluster_name='', + name='top-queries', + query_set_size=20, + ) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error sampling query set' in result[0]['text'] + assert 'UBI index not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_delete_query_set_tool_success(self): + """Test successful deletion of a query set by ID.""" + query_set_id = 'abc123' + mock_response = {'_id': query_set_id, 'result': 'deleted'} + self.mock_client.plugins.search_relevance.delete_query_sets.return_value = mock_response + + result = await self._delete_query_set_tool( + self.DeleteQuerySetArgs(opensearch_cluster_name='', query_set_id=query_set_id) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert query_set_id in result[0]['text'] + assert 'deleted' in result[0]['text'] + self.mock_client.plugins.search_relevance.delete_query_sets.assert_called_once_with( + query_set_id=query_set_id + ) + + @pytest.mark.asyncio + async def test_delete_query_set_tool_error(self): + """Test error handling when deleting a query set fails.""" + self.mock_client.plugins.search_relevance.delete_query_sets.side_effect = Exception( + 'Query set not found' + ) + + result = await self._delete_query_set_tool( + self.DeleteQuerySetArgs(opensearch_cluster_name='', query_set_id='missing-id') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting query set' in result[0]['text'] + assert 'Query set not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_query_set_tools_registered_in_registry(self): + """Test that all query set tools are registered in the TOOL_REGISTRY.""" + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import TOOL_REGISTRY + + assert 'GetQuerySetTool' in TOOL_REGISTRY + assert 'CreateQuerySetTool' in TOOL_REGISTRY + assert 'SampleQuerySetTool' in TOOL_REGISTRY + assert 'DeleteQuerySetTool' in TOOL_REGISTRY + + for tool_name in ['GetQuerySetTool', 'CreateQuerySetTool', 'SampleQuerySetTool', 'DeleteQuerySetTool']: + tool = TOOL_REGISTRY[tool_name] + assert 'description' in tool + assert 'input_schema' in tool + assert 'function' in tool + assert 'args_model' in tool + assert tool.get('min_version') == '3.1.0' diff --git a/tests/tools/test_srw_search_tools.py b/tests/tools/test_srw_search_tools.py new file mode 100644 index 0000000..e946807 --- /dev/null +++ b/tests/tools/test_srw_search_tools.py @@ -0,0 +1,275 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import json +import pytest +from unittest.mock import Mock, patch, AsyncMock + + +class TestSRWSearchTools: + def setup_method(self): + """Setup that runs before each test method.""" + self.mock_client = Mock() + self.mock_client.info = AsyncMock(return_value={'version': {'number': '3.5.0'}}) + self.mock_client.transport = Mock() + self.mock_client.transport.perform_request = AsyncMock(return_value={}) + + self.init_client_patcher = patch( + 'opensearch.client.initialize_client', return_value=self.mock_client + ) + self.init_client_patcher.start() + + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import ( + SearchQuerySetsArgs, + SearchSearchConfigurationsArgs, + SearchJudgmentsArgs, + SearchExperimentsArgs, + search_query_sets_tool, + search_search_configurations_tool, + search_judgments_tool, + search_experiments_tool, + ) + + self.SearchQuerySetsArgs = SearchQuerySetsArgs + self.SearchSearchConfigurationsArgs = SearchSearchConfigurationsArgs + self.SearchJudgmentsArgs = SearchJudgmentsArgs + self.SearchExperimentsArgs = SearchExperimentsArgs + self._search_query_sets_tool = search_query_sets_tool + self._search_search_configurations_tool = search_search_configurations_tool + self._search_judgments_tool = search_judgments_tool + self._search_experiments_tool = search_experiments_tool + + def teardown_method(self): + """Cleanup after each test method.""" + self.init_client_patcher.stop() + + # --- SearchQuerySetsTool --- + + @pytest.mark.asyncio + async def test_search_query_sets_default_query(self): + """Test that SearchQuerySetsTool uses match_all when no query body is provided.""" + mock_response = {'hits': {'total': {'value': 2}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + result = await self._search_query_sets_tool( + self.SearchQuerySetsArgs(opensearch_cluster_name='') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set search results' in result[0]['text'] + + call_kwargs = self.mock_client.transport.perform_request.call_args + assert call_kwargs.kwargs['method'] == 'POST' + assert call_kwargs.kwargs['url'] == '/_plugins/_search_relevance/query_sets/_search' + body = json.loads(call_kwargs.kwargs['body']) + assert body == {'query': {'match_all': {}}} + + @pytest.mark.asyncio + async def test_search_query_sets_custom_query(self): + """Test SearchQuerySetsTool with a custom query DSL body.""" + mock_response = {'hits': {'total': {'value': 1}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + query_body = {'query': {'match': {'name': 'my-set'}}, 'size': 5} + result = await self._search_query_sets_tool( + self.SearchQuerySetsArgs(opensearch_cluster_name='', query_body=query_body) + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Query set search results' in result[0]['text'] + + call_kwargs = self.mock_client.transport.perform_request.call_args + body = json.loads(call_kwargs.kwargs['body']) + assert body == query_body + + @pytest.mark.asyncio + async def test_search_query_sets_error(self): + """Test error handling for SearchQuerySetsTool.""" + self.mock_client.transport.perform_request.side_effect = Exception('Connection refused') + + result = await self._search_query_sets_tool( + self.SearchQuerySetsArgs(opensearch_cluster_name='') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error searching query sets' in result[0]['text'] + assert 'Connection refused' in result[0]['text'] + + # --- SearchSearchConfigurationsTool --- + + @pytest.mark.asyncio + async def test_search_search_configurations_default_query(self): + """Test that SearchSearchConfigurationsTool uses match_all when no query body provided.""" + mock_response = {'hits': {'total': {'value': 3}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + result = await self._search_search_configurations_tool( + self.SearchSearchConfigurationsArgs(opensearch_cluster_name='') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Search configuration search results' in result[0]['text'] + + call_kwargs = self.mock_client.transport.perform_request.call_args + assert call_kwargs.kwargs['url'] == '/_plugins/_search_relevance/search_configurations/_search' + body = json.loads(call_kwargs.kwargs['body']) + assert body == {'query': {'match_all': {}}} + + @pytest.mark.asyncio + async def test_search_search_configurations_custom_query(self): + """Test SearchSearchConfigurationsTool with a custom query DSL body.""" + mock_response = {'hits': {'total': {'value': 1}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + query_body = {'query': {'match': {'name': 'bm25'}}, 'size': 10} + result = await self._search_search_configurations_tool( + self.SearchSearchConfigurationsArgs(opensearch_cluster_name='', query_body=query_body) + ) + + call_kwargs = self.mock_client.transport.perform_request.call_args + body = json.loads(call_kwargs.kwargs['body']) + assert body == query_body + + @pytest.mark.asyncio + async def test_search_search_configurations_error(self): + """Test error handling for SearchSearchConfigurationsTool.""" + self.mock_client.transport.perform_request.side_effect = Exception('Not found') + + result = await self._search_search_configurations_tool( + self.SearchSearchConfigurationsArgs(opensearch_cluster_name='') + ) + + assert 'Error searching search configurations' in result[0]['text'] + assert 'Not found' in result[0]['text'] + + # --- SearchJudgmentsTool --- + + @pytest.mark.asyncio + async def test_search_judgments_default_query(self): + """Test that SearchJudgmentsTool uses match_all when no query body is provided.""" + mock_response = {'hits': {'total': {'value': 5}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + result = await self._search_judgments_tool( + self.SearchJudgmentsArgs(opensearch_cluster_name='') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Judgment search results' in result[0]['text'] + + call_kwargs = self.mock_client.transport.perform_request.call_args + assert call_kwargs.kwargs['url'] == '/_plugins/_search_relevance/judgments/_search' + body = json.loads(call_kwargs.kwargs['body']) + assert body == {'query': {'match_all': {}}} + + @pytest.mark.asyncio + async def test_search_judgments_custom_query(self): + """Test SearchJudgmentsTool with a custom query DSL body.""" + mock_response = {'hits': {'total': {'value': 1}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + query_body = {'query': {'match': {'name': 'my-judgments'}}} + result = await self._search_judgments_tool( + self.SearchJudgmentsArgs(opensearch_cluster_name='', query_body=query_body) + ) + + call_kwargs = self.mock_client.transport.perform_request.call_args + body = json.loads(call_kwargs.kwargs['body']) + assert body == query_body + + @pytest.mark.asyncio + async def test_search_judgments_error(self): + """Test error handling for SearchJudgmentsTool.""" + self.mock_client.transport.perform_request.side_effect = Exception('Timeout') + + result = await self._search_judgments_tool( + self.SearchJudgmentsArgs(opensearch_cluster_name='') + ) + + assert 'Error searching judgments' in result[0]['text'] + assert 'Timeout' in result[0]['text'] + + # --- SearchExperimentsTool --- + + @pytest.mark.asyncio + async def test_search_experiments_default_query(self): + """Test that SearchExperimentsTool uses match_all when no query body is provided.""" + mock_response = {'hits': {'total': {'value': 2}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + result = await self._search_experiments_tool( + self.SearchExperimentsArgs(opensearch_cluster_name='') + ) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Experiment search results' in result[0]['text'] + + call_kwargs = self.mock_client.transport.perform_request.call_args + assert call_kwargs.kwargs['url'] == '/_plugins/_search_relevance/experiment/_search' + body = json.loads(call_kwargs.kwargs['body']) + assert body == {'query': {'match_all': {}}} + + @pytest.mark.asyncio + async def test_search_experiments_custom_query(self): + """Test SearchExperimentsTool with a custom query DSL body.""" + mock_response = {'hits': {'total': {'value': 1}, 'hits': []}} + self.mock_client.transport.perform_request.return_value = mock_response + + query_body = {'query': {'term': {'type.keyword': 'PAIRWISE_COMPARISON'}}, 'size': 10} + result = await self._search_experiments_tool( + self.SearchExperimentsArgs(opensearch_cluster_name='', query_body=query_body) + ) + + call_kwargs = self.mock_client.transport.perform_request.call_args + body = json.loads(call_kwargs.kwargs['body']) + assert body == query_body + + @pytest.mark.asyncio + async def test_search_experiments_error(self): + """Test error handling for SearchExperimentsTool.""" + self.mock_client.transport.perform_request.side_effect = Exception('Server error') + + result = await self._search_experiments_tool( + self.SearchExperimentsArgs(opensearch_cluster_name='') + ) + + assert 'Error searching experiments' in result[0]['text'] + assert 'Server error' in result[0]['text'] + + # --- Registry --- + + @pytest.mark.asyncio + async def test_srw_search_tools_registered_in_registry(self): + """Test that all SRW search tools are registered in the TOOL_REGISTRY.""" + import sys + for module in ['tools.tools']: + if module in sys.modules: + del sys.modules[module] + + from tools.tools import TOOL_REGISTRY + + for tool_name in [ + 'SearchQuerySetsTool', + 'SearchSearchConfigurationsTool', + 'SearchJudgmentsTool', + 'SearchExperimentsTool', + ]: + assert tool_name in TOOL_REGISTRY + tool = TOOL_REGISTRY[tool_name] + assert 'description' in tool + assert 'input_schema' in tool + assert 'function' in tool + assert 'args_model' in tool + assert tool.get('min_version') == '3.5.0' + assert 'GET' in tool.get('http_methods', '') diff --git a/tests/tools/test_tool_filters.py b/tests/tools/test_tool_filters.py index f0422ca..74ea4be 100644 --- a/tests/tools/test_tool_filters.py +++ b/tests/tools/test_tool_filters.py @@ -418,6 +418,135 @@ def test_disable_core_tools(self): assert 'ClusterHealthTool' not in self.tool_registry assert 'ExplainTool' not in self.tool_registry + def test_search_relevance_category_is_not_enabled_by_default(self): + """search_relevance tools are not enabled unless the category is explicitly enabled.""" + registry = { + 'ListIndexTool': {'display_name': 'ListIndexTool', 'http_methods': 'GET'}, + 'CreateSearchConfigurationTool': { + 'display_name': 'CreateSearchConfigurationTool', + 'http_methods': 'PUT', + }, + 'GetSearchConfigurationTool': { + 'display_name': 'GetSearchConfigurationTool', + 'http_methods': 'GET', + }, + 'DeleteSearchConfigurationTool': { + 'display_name': 'DeleteSearchConfigurationTool', + 'http_methods': 'DELETE', + }, + 'GetQuerySetTool': {'display_name': 'GetQuerySetTool', 'http_methods': 'GET'}, + 'CreateQuerySetTool': {'display_name': 'CreateQuerySetTool', 'http_methods': 'PUT'}, + 'SampleQuerySetTool': {'display_name': 'SampleQuerySetTool', 'http_methods': 'POST'}, + 'DeleteQuerySetTool': {'display_name': 'DeleteQuerySetTool', 'http_methods': 'DELETE'}, + 'GetJudgmentListTool': {'display_name': 'GetJudgmentListTool', 'http_methods': 'GET'}, + 'CreateJudgmentListTool': {'display_name': 'CreateJudgmentListTool', 'http_methods': 'PUT'}, + 'CreateUBIJudgmentListTool': { + 'display_name': 'CreateUBIJudgmentListTool', + 'http_methods': 'PUT', + }, + 'CreateLLMJudgmentListTool': { + 'display_name': 'CreateLLMJudgmentListTool', + 'http_methods': 'PUT', + }, + 'DeleteJudgmentListTool': {'display_name': 'DeleteJudgmentListTool', 'http_methods': 'DELETE'}, + 'GetExperimentTool': {'display_name': 'GetExperimentTool', 'http_methods': 'GET'}, + 'CreateExperimentTool': {'display_name': 'CreateExperimentTool', 'http_methods': 'PUT'}, + 'DeleteExperimentTool': {'display_name': 'DeleteExperimentTool', 'http_methods': 'DELETE'}, + 'SearchQuerySetsTool': {'display_name': 'SearchQuerySetsTool', 'http_methods': 'GET, POST'}, + 'SearchSearchConfigurationsTool': {'display_name': 'SearchSearchConfigurationsTool', 'http_methods': 'GET, POST'}, + 'SearchJudgmentsTool': {'display_name': 'SearchJudgmentsTool', 'http_methods': 'GET, POST'}, + 'SearchExperimentsTool': {'display_name': 'SearchExperimentsTool', 'http_methods': 'GET, POST'}, + } + process_tool_filter(tool_registry=registry, allow_write=True) + + # core_tools are enabled by default, search_relevance tools are not + assert 'ListIndexTool' in registry + assert 'CreateSearchConfigurationTool' not in registry + assert 'GetSearchConfigurationTool' not in registry + assert 'DeleteSearchConfigurationTool' not in registry + assert 'GetQuerySetTool' not in registry + assert 'CreateQuerySetTool' not in registry + assert 'SampleQuerySetTool' not in registry + assert 'DeleteQuerySetTool' not in registry + assert 'GetJudgmentListTool' not in registry + assert 'CreateJudgmentListTool' not in registry + assert 'CreateUBIJudgmentListTool' not in registry + assert 'CreateLLMJudgmentListTool' not in registry + assert 'DeleteJudgmentListTool' not in registry + assert 'GetExperimentTool' not in registry + assert 'CreateExperimentTool' not in registry + assert 'DeleteExperimentTool' not in registry + assert 'SearchQuerySetsTool' not in registry + assert 'SearchSearchConfigurationsTool' not in registry + assert 'SearchJudgmentsTool' not in registry + assert 'SearchExperimentsTool' not in registry + + def test_search_relevance_category_can_be_enabled(self): + """search_relevance tools are exposed when the category is explicitly enabled.""" + registry = { + 'ListIndexTool': {'display_name': 'ListIndexTool', 'http_methods': 'GET'}, + 'CreateSearchConfigurationTool': { + 'display_name': 'CreateSearchConfigurationTool', + 'http_methods': 'PUT', + }, + 'GetSearchConfigurationTool': { + 'display_name': 'GetSearchConfigurationTool', + 'http_methods': 'GET', + }, + 'DeleteSearchConfigurationTool': { + 'display_name': 'DeleteSearchConfigurationTool', + 'http_methods': 'DELETE', + }, + 'GetQuerySetTool': {'display_name': 'GetQuerySetTool', 'http_methods': 'GET'}, + 'CreateQuerySetTool': {'display_name': 'CreateQuerySetTool', 'http_methods': 'PUT'}, + 'SampleQuerySetTool': {'display_name': 'SampleQuerySetTool', 'http_methods': 'POST'}, + 'DeleteQuerySetTool': {'display_name': 'DeleteQuerySetTool', 'http_methods': 'DELETE'}, + 'GetJudgmentListTool': {'display_name': 'GetJudgmentListTool', 'http_methods': 'GET'}, + 'CreateJudgmentListTool': {'display_name': 'CreateJudgmentListTool', 'http_methods': 'PUT'}, + 'CreateUBIJudgmentListTool': { + 'display_name': 'CreateUBIJudgmentListTool', + 'http_methods': 'PUT', + }, + 'CreateLLMJudgmentListTool': { + 'display_name': 'CreateLLMJudgmentListTool', + 'http_methods': 'PUT', + }, + 'DeleteJudgmentListTool': {'display_name': 'DeleteJudgmentListTool', 'http_methods': 'DELETE'}, + 'GetExperimentTool': {'display_name': 'GetExperimentTool', 'http_methods': 'GET'}, + 'CreateExperimentTool': {'display_name': 'CreateExperimentTool', 'http_methods': 'PUT'}, + 'DeleteExperimentTool': {'display_name': 'DeleteExperimentTool', 'http_methods': 'DELETE'}, + 'SearchQuerySetsTool': {'display_name': 'SearchQuerySetsTool', 'http_methods': 'GET, POST'}, + 'SearchSearchConfigurationsTool': {'display_name': 'SearchSearchConfigurationsTool', 'http_methods': 'GET, POST'}, + 'SearchJudgmentsTool': {'display_name': 'SearchJudgmentsTool', 'http_methods': 'GET, POST'}, + 'SearchExperimentsTool': {'display_name': 'SearchExperimentsTool', 'http_methods': 'GET, POST'}, + } + process_tool_filter( + tool_registry=registry, + enabled_categories='core_tools,search_relevance', + allow_write=True, + ) + + assert 'ListIndexTool' in registry + assert 'CreateSearchConfigurationTool' in registry + assert 'GetSearchConfigurationTool' in registry + assert 'DeleteSearchConfigurationTool' in registry + assert 'GetQuerySetTool' in registry + assert 'CreateQuerySetTool' in registry + assert 'SampleQuerySetTool' in registry + assert 'DeleteQuerySetTool' in registry + assert 'GetJudgmentListTool' in registry + assert 'CreateJudgmentListTool' in registry + assert 'CreateUBIJudgmentListTool' in registry + assert 'CreateLLMJudgmentListTool' in registry + assert 'DeleteJudgmentListTool' in registry + assert 'GetExperimentTool' in registry + assert 'CreateExperimentTool' in registry + assert 'DeleteExperimentTool' in registry + assert 'SearchQuerySetsTool' in registry + assert 'SearchSearchConfigurationsTool' in registry + assert 'SearchJudgmentsTool' in registry + assert 'SearchExperimentsTool' in registry + class TestAllowWriteSettings: """Test cases for the allow_write setting functionality.""" diff --git a/tests/tools/test_tool_logging.py b/tests/tools/test_tool_logging.py new file mode 100644 index 0000000..db7560f --- /dev/null +++ b/tests/tools/test_tool_logging.py @@ -0,0 +1,147 @@ +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 + +import logging +import pytest + +from tools.tool_logging import log_tool_error + + +class TestLogToolError: + def test_returns_mcp_error_format(self): + exc = Exception('something broke') + result = log_tool_error('TestTool', exc, 'doing stuff') + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert result[0]['text'] == 'Error doing stuff: something broke' + assert result[0]['is_error'] is True + + def test_returns_error_without_operation(self): + exc = Exception('fail') + result = log_tool_error('TestTool', exc) + + assert result[0]['text'] == 'Error: fail' + + def test_emits_structured_log(self, caplog): + exc = Exception('test error') + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching index') + + records = [ + r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_error' + ] + assert len(records) == 1 + + record = records[0] + assert record.tool_name == 'SearchIndexTool' + assert record.exception_type == 'Exception' + assert record.status == 'error' + + def test_extracts_status_code_from_transport_error(self, caplog): + # Simulate opensearchpy TransportError which has status_code attribute + exc = Exception('index_not_found_exception') + exc.status_code = 404 + + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching index') + + records = [r for r in caplog.records if hasattr(r, 'status_code')] + assert len(records) == 1 + assert records[0].status_code == 404 + + def test_extracts_root_cause_from_info(self, caplog): + exc = Exception('search error') + exc.status_code = 400 + exc.info = { + 'error': { + 'root_cause': [{'type': 'query_shard_exception', 'reason': 'bad query'}], + 'type': 'search_phase_execution_exception', + } + } + + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching index') + + records = [r for r in caplog.records if hasattr(r, 'root_cause')] + assert len(records) == 1 + assert records[0].root_cause == 'query_shard_exception' + + def test_extracts_root_cause_from_string_info(self, caplog): + """When exception.info is a JSON string (fallback path), root_cause should still be extracted.""" + exc = Exception('search error') + exc.status_code = 400 + exc.info = '{"error":{"root_cause":[{"type":"parsing_exception","reason":"unknown query"}],"type":"search_phase_execution_exception"},"status":400}' + + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching index') + + records = [r for r in caplog.records if hasattr(r, 'root_cause')] + assert len(records) == 1 + assert records[0].root_cause == 'parsing_exception' + + def test_extracts_root_cause_from_error_attr(self, caplog): + """Async opensearchpy stores response body in exception.error, not exception.info.""" + exc = Exception('not found') + exc.status_code = 404 + exc.info = None + exc.error = '{"error":{"root_cause":[{"type":"index_not_found_exception","reason":"no such index"}],"type":"index_not_found_exception"},"status":404}' + + with caplog.at_level(logging.ERROR): + log_tool_error('CountTool', exc, 'executing CountTool') + + records = [r for r in caplog.records if hasattr(r, 'root_cause')] + assert len(records) == 1 + assert records[0].root_cause == 'index_not_found_exception' + + def test_string_info_not_json_is_ignored(self, caplog): + """When exception.info is a non-JSON string, root_cause should be absent.""" + exc = Exception('error') + exc.status_code = 400 + exc.info = 'plain text error body' + + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching index') + + records = [ + r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_error' + ] + assert len(records) == 1 + assert not hasattr(records[0], 'root_cause') + + def test_context_kwargs_included_in_log(self, caplog): + exc = Exception('error') + with caplog.at_level(logging.ERROR): + log_tool_error('SearchIndexTool', exc, 'searching', index='my-index', method='GET') + + records = [ + r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_error' + ] + assert len(records) == 1 + assert records[0].index == 'my-index' + assert records[0].method == 'GET' + + def test_none_context_values_excluded(self, caplog): + exc = Exception('error') + with caplog.at_level(logging.ERROR): + log_tool_error('TestTool', exc, 'doing stuff', index=None, other='value') + + records = [ + r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_error' + ] + assert len(records) == 1 + assert not hasattr(records[0], 'index') + assert records[0].other == 'value' + + def test_no_status_code_attribute(self, caplog): + """When exception has no status_code, the field should not appear in the log.""" + exc = ValueError('plain error') + with caplog.at_level(logging.ERROR): + log_tool_error('TestTool', exc, 'doing stuff') + + records = [ + r for r in caplog.records if hasattr(r, 'event_type') and r.event_type == 'tool_error' + ] + assert len(records) == 1 + assert not hasattr(records[0], 'status_code') + assert records[0].exception_type == 'ValueError' diff --git a/tests/tools/test_tools.py b/tests/tools/test_tools.py index 61b4bf9..a588845 100644 --- a/tests/tools/test_tools.py +++ b/tests/tools/test_tools.py @@ -3,7 +3,7 @@ import json import pytest -from unittest.mock import Mock, patch, AsyncMock +from unittest.mock import AsyncMock, Mock, patch class TestTools: @@ -27,6 +27,17 @@ def setup_method(self): self.mock_client.indices.stats = AsyncMock(return_value={}) self.mock_client.transport.perform_request = AsyncMock(return_value={}) self.mock_client.info = AsyncMock(return_value={'version': {'number': '2.19.0'}}) + self.mock_client.plugins = Mock() + self.mock_client.plugins.search_relevance = Mock() + self.mock_client.plugins.search_relevance.put_search_configurations = AsyncMock( + return_value={} + ) + self.mock_client.plugins.search_relevance.get_search_configurations = AsyncMock( + return_value={} + ) + self.mock_client.plugins.search_relevance.delete_search_configurations = AsyncMock( + return_value={} + ) # Patch initialize_client to always return our mock client self.init_client_patcher = patch( @@ -45,23 +56,33 @@ def setup_method(self): del sys.modules[module] # Import after patching to ensure fresh imports + from tools.tools import ( TOOL_REGISTRY, - GetIndexMappingArgs, - GetShardsArgs, - ListIndicesArgs, - SearchIndexArgs, - GetClusterStateArgs, - GetSegmentsArgs, CatNodesArgs, - GetNodesArgs, + GetAllocationArgs, + GetClusterStateArgs, GetIndexInfoArgs, + GetIndexMappingArgs, GetIndexStatsArgs, - GetQueryInsightsArgs, - GetNodesHotThreadsArgs, - GetAllocationArgs, GetLongRunningTasksArgs, + GetNodesArgs, + GetNodesHotThreadsArgs, + GetQueryInsightsArgs, + GetSegmentsArgs, + GetShardsArgs, + ListIndicesArgs, + SearchIndexArgs, + CreateSearchConfigurationArgs, + GetSearchConfigurationArgs, + DeleteSearchConfigurationArgs, get_index_mapping_tool, + get_index_stats_tool, + get_long_running_tasks_tool, + get_nodes_hot_threads_tool, + get_nodes_tool, + get_query_insights_tool, + get_segments_tool, get_shards_tool, list_indices_tool, search_index_tool, @@ -75,6 +96,9 @@ def setup_method(self): get_nodes_hot_threads_tool, get_allocation_tool, get_long_running_tasks_tool, + create_search_configuration_tool, + get_search_configuration_tool, + delete_search_configuration_tool, ) self.ListIndicesArgs = ListIndicesArgs @@ -106,6 +130,12 @@ def setup_method(self): self._get_nodes_hot_threads_tool = get_nodes_hot_threads_tool self._get_allocation_tool = get_allocation_tool self._get_long_running_tasks_tool = get_long_running_tasks_tool + self.CreateSearchConfigurationArgs = CreateSearchConfigurationArgs + self.GetSearchConfigurationArgs = GetSearchConfigurationArgs + self.DeleteSearchConfigurationArgs = DeleteSearchConfigurationArgs + self._create_search_configuration_tool = create_search_configuration_tool + self._get_search_configuration_tool = get_search_configuration_tool + self._delete_search_configuration_tool = delete_search_configuration_tool def teardown_method(self): """Cleanup after each test method.""" @@ -298,7 +328,7 @@ async def test_search_index_tool(self): self.mock_client.search.return_value = mock_results # Execute args = self.SearchIndexArgs( - index='test-index', query={'match_all': {}}, opensearch_cluster_name='' + index='test-index', query_dsl={'match_all': {}}, opensearch_cluster_name='' ) result = await self._search_index_tool(args) # Assert @@ -318,7 +348,7 @@ async def test_search_index_tool_error(self): self.mock_client.search.side_effect = Exception('Test error') # Execute args = self.SearchIndexArgs( - index='test-index', query={'match_all': {}}, opensearch_cluster_name='' + index='test-index', query_dsl={'match_all': {}}, opensearch_cluster_name='' ) result = await self._search_index_tool(args) # Assert @@ -1158,6 +1188,133 @@ async def test_get_nodes_tool_error(self): method='GET', url='/_nodes' ) + @pytest.mark.asyncio + async def test_create_search_configuration_tool(self): + """Test create_search_configuration_tool successful.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + mock_response = {'_id': 'cfg-1', 'result': 'created'} + self.mock_client.plugins.search_relevance.put_search_configurations.return_value = ( + mock_response + ) + + args = self.CreateSearchConfigurationArgs( + name='my-config', + index='my-index', + query='{"query":{"match":{"title":"%SearchText%"}}}', + opensearch_cluster_name='', + ) + result = await self._create_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Search configuration created' in result[0]['text'] + assert '"_id":"cfg-1"' in result[0]['text'] + self.mock_client.plugins.search_relevance.put_search_configurations.assert_called_once_with( + body={ + 'name': 'my-config', + 'index': 'my-index', + 'query': '{"query":{"match":{"title":"%SearchText%"}}}', + } + ) + + @pytest.mark.asyncio + async def test_create_search_configuration_tool_error(self): + """Test create_search_configuration_tool exception handling.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + self.mock_client.plugins.search_relevance.put_search_configurations.side_effect = ( + Exception('Test error') + ) + + args = self.CreateSearchConfigurationArgs( + name='my-config', + index='my-index', + query='{"query":{"match_all":{}}}', + opensearch_cluster_name='', + ) + result = await self._create_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error creating search configuration: Test error' in result[0]['text'] + + @pytest.mark.asyncio + async def test_get_search_configuration_tool(self): + """Test get_search_configuration_tool successful.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + mock_response = {'_id': 'cfg-1', '_source': {'name': 'my-config', 'index': 'my-index'}} + self.mock_client.plugins.search_relevance.get_search_configurations.return_value = ( + mock_response + ) + + args = self.GetSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self._get_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Search configuration cfg-1' in result[0]['text'] + assert '"_id":"cfg-1"' in result[0]['text'] + self.mock_client.plugins.search_relevance.get_search_configurations.assert_called_once_with( + search_configuration_id='cfg-1' + ) + + @pytest.mark.asyncio + async def test_get_search_configuration_tool_error(self): + """Test get_search_configuration_tool exception handling.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + self.mock_client.plugins.search_relevance.get_search_configurations.side_effect = ( + Exception('Not found') + ) + + args = self.GetSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self._get_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error retrieving search configuration: Not found' in result[0]['text'] + + @pytest.mark.asyncio + async def test_delete_search_configuration_tool(self): + """Test delete_search_configuration_tool successful.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + mock_response = {'result': 'deleted'} + self.mock_client.plugins.search_relevance.delete_search_configurations.return_value = ( + mock_response + ) + + args = self.DeleteSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self._delete_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Search configuration cfg-1 deleted' in result[0]['text'] + assert '"result":"deleted"' in result[0]['text'] + self.mock_client.plugins.search_relevance.delete_search_configurations.assert_called_once_with( + search_configuration_id='cfg-1' + ) + + @pytest.mark.asyncio + async def test_delete_search_configuration_tool_error(self): + """Test delete_search_configuration_tool exception handling.""" + self.mock_client.info.return_value = {'version': {'number': '3.1.0'}} + self.mock_client.plugins.search_relevance.delete_search_configurations.side_effect = ( + Exception('Test error') + ) + + args = self.DeleteSearchConfigurationArgs( + search_configuration_id='cfg-1', opensearch_cluster_name='' + ) + result = await self._delete_search_configuration_tool(args) + + assert len(result) == 1 + assert result[0]['type'] == 'text' + assert 'Error deleting search configuration: Test error' in result[0]['text'] + def test_tool_registry(self): """Test TOOL_REGISTRY structure.""" expected_tools = [ @@ -1175,6 +1332,9 @@ def test_tool_registry(self): 'GetNodesHotThreadsTool', 'GetAllocationTool', 'GetLongRunningTasksTool', + 'CreateSearchConfigurationTool', + 'GetSearchConfigurationTool', + 'DeleteSearchConfigurationTool', ] for tool in expected_tools: @@ -1198,7 +1358,7 @@ def test_input_models(self): assert self.GetIndexMappingArgs(index='test', opensearch_cluster_name='').index == 'test' assert ( self.SearchIndexArgs( - index='test', query={'match': {}}, opensearch_cluster_name='' + index='test', query_dsl={'match': {}}, opensearch_cluster_name='' ).index == 'test' ) diff --git a/uv.lock b/uv.lock index 3137a7f..f695fe3 100644 --- a/uv.lock +++ b/uv.lock @@ -13,7 +13,7 @@ wheels = [ [[package]] name = "aiohttp" -version = "3.11.18" +version = "3.13.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiohappyeyeballs" }, @@ -25,84 +25,132 @@ dependencies = [ { name = "propcache" }, { name = "yarl" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/63/e7/fa1a8c00e2c54b05dc8cb5d1439f627f7c267874e3f7bb047146116020f9/aiohttp-3.11.18.tar.gz", hash = "sha256:ae856e1138612b7e412db63b7708735cff4d38d0399f6a5435d3dac2669f558a", size = 7678653, upload_time = "2025-04-21T09:43:09.191Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c7/c3/e5f64af7e97a02f547020e6ff861595766bb5ecb37c7492fac9fe3c14f6c/aiohttp-3.11.18-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:96264854fedbea933a9ca4b7e0c745728f01380691687b7365d18d9e977179c4", size = 711703, upload_time = "2025-04-21T09:40:25.487Z" }, - { url = "https://files.pythonhosted.org/packages/5f/2f/53c26e96efa5fd01ebcfe1fefdfb7811f482bb21f4fa103d85eca4dcf888/aiohttp-3.11.18-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9602044ff047043430452bc3a2089743fa85da829e6fc9ee0025351d66c332b6", size = 471348, upload_time = "2025-04-21T09:40:27.569Z" }, - { url = "https://files.pythonhosted.org/packages/80/47/dcc248464c9b101532ee7d254a46f6ed2c1fd3f4f0f794cf1f2358c0d45b/aiohttp-3.11.18-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5691dc38750fcb96a33ceef89642f139aa315c8a193bbd42a0c33476fd4a1609", size = 457611, upload_time = "2025-04-21T09:40:28.978Z" }, - { url = "https://files.pythonhosted.org/packages/4c/ca/67d816ef075e8ac834b5f1f6b18e8db7d170f7aebaf76f1be462ea10cab0/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:554c918ec43f8480b47a5ca758e10e793bd7410b83701676a4782672d670da55", size = 1591976, upload_time = "2025-04-21T09:40:30.804Z" }, - { url = "https://files.pythonhosted.org/packages/46/00/0c120287aa51c744438d99e9aae9f8c55ca5b9911c42706966c91c9d68d6/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a4076a2b3ba5b004b8cffca6afe18a3b2c5c9ef679b4d1e9859cf76295f8d4f", size = 1632819, upload_time = "2025-04-21T09:40:32.731Z" }, - { url = "https://files.pythonhosted.org/packages/54/a3/3923c9040cd4927dfee1aa017513701e35adcfc35d10729909688ecaa465/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:767a97e6900edd11c762be96d82d13a1d7c4fc4b329f054e88b57cdc21fded94", size = 1666567, upload_time = "2025-04-21T09:40:34.901Z" }, - { url = "https://files.pythonhosted.org/packages/e0/ab/40dacb15c0c58f7f17686ea67bc186e9f207341691bdb777d1d5ff4671d5/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f0ddc9337a0fb0e727785ad4f41163cc314376e82b31846d3835673786420ef1", size = 1594959, upload_time = "2025-04-21T09:40:36.714Z" }, - { url = "https://files.pythonhosted.org/packages/0d/98/d40c2b7c4a5483f9a16ef0adffce279ced3cc44522e84b6ba9e906be5168/aiohttp-3.11.18-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f414f37b244f2a97e79b98d48c5ff0789a0b4b4609b17d64fa81771ad780e415", size = 1538516, upload_time = "2025-04-21T09:40:38.263Z" }, - { url = "https://files.pythonhosted.org/packages/cf/10/e0bf3a03524faac45a710daa034e6f1878b24a1fef9c968ac8eb786ae657/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fdb239f47328581e2ec7744ab5911f97afb10752332a6dd3d98e14e429e1a9e7", size = 1529037, upload_time = "2025-04-21T09:40:40.349Z" }, - { url = "https://files.pythonhosted.org/packages/ad/d6/5ff5282e00e4eb59c857844984cbc5628f933e2320792e19f93aff518f52/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:f2c50bad73ed629cc326cc0f75aed8ecfb013f88c5af116f33df556ed47143eb", size = 1546813, upload_time = "2025-04-21T09:40:42.106Z" }, - { url = "https://files.pythonhosted.org/packages/de/96/f1014f84101f9b9ad2d8acf3cc501426475f7f0cc62308ae5253e2fac9a7/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a8d8f20c39d3fa84d1c28cdb97f3111387e48209e224408e75f29c6f8e0861d", size = 1523852, upload_time = "2025-04-21T09:40:44.164Z" }, - { url = "https://files.pythonhosted.org/packages/a5/86/ec772c6838dd6bae3229065af671891496ac1834b252f305cee8152584b2/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:106032eaf9e62fd6bc6578c8b9e6dc4f5ed9a5c1c7fb2231010a1b4304393421", size = 1603766, upload_time = "2025-04-21T09:40:46.203Z" }, - { url = "https://files.pythonhosted.org/packages/84/38/31f85459c9402d409c1499284fc37a96f69afadce3cfac6a1b5ab048cbf1/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:b491e42183e8fcc9901d8dcd8ae644ff785590f1727f76ca86e731c61bfe6643", size = 1620647, upload_time = "2025-04-21T09:40:48.168Z" }, - { url = "https://files.pythonhosted.org/packages/31/2f/54aba0040764dd3d362fb37bd6aae9b3034fcae0b27f51b8a34864e48209/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ad8c745ff9460a16b710e58e06a9dec11ebc0d8f4dd82091cefb579844d69868", size = 1559260, upload_time = "2025-04-21T09:40:50.219Z" }, - { url = "https://files.pythonhosted.org/packages/ca/d2/a05c7dd9e1b6948c1c5d04f1a8bcfd7e131923fa809bb87477d5c76f1517/aiohttp-3.11.18-cp310-cp310-win32.whl", hash = "sha256:8e57da93e24303a883146510a434f0faf2f1e7e659f3041abc4e3fb3f6702a9f", size = 418051, upload_time = "2025-04-21T09:40:52.272Z" }, - { url = "https://files.pythonhosted.org/packages/39/e2/796a6179e8abe267dfc84614a50291560a989d28acacbc5dab3bcd4cbec4/aiohttp-3.11.18-cp310-cp310-win_amd64.whl", hash = "sha256:cc93a4121d87d9f12739fc8fab0a95f78444e571ed63e40bfc78cd5abe700ac9", size = 442908, upload_time = "2025-04-21T09:40:54.345Z" }, - { url = "https://files.pythonhosted.org/packages/2f/10/fd9ee4f9e042818c3c2390054c08ccd34556a3cb209d83285616434cf93e/aiohttp-3.11.18-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:427fdc56ccb6901ff8088544bde47084845ea81591deb16f957897f0f0ba1be9", size = 712088, upload_time = "2025-04-21T09:40:55.776Z" }, - { url = "https://files.pythonhosted.org/packages/22/eb/6a77f055ca56f7aae2cd2a5607a3c9e7b9554f1497a069dcfcb52bfc9540/aiohttp-3.11.18-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c828b6d23b984255b85b9b04a5b963a74278b7356a7de84fda5e3b76866597b", size = 471450, upload_time = "2025-04-21T09:40:57.301Z" }, - { url = "https://files.pythonhosted.org/packages/78/dc/5f3c0d27c91abf0bb5d103e9c9b0ff059f60cf6031a5f06f456c90731f42/aiohttp-3.11.18-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5c2eaa145bb36b33af1ff2860820ba0589e165be4ab63a49aebfd0981c173b66", size = 457836, upload_time = "2025-04-21T09:40:59.322Z" }, - { url = "https://files.pythonhosted.org/packages/49/7b/55b65af9ef48b9b811c91ff8b5b9de9650c71147f10523e278d297750bc8/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d518ce32179f7e2096bf4e3e8438cf445f05fedd597f252de9f54c728574756", size = 1690978, upload_time = "2025-04-21T09:41:00.795Z" }, - { url = "https://files.pythonhosted.org/packages/a2/5a/3f8938c4f68ae400152b42742653477fc625d6bfe02e764f3521321c8442/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0700055a6e05c2f4711011a44364020d7a10fbbcd02fbf3e30e8f7e7fddc8717", size = 1745307, upload_time = "2025-04-21T09:41:02.89Z" }, - { url = "https://files.pythonhosted.org/packages/b4/42/89b694a293333ef6f771c62da022163bcf44fb03d4824372d88e3dc12530/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8bd1cde83e4684324e6ee19adfc25fd649d04078179890be7b29f76b501de8e4", size = 1780692, upload_time = "2025-04-21T09:41:04.461Z" }, - { url = "https://files.pythonhosted.org/packages/e2/ce/1a75384e01dd1bf546898b6062b1b5f7a59b6692ef802e4dd6db64fed264/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73b8870fe1c9a201b8c0d12c94fe781b918664766728783241a79e0468427e4f", size = 1676934, upload_time = "2025-04-21T09:41:06.728Z" }, - { url = "https://files.pythonhosted.org/packages/a5/31/442483276e6c368ab5169797d9873b5875213cbcf7e74b95ad1c5003098a/aiohttp-3.11.18-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25557982dd36b9e32c0a3357f30804e80790ec2c4d20ac6bcc598533e04c6361", size = 1621190, upload_time = "2025-04-21T09:41:08.293Z" }, - { url = "https://files.pythonhosted.org/packages/7b/83/90274bf12c079457966008a58831a99675265b6a34b505243e004b408934/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7e889c9df381a2433802991288a61e5a19ceb4f61bd14f5c9fa165655dcb1fd1", size = 1658947, upload_time = "2025-04-21T09:41:11.054Z" }, - { url = "https://files.pythonhosted.org/packages/91/c1/da9cee47a0350b78fdc93670ebe7ad74103011d7778ab4c382ca4883098d/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:9ea345fda05bae217b6cce2acf3682ce3b13d0d16dd47d0de7080e5e21362421", size = 1654443, upload_time = "2025-04-21T09:41:13.213Z" }, - { url = "https://files.pythonhosted.org/packages/c9/f2/73cbe18dc25d624f79a09448adfc4972f82ed6088759ddcf783cd201956c/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9f26545b9940c4b46f0a9388fd04ee3ad7064c4017b5a334dd450f616396590e", size = 1644169, upload_time = "2025-04-21T09:41:14.827Z" }, - { url = "https://files.pythonhosted.org/packages/5b/32/970b0a196c4dccb1b0cfa5b4dc3b20f63d76f1c608f41001a84b2fd23c3d/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:3a621d85e85dccabd700294494d7179ed1590b6d07a35709bb9bd608c7f5dd1d", size = 1728532, upload_time = "2025-04-21T09:41:17.168Z" }, - { url = "https://files.pythonhosted.org/packages/0b/50/b1dc810a41918d2ea9574e74125eb053063bc5e14aba2d98966f7d734da0/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9c23fd8d08eb9c2af3faeedc8c56e134acdaf36e2117ee059d7defa655130e5f", size = 1750310, upload_time = "2025-04-21T09:41:19.353Z" }, - { url = "https://files.pythonhosted.org/packages/95/24/39271f5990b35ff32179cc95537e92499d3791ae82af7dcf562be785cd15/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d9e6b0e519067caa4fd7fb72e3e8002d16a68e84e62e7291092a5433763dc0dd", size = 1691580, upload_time = "2025-04-21T09:41:21.868Z" }, - { url = "https://files.pythonhosted.org/packages/6b/78/75d0353feb77f041460564f12fe58e456436bbc00cbbf5d676dbf0038cc2/aiohttp-3.11.18-cp311-cp311-win32.whl", hash = "sha256:122f3e739f6607e5e4c6a2f8562a6f476192a682a52bda8b4c6d4254e1138f4d", size = 417565, upload_time = "2025-04-21T09:41:24.78Z" }, - { url = "https://files.pythonhosted.org/packages/ed/97/b912dcb654634a813f8518de359364dfc45976f822116e725dc80a688eee/aiohttp-3.11.18-cp311-cp311-win_amd64.whl", hash = "sha256:e6f3c0a3a1e73e88af384b2e8a0b9f4fb73245afd47589df2afcab6b638fa0e6", size = 443652, upload_time = "2025-04-21T09:41:26.48Z" }, - { url = "https://files.pythonhosted.org/packages/b5/d2/5bc436f42bf4745c55f33e1e6a2d69e77075d3e768e3d1a34f96ee5298aa/aiohttp-3.11.18-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:63d71eceb9cad35d47d71f78edac41fcd01ff10cacaa64e473d1aec13fa02df2", size = 706671, upload_time = "2025-04-21T09:41:28.021Z" }, - { url = "https://files.pythonhosted.org/packages/fe/d0/2dbabecc4e078c0474abb40536bbde717fb2e39962f41c5fc7a216b18ea7/aiohttp-3.11.18-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d1929da615840969929e8878d7951b31afe0bac883d84418f92e5755d7b49508", size = 466169, upload_time = "2025-04-21T09:41:29.783Z" }, - { url = "https://files.pythonhosted.org/packages/70/84/19edcf0b22933932faa6e0be0d933a27bd173da02dc125b7354dff4d8da4/aiohttp-3.11.18-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7d0aebeb2392f19b184e3fdd9e651b0e39cd0f195cdb93328bd124a1d455cd0e", size = 457554, upload_time = "2025-04-21T09:41:31.327Z" }, - { url = "https://files.pythonhosted.org/packages/32/d0/e8d1f034ae5624a0f21e4fb3feff79342ce631f3a4d26bd3e58b31ef033b/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3849ead845e8444f7331c284132ab314b4dac43bfae1e3cf350906d4fff4620f", size = 1690154, upload_time = "2025-04-21T09:41:33.541Z" }, - { url = "https://files.pythonhosted.org/packages/16/de/2f9dbe2ac6f38f8495562077131888e0d2897e3798a0ff3adda766b04a34/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5e8452ad6b2863709f8b3d615955aa0807bc093c34b8e25b3b52097fe421cb7f", size = 1733402, upload_time = "2025-04-21T09:41:35.634Z" }, - { url = "https://files.pythonhosted.org/packages/e0/04/bd2870e1e9aef990d14b6df2a695f17807baf5c85a4c187a492bda569571/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b8d2b42073611c860a37f718b3d61ae8b4c2b124b2e776e2c10619d920350ec", size = 1783958, upload_time = "2025-04-21T09:41:37.456Z" }, - { url = "https://files.pythonhosted.org/packages/23/06/4203ffa2beb5bedb07f0da0f79b7d9039d1c33f522e0d1a2d5b6218e6f2e/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40fbf91f6a0ac317c0a07eb328a1384941872f6761f2e6f7208b63c4cc0a7ff6", size = 1695288, upload_time = "2025-04-21T09:41:39.756Z" }, - { url = "https://files.pythonhosted.org/packages/30/b2/e2285dda065d9f29ab4b23d8bcc81eb881db512afb38a3f5247b191be36c/aiohttp-3.11.18-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ff5625413fec55216da5eaa011cf6b0a2ed67a565914a212a51aa3755b0009", size = 1618871, upload_time = "2025-04-21T09:41:41.972Z" }, - { url = "https://files.pythonhosted.org/packages/57/e0/88f2987885d4b646de2036f7296ebea9268fdbf27476da551c1a7c158bc0/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7f33a92a2fde08e8c6b0c61815521324fc1612f397abf96eed86b8e31618fdb4", size = 1646262, upload_time = "2025-04-21T09:41:44.192Z" }, - { url = "https://files.pythonhosted.org/packages/e0/19/4d2da508b4c587e7472a032290b2981f7caeca82b4354e19ab3df2f51d56/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:11d5391946605f445ddafda5eab11caf310f90cdda1fd99865564e3164f5cff9", size = 1677431, upload_time = "2025-04-21T09:41:46.049Z" }, - { url = "https://files.pythonhosted.org/packages/eb/ae/047473ea50150a41440f3265f53db1738870b5a1e5406ece561ca61a3bf4/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3cc314245deb311364884e44242e00c18b5896e4fe6d5f942e7ad7e4cb640adb", size = 1637430, upload_time = "2025-04-21T09:41:47.973Z" }, - { url = "https://files.pythonhosted.org/packages/11/32/c6d1e3748077ce7ee13745fae33e5cb1dac3e3b8f8787bf738a93c94a7d2/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0f421843b0f70740772228b9e8093289924359d306530bcd3926f39acbe1adda", size = 1703342, upload_time = "2025-04-21T09:41:50.323Z" }, - { url = "https://files.pythonhosted.org/packages/c5/1d/a3b57bfdbe285f0d45572d6d8f534fd58761da3e9cbc3098372565005606/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e220e7562467dc8d589e31c1acd13438d82c03d7f385c9cd41a3f6d1d15807c1", size = 1740600, upload_time = "2025-04-21T09:41:52.111Z" }, - { url = "https://files.pythonhosted.org/packages/a5/71/f9cd2fed33fa2b7ce4d412fb7876547abb821d5b5520787d159d0748321d/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ab2ef72f8605046115bc9aa8e9d14fd49086d405855f40b79ed9e5c1f9f4faea", size = 1695131, upload_time = "2025-04-21T09:41:53.94Z" }, - { url = "https://files.pythonhosted.org/packages/97/97/d1248cd6d02b9de6aa514793d0dcb20099f0ec47ae71a933290116c070c5/aiohttp-3.11.18-cp312-cp312-win32.whl", hash = "sha256:12a62691eb5aac58d65200c7ae94d73e8a65c331c3a86a2e9670927e94339ee8", size = 412442, upload_time = "2025-04-21T09:41:55.689Z" }, - { url = "https://files.pythonhosted.org/packages/33/9a/e34e65506e06427b111e19218a99abf627638a9703f4b8bcc3e3021277ed/aiohttp-3.11.18-cp312-cp312-win_amd64.whl", hash = "sha256:364329f319c499128fd5cd2d1c31c44f234c58f9b96cc57f743d16ec4f3238c8", size = 439444, upload_time = "2025-04-21T09:41:57.977Z" }, - { url = "https://files.pythonhosted.org/packages/0a/18/be8b5dd6b9cf1b2172301dbed28e8e5e878ee687c21947a6c81d6ceaa15d/aiohttp-3.11.18-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:474215ec618974054cf5dc465497ae9708543cbfc312c65212325d4212525811", size = 699833, upload_time = "2025-04-21T09:42:00.298Z" }, - { url = "https://files.pythonhosted.org/packages/0d/84/ecdc68e293110e6f6f6d7b57786a77555a85f70edd2b180fb1fafaff361a/aiohttp-3.11.18-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6ced70adf03920d4e67c373fd692123e34d3ac81dfa1c27e45904a628567d804", size = 462774, upload_time = "2025-04-21T09:42:02.015Z" }, - { url = "https://files.pythonhosted.org/packages/d7/85/f07718cca55884dad83cc2433746384d267ee970e91f0dcc75c6d5544079/aiohttp-3.11.18-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2d9f6c0152f8d71361905aaf9ed979259537981f47ad099c8b3d81e0319814bd", size = 454429, upload_time = "2025-04-21T09:42:03.728Z" }, - { url = "https://files.pythonhosted.org/packages/82/02/7f669c3d4d39810db8842c4e572ce4fe3b3a9b82945fdd64affea4c6947e/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a35197013ed929c0aed5c9096de1fc5a9d336914d73ab3f9df14741668c0616c", size = 1670283, upload_time = "2025-04-21T09:42:06.053Z" }, - { url = "https://files.pythonhosted.org/packages/ec/79/b82a12f67009b377b6c07a26bdd1b81dab7409fc2902d669dbfa79e5ac02/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:540b8a1f3a424f1af63e0af2d2853a759242a1769f9f1ab053996a392bd70118", size = 1717231, upload_time = "2025-04-21T09:42:07.953Z" }, - { url = "https://files.pythonhosted.org/packages/a6/38/d5a1f28c3904a840642b9a12c286ff41fc66dfa28b87e204b1f242dbd5e6/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f9e6710ebebfce2ba21cee6d91e7452d1125100f41b906fb5af3da8c78b764c1", size = 1769621, upload_time = "2025-04-21T09:42:09.855Z" }, - { url = "https://files.pythonhosted.org/packages/53/2d/deb3749ba293e716b5714dda06e257f123c5b8679072346b1eb28b766a0b/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8af2ef3b4b652ff109f98087242e2ab974b2b2b496304063585e3d78de0b000", size = 1678667, upload_time = "2025-04-21T09:42:11.741Z" }, - { url = "https://files.pythonhosted.org/packages/b8/a8/04b6e11683a54e104b984bd19a9790eb1ae5f50968b601bb202d0406f0ff/aiohttp-3.11.18-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:28c3f975e5ae3dbcbe95b7e3dcd30e51da561a0a0f2cfbcdea30fc1308d72137", size = 1601592, upload_time = "2025-04-21T09:42:14.137Z" }, - { url = "https://files.pythonhosted.org/packages/5e/9d/c33305ae8370b789423623f0e073d09ac775cd9c831ac0f11338b81c16e0/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c28875e316c7b4c3e745172d882d8a5c835b11018e33432d281211af35794a93", size = 1621679, upload_time = "2025-04-21T09:42:16.056Z" }, - { url = "https://files.pythonhosted.org/packages/56/45/8e9a27fff0538173d47ba60362823358f7a5f1653c6c30c613469f94150e/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:13cd38515568ae230e1ef6919e2e33da5d0f46862943fcda74e7e915096815f3", size = 1656878, upload_time = "2025-04-21T09:42:18.368Z" }, - { url = "https://files.pythonhosted.org/packages/84/5b/8c5378f10d7a5a46b10cb9161a3aac3eeae6dba54ec0f627fc4ddc4f2e72/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0e2a92101efb9f4c2942252c69c63ddb26d20f46f540c239ccfa5af865197bb8", size = 1620509, upload_time = "2025-04-21T09:42:20.141Z" }, - { url = "https://files.pythonhosted.org/packages/9e/2f/99dee7bd91c62c5ff0aa3c55f4ae7e1bc99c6affef780d7777c60c5b3735/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e6d3e32b8753c8d45ac550b11a1090dd66d110d4ef805ffe60fa61495360b3b2", size = 1680263, upload_time = "2025-04-21T09:42:21.993Z" }, - { url = "https://files.pythonhosted.org/packages/03/0a/378745e4ff88acb83e2d5c884a4fe993a6e9f04600a4560ce0e9b19936e3/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ea4cf2488156e0f281f93cc2fd365025efcba3e2d217cbe3df2840f8c73db261", size = 1715014, upload_time = "2025-04-21T09:42:23.87Z" }, - { url = "https://files.pythonhosted.org/packages/f6/0b/b5524b3bb4b01e91bc4323aad0c2fcaebdf2f1b4d2eb22743948ba364958/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d4df95ad522c53f2b9ebc07f12ccd2cb15550941e11a5bbc5ddca2ca56316d7", size = 1666614, upload_time = "2025-04-21T09:42:25.764Z" }, - { url = "https://files.pythonhosted.org/packages/c7/b7/3d7b036d5a4ed5a4c704e0754afe2eef24a824dfab08e6efbffb0f6dd36a/aiohttp-3.11.18-cp313-cp313-win32.whl", hash = "sha256:cdd1bbaf1e61f0d94aced116d6e95fe25942f7a5f42382195fd9501089db5d78", size = 411358, upload_time = "2025-04-21T09:42:27.558Z" }, - { url = "https://files.pythonhosted.org/packages/1e/3c/143831b32cd23b5263a995b2a1794e10aa42f8a895aae5074c20fda36c07/aiohttp-3.11.18-cp313-cp313-win_amd64.whl", hash = "sha256:bdd619c27e44382cf642223f11cfd4d795161362a5a1fc1fa3940397bc89db01", size = 437658, upload_time = "2025-04-21T09:42:29.209Z" }, +sdist = { url = "https://files.pythonhosted.org/packages/50/42/32cf8e7704ceb4481406eb87161349abb46a57fee3f008ba9cb610968646/aiohttp-3.13.3.tar.gz", hash = "sha256:a949eee43d3782f2daae4f4a2819b2cb9b0c5d3b7f7a927067cc84dafdbb9f88", size = 7844556, upload_time = "2026-01-03T17:33:05.204Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/36/d6/5aec9313ee6ea9c7cde8b891b69f4ff4001416867104580670a31daeba5b/aiohttp-3.13.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d5a372fd5afd301b3a89582817fdcdb6c34124787c70dbcc616f259013e7eef7", size = 738950, upload_time = "2026-01-03T17:29:13.002Z" }, + { url = "https://files.pythonhosted.org/packages/68/03/8fa90a7e6d11ff20a18837a8e2b5dd23db01aabc475aa9271c8ad33299f5/aiohttp-3.13.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:147e422fd1223005c22b4fe080f5d93ced44460f5f9c105406b753612b587821", size = 496099, upload_time = "2026-01-03T17:29:15.268Z" }, + { url = "https://files.pythonhosted.org/packages/d2/23/b81f744d402510a8366b74eb420fc0cc1170d0c43daca12d10814df85f10/aiohttp-3.13.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:859bd3f2156e81dd01432f5849fc73e2243d4a487c4fd26609b1299534ee1845", size = 491072, upload_time = "2026-01-03T17:29:16.922Z" }, + { url = "https://files.pythonhosted.org/packages/d5/e1/56d1d1c0dd334cd203dd97706ce004c1aa24b34a813b0b8daf3383039706/aiohttp-3.13.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dca68018bf48c251ba17c72ed479f4dafe9dbd5a73707ad8d28a38d11f3d42af", size = 1671588, upload_time = "2026-01-03T17:29:18.539Z" }, + { url = "https://files.pythonhosted.org/packages/5f/34/8d7f962604f4bc2b4e39eb1220dac7d4e4cba91fb9ba0474b4ecd67db165/aiohttp-3.13.3-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fee0c6bc7db1de362252affec009707a17478a00ec69f797d23ca256e36d5940", size = 1640334, upload_time = "2026-01-03T17:29:21.028Z" }, + { url = "https://files.pythonhosted.org/packages/94/1d/fcccf2c668d87337ddeef9881537baee13c58d8f01f12ba8a24215f2b804/aiohttp-3.13.3-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c048058117fd649334d81b4b526e94bde3ccaddb20463a815ced6ecbb7d11160", size = 1722656, upload_time = "2026-01-03T17:29:22.531Z" }, + { url = "https://files.pythonhosted.org/packages/aa/98/c6f3b081c4c606bc1e5f2ec102e87d6411c73a9ef3616fea6f2d5c98c062/aiohttp-3.13.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:215a685b6fbbfcf71dfe96e3eba7a6f58f10da1dfdf4889c7dd856abe430dca7", size = 1817625, upload_time = "2026-01-03T17:29:24.276Z" }, + { url = "https://files.pythonhosted.org/packages/2c/c0/cfcc3d2e11b477f86e1af2863f3858c8850d751ce8dc39c4058a072c9e54/aiohttp-3.13.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2c184bb1fe2cbd2cefba613e9db29a5ab559323f994b6737e370d3da0ac455", size = 1672604, upload_time = "2026-01-03T17:29:26.099Z" }, + { url = "https://files.pythonhosted.org/packages/1e/77/6b4ffcbcac4c6a5d041343a756f34a6dd26174ae07f977a64fe028dda5b0/aiohttp-3.13.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:75ca857eba4e20ce9f546cd59c7007b33906a4cd48f2ff6ccf1ccfc3b646f279", size = 1554370, upload_time = "2026-01-03T17:29:28.121Z" }, + { url = "https://files.pythonhosted.org/packages/f2/f0/e3ddfa93f17d689dbe014ba048f18e0c9f9b456033b70e94349a2e9048be/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:81e97251d9298386c2b7dbeb490d3d1badbdc69107fb8c9299dd04eb39bddc0e", size = 1642023, upload_time = "2026-01-03T17:29:30.002Z" }, + { url = "https://files.pythonhosted.org/packages/eb/45/c14019c9ec60a8e243d06d601b33dcc4fd92379424bde3021725859d7f99/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:c0e2d366af265797506f0283487223146af57815b388623f0357ef7eac9b209d", size = 1649680, upload_time = "2026-01-03T17:29:31.782Z" }, + { url = "https://files.pythonhosted.org/packages/9c/fd/09c9451dae5aa5c5ed756df95ff9ef549d45d4be663bafd1e4954fd836f0/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4e239d501f73d6db1522599e14b9b321a7e3b1de66ce33d53a765d975e9f4808", size = 1692407, upload_time = "2026-01-03T17:29:33.392Z" }, + { url = "https://files.pythonhosted.org/packages/a6/81/938bc2ec33c10efd6637ccb3d22f9f3160d08e8f3aa2587a2c2d5ab578eb/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:0db318f7a6f065d84cb1e02662c526294450b314a02bd9e2a8e67f0d8564ce40", size = 1543047, upload_time = "2026-01-03T17:29:34.855Z" }, + { url = "https://files.pythonhosted.org/packages/f7/23/80488ee21c8d567c83045e412e1d9b7077d27171591a4eb7822586e8c06a/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:bfc1cc2fe31a6026a8a88e4ecfb98d7f6b1fec150cfd708adbfd1d2f42257c29", size = 1715264, upload_time = "2026-01-03T17:29:36.389Z" }, + { url = "https://files.pythonhosted.org/packages/e2/83/259a8da6683182768200b368120ab3deff5370bed93880fb9a3a86299f34/aiohttp-3.13.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:af71fff7bac6bb7508956696dce8f6eec2bbb045eceb40343944b1ae62b5ef11", size = 1657275, upload_time = "2026-01-03T17:29:38.162Z" }, + { url = "https://files.pythonhosted.org/packages/3f/4f/2c41f800a0b560785c10fb316216ac058c105f9be50bdc6a285de88db625/aiohttp-3.13.3-cp310-cp310-win32.whl", hash = "sha256:37da61e244d1749798c151421602884db5270faf479cf0ef03af0ff68954c9dd", size = 434053, upload_time = "2026-01-03T17:29:40.074Z" }, + { url = "https://files.pythonhosted.org/packages/80/df/29cd63c7ecfdb65ccc12f7d808cac4fa2a19544660c06c61a4a48462de0c/aiohttp-3.13.3-cp310-cp310-win_amd64.whl", hash = "sha256:7e63f210bc1b57ef699035f2b4b6d9ce096b5914414a49b0997c839b2bd2223c", size = 456687, upload_time = "2026-01-03T17:29:41.819Z" }, + { url = "https://files.pythonhosted.org/packages/f1/4c/a164164834f03924d9a29dc3acd9e7ee58f95857e0b467f6d04298594ebb/aiohttp-3.13.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5b6073099fb654e0a068ae678b10feff95c5cae95bbfcbfa7af669d361a8aa6b", size = 746051, upload_time = "2026-01-03T17:29:43.287Z" }, + { url = "https://files.pythonhosted.org/packages/82/71/d5c31390d18d4f58115037c432b7e0348c60f6f53b727cad33172144a112/aiohttp-3.13.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cb93e166e6c28716c8c6aeb5f99dfb6d5ccf482d29fe9bf9a794110e6d0ab64", size = 499234, upload_time = "2026-01-03T17:29:44.822Z" }, + { url = "https://files.pythonhosted.org/packages/0e/c9/741f8ac91e14b1d2e7100690425a5b2b919a87a5075406582991fb7de920/aiohttp-3.13.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:28e027cf2f6b641693a09f631759b4d9ce9165099d2b5d92af9bd4e197690eea", size = 494979, upload_time = "2026-01-03T17:29:46.405Z" }, + { url = "https://files.pythonhosted.org/packages/75/b5/31d4d2e802dfd59f74ed47eba48869c1c21552c586d5e81a9d0d5c2ad640/aiohttp-3.13.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3b61b7169ababd7802f9568ed96142616a9118dd2be0d1866e920e77ec8fa92a", size = 1748297, upload_time = "2026-01-03T17:29:48.083Z" }, + { url = "https://files.pythonhosted.org/packages/1a/3e/eefad0ad42959f226bb79664826883f2687d602a9ae2941a18e0484a74d3/aiohttp-3.13.3-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:80dd4c21b0f6237676449c6baaa1039abae86b91636b6c91a7f8e61c87f89540", size = 1707172, upload_time = "2026-01-03T17:29:49.648Z" }, + { url = "https://files.pythonhosted.org/packages/c5/3a/54a64299fac2891c346cdcf2aa6803f994a2e4beeaf2e5a09dcc54acc842/aiohttp-3.13.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:65d2ccb7eabee90ce0503c17716fc77226be026dcc3e65cce859a30db715025b", size = 1805405, upload_time = "2026-01-03T17:29:51.244Z" }, + { url = "https://files.pythonhosted.org/packages/6c/70/ddc1b7169cf64075e864f64595a14b147a895a868394a48f6a8031979038/aiohttp-3.13.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5b179331a481cb5529fca8b432d8d3c7001cb217513c94cd72d668d1248688a3", size = 1899449, upload_time = "2026-01-03T17:29:53.938Z" }, + { url = "https://files.pythonhosted.org/packages/a1/7e/6815aab7d3a56610891c76ef79095677b8b5be6646aaf00f69b221765021/aiohttp-3.13.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d4c940f02f49483b18b079d1c27ab948721852b281f8b015c058100e9421dd1", size = 1748444, upload_time = "2026-01-03T17:29:55.484Z" }, + { url = "https://files.pythonhosted.org/packages/6b/f2/073b145c4100da5511f457dc0f7558e99b2987cf72600d42b559db856fbc/aiohttp-3.13.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f9444f105664c4ce47a2a7171a2418bce5b7bae45fb610f4e2c36045d85911d3", size = 1606038, upload_time = "2026-01-03T17:29:57.179Z" }, + { url = "https://files.pythonhosted.org/packages/0a/c1/778d011920cae03ae01424ec202c513dc69243cf2db303965615b81deeea/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:694976222c711d1d00ba131904beb60534f93966562f64440d0c9d41b8cdb440", size = 1724156, upload_time = "2026-01-03T17:29:58.914Z" }, + { url = "https://files.pythonhosted.org/packages/0e/cb/3419eabf4ec1e9ec6f242c32b689248365a1cf621891f6f0386632525494/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f33ed1a2bf1997a36661874b017f5c4b760f41266341af36febaf271d179f6d7", size = 1722340, upload_time = "2026-01-03T17:30:01.962Z" }, + { url = "https://files.pythonhosted.org/packages/7a/e5/76cf77bdbc435bf233c1f114edad39ed4177ccbfab7c329482b179cff4f4/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e636b3c5f61da31a92bf0d91da83e58fdfa96f178ba682f11d24f31944cdd28c", size = 1783041, upload_time = "2026-01-03T17:30:03.609Z" }, + { url = "https://files.pythonhosted.org/packages/9d/d4/dd1ca234c794fd29c057ce8c0566b8ef7fd6a51069de5f06fa84b9a1971c/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5d2d94f1f5fcbe40838ac51a6ab5704a6f9ea42e72ceda48de5e6b898521da51", size = 1596024, upload_time = "2026-01-03T17:30:05.132Z" }, + { url = "https://files.pythonhosted.org/packages/55/58/4345b5f26661a6180afa686c473620c30a66afdf120ed3dd545bbc809e85/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2be0e9ccf23e8a94f6f0650ce06042cefc6ac703d0d7ab6c7a917289f2539ad4", size = 1804590, upload_time = "2026-01-03T17:30:07.135Z" }, + { url = "https://files.pythonhosted.org/packages/7b/06/05950619af6c2df7e0a431d889ba2813c9f0129cec76f663e547a5ad56f2/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9af5e68ee47d6534d36791bbe9b646d2a7c7deb6fc24d7943628edfbb3581f29", size = 1740355, upload_time = "2026-01-03T17:30:09.083Z" }, + { url = "https://files.pythonhosted.org/packages/3e/80/958f16de79ba0422d7c1e284b2abd0c84bc03394fbe631d0a39ffa10e1eb/aiohttp-3.13.3-cp311-cp311-win32.whl", hash = "sha256:a2212ad43c0833a873d0fb3c63fa1bacedd4cf6af2fee62bf4b739ceec3ab239", size = 433701, upload_time = "2026-01-03T17:30:10.869Z" }, + { url = "https://files.pythonhosted.org/packages/dc/f2/27cdf04c9851712d6c1b99df6821a6623c3c9e55956d4b1e318c337b5a48/aiohttp-3.13.3-cp311-cp311-win_amd64.whl", hash = "sha256:642f752c3eb117b105acbd87e2c143de710987e09860d674e068c4c2c441034f", size = 457678, upload_time = "2026-01-03T17:30:12.719Z" }, + { url = "https://files.pythonhosted.org/packages/a0/be/4fc11f202955a69e0db803a12a062b8379c970c7c84f4882b6da17337cc1/aiohttp-3.13.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b903a4dfee7d347e2d87697d0713be59e0b87925be030c9178c5faa58ea58d5c", size = 739732, upload_time = "2026-01-03T17:30:14.23Z" }, + { url = "https://files.pythonhosted.org/packages/97/2c/621d5b851f94fa0bb7430d6089b3aa970a9d9b75196bc93bb624b0db237a/aiohttp-3.13.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a45530014d7a1e09f4a55f4f43097ba0fd155089372e105e4bff4ca76cb1b168", size = 494293, upload_time = "2026-01-03T17:30:15.96Z" }, + { url = "https://files.pythonhosted.org/packages/5d/43/4be01406b78e1be8320bb8316dc9c42dbab553d281c40364e0f862d5661c/aiohttp-3.13.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:27234ef6d85c914f9efeb77ff616dbf4ad2380be0cda40b4db086ffc7ddd1b7d", size = 493533, upload_time = "2026-01-03T17:30:17.431Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a8/5a35dc56a06a2c90d4742cbf35294396907027f80eea696637945a106f25/aiohttp-3.13.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d32764c6c9aafb7fb55366a224756387cd50bfa720f32b88e0e6fa45b27dcf29", size = 1737839, upload_time = "2026-01-03T17:30:19.422Z" }, + { url = "https://files.pythonhosted.org/packages/bf/62/4b9eeb331da56530bf2e198a297e5303e1c1ebdceeb00fe9b568a65c5a0c/aiohttp-3.13.3-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b1a6102b4d3ebc07dad44fbf07b45bb600300f15b552ddf1851b5390202ea2e3", size = 1703932, upload_time = "2026-01-03T17:30:21.756Z" }, + { url = "https://files.pythonhosted.org/packages/7c/f6/af16887b5d419e6a367095994c0b1332d154f647e7dc2bd50e61876e8e3d/aiohttp-3.13.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c014c7ea7fb775dd015b2d3137378b7be0249a448a1612268b5a90c2d81de04d", size = 1771906, upload_time = "2026-01-03T17:30:23.932Z" }, + { url = "https://files.pythonhosted.org/packages/ce/83/397c634b1bcc24292fa1e0c7822800f9f6569e32934bdeef09dae7992dfb/aiohttp-3.13.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2b8d8ddba8f95ba17582226f80e2de99c7a7948e66490ef8d947e272a93e9463", size = 1871020, upload_time = "2026-01-03T17:30:26Z" }, + { url = "https://files.pythonhosted.org/packages/86/f6/a62cbbf13f0ac80a70f71b1672feba90fdb21fd7abd8dbf25c0105fb6fa3/aiohttp-3.13.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ae8dd55c8e6c4257eae3a20fd2c8f41edaea5992ed67156642493b8daf3cecc", size = 1755181, upload_time = "2026-01-03T17:30:27.554Z" }, + { url = "https://files.pythonhosted.org/packages/0a/87/20a35ad487efdd3fba93d5843efdfaa62d2f1479eaafa7453398a44faf13/aiohttp-3.13.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:01ad2529d4b5035578f5081606a465f3b814c542882804e2e8cda61adf5c71bf", size = 1561794, upload_time = "2026-01-03T17:30:29.254Z" }, + { url = "https://files.pythonhosted.org/packages/de/95/8fd69a66682012f6716e1bc09ef8a1a2a91922c5725cb904689f112309c4/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bb4f7475e359992b580559e008c598091c45b5088f28614e855e42d39c2f1033", size = 1697900, upload_time = "2026-01-03T17:30:31.033Z" }, + { url = "https://files.pythonhosted.org/packages/e5/66/7b94b3b5ba70e955ff597672dad1691333080e37f50280178967aff68657/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:c19b90316ad3b24c69cd78d5c9b4f3aa4497643685901185b65166293d36a00f", size = 1728239, upload_time = "2026-01-03T17:30:32.703Z" }, + { url = "https://files.pythonhosted.org/packages/47/71/6f72f77f9f7d74719692ab65a2a0252584bf8d5f301e2ecb4c0da734530a/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:96d604498a7c782cb15a51c406acaea70d8c027ee6b90c569baa6e7b93073679", size = 1740527, upload_time = "2026-01-03T17:30:34.695Z" }, + { url = "https://files.pythonhosted.org/packages/fa/b4/75ec16cbbd5c01bdaf4a05b19e103e78d7ce1ef7c80867eb0ace42ff4488/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:084911a532763e9d3dd95adf78a78f4096cd5f58cdc18e6fdbc1b58417a45423", size = 1554489, upload_time = "2026-01-03T17:30:36.864Z" }, + { url = "https://files.pythonhosted.org/packages/52/8f/bc518c0eea29f8406dcf7ed1f96c9b48e3bc3995a96159b3fc11f9e08321/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7a4a94eb787e606d0a09404b9c38c113d3b099d508021faa615d70a0131907ce", size = 1767852, upload_time = "2026-01-03T17:30:39.433Z" }, + { url = "https://files.pythonhosted.org/packages/9d/f2/a07a75173124f31f11ea6f863dc44e6f09afe2bca45dd4e64979490deab1/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:87797e645d9d8e222e04160ee32aa06bc5c163e8499f24db719e7852ec23093a", size = 1722379, upload_time = "2026-01-03T17:30:41.081Z" }, + { url = "https://files.pythonhosted.org/packages/3c/4a/1a3fee7c21350cac78e5c5cef711bac1b94feca07399f3d406972e2d8fcd/aiohttp-3.13.3-cp312-cp312-win32.whl", hash = "sha256:b04be762396457bef43f3597c991e192ee7da460a4953d7e647ee4b1c28e7046", size = 428253, upload_time = "2026-01-03T17:30:42.644Z" }, + { url = "https://files.pythonhosted.org/packages/d9/b7/76175c7cb4eb73d91ad63c34e29fc4f77c9386bba4a65b53ba8e05ee3c39/aiohttp-3.13.3-cp312-cp312-win_amd64.whl", hash = "sha256:e3531d63d3bdfa7e3ac5e9b27b2dd7ec9df3206a98e0b3445fa906f233264c57", size = 455407, upload_time = "2026-01-03T17:30:44.195Z" }, + { url = "https://files.pythonhosted.org/packages/97/8a/12ca489246ca1faaf5432844adbfce7ff2cc4997733e0af120869345643a/aiohttp-3.13.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:5dff64413671b0d3e7d5918ea490bdccb97a4ad29b3f311ed423200b2203e01c", size = 734190, upload_time = "2026-01-03T17:30:45.832Z" }, + { url = "https://files.pythonhosted.org/packages/32/08/de43984c74ed1fca5c014808963cc83cb00d7bb06af228f132d33862ca76/aiohttp-3.13.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:87b9aab6d6ed88235aa2970294f496ff1a1f9adcd724d800e9b952395a80ffd9", size = 491783, upload_time = "2026-01-03T17:30:47.466Z" }, + { url = "https://files.pythonhosted.org/packages/17/f8/8dd2cf6112a5a76f81f81a5130c57ca829d101ad583ce57f889179accdda/aiohttp-3.13.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:425c126c0dc43861e22cb1c14ba4c8e45d09516d0a3ae0a3f7494b79f5f233a3", size = 490704, upload_time = "2026-01-03T17:30:49.373Z" }, + { url = "https://files.pythonhosted.org/packages/6d/40/a46b03ca03936f832bc7eaa47cfbb1ad012ba1be4790122ee4f4f8cba074/aiohttp-3.13.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f9120f7093c2a32d9647abcaf21e6ad275b4fbec5b55969f978b1a97c7c86bf", size = 1720652, upload_time = "2026-01-03T17:30:50.974Z" }, + { url = "https://files.pythonhosted.org/packages/f7/7e/917fe18e3607af92657e4285498f500dca797ff8c918bd7d90b05abf6c2a/aiohttp-3.13.3-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:697753042d57f4bf7122cab985bf15d0cef23c770864580f5af4f52023a56bd6", size = 1692014, upload_time = "2026-01-03T17:30:52.729Z" }, + { url = "https://files.pythonhosted.org/packages/71/b6/cefa4cbc00d315d68973b671cf105b21a609c12b82d52e5d0c9ae61d2a09/aiohttp-3.13.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6de499a1a44e7de70735d0b39f67c8f25eb3d91eb3103be99ca0fa882cdd987d", size = 1759777, upload_time = "2026-01-03T17:30:54.537Z" }, + { url = "https://files.pythonhosted.org/packages/fb/e3/e06ee07b45e59e6d81498b591fc589629be1553abb2a82ce33efe2a7b068/aiohttp-3.13.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:37239e9f9a7ea9ac5bf6b92b0260b01f8a22281996da609206a84df860bc1261", size = 1861276, upload_time = "2026-01-03T17:30:56.512Z" }, + { url = "https://files.pythonhosted.org/packages/7c/24/75d274228acf35ceeb2850b8ce04de9dd7355ff7a0b49d607ee60c29c518/aiohttp-3.13.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f76c1e3fe7d7c8afad7ed193f89a292e1999608170dcc9751a7462a87dfd5bc0", size = 1743131, upload_time = "2026-01-03T17:30:58.256Z" }, + { url = "https://files.pythonhosted.org/packages/04/98/3d21dde21889b17ca2eea54fdcff21b27b93f45b7bb94ca029c31ab59dc3/aiohttp-3.13.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fc290605db2a917f6e81b0e1e0796469871f5af381ce15c604a3c5c7e51cb730", size = 1556863, upload_time = "2026-01-03T17:31:00.445Z" }, + { url = "https://files.pythonhosted.org/packages/9e/84/da0c3ab1192eaf64782b03971ab4055b475d0db07b17eff925e8c93b3aa5/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4021b51936308aeea0367b8f006dc999ca02bc118a0cc78c303f50a2ff6afb91", size = 1682793, upload_time = "2026-01-03T17:31:03.024Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0f/5802ada182f575afa02cbd0ec5180d7e13a402afb7c2c03a9aa5e5d49060/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:49a03727c1bba9a97d3e93c9f93ca03a57300f484b6e935463099841261195d3", size = 1716676, upload_time = "2026-01-03T17:31:04.842Z" }, + { url = "https://files.pythonhosted.org/packages/3f/8c/714d53bd8b5a4560667f7bbbb06b20c2382f9c7847d198370ec6526af39c/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3d9908a48eb7416dc1f4524e69f1d32e5d90e3981e4e37eb0aa1cd18f9cfa2a4", size = 1733217, upload_time = "2026-01-03T17:31:06.868Z" }, + { url = "https://files.pythonhosted.org/packages/7d/79/e2176f46d2e963facea939f5be2d26368ce543622be6f00a12844d3c991f/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2712039939ec963c237286113c68dbad80a82a4281543f3abf766d9d73228998", size = 1552303, upload_time = "2026-01-03T17:31:08.958Z" }, + { url = "https://files.pythonhosted.org/packages/ab/6a/28ed4dea1759916090587d1fe57087b03e6c784a642b85ef48217b0277ae/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:7bfdc049127717581866fa4708791220970ce291c23e28ccf3922c700740fdc0", size = 1763673, upload_time = "2026-01-03T17:31:10.676Z" }, + { url = "https://files.pythonhosted.org/packages/e8/35/4a3daeb8b9fab49240d21c04d50732313295e4bd813a465d840236dd0ce1/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8057c98e0c8472d8846b9c79f56766bcc57e3e8ac7bfd510482332366c56c591", size = 1721120, upload_time = "2026-01-03T17:31:12.575Z" }, + { url = "https://files.pythonhosted.org/packages/bc/9f/d643bb3c5fb99547323e635e251c609fbbc660d983144cfebec529e09264/aiohttp-3.13.3-cp313-cp313-win32.whl", hash = "sha256:1449ceddcdbcf2e0446957863af03ebaaa03f94c090f945411b61269e2cb5daf", size = 427383, upload_time = "2026-01-03T17:31:14.382Z" }, + { url = "https://files.pythonhosted.org/packages/4e/f1/ab0395f8a79933577cdd996dd2f9aa6014af9535f65dddcf88204682fe62/aiohttp-3.13.3-cp313-cp313-win_amd64.whl", hash = "sha256:693781c45a4033d31d4187d2436f5ac701e7bbfe5df40d917736108c1cc7436e", size = 453899, upload_time = "2026-01-03T17:31:15.958Z" }, + { url = "https://files.pythonhosted.org/packages/99/36/5b6514a9f5d66f4e2597e40dea2e3db271e023eb7a5d22defe96ba560996/aiohttp-3.13.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:ea37047c6b367fd4bd632bff8077449b8fa034b69e812a18e0132a00fae6e808", size = 737238, upload_time = "2026-01-03T17:31:17.909Z" }, + { url = "https://files.pythonhosted.org/packages/f7/49/459327f0d5bcd8c6c9ca69e60fdeebc3622861e696490d8674a6d0cb90a6/aiohttp-3.13.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6fc0e2337d1a4c3e6acafda6a78a39d4c14caea625124817420abceed36e2415", size = 492292, upload_time = "2026-01-03T17:31:19.919Z" }, + { url = "https://files.pythonhosted.org/packages/e8/0b/b97660c5fd05d3495b4eb27f2d0ef18dc1dc4eff7511a9bf371397ff0264/aiohttp-3.13.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c685f2d80bb67ca8c3837823ad76196b3694b0159d232206d1e461d3d434666f", size = 493021, upload_time = "2026-01-03T17:31:21.636Z" }, + { url = "https://files.pythonhosted.org/packages/54/d4/438efabdf74e30aeceb890c3290bbaa449780583b1270b00661126b8aae4/aiohttp-3.13.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e377758516d262bde50c2584fc6c578af272559c409eecbdd2bae1601184d6", size = 1717263, upload_time = "2026-01-03T17:31:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/71/f2/7bddc7fd612367d1459c5bcf598a9e8f7092d6580d98de0e057eb42697ad/aiohttp-3.13.3-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:34749271508078b261c4abb1767d42b8d0c0cc9449c73a4df494777dc55f0687", size = 1669107, upload_time = "2026-01-03T17:31:25.334Z" }, + { url = "https://files.pythonhosted.org/packages/00/5a/1aeaecca40e22560f97610a329e0e5efef5e0b5afdf9f857f0d93839ab2e/aiohttp-3.13.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:82611aeec80eb144416956ec85b6ca45a64d76429c1ed46ae1b5f86c6e0c9a26", size = 1760196, upload_time = "2026-01-03T17:31:27.394Z" }, + { url = "https://files.pythonhosted.org/packages/f8/f8/0ff6992bea7bd560fc510ea1c815f87eedd745fe035589c71ce05612a19a/aiohttp-3.13.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2fff83cfc93f18f215896e3a190e8e5cb413ce01553901aca925176e7568963a", size = 1843591, upload_time = "2026-01-03T17:31:29.238Z" }, + { url = "https://files.pythonhosted.org/packages/e3/d1/e30e537a15f53485b61f5be525f2157da719819e8377298502aebac45536/aiohttp-3.13.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bbe7d4cecacb439e2e2a8a1a7b935c25b812af7a5fd26503a66dadf428e79ec1", size = 1720277, upload_time = "2026-01-03T17:31:31.053Z" }, + { url = "https://files.pythonhosted.org/packages/84/45/23f4c451d8192f553d38d838831ebbc156907ea6e05557f39563101b7717/aiohttp-3.13.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b928f30fe49574253644b1ca44b1b8adbd903aa0da4b9054a6c20fc7f4092a25", size = 1548575, upload_time = "2026-01-03T17:31:32.87Z" }, + { url = "https://files.pythonhosted.org/packages/6a/ed/0a42b127a43712eda7807e7892c083eadfaf8429ca8fb619662a530a3aab/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7b5e8fe4de30df199155baaf64f2fcd604f4c678ed20910db8e2c66dc4b11603", size = 1679455, upload_time = "2026-01-03T17:31:34.76Z" }, + { url = "https://files.pythonhosted.org/packages/2e/b5/c05f0c2b4b4fe2c9d55e73b6d3ed4fd6c9dc2684b1d81cbdf77e7fad9adb/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:8542f41a62bcc58fc7f11cf7c90e0ec324ce44950003feb70640fc2a9092c32a", size = 1687417, upload_time = "2026-01-03T17:31:36.699Z" }, + { url = "https://files.pythonhosted.org/packages/c9/6b/915bc5dad66aef602b9e459b5a973529304d4e89ca86999d9d75d80cbd0b/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5e1d8c8b8f1d91cd08d8f4a3c2b067bfca6ec043d3ff36de0f3a715feeedf926", size = 1729968, upload_time = "2026-01-03T17:31:38.622Z" }, + { url = "https://files.pythonhosted.org/packages/11/3b/e84581290a9520024a08640b63d07673057aec5ca548177a82026187ba73/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:90455115e5da1c3c51ab619ac57f877da8fd6d73c05aacd125c5ae9819582aba", size = 1545690, upload_time = "2026-01-03T17:31:40.57Z" }, + { url = "https://files.pythonhosted.org/packages/f5/04/0c3655a566c43fd647c81b895dfe361b9f9ad6d58c19309d45cff52d6c3b/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:042e9e0bcb5fba81886c8b4fbb9a09d6b8a00245fd8d88e4d989c1f96c74164c", size = 1746390, upload_time = "2026-01-03T17:31:42.857Z" }, + { url = "https://files.pythonhosted.org/packages/1f/53/71165b26978f719c3419381514c9690bd5980e764a09440a10bb816ea4ab/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2eb752b102b12a76ca02dff751a801f028b4ffbbc478840b473597fc91a9ed43", size = 1702188, upload_time = "2026-01-03T17:31:44.984Z" }, + { url = "https://files.pythonhosted.org/packages/29/a7/cbe6c9e8e136314fa1980da388a59d2f35f35395948a08b6747baebb6aa6/aiohttp-3.13.3-cp314-cp314-win32.whl", hash = "sha256:b556c85915d8efaed322bf1bdae9486aa0f3f764195a0fb6ee962e5c71ef5ce1", size = 433126, upload_time = "2026-01-03T17:31:47.463Z" }, + { url = "https://files.pythonhosted.org/packages/de/56/982704adea7d3b16614fc5936014e9af85c0e34b58f9046655817f04306e/aiohttp-3.13.3-cp314-cp314-win_amd64.whl", hash = "sha256:9bf9f7a65e7aa20dd764151fb3d616c81088f91f8df39c3893a536e279b4b984", size = 459128, upload_time = "2026-01-03T17:31:49.2Z" }, + { url = "https://files.pythonhosted.org/packages/6c/2a/3c79b638a9c3d4658d345339d22070241ea341ed4e07b5ac60fb0f418003/aiohttp-3.13.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:05861afbbec40650d8a07ea324367cb93e9e8cc7762e04dd4405df99fa65159c", size = 769512, upload_time = "2026-01-03T17:31:51.134Z" }, + { url = "https://files.pythonhosted.org/packages/29/b9/3e5014d46c0ab0db8707e0ac2711ed28c4da0218c358a4e7c17bae0d8722/aiohttp-3.13.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2fc82186fadc4a8316768d61f3722c230e2c1dcab4200d52d2ebdf2482e47592", size = 506444, upload_time = "2026-01-03T17:31:52.85Z" }, + { url = "https://files.pythonhosted.org/packages/90/03/c1d4ef9a054e151cd7839cdc497f2638f00b93cbe8043983986630d7a80c/aiohttp-3.13.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0add0900ff220d1d5c5ebbf99ed88b0c1bbf87aa7e4262300ed1376a6b13414f", size = 510798, upload_time = "2026-01-03T17:31:54.91Z" }, + { url = "https://files.pythonhosted.org/packages/ea/76/8c1e5abbfe8e127c893fe7ead569148a4d5a799f7cf958d8c09f3eedf097/aiohttp-3.13.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:568f416a4072fbfae453dcf9a99194bbb8bdeab718e08ee13dfa2ba0e4bebf29", size = 1868835, upload_time = "2026-01-03T17:31:56.733Z" }, + { url = "https://files.pythonhosted.org/packages/8e/ac/984c5a6f74c363b01ff97adc96a3976d9c98940b8969a1881575b279ac5d/aiohttp-3.13.3-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:add1da70de90a2569c5e15249ff76a631ccacfe198375eead4aadf3b8dc849dc", size = 1720486, upload_time = "2026-01-03T17:31:58.65Z" }, + { url = "https://files.pythonhosted.org/packages/b2/9a/b7039c5f099c4eb632138728828b33428585031a1e658d693d41d07d89d1/aiohttp-3.13.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:10b47b7ba335d2e9b1239fa571131a87e2d8ec96b333e68b2a305e7a98b0bae2", size = 1847951, upload_time = "2026-01-03T17:32:00.989Z" }, + { url = "https://files.pythonhosted.org/packages/3c/02/3bec2b9a1ba3c19ff89a43a19324202b8eb187ca1e928d8bdac9bbdddebd/aiohttp-3.13.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3dd4dce1c718e38081c8f35f323209d4c1df7d4db4bab1b5c88a6b4d12b74587", size = 1941001, upload_time = "2026-01-03T17:32:03.122Z" }, + { url = "https://files.pythonhosted.org/packages/37/df/d879401cedeef27ac4717f6426c8c36c3091c6e9f08a9178cc87549c537f/aiohttp-3.13.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34bac00a67a812570d4a460447e1e9e06fae622946955f939051e7cc895cfab8", size = 1797246, upload_time = "2026-01-03T17:32:05.255Z" }, + { url = "https://files.pythonhosted.org/packages/8d/15/be122de1f67e6953add23335c8ece6d314ab67c8bebb3f181063010795a7/aiohttp-3.13.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a19884d2ee70b06d9204b2727a7b9f983d0c684c650254679e716b0b77920632", size = 1627131, upload_time = "2026-01-03T17:32:07.607Z" }, + { url = "https://files.pythonhosted.org/packages/12/12/70eedcac9134cfa3219ab7af31ea56bc877395b1ac30d65b1bc4b27d0438/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ca7f2bb6ba8348a3614c7918cc4bb73268c5ac2a207576b7afea19d3d9f64", size = 1795196, upload_time = "2026-01-03T17:32:09.59Z" }, + { url = "https://files.pythonhosted.org/packages/32/11/b30e1b1cd1f3054af86ebe60df96989c6a414dd87e27ad16950eee420bea/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:b0d95340658b9d2f11d9697f59b3814a9d3bb4b7a7c20b131df4bcef464037c0", size = 1782841, upload_time = "2026-01-03T17:32:11.445Z" }, + { url = "https://files.pythonhosted.org/packages/88/0d/d98a9367b38912384a17e287850f5695c528cff0f14f791ce8ee2e4f7796/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:a1e53262fd202e4b40b70c3aff944a8155059beedc8a89bba9dc1f9ef06a1b56", size = 1795193, upload_time = "2026-01-03T17:32:13.705Z" }, + { url = "https://files.pythonhosted.org/packages/43/a5/a2dfd1f5ff5581632c7f6a30e1744deda03808974f94f6534241ef60c751/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d60ac9663f44168038586cab2157e122e46bdef09e9368b37f2d82d354c23f72", size = 1621979, upload_time = "2026-01-03T17:32:15.965Z" }, + { url = "https://files.pythonhosted.org/packages/fa/f0/12973c382ae7c1cccbc4417e129c5bf54c374dfb85af70893646e1f0e749/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:90751b8eed69435bac9ff4e3d2f6b3af1f57e37ecb0fbeee59c0174c9e2d41df", size = 1822193, upload_time = "2026-01-03T17:32:18.219Z" }, + { url = "https://files.pythonhosted.org/packages/3c/5f/24155e30ba7f8c96918af1350eb0663e2430aad9e001c0489d89cd708ab1/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fc353029f176fd2b3ec6cfc71be166aba1936fe5d73dd1992ce289ca6647a9aa", size = 1769801, upload_time = "2026-01-03T17:32:20.25Z" }, + { url = "https://files.pythonhosted.org/packages/eb/f8/7314031ff5c10e6ece114da79b338ec17eeff3a079e53151f7e9f43c4723/aiohttp-3.13.3-cp314-cp314t-win32.whl", hash = "sha256:2e41b18a58da1e474a057b3d35248d8320029f61d70a37629535b16a0c8f3767", size = 466523, upload_time = "2026-01-03T17:32:22.215Z" }, + { url = "https://files.pythonhosted.org/packages/b4/63/278a98c715ae467624eafe375542d8ba9b4383a016df8fdefe0ae28382a7/aiohttp-3.13.3-cp314-cp314t-win_amd64.whl", hash = "sha256:44531a36aa2264a1860089ffd4dce7baf875ee5a6079d5fb42e261c704ef7344", size = 499694, upload_time = "2026-01-03T17:32:24.546Z" }, ] [[package]] name = "aiosignal" -version = "1.3.2" +version = "1.4.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "frozenlist" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload_time = "2025-07-03T22:54:43.528Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload_time = "2025-07-03T22:54:42.156Z" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ba/b5/6d55e80f6d8a08ce22b982eafa278d823b541c925f11ee774b0b9c43473d/aiosignal-1.3.2.tar.gz", hash = "sha256:a8c255c66fafb1e499c9351d0bf32ff2d8a0321595ebac3b93713656d2436f54", size = 19424, upload_time = "2024-12-13T17:10:40.86Z" } + +[[package]] +name = "annotated-doc" +version = "0.0.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/57/ba/046ceea27344560984e26a590f90bc7f4a75b06701f653222458922b558c/annotated_doc-0.0.4.tar.gz", hash = "sha256:fbcda96e87e9c92ad167c2e53839e57503ecfda18804ea28102353485033faa4", size = 7288, upload_time = "2025-11-10T22:07:42.062Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ec/6a/bc7e17a3e87a2985d3e8f4da4cd0f481060eb78fb08596c42be62c90a4d9/aiosignal-1.3.2-py2.py3-none-any.whl", hash = "sha256:45cde58e409a301715980c2b01d0c28bdde3770d8290b5eb2173759d9acb31a5", size = 7597, upload_time = "2024-12-13T17:10:38.469Z" }, + { url = "https://files.pythonhosted.org/packages/1e/d3/26bf1008eb3d2daa8ef4cacc7f3bfdc11818d111f7e2d0201bc6e3b49d45/annotated_doc-0.0.4-py3-none-any.whl", hash = "sha256:571ac1dc6991c450b25a9c2d84a3705e2ae7a53467b5d111c24fa8baabbed320", size = 5303, upload_time = "2025-11-10T22:07:40.673Z" }, ] [[package]] @@ -114,6 +162,25 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload_time = "2024-05-20T21:33:24.1Z" }, ] +[[package]] +name = "anthropic" +version = "0.84.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "distro" }, + { name = "docstring-parser" }, + { name = "httpx" }, + { name = "jiter" }, + { name = "pydantic" }, + { name = "sniffio" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/04/ea/0869d6df9ef83dcf393aeefc12dd81677d091c6ffc86f783e51cf44062f2/anthropic-0.84.0.tar.gz", hash = "sha256:72f5f90e5aebe62dca316cb013629cfa24996b0f5a4593b8c3d712bc03c43c37", size = 539457, upload_time = "2026-02-25T05:22:38.54Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/64/ca/218fa25002a332c0aa149ba18ffc0543175998b1f65de63f6d106689a345/anthropic-0.84.0-py3-none-any.whl", hash = "sha256:861c4c50f91ca45f942e091d83b60530ad6d4f98733bfe648065364da05d29e7", size = 455156, upload_time = "2026-02-25T05:22:40.468Z" }, +] + [[package]] name = "anyio" version = "4.9.0" @@ -184,6 +251,88 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/38/fc/bce832fd4fd99766c04d1ee0eead6b0ec6486fb100ae5e74c1d91292b982/certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe", size = 166393, upload_time = "2025-01-31T02:16:45.015Z" }, ] +[[package]] +name = "cffi" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser", marker = "implementation_name != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload_time = "2025-09-08T23:24:04.541Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/93/d7/516d984057745a6cd96575eea814fe1edd6646ee6efd552fb7b0921dec83/cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44", size = 184283, upload_time = "2025-09-08T23:22:08.01Z" }, + { url = "https://files.pythonhosted.org/packages/9e/84/ad6a0b408daa859246f57c03efd28e5dd1b33c21737c2db84cae8c237aa5/cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49", size = 180504, upload_time = "2025-09-08T23:22:10.637Z" }, + { url = "https://files.pythonhosted.org/packages/50/bd/b1a6362b80628111e6653c961f987faa55262b4002fcec42308cad1db680/cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c", size = 208811, upload_time = "2025-09-08T23:22:12.267Z" }, + { url = "https://files.pythonhosted.org/packages/4f/27/6933a8b2562d7bd1fb595074cf99cc81fc3789f6a6c05cdabb46284a3188/cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb", size = 216402, upload_time = "2025-09-08T23:22:13.455Z" }, + { url = "https://files.pythonhosted.org/packages/05/eb/b86f2a2645b62adcfff53b0dd97e8dfafb5c8aa864bd0d9a2c2049a0d551/cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0", size = 203217, upload_time = "2025-09-08T23:22:14.596Z" }, + { url = "https://files.pythonhosted.org/packages/9f/e0/6cbe77a53acf5acc7c08cc186c9928864bd7c005f9efd0d126884858a5fe/cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4", size = 203079, upload_time = "2025-09-08T23:22:15.769Z" }, + { url = "https://files.pythonhosted.org/packages/98/29/9b366e70e243eb3d14a5cb488dfd3a0b6b2f1fb001a203f653b93ccfac88/cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453", size = 216475, upload_time = "2025-09-08T23:22:17.427Z" }, + { url = "https://files.pythonhosted.org/packages/21/7a/13b24e70d2f90a322f2900c5d8e1f14fa7e2a6b3332b7309ba7b2ba51a5a/cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495", size = 218829, upload_time = "2025-09-08T23:22:19.069Z" }, + { url = "https://files.pythonhosted.org/packages/60/99/c9dc110974c59cc981b1f5b66e1d8af8af764e00f0293266824d9c4254bc/cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5", size = 211211, upload_time = "2025-09-08T23:22:20.588Z" }, + { url = "https://files.pythonhosted.org/packages/49/72/ff2d12dbf21aca1b32a40ed792ee6b40f6dc3a9cf1644bd7ef6e95e0ac5e/cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb", size = 218036, upload_time = "2025-09-08T23:22:22.143Z" }, + { url = "https://files.pythonhosted.org/packages/e2/cc/027d7fb82e58c48ea717149b03bcadcbdc293553edb283af792bd4bcbb3f/cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a", size = 172184, upload_time = "2025-09-08T23:22:23.328Z" }, + { url = "https://files.pythonhosted.org/packages/33/fa/072dd15ae27fbb4e06b437eb6e944e75b068deb09e2a2826039e49ee2045/cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739", size = 182790, upload_time = "2025-09-08T23:22:24.752Z" }, + { url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload_time = "2025-09-08T23:22:26.456Z" }, + { url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload_time = "2025-09-08T23:22:28.197Z" }, + { url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload_time = "2025-09-08T23:22:29.475Z" }, + { url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload_time = "2025-09-08T23:22:31.063Z" }, + { url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload_time = "2025-09-08T23:22:32.507Z" }, + { url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload_time = "2025-09-08T23:22:34.132Z" }, + { url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload_time = "2025-09-08T23:22:35.443Z" }, + { url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload_time = "2025-09-08T23:22:36.805Z" }, + { url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload_time = "2025-09-08T23:22:38.436Z" }, + { url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload_time = "2025-09-08T23:22:39.776Z" }, + { url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload_time = "2025-09-08T23:22:40.95Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload_time = "2025-09-08T23:22:42.463Z" }, + { url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload_time = "2025-09-08T23:22:43.623Z" }, + { url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload_time = "2025-09-08T23:22:44.795Z" }, + { url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload_time = "2025-09-08T23:22:45.938Z" }, + { url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload_time = "2025-09-08T23:22:47.349Z" }, + { url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload_time = "2025-09-08T23:22:48.677Z" }, + { url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload_time = "2025-09-08T23:22:50.06Z" }, + { url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload_time = "2025-09-08T23:22:51.364Z" }, + { url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload_time = "2025-09-08T23:22:52.902Z" }, + { url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload_time = "2025-09-08T23:22:54.518Z" }, + { url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload_time = "2025-09-08T23:22:55.867Z" }, + { url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload_time = "2025-09-08T23:22:57.188Z" }, + { url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload_time = "2025-09-08T23:22:58.351Z" }, + { url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload_time = "2025-09-08T23:22:59.668Z" }, + { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload_time = "2025-09-08T23:23:00.879Z" }, + { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload_time = "2025-09-08T23:23:02.231Z" }, + { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload_time = "2025-09-08T23:23:03.472Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload_time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload_time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload_time = "2025-09-08T23:23:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload_time = "2025-09-08T23:23:09.648Z" }, + { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload_time = "2025-09-08T23:23:10.928Z" }, + { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload_time = "2025-09-08T23:23:12.42Z" }, + { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload_time = "2025-09-08T23:23:14.32Z" }, + { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload_time = "2025-09-08T23:23:15.535Z" }, + { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload_time = "2025-09-08T23:23:16.761Z" }, + { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload_time = "2025-09-08T23:23:18.087Z" }, + { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload_time = "2025-09-08T23:23:19.622Z" }, + { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload_time = "2025-09-08T23:23:20.853Z" }, + { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload_time = "2025-09-08T23:23:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload_time = "2025-09-08T23:23:23.314Z" }, + { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload_time = "2025-09-08T23:23:24.541Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload_time = "2025-09-08T23:23:26.143Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload_time = "2025-09-08T23:23:27.873Z" }, + { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload_time = "2025-09-08T23:23:44.61Z" }, + { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload_time = "2025-09-08T23:23:45.848Z" }, + { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload_time = "2025-09-08T23:23:47.105Z" }, + { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload_time = "2025-09-08T23:23:29.347Z" }, + { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload_time = "2025-09-08T23:23:30.63Z" }, + { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload_time = "2025-09-08T23:23:31.91Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload_time = "2025-09-08T23:23:33.214Z" }, + { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload_time = "2025-09-08T23:23:34.495Z" }, + { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload_time = "2025-09-08T23:23:36.096Z" }, + { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload_time = "2025-09-08T23:23:37.328Z" }, + { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload_time = "2025-09-08T23:23:38.945Z" }, + { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload_time = "2025-09-08T23:23:40.423Z" }, + { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload_time = "2025-09-08T23:23:41.742Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload_time = "2025-09-08T23:23:43.004Z" }, +] + [[package]] name = "charset-normalizer" version = "3.4.1" @@ -247,14 +396,14 @@ wheels = [ [[package]] name = "click" -version = "8.1.8" +version = "8.3.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama", marker = "sys_platform == 'win32'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593, upload_time = "2024-12-21T18:38:44.339Z" } +sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload_time = "2025-11-15T20:45:42.706Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188, upload_time = "2024-12-21T18:38:41.666Z" }, + { url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload_time = "2025-11-15T20:45:41.139Z" }, ] [[package]] @@ -335,6 +484,84 @@ toml = [ { name = "tomli", marker = "python_full_version <= '3.11'" }, ] +[[package]] +name = "cryptography" +version = "46.0.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/60/04/ee2a9e8542e4fa2773b81771ff8349ff19cdd56b7258a0cc442639052edb/cryptography-46.0.5.tar.gz", hash = "sha256:abace499247268e3757271b2f1e244b36b06f8515cf27c4d49468fc9eb16e93d", size = 750064, upload_time = "2026-02-10T19:18:38.255Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/81/b0bb27f2ba931a65409c6b8a8b358a7f03c0e46eceacddff55f7c84b1f3b/cryptography-46.0.5-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:351695ada9ea9618b3500b490ad54c739860883df6c1f555e088eaf25b1bbaad", size = 7176289, upload_time = "2026-02-10T19:17:08.274Z" }, + { url = "https://files.pythonhosted.org/packages/ff/9e/6b4397a3e3d15123de3b1806ef342522393d50736c13b20ec4c9ea6693a6/cryptography-46.0.5-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c18ff11e86df2e28854939acde2d003f7984f721eba450b56a200ad90eeb0e6b", size = 4275637, upload_time = "2026-02-10T19:17:10.53Z" }, + { url = "https://files.pythonhosted.org/packages/63/e7/471ab61099a3920b0c77852ea3f0ea611c9702f651600397ac567848b897/cryptography-46.0.5-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d7e3d356b8cd4ea5aff04f129d5f66ebdc7b6f8eae802b93739ed520c47c79b", size = 4424742, upload_time = "2026-02-10T19:17:12.388Z" }, + { url = "https://files.pythonhosted.org/packages/37/53/a18500f270342d66bf7e4d9f091114e31e5ee9e7375a5aba2e85a91e0044/cryptography-46.0.5-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:50bfb6925eff619c9c023b967d5b77a54e04256c4281b0e21336a130cd7fc263", size = 4277528, upload_time = "2026-02-10T19:17:13.853Z" }, + { url = "https://files.pythonhosted.org/packages/22/29/c2e812ebc38c57b40e7c583895e73c8c5adb4d1e4a0cc4c5a4fdab2b1acc/cryptography-46.0.5-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:803812e111e75d1aa73690d2facc295eaefd4439be1023fefc4995eaea2af90d", size = 4947993, upload_time = "2026-02-10T19:17:15.618Z" }, + { url = "https://files.pythonhosted.org/packages/6b/e7/237155ae19a9023de7e30ec64e5d99a9431a567407ac21170a046d22a5a3/cryptography-46.0.5-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ee190460e2fbe447175cda91b88b84ae8322a104fc27766ad09428754a618ed", size = 4456855, upload_time = "2026-02-10T19:17:17.221Z" }, + { url = "https://files.pythonhosted.org/packages/2d/87/fc628a7ad85b81206738abbd213b07702bcbdada1dd43f72236ef3cffbb5/cryptography-46.0.5-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:f145bba11b878005c496e93e257c1e88f154d278d2638e6450d17e0f31e558d2", size = 3984635, upload_time = "2026-02-10T19:17:18.792Z" }, + { url = "https://files.pythonhosted.org/packages/84/29/65b55622bde135aedf4565dc509d99b560ee4095e56989e815f8fd2aa910/cryptography-46.0.5-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:e9251e3be159d1020c4030bd2e5f84d6a43fe54b6c19c12f51cde9542a2817b2", size = 4277038, upload_time = "2026-02-10T19:17:20.256Z" }, + { url = "https://files.pythonhosted.org/packages/bc/36/45e76c68d7311432741faf1fbf7fac8a196a0a735ca21f504c75d37e2558/cryptography-46.0.5-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:47fb8a66058b80e509c47118ef8a75d14c455e81ac369050f20ba0d23e77fee0", size = 4912181, upload_time = "2026-02-10T19:17:21.825Z" }, + { url = "https://files.pythonhosted.org/packages/6d/1a/c1ba8fead184d6e3d5afcf03d569acac5ad063f3ac9fb7258af158f7e378/cryptography-46.0.5-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4c3341037c136030cb46e4b1e17b7418ea4cbd9dd207e4a6f3b2b24e0d4ac731", size = 4456482, upload_time = "2026-02-10T19:17:25.133Z" }, + { url = "https://files.pythonhosted.org/packages/f9/e5/3fb22e37f66827ced3b902cf895e6a6bc1d095b5b26be26bd13c441fdf19/cryptography-46.0.5-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:890bcb4abd5a2d3f852196437129eb3667d62630333aacc13dfd470fad3aaa82", size = 4405497, upload_time = "2026-02-10T19:17:26.66Z" }, + { url = "https://files.pythonhosted.org/packages/1a/df/9d58bb32b1121a8a2f27383fabae4d63080c7ca60b9b5c88be742be04ee7/cryptography-46.0.5-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80a8d7bfdf38f87ca30a5391c0c9ce4ed2926918e017c29ddf643d0ed2778ea1", size = 4667819, upload_time = "2026-02-10T19:17:28.569Z" }, + { url = "https://files.pythonhosted.org/packages/ea/ed/325d2a490c5e94038cdb0117da9397ece1f11201f425c4e9c57fe5b9f08b/cryptography-46.0.5-cp311-abi3-win32.whl", hash = "sha256:60ee7e19e95104d4c03871d7d7dfb3d22ef8a9b9c6778c94e1c8fcc8365afd48", size = 3028230, upload_time = "2026-02-10T19:17:30.518Z" }, + { url = "https://files.pythonhosted.org/packages/e9/5a/ac0f49e48063ab4255d9e3b79f5def51697fce1a95ea1370f03dc9db76f6/cryptography-46.0.5-cp311-abi3-win_amd64.whl", hash = "sha256:38946c54b16c885c72c4f59846be9743d699eee2b69b6988e0a00a01f46a61a4", size = 3480909, upload_time = "2026-02-10T19:17:32.083Z" }, + { url = "https://files.pythonhosted.org/packages/00/13/3d278bfa7a15a96b9dc22db5a12ad1e48a9eb3d40e1827ef66a5df75d0d0/cryptography-46.0.5-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:94a76daa32eb78d61339aff7952ea819b1734b46f73646a07decb40e5b3448e2", size = 7119287, upload_time = "2026-02-10T19:17:33.801Z" }, + { url = "https://files.pythonhosted.org/packages/67/c8/581a6702e14f0898a0848105cbefd20c058099e2c2d22ef4e476dfec75d7/cryptography-46.0.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:5be7bf2fb40769e05739dd0046e7b26f9d4670badc7b032d6ce4db64dddc0678", size = 4265728, upload_time = "2026-02-10T19:17:35.569Z" }, + { url = "https://files.pythonhosted.org/packages/dd/4a/ba1a65ce8fc65435e5a849558379896c957870dd64fecea97b1ad5f46a37/cryptography-46.0.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe346b143ff9685e40192a4960938545c699054ba11d4f9029f94751e3f71d87", size = 4408287, upload_time = "2026-02-10T19:17:36.938Z" }, + { url = "https://files.pythonhosted.org/packages/f8/67/8ffdbf7b65ed1ac224d1c2df3943553766914a8ca718747ee3871da6107e/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:c69fd885df7d089548a42d5ec05be26050ebcd2283d89b3d30676eb32ff87dee", size = 4270291, upload_time = "2026-02-10T19:17:38.748Z" }, + { url = "https://files.pythonhosted.org/packages/f8/e5/f52377ee93bc2f2bba55a41a886fd208c15276ffbd2569f2ddc89d50e2c5/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:8293f3dea7fc929ef7240796ba231413afa7b68ce38fd21da2995549f5961981", size = 4927539, upload_time = "2026-02-10T19:17:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/3b/02/cfe39181b02419bbbbcf3abdd16c1c5c8541f03ca8bda240debc467d5a12/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:1abfdb89b41c3be0365328a410baa9df3ff8a9110fb75e7b52e66803ddabc9a9", size = 4442199, upload_time = "2026-02-10T19:17:41.789Z" }, + { url = "https://files.pythonhosted.org/packages/c0/96/2fcaeb4873e536cf71421a388a6c11b5bc846e986b2b069c79363dc1648e/cryptography-46.0.5-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:d66e421495fdb797610a08f43b05269e0a5ea7f5e652a89bfd5a7d3c1dee3648", size = 3960131, upload_time = "2026-02-10T19:17:43.379Z" }, + { url = "https://files.pythonhosted.org/packages/d8/d2/b27631f401ddd644e94c5cf33c9a4069f72011821cf3dc7309546b0642a0/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:4e817a8920bfbcff8940ecfd60f23d01836408242b30f1a708d93198393a80b4", size = 4270072, upload_time = "2026-02-10T19:17:45.481Z" }, + { url = "https://files.pythonhosted.org/packages/f4/a7/60d32b0370dae0b4ebe55ffa10e8599a2a59935b5ece1b9f06edb73abdeb/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:68f68d13f2e1cb95163fa3b4db4bf9a159a418f5f6e7242564fc75fcae667fd0", size = 4892170, upload_time = "2026-02-10T19:17:46.997Z" }, + { url = "https://files.pythonhosted.org/packages/d2/b9/cf73ddf8ef1164330eb0b199a589103c363afa0cf794218c24d524a58eab/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a3d1fae9863299076f05cb8a778c467578262fae09f9dc0ee9b12eb4268ce663", size = 4441741, upload_time = "2026-02-10T19:17:48.661Z" }, + { url = "https://files.pythonhosted.org/packages/5f/eb/eee00b28c84c726fe8fa0158c65afe312d9c3b78d9d01daf700f1f6e37ff/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4143987a42a2397f2fc3b4d7e3a7d313fbe684f67ff443999e803dd75a76826", size = 4396728, upload_time = "2026-02-10T19:17:50.058Z" }, + { url = "https://files.pythonhosted.org/packages/65/f4/6bc1a9ed5aef7145045114b75b77c2a8261b4d38717bd8dea111a63c3442/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:7d731d4b107030987fd61a7f8ab512b25b53cef8f233a97379ede116f30eb67d", size = 4652001, upload_time = "2026-02-10T19:17:51.54Z" }, + { url = "https://files.pythonhosted.org/packages/86/ef/5d00ef966ddd71ac2e6951d278884a84a40ffbd88948ef0e294b214ae9e4/cryptography-46.0.5-cp314-cp314t-win32.whl", hash = "sha256:c3bcce8521d785d510b2aad26ae2c966092b7daa8f45dd8f44734a104dc0bc1a", size = 3003637, upload_time = "2026-02-10T19:17:52.997Z" }, + { url = "https://files.pythonhosted.org/packages/b7/57/f3f4160123da6d098db78350fdfd9705057aad21de7388eacb2401dceab9/cryptography-46.0.5-cp314-cp314t-win_amd64.whl", hash = "sha256:4d8ae8659ab18c65ced284993c2265910f6c9e650189d4e3f68445ef82a810e4", size = 3469487, upload_time = "2026-02-10T19:17:54.549Z" }, + { url = "https://files.pythonhosted.org/packages/e2/fa/a66aa722105ad6a458bebd64086ca2b72cdd361fed31763d20390f6f1389/cryptography-46.0.5-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:4108d4c09fbbf2789d0c926eb4152ae1760d5a2d97612b92d508d96c861e4d31", size = 7170514, upload_time = "2026-02-10T19:17:56.267Z" }, + { url = "https://files.pythonhosted.org/packages/0f/04/c85bdeab78c8bc77b701bf0d9bdcf514c044e18a46dcff330df5448631b0/cryptography-46.0.5-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1f30a86d2757199cb2d56e48cce14deddf1f9c95f1ef1b64ee91ea43fe2e18", size = 4275349, upload_time = "2026-02-10T19:17:58.419Z" }, + { url = "https://files.pythonhosted.org/packages/5c/32/9b87132a2f91ee7f5223b091dc963055503e9b442c98fc0b8a5ca765fab0/cryptography-46.0.5-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:039917b0dc418bb9f6edce8a906572d69e74bd330b0b3fea4f79dab7f8ddd235", size = 4420667, upload_time = "2026-02-10T19:18:00.619Z" }, + { url = "https://files.pythonhosted.org/packages/a1/a6/a7cb7010bec4b7c5692ca6f024150371b295ee1c108bdc1c400e4c44562b/cryptography-46.0.5-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ba2a27ff02f48193fc4daeadf8ad2590516fa3d0adeeb34336b96f7fa64c1e3a", size = 4276980, upload_time = "2026-02-10T19:18:02.379Z" }, + { url = "https://files.pythonhosted.org/packages/8e/7c/c4f45e0eeff9b91e3f12dbd0e165fcf2a38847288fcfd889deea99fb7b6d/cryptography-46.0.5-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:61aa400dce22cb001a98014f647dc21cda08f7915ceb95df0c9eaf84b4b6af76", size = 4939143, upload_time = "2026-02-10T19:18:03.964Z" }, + { url = "https://files.pythonhosted.org/packages/37/19/e1b8f964a834eddb44fa1b9a9976f4e414cbb7aa62809b6760c8803d22d1/cryptography-46.0.5-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ce58ba46e1bc2aac4f7d9290223cead56743fa6ab94a5d53292ffaac6a91614", size = 4453674, upload_time = "2026-02-10T19:18:05.588Z" }, + { url = "https://files.pythonhosted.org/packages/db/ed/db15d3956f65264ca204625597c410d420e26530c4e2943e05a0d2f24d51/cryptography-46.0.5-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:420d0e909050490d04359e7fdb5ed7e667ca5c3c402b809ae2563d7e66a92229", size = 3978801, upload_time = "2026-02-10T19:18:07.167Z" }, + { url = "https://files.pythonhosted.org/packages/41/e2/df40a31d82df0a70a0daf69791f91dbb70e47644c58581d654879b382d11/cryptography-46.0.5-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:582f5fcd2afa31622f317f80426a027f30dc792e9c80ffee87b993200ea115f1", size = 4276755, upload_time = "2026-02-10T19:18:09.813Z" }, + { url = "https://files.pythonhosted.org/packages/33/45/726809d1176959f4a896b86907b98ff4391a8aa29c0aaaf9450a8a10630e/cryptography-46.0.5-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:bfd56bb4b37ed4f330b82402f6f435845a5f5648edf1ad497da51a8452d5d62d", size = 4901539, upload_time = "2026-02-10T19:18:11.263Z" }, + { url = "https://files.pythonhosted.org/packages/99/0f/a3076874e9c88ecb2ecc31382f6e7c21b428ede6f55aafa1aa272613e3cd/cryptography-46.0.5-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:a3d507bb6a513ca96ba84443226af944b0f7f47dcc9a399d110cd6146481d24c", size = 4452794, upload_time = "2026-02-10T19:18:12.914Z" }, + { url = "https://files.pythonhosted.org/packages/02/ef/ffeb542d3683d24194a38f66ca17c0a4b8bf10631feef44a7ef64e631b1a/cryptography-46.0.5-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9f16fbdf4da055efb21c22d81b89f155f02ba420558db21288b3d0035bafd5f4", size = 4404160, upload_time = "2026-02-10T19:18:14.375Z" }, + { url = "https://files.pythonhosted.org/packages/96/93/682d2b43c1d5f1406ed048f377c0fc9fc8f7b0447a478d5c65ab3d3a66eb/cryptography-46.0.5-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:ced80795227d70549a411a4ab66e8ce307899fad2220ce5ab2f296e687eacde9", size = 4667123, upload_time = "2026-02-10T19:18:15.886Z" }, + { url = "https://files.pythonhosted.org/packages/45/2d/9c5f2926cb5300a8eefc3f4f0b3f3df39db7f7ce40c8365444c49363cbda/cryptography-46.0.5-cp38-abi3-win32.whl", hash = "sha256:02f547fce831f5096c9a567fd41bc12ca8f11df260959ecc7c3202555cc47a72", size = 3010220, upload_time = "2026-02-10T19:18:17.361Z" }, + { url = "https://files.pythonhosted.org/packages/48/ef/0c2f4a8e31018a986949d34a01115dd057bf536905dca38897bacd21fac3/cryptography-46.0.5-cp38-abi3-win_amd64.whl", hash = "sha256:556e106ee01aa13484ce9b0239bca667be5004efb0aabbed28d353df86445595", size = 3467050, upload_time = "2026-02-10T19:18:18.899Z" }, + { url = "https://files.pythonhosted.org/packages/eb/dd/2d9fdb07cebdf3d51179730afb7d5e576153c6744c3ff8fded23030c204e/cryptography-46.0.5-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:3b4995dc971c9fb83c25aa44cf45f02ba86f71ee600d81091c2f0cbae116b06c", size = 3476964, upload_time = "2026-02-10T19:18:20.687Z" }, + { url = "https://files.pythonhosted.org/packages/e9/6f/6cc6cc9955caa6eaf83660b0da2b077c7fe8ff9950a3c5e45d605038d439/cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:bc84e875994c3b445871ea7181d424588171efec3e185dced958dad9e001950a", size = 4218321, upload_time = "2026-02-10T19:18:22.349Z" }, + { url = "https://files.pythonhosted.org/packages/3e/5d/c4da701939eeee699566a6c1367427ab91a8b7088cc2328c09dbee940415/cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:2ae6971afd6246710480e3f15824ed3029a60fc16991db250034efd0b9fb4356", size = 4381786, upload_time = "2026-02-10T19:18:24.529Z" }, + { url = "https://files.pythonhosted.org/packages/ac/97/a538654732974a94ff96c1db621fa464f455c02d4bb7d2652f4edc21d600/cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:d861ee9e76ace6cf36a6a89b959ec08e7bc2493ee39d07ffe5acb23ef46d27da", size = 4217990, upload_time = "2026-02-10T19:18:25.957Z" }, + { url = "https://files.pythonhosted.org/packages/ae/11/7e500d2dd3ba891197b9efd2da5454b74336d64a7cc419aa7327ab74e5f6/cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:2b7a67c9cd56372f3249b39699f2ad479f6991e62ea15800973b956f4b73e257", size = 4381252, upload_time = "2026-02-10T19:18:27.496Z" }, + { url = "https://files.pythonhosted.org/packages/bc/58/6b3d24e6b9bc474a2dcdee65dfd1f008867015408a271562e4b690561a4d/cryptography-46.0.5-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:8456928655f856c6e1533ff59d5be76578a7157224dbd9ce6872f25055ab9ab7", size = 3407605, upload_time = "2026-02-10T19:18:29.233Z" }, +] + +[[package]] +name = "distro" +version = "1.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fc/f8/98eea607f65de6527f8a2e8885fc8015d3e6f5775df186e443e0964a11c3/distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed", size = 60722, upload_time = "2023-12-24T09:54:32.31Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload_time = "2023-12-24T09:54:30.421Z" }, +] + +[[package]] +name = "docstring-parser" +version = "0.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload_time = "2025-07-21T07:35:01.868Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload_time = "2025-07-21T07:35:00.684Z" }, +] + [[package]] name = "events" version = "0.5" @@ -446,26 +673,87 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/71/3e/b04a0adda73bd52b390d730071c0d577073d3d26740ee1bad25c3ad0f37b/frozenlist-1.6.0-py3-none-any.whl", hash = "sha256:535eec9987adb04701266b92745d6cdcef2e77669299359c3009c3404dd5d191", size = 12404, upload_time = "2025-04-17T22:38:51.668Z" }, ] +[[package]] +name = "grpcio" +version = "1.78.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1f/de/de568532d9907552700f80dcec38219d8d298ad9e71f5e0a095abaf2761e/grpcio-1.78.1.tar.gz", hash = "sha256:27c625532d33ace45d57e775edf1982e183ff8641c72e4e91ef7ba667a149d72", size = 12835760, upload_time = "2026-02-20T01:16:10.869Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/44/30/0534b643dafd54824769d6260b89c71d518e4ef8b5ad16b84d1ae9272978/grpcio-1.78.1-cp310-cp310-linux_armv7l.whl", hash = "sha256:4393bef64cf26dc07cd6f18eaa5170ae4eebaafd4418e7e3a59ca9526a6fa30b", size = 5947661, upload_time = "2026-02-20T01:12:34.922Z" }, + { url = "https://files.pythonhosted.org/packages/4a/f8/f678566655ab822da0f713789555e7eddca7ef93da99f480c63de3aa94b4/grpcio-1.78.1-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:917047c19cd120b40aab9a4b8a22e9ce3562f4a1343c0d62b3cd2d5199da3d67", size = 11819948, upload_time = "2026-02-20T01:12:39.709Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0b/a4b4210d946055f4e5a8430f2802202ae8f831b4b00d36d55055c5cf4b6a/grpcio-1.78.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ff7de398bb3528d44d17e6913a7cfe639e3b15c65595a71155322df16978c5e1", size = 6519850, upload_time = "2026-02-20T01:12:42.715Z" }, + { url = "https://files.pythonhosted.org/packages/ea/d9/a1e657a73000a71fa75ec7140ff3a8dc32eb3427560620e477c6a2735527/grpcio-1.78.1-cp310-cp310-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:15f6e636d1152667ddb4022b37534c161c8477274edb26a0b65b215dd0a81e97", size = 7198654, upload_time = "2026-02-20T01:12:46.164Z" }, + { url = "https://files.pythonhosted.org/packages/aa/28/a61c5bdf53c1638e657bb5eebb93c789837820e1fdb965145f05eccc2994/grpcio-1.78.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:27b5cb669603efb7883a882275db88b6b5d6b6c9f0267d5846ba8699b7ace338", size = 6727238, upload_time = "2026-02-20T01:12:48.472Z" }, + { url = "https://files.pythonhosted.org/packages/9d/3e/aa143d0687801986a29d85788c96089449f36651cd4e2a493737ae0c5be9/grpcio-1.78.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:86edb3966778fa05bfdb333688fde5dc9079f9e2a9aa6a5c42e9564b7656ba04", size = 7300960, upload_time = "2026-02-20T01:12:51.139Z" }, + { url = "https://files.pythonhosted.org/packages/30/d3/53e0f26b46417f28d14b5951fc6a1eff79c08c8a339e967c0a19ec7cf9e9/grpcio-1.78.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:849cc62eb989bc3be5629d4f3acef79be0d0ff15622201ed251a86d17fef6494", size = 8285274, upload_time = "2026-02-20T01:12:53.315Z" }, + { url = "https://files.pythonhosted.org/packages/29/d0/e0e9fd477ce86c07ed1ed1d5c34790f050b6d58bfde77b02b36e23f8b235/grpcio-1.78.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:9a00992d6fafe19d648b9ccb4952200c50d8e36d0cce8cf026c56ed3fdc28465", size = 7726620, upload_time = "2026-02-20T01:12:56.498Z" }, + { url = "https://files.pythonhosted.org/packages/5e/b5/e138a9f7810d196081b2e047c378ca12358c5906d79c42ddec41bb43d528/grpcio-1.78.1-cp310-cp310-win32.whl", hash = "sha256:f8759a1347f3b4f03d9a9d4ce8f9f31ad5e5d0144ba06ccfb1ffaeb0ba4c1e20", size = 4076778, upload_time = "2026-02-20T01:12:59.098Z" }, + { url = "https://files.pythonhosted.org/packages/4e/95/9b02316b85731df0943a635ca6d02f155f673c4f17e60be0c4892a6eb051/grpcio-1.78.1-cp310-cp310-win_amd64.whl", hash = "sha256:e840405a3f1249509892be2399f668c59b9d492068a2cf326d661a8c79e5e747", size = 4798925, upload_time = "2026-02-20T01:13:03.186Z" }, + { url = "https://files.pythonhosted.org/packages/bf/1e/ad774af3b2c84f49c6d8c4a7bea4c40f02268ea8380630c28777edda463b/grpcio-1.78.1-cp311-cp311-linux_armv7l.whl", hash = "sha256:3a8aa79bc6e004394c0abefd4b034c14affda7b66480085d87f5fbadf43b593b", size = 5951132, upload_time = "2026-02-20T01:13:05.942Z" }, + { url = "https://files.pythonhosted.org/packages/48/9d/ad3c284bedd88c545e20675d98ae904114d8517a71b0efc0901e9166628f/grpcio-1.78.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:8e1fcb419da5811deb47b7749b8049f7c62b993ba17822e3c7231e3e0ba65b79", size = 11831052, upload_time = "2026-02-20T01:13:09.604Z" }, + { url = "https://files.pythonhosted.org/packages/6d/08/20d12865e47242d03c3ade9bb2127f5b4aded964f373284cfb357d47c5ac/grpcio-1.78.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b071dccac245c32cd6b1dd96b722283b855881ca0bf1c685cf843185f5d5d51e", size = 6524749, upload_time = "2026-02-20T01:13:21.692Z" }, + { url = "https://files.pythonhosted.org/packages/c6/53/a8b72f52b253ec0cfdf88a13e9236a9d717c332b8aa5f0ba9e4699e94b55/grpcio-1.78.1-cp311-cp311-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:d6fb962947e4fe321eeef3be1ba5ba49d32dea9233c825fcbade8e858c14aaf4", size = 7198995, upload_time = "2026-02-20T01:13:24.275Z" }, + { url = "https://files.pythonhosted.org/packages/13/3c/ac769c8ded1bcb26bb119fb472d3374b481b3cf059a0875db9fc77139c17/grpcio-1.78.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:a6afd191551fd72e632367dfb083e33cd185bf9ead565f2476bba8ab864ae496", size = 6730770, upload_time = "2026-02-20T01:13:26.522Z" }, + { url = "https://files.pythonhosted.org/packages/dc/c3/2275ef4cc5b942314321f77d66179be4097ff484e82ca34bf7baa5b1ddbc/grpcio-1.78.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b2acd83186305c0802dbc4d81ed0ec2f3e8658d7fde97cfba2f78d7372f05b89", size = 7305036, upload_time = "2026-02-20T01:13:30.923Z" }, + { url = "https://files.pythonhosted.org/packages/91/cb/3c2aa99e12cbbfc72c2ed8aa328e6041709d607d668860380e6cd00ba17d/grpcio-1.78.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5380268ab8513445740f1f77bd966d13043d07e2793487e61fd5b5d0935071eb", size = 8288641, upload_time = "2026-02-20T01:13:39.42Z" }, + { url = "https://files.pythonhosted.org/packages/0d/b2/21b89f492260ac645775d9973752ca873acfd0609d6998e9d3065a21ea2f/grpcio-1.78.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:389b77484959bdaad6a2b7dda44d7d1228381dd669a03f5660392aa0e9385b22", size = 7730967, upload_time = "2026-02-20T01:13:41.697Z" }, + { url = "https://files.pythonhosted.org/packages/24/03/6b89eddf87fdffb8fa9d37375d44d3a798f4b8116ac363a5f7ca84caa327/grpcio-1.78.1-cp311-cp311-win32.whl", hash = "sha256:9dee66d142f4a8cca36b5b98a38f006419138c3c89e72071747f8fca415a6d8f", size = 4076680, upload_time = "2026-02-20T01:13:43.781Z" }, + { url = "https://files.pythonhosted.org/packages/a7/a8/204460b1bc1dff9862e98f56a2d14be3c4171f929f8eaf8c4517174b4270/grpcio-1.78.1-cp311-cp311-win_amd64.whl", hash = "sha256:43b930cf4f9c4a2262bb3e5d5bc40df426a72538b4f98e46f158b7eb112d2d70", size = 4801074, upload_time = "2026-02-20T01:13:46.315Z" }, + { url = "https://files.pythonhosted.org/packages/ab/ed/d2eb9d27fded1a76b2a80eb9aa8b12101da7e41ce2bac0ad3651e88a14ae/grpcio-1.78.1-cp312-cp312-linux_armv7l.whl", hash = "sha256:41e4605c923e0e9a84a2718e4948a53a530172bfaf1a6d1ded16ef9c5849fca2", size = 5913389, upload_time = "2026-02-20T01:13:49.005Z" }, + { url = "https://files.pythonhosted.org/packages/69/1b/40034e9ab010eeb3fa41ec61d8398c6dbf7062f3872c866b8f72700e2522/grpcio-1.78.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:39da1680d260c0c619c3b5fa2dc47480ca24d5704c7a548098bca7de7f5dd17f", size = 11811839, upload_time = "2026-02-20T01:13:51.839Z" }, + { url = "https://files.pythonhosted.org/packages/b4/69/fe16ef2979ea62b8aceb3a3f1e7a8bbb8b717ae2a44b5899d5d426073273/grpcio-1.78.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b5d5881d72a09b8336a8f874784a8eeffacde44a7bc1a148bce5a0243a265ef0", size = 6475805, upload_time = "2026-02-20T01:13:55.423Z" }, + { url = "https://files.pythonhosted.org/packages/5b/1e/069e0a9062167db18446917d7c00ae2e91029f96078a072bedc30aaaa8c3/grpcio-1.78.1-cp312-cp312-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:888ceb7821acd925b1c90f0cdceaed1386e69cfe25e496e0771f6c35a156132f", size = 7169955, upload_time = "2026-02-20T01:13:59.553Z" }, + { url = "https://files.pythonhosted.org/packages/38/fc/44a57e2bb4a755e309ee4e9ed2b85c9af93450b6d3118de7e69410ee05fa/grpcio-1.78.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8942bdfc143b467c264b048862090c4ba9a0223c52ae28c9ae97754361372e42", size = 6690767, upload_time = "2026-02-20T01:14:02.31Z" }, + { url = "https://files.pythonhosted.org/packages/b8/87/21e16345d4c75046d453916166bc72a3309a382c8e97381ec4b8c1a54729/grpcio-1.78.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:716a544969660ed609164aff27b2effd3ff84e54ac81aa4ce77b1607ca917d22", size = 7266846, upload_time = "2026-02-20T01:14:12.974Z" }, + { url = "https://files.pythonhosted.org/packages/11/df/d6261983f9ca9ef4d69893765007a9a3211b91d9faf85a2591063df381c7/grpcio-1.78.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4d50329b081c223d444751076bb5b389d4f06c2b32d51b31a1e98172e6cecfb9", size = 8253522, upload_time = "2026-02-20T01:14:17.407Z" }, + { url = "https://files.pythonhosted.org/packages/de/7c/4f96a0ff113c5d853a27084d7590cd53fdb05169b596ea9f5f27f17e021e/grpcio-1.78.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7e836778c13ff70edada16567e8da0c431e8818eaae85b80d11c1ba5782eccbb", size = 7698070, upload_time = "2026-02-20T01:14:20.032Z" }, + { url = "https://files.pythonhosted.org/packages/17/3c/7b55c0b5af88fbeb3d0c13e25492d3ace41ac9dbd0f5f8f6c0fb613b6706/grpcio-1.78.1-cp312-cp312-win32.whl", hash = "sha256:07eb016ea7444a22bef465cce045512756956433f54450aeaa0b443b8563b9ca", size = 4066474, upload_time = "2026-02-20T01:14:22.602Z" }, + { url = "https://files.pythonhosted.org/packages/5d/17/388c12d298901b0acf10b612b650692bfed60e541672b1d8965acbf2d722/grpcio-1.78.1-cp312-cp312-win_amd64.whl", hash = "sha256:02b82dcd2fa580f5e82b4cf62ecde1b3c7cc9ba27b946421200706a6e5acaf85", size = 4797537, upload_time = "2026-02-20T01:14:25.444Z" }, + { url = "https://files.pythonhosted.org/packages/df/72/754754639cfd16ad04619e1435a518124b2d858e5752225376f9285d4c51/grpcio-1.78.1-cp313-cp313-linux_armv7l.whl", hash = "sha256:2b7ad2981550ce999e25ce3f10c8863f718a352a2fd655068d29ea3fd37b4907", size = 5919437, upload_time = "2026-02-20T01:14:29.403Z" }, + { url = "https://files.pythonhosted.org/packages/5c/84/6267d1266f8bc335d3a8b7ccf981be7de41e3ed8bd3a49e57e588212b437/grpcio-1.78.1-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:409bfe22220889b9906739910a0ee4c197a967c21b8dd14b4b06dd477f8819ce", size = 11803701, upload_time = "2026-02-20T01:14:32.624Z" }, + { url = "https://files.pythonhosted.org/packages/f3/56/c9098e8b920a54261cd605bbb040de0cde1ca4406102db0aa2c0b11d1fb4/grpcio-1.78.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:34b6cb16f4b67eeb5206250dc5b4d5e8e3db939535e58efc330e4c61341554bd", size = 6479416, upload_time = "2026-02-20T01:14:35.926Z" }, + { url = "https://files.pythonhosted.org/packages/86/cf/5d52024371ee62658b7ed72480200524087528844ec1b65265bbcd31c974/grpcio-1.78.1-cp313-cp313-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:39d21fd30d38a5afb93f0e2e71e2ec2bd894605fb75d41d5a40060c2f98f8d11", size = 7174087, upload_time = "2026-02-20T01:14:39.98Z" }, + { url = "https://files.pythonhosted.org/packages/31/e6/5e59551afad4279e27335a6d60813b8aa3ae7b14fb62cea1d329a459c118/grpcio-1.78.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:09fbd4bcaadb6d8604ed1504b0bdf7ac18e48467e83a9d930a70a7fefa27e862", size = 6692881, upload_time = "2026-02-20T01:14:42.466Z" }, + { url = "https://files.pythonhosted.org/packages/db/8f/940062de2d14013c02f51b079eb717964d67d46f5d44f22038975c9d9576/grpcio-1.78.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:db681513a1bdd879c0b24a5a6a70398da5eaaba0e077a306410dc6008426847a", size = 7269092, upload_time = "2026-02-20T01:14:45.826Z" }, + { url = "https://files.pythonhosted.org/packages/09/87/9db657a4b5f3b15560ec591db950bc75a1a2f9e07832578d7e2b23d1a7bd/grpcio-1.78.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:f81816faa426da461e9a597a178832a351d6f1078102590a4b32c77d251b71eb", size = 8252037, upload_time = "2026-02-20T01:14:48.57Z" }, + { url = "https://files.pythonhosted.org/packages/e2/37/b980e0265479ec65e26b6e300a39ceac33ecb3f762c2861d4bac990317cf/grpcio-1.78.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffbb760df1cd49e0989f9826b2fd48930700db6846ac171eaff404f3cfbe5c28", size = 7695243, upload_time = "2026-02-20T01:14:51.376Z" }, + { url = "https://files.pythonhosted.org/packages/98/46/5fc42c100ab702fa1ea41a75c890c563c3f96432b4a287d5a6369654f323/grpcio-1.78.1-cp313-cp313-win32.whl", hash = "sha256:1a56bf3ee99af5cf32d469de91bf5de79bdac2e18082b495fc1063ea33f4f2d0", size = 4065329, upload_time = "2026-02-20T01:14:53.952Z" }, + { url = "https://files.pythonhosted.org/packages/b0/da/806d60bb6611dfc16cf463d982bd92bd8b6bd5f87dfac66b0a44dfe20995/grpcio-1.78.1-cp313-cp313-win_amd64.whl", hash = "sha256:8991c2add0d8505178ff6c3ae54bd9386279e712be82fa3733c54067aae9eda1", size = 4797637, upload_time = "2026-02-20T01:14:57.276Z" }, + { url = "https://files.pythonhosted.org/packages/96/3a/2d2ec4d2ce2eb9d6a2b862630a0d9d4ff4239ecf1474ecff21442a78612a/grpcio-1.78.1-cp314-cp314-linux_armv7l.whl", hash = "sha256:d101fe49b1e0fb4a7aa36ed0c3821a0f67a5956ef572745452d2cd790d723a3f", size = 5920256, upload_time = "2026-02-20T01:15:00.23Z" }, + { url = "https://files.pythonhosted.org/packages/9c/92/dccb7d087a1220ed358753945230c1ddeeed13684b954cb09db6758f1271/grpcio-1.78.1-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:5ce1855e8cfc217cdf6bcfe0cf046d7cf81ddcc3e6894d6cfd075f87a2d8f460", size = 11813749, upload_time = "2026-02-20T01:15:03.312Z" }, + { url = "https://files.pythonhosted.org/packages/ef/47/c20e87f87986da9998f30f14776ce27e61f02482a3a030ffe265089342c6/grpcio-1.78.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd26048d066b51f39fe9206e2bcc2cea869a5e5b2d13c8d523f4179193047ebd", size = 6488739, upload_time = "2026-02-20T01:15:14.349Z" }, + { url = "https://files.pythonhosted.org/packages/a6/c2/088bd96e255133d7d87c3eed0d598350d16cde1041bdbe2bb065967aaf91/grpcio-1.78.1-cp314-cp314-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:4b8d7fda614cf2af0f73bbb042f3b7fee2ecd4aea69ec98dbd903590a1083529", size = 7173096, upload_time = "2026-02-20T01:15:17.687Z" }, + { url = "https://files.pythonhosted.org/packages/60/ce/168db121073a03355ce3552b3b1f790b5ded62deffd7d98c5f642b9d3d81/grpcio-1.78.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:656a5bd142caeb8b1efe1fe0b4434ecc7781f44c97cfc7927f6608627cf178c0", size = 6693861, upload_time = "2026-02-20T01:15:20.911Z" }, + { url = "https://files.pythonhosted.org/packages/ae/d0/90b30ec2d9425215dd56922d85a90babbe6ee7e8256ba77d866b9c0d3aba/grpcio-1.78.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:99550e344482e3c21950c034f74668fccf8a546d50c1ecb4f717543bbdc071ba", size = 7278083, upload_time = "2026-02-20T01:15:23.698Z" }, + { url = "https://files.pythonhosted.org/packages/c1/fb/73f9ba0b082bcd385d46205095fd9c917754685885b28fce3741e9f54529/grpcio-1.78.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:8f27683ca68359bd3f0eb4925824d71e538f84338b3ae337ead2ae43977d7541", size = 8252546, upload_time = "2026-02-20T01:15:26.517Z" }, + { url = "https://files.pythonhosted.org/packages/85/c5/6a89ea3cb5db6c3d9ed029b0396c49f64328c0cf5d2630ffeed25711920a/grpcio-1.78.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:a40515b69ac50792f9b8ead260f194ba2bb3285375b6c40c7ff938f14c3df17d", size = 7696289, upload_time = "2026-02-20T01:15:29.718Z" }, + { url = "https://files.pythonhosted.org/packages/3d/05/63a7495048499ef437b4933d32e59b7f737bd5368ad6fb2479e2bd83bf2c/grpcio-1.78.1-cp314-cp314-win32.whl", hash = "sha256:2c473b54ef1618f4fb85e82ff4994de18143b74efc088b91b5a935a3a45042ba", size = 4142186, upload_time = "2026-02-20T01:15:32.786Z" }, + { url = "https://files.pythonhosted.org/packages/1c/ce/adfe7e5f701d503be7778291757452e3fab6b19acf51917c79f5d1cf7f8a/grpcio-1.78.1-cp314-cp314-win_amd64.whl", hash = "sha256:e2a6b33d1050dce2c6f563c5caf7f7cbeebf7fba8cde37ffe3803d50526900d1", size = 4932000, upload_time = "2026-02-20T01:15:36.127Z" }, +] + [[package]] name = "h11" -version = "0.14.0" +version = "0.16.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418, upload_time = "2022-09-25T15:40:01.519Z" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload_time = "2025-04-24T03:35:25.427Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259, upload_time = "2022-09-25T15:39:59.68Z" }, + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload_time = "2025-04-24T03:35:24.344Z" }, ] [[package]] name = "httpcore" -version = "1.0.8" +version = "1.0.9" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, { name = "h11" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/9f/45/ad3e1b4d448f22c0cff4f5692f5ed0666658578e358b8d58a19846048059/httpcore-1.0.8.tar.gz", hash = "sha256:86e94505ed24ea06514883fd44d2bc02d90e77e7979c8eb71b90f41d364a1bad", size = 85385, upload_time = "2025-04-11T14:42:46.661Z" } +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload_time = "2025-04-24T22:06:22.219Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/18/8d/f052b1e336bb2c1fc7ed1aaed898aa570c0b61a09707b108979d9fc6e308/httpcore-1.0.8-py3-none-any.whl", hash = "sha256:5254cf149bcb5f75e9d1b2b9f729ea4a4b883d1ad7379fc632b727cec23674be", size = 78732, upload_time = "2025-04-11T14:42:44.896Z" }, + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload_time = "2025-04-24T22:06:20.566Z" }, ] [[package]] @@ -510,6 +798,103 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload_time = "2025-03-19T20:10:01.071Z" }, ] +[[package]] +name = "jiter" +version = "0.13.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0d/5e/4ec91646aee381d01cdb9974e30882c9cd3b8c5d1079d6b5ff4af522439a/jiter-0.13.0.tar.gz", hash = "sha256:f2839f9c2c7e2dffc1bc5929a510e14ce0a946be9365fd1219e7ef342dae14f4", size = 164847, upload_time = "2026-02-02T12:37:56.441Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d0/5a/41da76c5ea07bec1b0472b6b2fdb1b651074d504b19374d7e130e0cdfb25/jiter-0.13.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2ffc63785fd6c7977defe49b9824ae6ce2b2e2b77ce539bdaf006c26da06342e", size = 311164, upload_time = "2026-02-02T12:35:17.688Z" }, + { url = "https://files.pythonhosted.org/packages/40/cb/4a1bf994a3e869f0d39d10e11efb471b76d0ad70ecbfb591427a46c880c2/jiter-0.13.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4a638816427006c1e3f0013eb66d391d7a3acda99a7b0cf091eff4497ccea33a", size = 320296, upload_time = "2026-02-02T12:35:19.828Z" }, + { url = "https://files.pythonhosted.org/packages/09/82/acd71ca9b50ecebadc3979c541cd717cce2fe2bc86236f4fa597565d8f1a/jiter-0.13.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19928b5d1ce0ff8c1ee1b9bdef3b5bfc19e8304f1b904e436caf30bc15dc6cf5", size = 352742, upload_time = "2026-02-02T12:35:21.258Z" }, + { url = "https://files.pythonhosted.org/packages/71/03/d1fc996f3aecfd42eb70922edecfb6dd26421c874503e241153ad41df94f/jiter-0.13.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:309549b778b949d731a2f0e1594a3f805716be704a73bf3ad9a807eed5eb5721", size = 363145, upload_time = "2026-02-02T12:35:24.653Z" }, + { url = "https://files.pythonhosted.org/packages/f1/61/a30492366378cc7a93088858f8991acd7d959759fe6138c12a4644e58e81/jiter-0.13.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bcdabaea26cb04e25df3103ce47f97466627999260290349a88c8136ecae0060", size = 487683, upload_time = "2026-02-02T12:35:26.162Z" }, + { url = "https://files.pythonhosted.org/packages/20/4e/4223cffa9dbbbc96ed821c5aeb6bca510848c72c02086d1ed3f1da3d58a7/jiter-0.13.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a3a377af27b236abbf665a69b2bdd680e3b5a0bd2af825cd3b81245279a7606c", size = 373579, upload_time = "2026-02-02T12:35:27.582Z" }, + { url = "https://files.pythonhosted.org/packages/fe/c9/b0489a01329ab07a83812d9ebcffe7820a38163c6d9e7da644f926ff877c/jiter-0.13.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fe49d3ff6db74321f144dff9addd4a5874d3105ac5ba7c5b77fac099cfae31ae", size = 362904, upload_time = "2026-02-02T12:35:28.925Z" }, + { url = "https://files.pythonhosted.org/packages/05/af/53e561352a44afcba9a9bc67ee1d320b05a370aed8df54eafe714c4e454d/jiter-0.13.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2113c17c9a67071b0f820733c0893ed1d467b5fcf4414068169e5c2cabddb1e2", size = 392380, upload_time = "2026-02-02T12:35:30.385Z" }, + { url = "https://files.pythonhosted.org/packages/76/2a/dd805c3afb8ed5b326c5ae49e725d1b1255b9754b1b77dbecdc621b20773/jiter-0.13.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ab1185ca5c8b9491b55ebf6c1e8866b8f68258612899693e24a92c5fdb9455d5", size = 517939, upload_time = "2026-02-02T12:35:31.865Z" }, + { url = "https://files.pythonhosted.org/packages/20/2a/7b67d76f55b8fe14c937e7640389612f05f9a4145fc28ae128aaa5e62257/jiter-0.13.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:9621ca242547edc16400981ca3231e0c91c0c4c1ab8573a596cd9bb3575d5c2b", size = 551696, upload_time = "2026-02-02T12:35:33.306Z" }, + { url = "https://files.pythonhosted.org/packages/85/9c/57cdd64dac8f4c6ab8f994fe0eb04dc9fd1db102856a4458fcf8a99dfa62/jiter-0.13.0-cp310-cp310-win32.whl", hash = "sha256:a7637d92b1c9d7a771e8c56f445c7f84396d48f2e756e5978840ecba2fac0894", size = 204592, upload_time = "2026-02-02T12:35:34.58Z" }, + { url = "https://files.pythonhosted.org/packages/a7/38/f4f3ea5788b8a5bae7510a678cdc747eda0c45ffe534f9878ff37e7cf3b3/jiter-0.13.0-cp310-cp310-win_amd64.whl", hash = "sha256:c1b609e5cbd2f52bb74fb721515745b407df26d7b800458bd97cb3b972c29e7d", size = 206016, upload_time = "2026-02-02T12:35:36.435Z" }, + { url = "https://files.pythonhosted.org/packages/71/29/499f8c9eaa8a16751b1c0e45e6f5f1761d180da873d417996cc7bddc8eef/jiter-0.13.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ea026e70a9a28ebbdddcbcf0f1323128a8db66898a06eaad3a4e62d2f554d096", size = 311157, upload_time = "2026-02-02T12:35:37.758Z" }, + { url = "https://files.pythonhosted.org/packages/50/f6/566364c777d2ab450b92100bea11333c64c38d32caf8dc378b48e5b20c46/jiter-0.13.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:66aa3e663840152d18cc8ff1e4faad3dd181373491b9cfdc6004b92198d67911", size = 319729, upload_time = "2026-02-02T12:35:39.246Z" }, + { url = "https://files.pythonhosted.org/packages/73/dd/560f13ec5e4f116d8ad2658781646cca91b617ae3b8758d4a5076b278f70/jiter-0.13.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c3524798e70655ff19aec58c7d05adb1f074fecff62da857ea9be2b908b6d701", size = 354766, upload_time = "2026-02-02T12:35:40.662Z" }, + { url = "https://files.pythonhosted.org/packages/7c/0d/061faffcfe94608cbc28a0d42a77a74222bdf5055ccdbe5fd2292b94f510/jiter-0.13.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ec7e287d7fbd02cb6e22f9a00dd9c9cd504c40a61f2c61e7e1f9690a82726b4c", size = 362587, upload_time = "2026-02-02T12:35:42.025Z" }, + { url = "https://files.pythonhosted.org/packages/92/c9/c66a7864982fd38a9773ec6e932e0398d1262677b8c60faecd02ffb67bf3/jiter-0.13.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:47455245307e4debf2ce6c6e65a717550a0244231240dcf3b8f7d64e4c2f22f4", size = 487537, upload_time = "2026-02-02T12:35:43.459Z" }, + { url = "https://files.pythonhosted.org/packages/6c/86/84eb4352cd3668f16d1a88929b5888a3fe0418ea8c1dfc2ad4e7bf6e069a/jiter-0.13.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ee9da221dca6e0429c2704c1b3655fe7b025204a71d4d9b73390c759d776d165", size = 373717, upload_time = "2026-02-02T12:35:44.928Z" }, + { url = "https://files.pythonhosted.org/packages/6e/09/9fe4c159358176f82d4390407a03f506a8659ed13ca3ac93a843402acecf/jiter-0.13.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:24ab43126d5e05f3d53a36a8e11eb2f23304c6c1117844aaaf9a0aa5e40b5018", size = 362683, upload_time = "2026-02-02T12:35:46.636Z" }, + { url = "https://files.pythonhosted.org/packages/c9/5e/85f3ab9caca0c1d0897937d378b4a515cae9e119730563572361ea0c48ae/jiter-0.13.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9da38b4fedde4fb528c740c2564628fbab737166a0e73d6d46cb4bb5463ff411", size = 392345, upload_time = "2026-02-02T12:35:48.088Z" }, + { url = "https://files.pythonhosted.org/packages/12/4c/05b8629ad546191939e6f0c2f17e29f542a398f4a52fb987bc70b6d1eb8b/jiter-0.13.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0b34c519e17658ed88d5047999a93547f8889f3c1824120c26ad6be5f27b6cf5", size = 517775, upload_time = "2026-02-02T12:35:49.482Z" }, + { url = "https://files.pythonhosted.org/packages/4d/88/367ea2eb6bc582c7052e4baf5ddf57ebe5ab924a88e0e09830dfb585c02d/jiter-0.13.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d2a6394e6af690d462310a86b53c47ad75ac8c21dc79f120714ea449979cb1d3", size = 551325, upload_time = "2026-02-02T12:35:51.104Z" }, + { url = "https://files.pythonhosted.org/packages/f3/12/fa377ffb94a2f28c41afaed093e0d70cfe512035d5ecb0cad0ae4792d35e/jiter-0.13.0-cp311-cp311-win32.whl", hash = "sha256:0f0c065695f616a27c920a56ad0d4fc46415ef8b806bf8fc1cacf25002bd24e1", size = 204709, upload_time = "2026-02-02T12:35:52.467Z" }, + { url = "https://files.pythonhosted.org/packages/cb/16/8e8203ce92f844dfcd3d9d6a5a7322c77077248dbb12da52d23193a839cd/jiter-0.13.0-cp311-cp311-win_amd64.whl", hash = "sha256:0733312953b909688ae3c2d58d043aa040f9f1a6a75693defed7bc2cc4bf2654", size = 204560, upload_time = "2026-02-02T12:35:53.925Z" }, + { url = "https://files.pythonhosted.org/packages/44/26/97cc40663deb17b9e13c3a5cf29251788c271b18ee4d262c8f94798b8336/jiter-0.13.0-cp311-cp311-win_arm64.whl", hash = "sha256:5d9b34ad56761b3bf0fbe8f7e55468704107608512350962d3317ffd7a4382d5", size = 189608, upload_time = "2026-02-02T12:35:55.304Z" }, + { url = "https://files.pythonhosted.org/packages/2e/30/7687e4f87086829955013ca12a9233523349767f69653ebc27036313def9/jiter-0.13.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:0a2bd69fc1d902e89925fc34d1da51b2128019423d7b339a45d9e99c894e0663", size = 307958, upload_time = "2026-02-02T12:35:57.165Z" }, + { url = "https://files.pythonhosted.org/packages/c3/27/e57f9a783246ed95481e6749cc5002a8a767a73177a83c63ea71f0528b90/jiter-0.13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f917a04240ef31898182f76a332f508f2cc4b57d2b4d7ad2dbfebbfe167eb505", size = 318597, upload_time = "2026-02-02T12:35:58.591Z" }, + { url = "https://files.pythonhosted.org/packages/cf/52/e5719a60ac5d4d7c5995461a94ad5ef962a37c8bf5b088390e6fad59b2ff/jiter-0.13.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c1e2b199f446d3e82246b4fd9236d7cb502dc2222b18698ba0d986d2fecc6152", size = 348821, upload_time = "2026-02-02T12:36:00.093Z" }, + { url = "https://files.pythonhosted.org/packages/61/db/c1efc32b8ba4c740ab3fc2d037d8753f67685f475e26b9d6536a4322bcdd/jiter-0.13.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:04670992b576fa65bd056dbac0c39fe8bd67681c380cb2b48efa885711d9d726", size = 364163, upload_time = "2026-02-02T12:36:01.937Z" }, + { url = "https://files.pythonhosted.org/packages/55/8a/fb75556236047c8806995671a18e4a0ad646ed255276f51a20f32dceaeec/jiter-0.13.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5a1aff1fbdb803a376d4d22a8f63f8e7ccbce0b4890c26cc7af9e501ab339ef0", size = 483709, upload_time = "2026-02-02T12:36:03.41Z" }, + { url = "https://files.pythonhosted.org/packages/7e/16/43512e6ee863875693a8e6f6d532e19d650779d6ba9a81593ae40a9088ff/jiter-0.13.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b3fb8c2053acaef8580809ac1d1f7481a0a0bdc012fd7f5d8b18fb696a5a089", size = 370480, upload_time = "2026-02-02T12:36:04.791Z" }, + { url = "https://files.pythonhosted.org/packages/f8/4c/09b93e30e984a187bc8aaa3510e1ec8dcbdcd71ca05d2f56aac0492453aa/jiter-0.13.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bdaba7d87e66f26a2c45d8cbadcbfc4bf7884182317907baf39cfe9775bb4d93", size = 360735, upload_time = "2026-02-02T12:36:06.994Z" }, + { url = "https://files.pythonhosted.org/packages/1a/1b/46c5e349019874ec5dfa508c14c37e29864ea108d376ae26d90bee238cd7/jiter-0.13.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:7b88d649135aca526da172e48083da915ec086b54e8e73a425ba50999468cc08", size = 391814, upload_time = "2026-02-02T12:36:08.368Z" }, + { url = "https://files.pythonhosted.org/packages/15/9e/26184760e85baee7162ad37b7912797d2077718476bf91517641c92b3639/jiter-0.13.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e404ea551d35438013c64b4f357b0474c7abf9f781c06d44fcaf7a14c69ff9e2", size = 513990, upload_time = "2026-02-02T12:36:09.993Z" }, + { url = "https://files.pythonhosted.org/packages/e9/34/2c9355247d6debad57a0a15e76ab1566ab799388042743656e566b3b7de1/jiter-0.13.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:1f4748aad1b4a93c8bdd70f604d0f748cdc0e8744c5547798acfa52f10e79228", size = 548021, upload_time = "2026-02-02T12:36:11.376Z" }, + { url = "https://files.pythonhosted.org/packages/ac/4a/9f2c23255d04a834398b9c2e0e665382116911dc4d06b795710503cdad25/jiter-0.13.0-cp312-cp312-win32.whl", hash = "sha256:0bf670e3b1445fc4d31612199f1744f67f889ee1bbae703c4b54dc097e5dd394", size = 203024, upload_time = "2026-02-02T12:36:12.682Z" }, + { url = "https://files.pythonhosted.org/packages/09/ee/f0ae675a957ae5a8f160be3e87acea6b11dc7b89f6b7ab057e77b2d2b13a/jiter-0.13.0-cp312-cp312-win_amd64.whl", hash = "sha256:15db60e121e11fe186c0b15236bd5d18381b9ddacdcf4e659feb96fc6c969c92", size = 205424, upload_time = "2026-02-02T12:36:13.93Z" }, + { url = "https://files.pythonhosted.org/packages/1b/02/ae611edf913d3cbf02c97cdb90374af2082c48d7190d74c1111dde08bcdd/jiter-0.13.0-cp312-cp312-win_arm64.whl", hash = "sha256:41f92313d17989102f3cb5dd533a02787cdb99454d494344b0361355da52fcb9", size = 186818, upload_time = "2026-02-02T12:36:15.308Z" }, + { url = "https://files.pythonhosted.org/packages/91/9c/7ee5a6ff4b9991e1a45263bfc46731634c4a2bde27dfda6c8251df2d958c/jiter-0.13.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1f8a55b848cbabf97d861495cd65f1e5c590246fabca8b48e1747c4dfc8f85bf", size = 306897, upload_time = "2026-02-02T12:36:16.748Z" }, + { url = "https://files.pythonhosted.org/packages/7c/02/be5b870d1d2be5dd6a91bdfb90f248fbb7dcbd21338f092c6b89817c3dbf/jiter-0.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f556aa591c00f2c45eb1b89f68f52441a016034d18b65da60e2d2875bbbf344a", size = 317507, upload_time = "2026-02-02T12:36:18.351Z" }, + { url = "https://files.pythonhosted.org/packages/da/92/b25d2ec333615f5f284f3a4024f7ce68cfa0604c322c6808b2344c7f5d2b/jiter-0.13.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f7e1d61da332ec412350463891923f960c3073cf1aae93b538f0bb4c8cd46efb", size = 350560, upload_time = "2026-02-02T12:36:19.746Z" }, + { url = "https://files.pythonhosted.org/packages/be/ec/74dcb99fef0aca9fbe56b303bf79f6bd839010cb18ad41000bf6cc71eec0/jiter-0.13.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3097d665a27bc96fd9bbf7f86178037db139f319f785e4757ce7ccbf390db6c2", size = 363232, upload_time = "2026-02-02T12:36:21.243Z" }, + { url = "https://files.pythonhosted.org/packages/1b/37/f17375e0bb2f6a812d4dd92d7616e41917f740f3e71343627da9db2824ce/jiter-0.13.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9d01ecc3a8cbdb6f25a37bd500510550b64ddf9f7d64a107d92f3ccb25035d0f", size = 483727, upload_time = "2026-02-02T12:36:22.688Z" }, + { url = "https://files.pythonhosted.org/packages/77/d2/a71160a5ae1a1e66c1395b37ef77da67513b0adba73b993a27fbe47eb048/jiter-0.13.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ed9bbc30f5d60a3bdf63ae76beb3f9db280d7f195dfcfa61af792d6ce912d159", size = 370799, upload_time = "2026-02-02T12:36:24.106Z" }, + { url = "https://files.pythonhosted.org/packages/01/99/ed5e478ff0eb4e8aa5fd998f9d69603c9fd3f32de3bd16c2b1194f68361c/jiter-0.13.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98fbafb6e88256f4454de33c1f40203d09fc33ed19162a68b3b257b29ca7f663", size = 359120, upload_time = "2026-02-02T12:36:25.519Z" }, + { url = "https://files.pythonhosted.org/packages/16/be/7ffd08203277a813f732ba897352797fa9493faf8dc7995b31f3d9cb9488/jiter-0.13.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5467696f6b827f1116556cb0db620440380434591e93ecee7fd14d1a491b6daa", size = 390664, upload_time = "2026-02-02T12:36:26.866Z" }, + { url = "https://files.pythonhosted.org/packages/d1/84/e0787856196d6d346264d6dcccb01f741e5f0bd014c1d9a2ebe149caf4f3/jiter-0.13.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:2d08c9475d48b92892583df9da592a0e2ac49bcd41fae1fec4f39ba6cf107820", size = 513543, upload_time = "2026-02-02T12:36:28.217Z" }, + { url = "https://files.pythonhosted.org/packages/65/50/ecbd258181c4313cf79bca6c88fb63207d04d5bf5e4f65174114d072aa55/jiter-0.13.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:aed40e099404721d7fcaf5b89bd3b4568a4666358bcac7b6b15c09fb6252ab68", size = 547262, upload_time = "2026-02-02T12:36:29.678Z" }, + { url = "https://files.pythonhosted.org/packages/27/da/68f38d12e7111d2016cd198161b36e1f042bd115c169255bcb7ec823a3bf/jiter-0.13.0-cp313-cp313-win32.whl", hash = "sha256:36ebfbcffafb146d0e6ffb3e74d51e03d9c35ce7c625c8066cdbfc7b953bdc72", size = 200630, upload_time = "2026-02-02T12:36:31.808Z" }, + { url = "https://files.pythonhosted.org/packages/25/65/3bd1a972c9a08ecd22eb3b08a95d1941ebe6938aea620c246cf426ae09c2/jiter-0.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:8d76029f077379374cf0dbc78dbe45b38dec4a2eb78b08b5194ce836b2517afc", size = 202602, upload_time = "2026-02-02T12:36:33.679Z" }, + { url = "https://files.pythonhosted.org/packages/15/fe/13bd3678a311aa67686bb303654792c48206a112068f8b0b21426eb6851e/jiter-0.13.0-cp313-cp313-win_arm64.whl", hash = "sha256:bb7613e1a427cfcb6ea4544f9ac566b93d5bf67e0d48c787eca673ff9c9dff2b", size = 185939, upload_time = "2026-02-02T12:36:35.065Z" }, + { url = "https://files.pythonhosted.org/packages/49/19/a929ec002ad3228bc97ca01dbb14f7632fffdc84a95ec92ceaf4145688ae/jiter-0.13.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:fa476ab5dd49f3bf3a168e05f89358c75a17608dbabb080ef65f96b27c19ab10", size = 316616, upload_time = "2026-02-02T12:36:36.579Z" }, + { url = "https://files.pythonhosted.org/packages/52/56/d19a9a194afa37c1728831e5fb81b7722c3de18a3109e8f282bfc23e587a/jiter-0.13.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ade8cb6ff5632a62b7dbd4757d8c5573f7a2e9ae285d6b5b841707d8363205ef", size = 346850, upload_time = "2026-02-02T12:36:38.058Z" }, + { url = "https://files.pythonhosted.org/packages/36/4a/94e831c6bf287754a8a019cb966ed39ff8be6ab78cadecf08df3bb02d505/jiter-0.13.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9950290340acc1adaded363edd94baebcee7dabdfa8bee4790794cd5cfad2af6", size = 358551, upload_time = "2026-02-02T12:36:39.417Z" }, + { url = "https://files.pythonhosted.org/packages/a2/ec/a4c72c822695fa80e55d2b4142b73f0012035d9fcf90eccc56bc060db37c/jiter-0.13.0-cp313-cp313t-win_amd64.whl", hash = "sha256:2b4972c6df33731aac0742b64fd0d18e0a69bc7d6e03108ce7d40c85fd9e3e6d", size = 201950, upload_time = "2026-02-02T12:36:40.791Z" }, + { url = "https://files.pythonhosted.org/packages/b6/00/393553ec27b824fbc29047e9c7cd4a3951d7fbe4a76743f17e44034fa4e4/jiter-0.13.0-cp313-cp313t-win_arm64.whl", hash = "sha256:701a1e77d1e593c1b435315ff625fd071f0998c5f02792038a5ca98899261b7d", size = 185852, upload_time = "2026-02-02T12:36:42.077Z" }, + { url = "https://files.pythonhosted.org/packages/6e/f5/f1997e987211f6f9bd71b8083047b316208b4aca0b529bb5f8c96c89ef3e/jiter-0.13.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:cc5223ab19fe25e2f0bf2643204ad7318896fe3729bf12fde41b77bfc4fafff0", size = 308804, upload_time = "2026-02-02T12:36:43.496Z" }, + { url = "https://files.pythonhosted.org/packages/cd/8f/5482a7677731fd44881f0204981ce2d7175db271f82cba2085dd2212e095/jiter-0.13.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9776ebe51713acf438fd9b4405fcd86893ae5d03487546dae7f34993217f8a91", size = 318787, upload_time = "2026-02-02T12:36:45.071Z" }, + { url = "https://files.pythonhosted.org/packages/f3/b9/7257ac59778f1cd025b26a23c5520a36a424f7f1b068f2442a5b499b7464/jiter-0.13.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:879e768938e7b49b5e90b7e3fecc0dbec01b8cb89595861fb39a8967c5220d09", size = 353880, upload_time = "2026-02-02T12:36:47.365Z" }, + { url = "https://files.pythonhosted.org/packages/c3/87/719eec4a3f0841dad99e3d3604ee4cba36af4419a76f3cb0b8e2e691ad67/jiter-0.13.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:682161a67adea11e3aae9038c06c8b4a9a71023228767477d683f69903ebc607", size = 366702, upload_time = "2026-02-02T12:36:48.871Z" }, + { url = "https://files.pythonhosted.org/packages/d2/65/415f0a75cf6921e43365a1bc227c565cb949caca8b7532776e430cbaa530/jiter-0.13.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a13b68cd1cd8cc9de8f244ebae18ccb3e4067ad205220ef324c39181e23bbf66", size = 486319, upload_time = "2026-02-02T12:36:53.006Z" }, + { url = "https://files.pythonhosted.org/packages/54/a2/9e12b48e82c6bbc6081fd81abf915e1443add1b13d8fc586e1d90bb02bb8/jiter-0.13.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:87ce0f14c6c08892b610686ae8be350bf368467b6acd5085a5b65441e2bf36d2", size = 372289, upload_time = "2026-02-02T12:36:54.593Z" }, + { url = "https://files.pythonhosted.org/packages/4e/c1/e4693f107a1789a239c759a432e9afc592366f04e901470c2af89cfd28e1/jiter-0.13.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c365005b05505a90d1c47856420980d0237adf82f70c4aff7aebd3c1cc143ad", size = 360165, upload_time = "2026-02-02T12:36:56.112Z" }, + { url = "https://files.pythonhosted.org/packages/17/08/91b9ea976c1c758240614bd88442681a87672eebc3d9a6dde476874e706b/jiter-0.13.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1317fdffd16f5873e46ce27d0e0f7f4f90f0cdf1d86bf6abeaea9f63ca2c401d", size = 389634, upload_time = "2026-02-02T12:36:57.495Z" }, + { url = "https://files.pythonhosted.org/packages/18/23/58325ef99390d6d40427ed6005bf1ad54f2577866594bcf13ce55675f87d/jiter-0.13.0-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:c05b450d37ba0c9e21c77fef1f205f56bcee2330bddca68d344baebfc55ae0df", size = 514933, upload_time = "2026-02-02T12:36:58.909Z" }, + { url = "https://files.pythonhosted.org/packages/5b/25/69f1120c7c395fd276c3996bb8adefa9c6b84c12bb7111e5c6ccdcd8526d/jiter-0.13.0-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:775e10de3849d0631a97c603f996f518159272db00fdda0a780f81752255ee9d", size = 548842, upload_time = "2026-02-02T12:37:00.433Z" }, + { url = "https://files.pythonhosted.org/packages/18/05/981c9669d86850c5fbb0d9e62bba144787f9fba84546ba43d624ee27ef29/jiter-0.13.0-cp314-cp314-win32.whl", hash = "sha256:632bf7c1d28421c00dd8bbb8a3bac5663e1f57d5cd5ed962bce3c73bf62608e6", size = 202108, upload_time = "2026-02-02T12:37:01.718Z" }, + { url = "https://files.pythonhosted.org/packages/8d/96/cdcf54dd0b0341db7d25413229888a346c7130bd20820530905fdb65727b/jiter-0.13.0-cp314-cp314-win_amd64.whl", hash = "sha256:f22ef501c3f87ede88f23f9b11e608581c14f04db59b6a801f354397ae13739f", size = 204027, upload_time = "2026-02-02T12:37:03.075Z" }, + { url = "https://files.pythonhosted.org/packages/fb/f9/724bcaaab7a3cd727031fe4f6995cb86c4bd344909177c186699c8dec51a/jiter-0.13.0-cp314-cp314-win_arm64.whl", hash = "sha256:07b75fe09a4ee8e0c606200622e571e44943f47254f95e2436c8bdcaceb36d7d", size = 187199, upload_time = "2026-02-02T12:37:04.414Z" }, + { url = "https://files.pythonhosted.org/packages/62/92/1661d8b9fd6a3d7a2d89831db26fe3c1509a287d83ad7838831c7b7a5c7e/jiter-0.13.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:964538479359059a35fb400e769295d4b315ae61e4105396d355a12f7fef09f0", size = 318423, upload_time = "2026-02-02T12:37:05.806Z" }, + { url = "https://files.pythonhosted.org/packages/4f/3b/f77d342a54d4ebcd128e520fc58ec2f5b30a423b0fd26acdfc0c6fef8e26/jiter-0.13.0-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e104da1db1c0991b3eaed391ccd650ae8d947eab1480c733e5a3fb28d4313e40", size = 351438, upload_time = "2026-02-02T12:37:07.189Z" }, + { url = "https://files.pythonhosted.org/packages/76/b3/ba9a69f0e4209bd3331470c723c2f5509e6f0482e416b612431a5061ed71/jiter-0.13.0-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0e3a5f0cde8ff433b8e88e41aa40131455420fb3649a3c7abdda6145f8cb7202", size = 364774, upload_time = "2026-02-02T12:37:08.579Z" }, + { url = "https://files.pythonhosted.org/packages/b3/16/6cdb31fa342932602458dbb631bfbd47f601e03d2e4950740e0b2100b570/jiter-0.13.0-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:57aab48f40be1db920a582b30b116fe2435d184f77f0e4226f546794cedd9cf0", size = 487238, upload_time = "2026-02-02T12:37:10.066Z" }, + { url = "https://files.pythonhosted.org/packages/ed/b1/956cc7abaca8d95c13aa8d6c9b3f3797241c246cd6e792934cc4c8b250d2/jiter-0.13.0-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7772115877c53f62beeb8fd853cab692dbc04374ef623b30f997959a4c0e7e95", size = 372892, upload_time = "2026-02-02T12:37:11.656Z" }, + { url = "https://files.pythonhosted.org/packages/26/c4/97ecde8b1e74f67b8598c57c6fccf6df86ea7861ed29da84629cdbba76c4/jiter-0.13.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1211427574b17b633cfceba5040de8081e5abf114f7a7602f73d2e16f9fdaa59", size = 360309, upload_time = "2026-02-02T12:37:13.244Z" }, + { url = "https://files.pythonhosted.org/packages/4b/d7/eabe3cf46715854ccc80be2cd78dd4c36aedeb30751dbf85a1d08c14373c/jiter-0.13.0-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:7beae3a3d3b5212d3a55d2961db3c292e02e302feb43fce6a3f7a31b90ea6dfe", size = 389607, upload_time = "2026-02-02T12:37:14.881Z" }, + { url = "https://files.pythonhosted.org/packages/df/2d/03963fc0804e6109b82decfb9974eb92df3797fe7222428cae12f8ccaa0c/jiter-0.13.0-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:e5562a0f0e90a6223b704163ea28e831bd3a9faa3512a711f031611e6b06c939", size = 514986, upload_time = "2026-02-02T12:37:16.326Z" }, + { url = "https://files.pythonhosted.org/packages/f6/6c/8c83b45eb3eb1c1e18d841fe30b4b5bc5619d781267ca9bc03e005d8fd0a/jiter-0.13.0-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:6c26a424569a59140fb51160a56df13f438a2b0967365e987889186d5fc2f6f9", size = 548756, upload_time = "2026-02-02T12:37:17.736Z" }, + { url = "https://files.pythonhosted.org/packages/47/66/eea81dfff765ed66c68fd2ed8c96245109e13c896c2a5015c7839c92367e/jiter-0.13.0-cp314-cp314t-win32.whl", hash = "sha256:24dc96eca9f84da4131cdf87a95e6ce36765c3b156fc9ae33280873b1c32d5f6", size = 201196, upload_time = "2026-02-02T12:37:19.101Z" }, + { url = "https://files.pythonhosted.org/packages/ff/32/4ac9c7a76402f8f00d00842a7f6b83b284d0cf7c1e9d4227bc95aa6d17fa/jiter-0.13.0-cp314-cp314t-win_amd64.whl", hash = "sha256:0a8d76c7524087272c8ae913f5d9d608bd839154b62c4322ef65723d2e5bb0b8", size = 204215, upload_time = "2026-02-02T12:37:20.495Z" }, + { url = "https://files.pythonhosted.org/packages/f9/8e/7def204fea9f9be8b3c21a6f2dd6c020cf56c7d5ff753e0e23ed7f9ea57e/jiter-0.13.0-cp314-cp314t-win_arm64.whl", hash = "sha256:2c26cf47e2cad140fa23b6d58d435a7c0161f5c514284802f25e87fddfe11024", size = 187152, upload_time = "2026-02-02T12:37:22.124Z" }, + { url = "https://files.pythonhosted.org/packages/79/b3/3c29819a27178d0e461a8571fb63c6ae38be6dc36b78b3ec2876bbd6a910/jiter-0.13.0-graalpy311-graalpy242_311_native-macosx_10_12_x86_64.whl", hash = "sha256:b1cbfa133241d0e6bdab48dcdc2604e8ba81512f6bbd68ec3e8e1357dd3c316c", size = 307016, upload_time = "2026-02-02T12:37:42.755Z" }, + { url = "https://files.pythonhosted.org/packages/eb/ae/60993e4b07b1ac5ebe46da7aa99fdbb802eb986c38d26e3883ac0125c4e0/jiter-0.13.0-graalpy311-graalpy242_311_native-macosx_11_0_arm64.whl", hash = "sha256:db367d8be9fad6e8ebbac4a7578b7af562e506211036cba2c06c3b998603c3d2", size = 305024, upload_time = "2026-02-02T12:37:44.774Z" }, + { url = "https://files.pythonhosted.org/packages/77/fa/2227e590e9cf98803db2811f172b2d6460a21539ab73006f251c66f44b14/jiter-0.13.0-graalpy311-graalpy242_311_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45f6f8efb2f3b0603092401dc2df79fa89ccbc027aaba4174d2d4133ed661434", size = 339337, upload_time = "2026-02-02T12:37:46.668Z" }, + { url = "https://files.pythonhosted.org/packages/2d/92/015173281f7eb96c0ef580c997da8ef50870d4f7f4c9e03c845a1d62ae04/jiter-0.13.0-graalpy311-graalpy242_311_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:597245258e6ad085d064780abfb23a284d418d3e61c57362d9449c6c7317ee2d", size = 346395, upload_time = "2026-02-02T12:37:48.09Z" }, + { url = "https://files.pythonhosted.org/packages/80/60/e50fa45dd7e2eae049f0ce964663849e897300433921198aef94b6ffa23a/jiter-0.13.0-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:3d744a6061afba08dd7ae375dcde870cffb14429b7477e10f67e9e6d68772a0a", size = 305169, upload_time = "2026-02-02T12:37:50.376Z" }, + { url = "https://files.pythonhosted.org/packages/d2/73/a009f41c5eed71c49bec53036c4b33555afcdee70682a18c6f66e396c039/jiter-0.13.0-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:ff732bd0a0e778f43d5009840f20b935e79087b4dc65bd36f1cd0f9b04b8ff7f", size = 303808, upload_time = "2026-02-02T12:37:52.092Z" }, + { url = "https://files.pythonhosted.org/packages/c4/10/528b439290763bff3d939268085d03382471b442f212dca4ff5f12802d43/jiter-0.13.0-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ab44b178f7981fcaea7e0a5df20e773c663d06ffda0198f1a524e91b2fde7e59", size = 337384, upload_time = "2026-02-02T12:37:53.582Z" }, + { url = "https://files.pythonhosted.org/packages/67/8a/a342b2f0251f3dac4ca17618265d93bf244a2a4d089126e81e4c1056ac50/jiter-0.13.0-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7bb00b6d26db67a05fe3e12c76edc75f32077fb51deed13822dc648fa373bc19", size = 343768, upload_time = "2026-02-02T12:37:55.055Z" }, +] + [[package]] name = "jmespath" version = "1.0.1" @@ -519,6 +904,33 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload_time = "2022-06-17T18:00:10.251Z" }, ] +[[package]] +name = "jsonschema" +version = "4.26.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "jsonschema-specifications" }, + { name = "referencing" }, + { name = "rpds-py" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b3/fc/e067678238fa451312d4c62bf6e6cf5ec56375422aee02f9cb5f909b3047/jsonschema-4.26.0.tar.gz", hash = "sha256:0c26707e2efad8aa1bfc5b7ce170f3fccc2e4918ff85989ba9ffa9facb2be326", size = 366583, upload_time = "2026-01-07T13:41:07.246Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/69/90/f63fb5873511e014207a475e2bb4e8b2e570d655b00ac19a9a0ca0a385ee/jsonschema-4.26.0-py3-none-any.whl", hash = "sha256:d489f15263b8d200f8387e64b4c3a75f06629559fb73deb8fdfb525f2dab50ce", size = 90630, upload_time = "2026-01-07T13:41:05.306Z" }, +] + +[[package]] +name = "jsonschema-specifications" +version = "2025.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "referencing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/19/74/a633ee74eb36c44aa6d1095e7cc5569bebf04342ee146178e2d36600708b/jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d", size = 32855, upload_time = "2025-09-08T01:34:59.186Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload_time = "2025-09-08T01:34:57.871Z" }, +] + [[package]] name = "markdown-it-py" version = "3.0.0" @@ -533,22 +945,27 @@ wheels = [ [[package]] name = "mcp" -version = "1.9.4" +version = "1.23.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, { name = "httpx" }, { name = "httpx-sse" }, + { name = "jsonschema" }, { name = "pydantic" }, { name = "pydantic-settings" }, + { name = "pyjwt", extra = ["crypto"] }, { name = "python-multipart" }, + { name = "pywin32", marker = "sys_platform == 'win32'" }, { name = "sse-starlette" }, { name = "starlette" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/06/f2/dc2450e566eeccf92d89a00c3e813234ad58e2ba1e31d11467a09ac4f3b9/mcp-1.9.4.tar.gz", hash = "sha256:cfb0bcd1a9535b42edaef89947b9e18a8feb49362e1cc059d6e7fc636f2cb09f", size = 333294, upload_time = "2025-06-12T08:20:30.158Z" } +sdist = { url = "https://files.pythonhosted.org/packages/25/1a/9c8a5362e3448d585081d6c7aa95898a64e0ac59d3e26169ae6c3ca5feaf/mcp-1.23.0.tar.gz", hash = "sha256:84e0c29316d0a8cf0affd196fd000487ac512aa3f771b63b2ea864e22961772b", size = 596506, upload_time = "2025-12-02T13:40:02.558Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/97/fc/80e655c955137393c443842ffcc4feccab5b12fa7cb8de9ced90f90e6998/mcp-1.9.4-py3-none-any.whl", hash = "sha256:7fcf36b62936adb8e63f89346bccca1268eeca9bf6dfb562ee10b1dfbda9dac0", size = 130232, upload_time = "2025-06-12T08:20:28.551Z" }, + { url = "https://files.pythonhosted.org/packages/7b/b2/28739ce409f98159c0121eab56e69ad71546c4f34ac8b42e58c03f57dccc/mcp-1.23.0-py3-none-any.whl", hash = "sha256:5a645cf111ed329f4619f2629a3f15d9aabd7adc2ea09d600d31467b51ecb64f", size = 231427, upload_time = "2025-12-02T13:40:00.738Z" }, ] [package.optional-dependencies] @@ -665,7 +1082,7 @@ wheels = [ [[package]] name = "opensearch-mcp-server-py" -version = "0.7.0" +version = "0.9.0" source = { editable = "." } dependencies = [ { name = "aiohttp" }, @@ -673,6 +1090,7 @@ dependencies = [ { name = "mcp", extra = ["cli"] }, { name = "opensearch-py" }, { name = "pydantic" }, + { name = "pyjwt" }, { name = "pyyaml" }, { name = "requests-aws4auth" }, { name = "semver" }, @@ -680,6 +1098,7 @@ dependencies = [ [package.dev-dependencies] dev = [ + { name = "anthropic" }, { name = "pytest" }, { name = "pytest-asyncio" }, { name = "pytest-cov" }, @@ -691,8 +1110,9 @@ requires-dist = [ { name = "aiohttp", specifier = ">=3.11.18" }, { name = "boto3", specifier = ">=1.38.3" }, { name = "mcp", extras = ["cli"], specifier = ">=1.9.4" }, - { name = "opensearch-py", specifier = "==2.8.0" }, + { name = "opensearch-py", specifier = "==3.1.0" }, { name = "pydantic", specifier = ">=2.11.3" }, + { name = "pyjwt", specifier = ">=2.12.0" }, { name = "pyyaml", specifier = ">=6.0.2" }, { name = "requests-aws4auth", specifier = ">=1.3.1" }, { name = "semver", specifier = ">=3.0.4" }, @@ -700,26 +1120,40 @@ requires-dist = [ [package.metadata.requires-dev] dev = [ + { name = "anthropic", specifier = ">=0.50.0" }, { name = "pytest", specifier = ">=8.3.5" }, { name = "pytest-asyncio", specifier = ">=0.26.0" }, { name = "pytest-cov", specifier = ">=6.2.1" }, { name = "ruff", specifier = ">=0.9.7" }, ] +[[package]] +name = "opensearch-protobufs" +version = "0.19.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "grpcio" }, + { name = "protobuf" }, +] +wheels = [ + { url = "https://files.pythonhosted.org/packages/16/e2/8a09dbdbfe51e30dfecb625a0f5c524a53bfa4b1fba168f73ac85621dba2/opensearch_protobufs-0.19.0-py3-none-any.whl", hash = "sha256:5137c9c2323cc7debb694754b820ca4cfb5fc8eb180c41ff125698c3ee11bfc2", size = 39778, upload_time = "2025-09-29T20:05:52.379Z" }, +] + [[package]] name = "opensearch-py" -version = "2.8.0" +version = "3.1.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, { name = "events" }, + { name = "opensearch-protobufs" }, { name = "python-dateutil" }, { name = "requests" }, { name = "urllib3" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/7c/e4/192c97ca676c81f69e138a22e10fb03f64e14a55633cb2acffb41bf6d061/opensearch_py-2.8.0.tar.gz", hash = "sha256:6598df0bc7a003294edd0ba88a331e0793acbb8c910c43edf398791e3b2eccda", size = 237923, upload_time = "2024-11-29T21:06:02.952Z" } +sdist = { url = "https://files.pythonhosted.org/packages/65/9f/d4969f7e8fa221bfebf254cc3056e7c743ce36ac9874e06110474f7c947d/opensearch_py-3.1.0.tar.gz", hash = "sha256:883573af13175ff102b61c80b77934a9e937bdcc40cda2b92051ad53336bc055", size = 258616, upload_time = "2025-11-20T16:37:36.777Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/23/35/a957c6fb88ff6874996be688448b889475cf0ea978446cd5a30e764e0561/opensearch_py-2.8.0-py3-none-any.whl", hash = "sha256:52c60fdb5d4dcf6cce3ee746c13b194529b0161e0f41268b98ab8f1624abe2fa", size = 353492, upload_time = "2024-11-29T21:05:56.075Z" }, + { url = "https://files.pythonhosted.org/packages/08/a1/293c8ad81768ad625283d960685bde07c6302abf20a685e693b48ab6eb91/opensearch_py-3.1.0-py3-none-any.whl", hash = "sha256:e5af83d0454323e6ea9ddee8c0dcc185c0181054592d23cb701da46271a3b65b", size = 385729, upload_time = "2025-11-20T16:37:34.941Z" }, ] [[package]] @@ -829,6 +1263,30 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/b8/d3/c3cb8f1d6ae3b37f83e1de806713a9b3642c5895f0215a62e1a4bd6e5e34/propcache-0.3.1-py3-none-any.whl", hash = "sha256:9a8ecf38de50a7f518c21568c80f985e776397b902f1ce0b01f799aba1608b40", size = 12376, upload_time = "2025-03-26T03:06:10.5Z" }, ] +[[package]] +name = "protobuf" +version = "6.33.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ba/25/7c72c307aafc96fa87062aa6291d9f7c94836e43214d43722e86037aac02/protobuf-6.33.5.tar.gz", hash = "sha256:6ddcac2a081f8b7b9642c09406bc6a4290128fce5f471cddd165960bb9119e5c", size = 444465, upload_time = "2026-01-29T21:51:33.494Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/79/af92d0a8369732b027e6d6084251dd8e782c685c72da161bd4a2e00fbabb/protobuf-6.33.5-cp310-abi3-win32.whl", hash = "sha256:d71b040839446bac0f4d162e758bea99c8251161dae9d0983a3b88dee345153b", size = 425769, upload_time = "2026-01-29T21:51:21.751Z" }, + { url = "https://files.pythonhosted.org/packages/55/75/bb9bc917d10e9ee13dee8607eb9ab963b7cf8be607c46e7862c748aa2af7/protobuf-6.33.5-cp310-abi3-win_amd64.whl", hash = "sha256:3093804752167bcab3998bec9f1048baae6e29505adaf1afd14a37bddede533c", size = 437118, upload_time = "2026-01-29T21:51:24.022Z" }, + { url = "https://files.pythonhosted.org/packages/a2/6b/e48dfc1191bc5b52950246275bf4089773e91cb5ba3592621723cdddca62/protobuf-6.33.5-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:a5cb85982d95d906df1e2210e58f8e4f1e3cdc088e52c921a041f9c9a0386de5", size = 427766, upload_time = "2026-01-29T21:51:25.413Z" }, + { url = "https://files.pythonhosted.org/packages/4e/b1/c79468184310de09d75095ed1314b839eb2f72df71097db9d1404a1b2717/protobuf-6.33.5-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:9b71e0281f36f179d00cbcb119cb19dec4d14a81393e5ea220f64b286173e190", size = 324638, upload_time = "2026-01-29T21:51:26.423Z" }, + { url = "https://files.pythonhosted.org/packages/c5/f5/65d838092fd01c44d16037953fd4c2cc851e783de9b8f02b27ec4ffd906f/protobuf-6.33.5-cp39-abi3-manylinux2014_s390x.whl", hash = "sha256:8afa18e1d6d20af15b417e728e9f60f3aa108ee76f23c3b2c07a2c3b546d3afd", size = 339411, upload_time = "2026-01-29T21:51:27.446Z" }, + { url = "https://files.pythonhosted.org/packages/9b/53/a9443aa3ca9ba8724fdfa02dd1887c1bcd8e89556b715cfbacca6b63dbec/protobuf-6.33.5-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:cbf16ba3350fb7b889fca858fb215967792dc125b35c7976ca4818bee3521cf0", size = 323465, upload_time = "2026-01-29T21:51:28.925Z" }, + { url = "https://files.pythonhosted.org/packages/57/bf/2086963c69bdac3d7cff1cc7ff79b8ce5ea0bec6797a017e1be338a46248/protobuf-6.33.5-py3-none-any.whl", hash = "sha256:69915a973dd0f60f31a08b8318b73eab2bd6a392c79184b3612226b0a3f8ec02", size = 170687, upload_time = "2026-01-29T21:51:32.557Z" }, +] + +[[package]] +name = "pycparser" +version = "3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload_time = "2026-01-21T14:26:51.89Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload_time = "2026-01-21T14:26:50.693Z" }, +] + [[package]] name = "pydantic" version = "2.11.3" @@ -954,6 +1412,23 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293, upload_time = "2025-01-06T17:26:25.553Z" }, ] +[[package]] +name = "pyjwt" +version = "2.12.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c2/27/a3b6e5bf6ff856d2509292e95c8f57f0df7017cf5394921fc4e4ef40308a/pyjwt-2.12.1.tar.gz", hash = "sha256:c74a7a2adf861c04d002db713dd85f84beb242228e671280bf709d765b03672b", size = 102564, upload_time = "2026-03-13T19:27:37.25Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/7a/8dd906bd22e79e47397a61742927f6747fe93242ef86645ee9092e610244/pyjwt-2.12.1-py3-none-any.whl", hash = "sha256:28ca37c070cad8ba8cd9790cd940535d40274d22f80ab87f3ac6a713e6e8454c", size = 29726, upload_time = "2026-03-13T19:27:35.677Z" }, +] + +[package.optional-dependencies] +crypto = [ + { name = "cryptography" }, +] + [[package]] name = "pytest" version = "8.3.5" @@ -1020,11 +1495,33 @@ wheels = [ [[package]] name = "python-multipart" -version = "0.0.20" +version = "0.0.22" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload_time = "2026-01-25T10:15:56.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload_time = "2026-01-25T10:15:54.811Z" }, +] + +[[package]] +name = "pywin32" +version = "311" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload_time = "2024-12-16T19:45:46.972Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload_time = "2024-12-16T19:45:44.423Z" }, + { url = "https://files.pythonhosted.org/packages/7b/40/44efbb0dfbd33aca6a6483191dae0716070ed99e2ecb0c53683f400a0b4f/pywin32-311-cp310-cp310-win32.whl", hash = "sha256:d03ff496d2a0cd4a5893504789d4a15399133fe82517455e78bad62efbb7f0a3", size = 8760432, upload_time = "2025-07-14T20:13:05.9Z" }, + { url = "https://files.pythonhosted.org/packages/5e/bf/360243b1e953bd254a82f12653974be395ba880e7ec23e3731d9f73921cc/pywin32-311-cp310-cp310-win_amd64.whl", hash = "sha256:797c2772017851984b97180b0bebe4b620bb86328e8a884bb626156295a63b3b", size = 9590103, upload_time = "2025-07-14T20:13:07.698Z" }, + { url = "https://files.pythonhosted.org/packages/57/38/d290720e6f138086fb3d5ffe0b6caa019a791dd57866940c82e4eeaf2012/pywin32-311-cp310-cp310-win_arm64.whl", hash = "sha256:0502d1facf1fed4839a9a51ccbcc63d952cf318f78ffc00a7e78528ac27d7a2b", size = 8778557, upload_time = "2025-07-14T20:13:11.11Z" }, + { url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload_time = "2025-07-14T20:13:13.266Z" }, + { url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload_time = "2025-07-14T20:13:15.147Z" }, + { url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload_time = "2025-07-14T20:13:16.945Z" }, + { url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload_time = "2025-07-14T20:13:20.765Z" }, + { url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload_time = "2025-07-14T20:13:22.543Z" }, + { url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload_time = "2025-07-14T20:13:24.682Z" }, + { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload_time = "2025-07-14T20:13:26.471Z" }, + { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload_time = "2025-07-14T20:13:28.243Z" }, + { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload_time = "2025-07-14T20:13:30.348Z" }, + { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload_time = "2025-07-14T20:13:32.449Z" }, + { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload_time = "2025-07-14T20:13:34.312Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload_time = "2025-07-14T20:13:36.379Z" }, ] [[package]] @@ -1071,9 +1568,23 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload_time = "2024-08-06T20:33:04.33Z" }, ] +[[package]] +name = "referencing" +version = "0.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "rpds-py" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/22/f5/df4e9027acead3ecc63e50fe1e36aca1523e1719559c499951bb4b53188f/referencing-0.37.0.tar.gz", hash = "sha256:44aefc3142c5b842538163acb373e24cce6632bd54bdb01b21ad5863489f50d8", size = 78036, upload_time = "2025-10-13T15:30:48.871Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/58/ca301544e1fa93ed4f80d724bf5b194f6e4b945841c5bfd555878eea9fcb/referencing-0.37.0-py3-none-any.whl", hash = "sha256:381329a9f99628c9069361716891d34ad94af76e461dcb0335825aecc7692231", size = 26766, upload_time = "2025-10-13T15:30:47.625Z" }, +] + [[package]] name = "requests" -version = "2.32.3" +version = "2.32.4" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, @@ -1081,9 +1592,9 @@ dependencies = [ { name = "idna" }, { name = "urllib3" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218, upload_time = "2024-05-29T15:37:49.536Z" } +sdist = { url = "https://files.pythonhosted.org/packages/e1/0a/929373653770d8a0d7ea76c37de6e41f11eb07559b103b1c02cafb3f7cf8/requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422", size = 135258, upload_time = "2025-06-09T16:43:07.34Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928, upload_time = "2024-05-29T15:37:47.027Z" }, + { url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847, upload_time = "2025-06-09T16:43:05.728Z" }, ] [[package]] @@ -1112,6 +1623,128 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/0d/9b/63f4c7ebc259242c89b3acafdb37b41d1185c07ff0011164674e9076b491/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0", size = 243229, upload_time = "2025-03-30T14:15:12.283Z" }, ] +[[package]] +name = "rpds-py" +version = "0.30.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/20/af/3f2f423103f1113b36230496629986e0ef7e199d2aa8392452b484b38ced/rpds_py-0.30.0.tar.gz", hash = "sha256:dd8ff7cf90014af0c0f787eea34794ebf6415242ee1d6fa91eaba725cc441e84", size = 69469, upload_time = "2025-11-30T20:24:38.837Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/06/0c/0c411a0ec64ccb6d104dcabe0e713e05e153a9a2c3c2bd2b32ce412166fe/rpds_py-0.30.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:679ae98e00c0e8d68a7fda324e16b90fd5260945b45d3b824c892cec9eea3288", size = 370490, upload_time = "2025-11-30T20:21:33.256Z" }, + { url = "https://files.pythonhosted.org/packages/19/6a/4ba3d0fb7297ebae71171822554abe48d7cab29c28b8f9f2c04b79988c05/rpds_py-0.30.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4cc2206b76b4f576934f0ed374b10d7ca5f457858b157ca52064bdfc26b9fc00", size = 359751, upload_time = "2025-11-30T20:21:34.591Z" }, + { url = "https://files.pythonhosted.org/packages/cd/7c/e4933565ef7f7a0818985d87c15d9d273f1a649afa6a52ea35ad011195ea/rpds_py-0.30.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:389a2d49eded1896c3d48b0136ead37c48e221b391c052fba3f4055c367f60a6", size = 389696, upload_time = "2025-11-30T20:21:36.122Z" }, + { url = "https://files.pythonhosted.org/packages/5e/01/6271a2511ad0815f00f7ed4390cf2567bec1d4b1da39e2c27a41e6e3b4de/rpds_py-0.30.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:32c8528634e1bf7121f3de08fa85b138f4e0dc47657866630611b03967f041d7", size = 403136, upload_time = "2025-11-30T20:21:37.728Z" }, + { url = "https://files.pythonhosted.org/packages/55/64/c857eb7cd7541e9b4eee9d49c196e833128a55b89a9850a9c9ac33ccf897/rpds_py-0.30.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f207f69853edd6f6700b86efb84999651baf3789e78a466431df1331608e5324", size = 524699, upload_time = "2025-11-30T20:21:38.92Z" }, + { url = "https://files.pythonhosted.org/packages/9c/ed/94816543404078af9ab26159c44f9e98e20fe47e2126d5d32c9d9948d10a/rpds_py-0.30.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:67b02ec25ba7a9e8fa74c63b6ca44cf5707f2fbfadae3ee8e7494297d56aa9df", size = 412022, upload_time = "2025-11-30T20:21:40.407Z" }, + { url = "https://files.pythonhosted.org/packages/61/b5/707f6cf0066a6412aacc11d17920ea2e19e5b2f04081c64526eb35b5c6e7/rpds_py-0.30.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c0e95f6819a19965ff420f65578bacb0b00f251fefe2c8b23347c37174271f3", size = 390522, upload_time = "2025-11-30T20:21:42.17Z" }, + { url = "https://files.pythonhosted.org/packages/13/4e/57a85fda37a229ff4226f8cbcf09f2a455d1ed20e802ce5b2b4a7f5ed053/rpds_py-0.30.0-cp310-cp310-manylinux_2_31_riscv64.whl", hash = "sha256:a452763cc5198f2f98898eb98f7569649fe5da666c2dc6b5ddb10fde5a574221", size = 404579, upload_time = "2025-11-30T20:21:43.769Z" }, + { url = "https://files.pythonhosted.org/packages/f9/da/c9339293513ec680a721e0e16bf2bac3db6e5d7e922488de471308349bba/rpds_py-0.30.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e0b65193a413ccc930671c55153a03ee57cecb49e6227204b04fae512eb657a7", size = 421305, upload_time = "2025-11-30T20:21:44.994Z" }, + { url = "https://files.pythonhosted.org/packages/f9/be/522cb84751114f4ad9d822ff5a1aa3c98006341895d5f084779b99596e5c/rpds_py-0.30.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:858738e9c32147f78b3ac24dc0edb6610000e56dc0f700fd5f651d0a0f0eb9ff", size = 572503, upload_time = "2025-11-30T20:21:46.91Z" }, + { url = "https://files.pythonhosted.org/packages/a2/9b/de879f7e7ceddc973ea6e4629e9b380213a6938a249e94b0cdbcc325bb66/rpds_py-0.30.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:da279aa314f00acbb803da1e76fa18666778e8a8f83484fba94526da5de2cba7", size = 598322, upload_time = "2025-11-30T20:21:48.709Z" }, + { url = "https://files.pythonhosted.org/packages/48/ac/f01fc22efec3f37d8a914fc1b2fb9bcafd56a299edbe96406f3053edea5a/rpds_py-0.30.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:7c64d38fb49b6cdeda16ab49e35fe0da2e1e9b34bc38bd78386530f218b37139", size = 560792, upload_time = "2025-11-30T20:21:50.024Z" }, + { url = "https://files.pythonhosted.org/packages/e2/da/4e2b19d0f131f35b6146425f846563d0ce036763e38913d917187307a671/rpds_py-0.30.0-cp310-cp310-win32.whl", hash = "sha256:6de2a32a1665b93233cde140ff8b3467bdb9e2af2b91079f0333a0974d12d464", size = 221901, upload_time = "2025-11-30T20:21:51.32Z" }, + { url = "https://files.pythonhosted.org/packages/96/cb/156d7a5cf4f78a7cc571465d8aec7a3c447c94f6749c5123f08438bcf7bc/rpds_py-0.30.0-cp310-cp310-win_amd64.whl", hash = "sha256:1726859cd0de969f88dc8673bdd954185b9104e05806be64bcd87badbe313169", size = 235823, upload_time = "2025-11-30T20:21:52.505Z" }, + { url = "https://files.pythonhosted.org/packages/4d/6e/f964e88b3d2abee2a82c1ac8366da848fce1c6d834dc2132c3fda3970290/rpds_py-0.30.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a2bffea6a4ca9f01b3f8e548302470306689684e61602aa3d141e34da06cf425", size = 370157, upload_time = "2025-11-30T20:21:53.789Z" }, + { url = "https://files.pythonhosted.org/packages/94/ba/24e5ebb7c1c82e74c4e4f33b2112a5573ddc703915b13a073737b59b86e0/rpds_py-0.30.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dc4f992dfe1e2bc3ebc7444f6c7051b4bc13cd8e33e43511e8ffd13bf407010d", size = 359676, upload_time = "2025-11-30T20:21:55.475Z" }, + { url = "https://files.pythonhosted.org/packages/84/86/04dbba1b087227747d64d80c3b74df946b986c57af0a9f0c98726d4d7a3b/rpds_py-0.30.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:422c3cb9856d80b09d30d2eb255d0754b23e090034e1deb4083f8004bd0761e4", size = 389938, upload_time = "2025-11-30T20:21:57.079Z" }, + { url = "https://files.pythonhosted.org/packages/42/bb/1463f0b1722b7f45431bdd468301991d1328b16cffe0b1c2918eba2c4eee/rpds_py-0.30.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:07ae8a593e1c3c6b82ca3292efbe73c30b61332fd612e05abee07c79359f292f", size = 402932, upload_time = "2025-11-30T20:21:58.47Z" }, + { url = "https://files.pythonhosted.org/packages/99/ee/2520700a5c1f2d76631f948b0736cdf9b0acb25abd0ca8e889b5c62ac2e3/rpds_py-0.30.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:12f90dd7557b6bd57f40abe7747e81e0c0b119bef015ea7726e69fe550e394a4", size = 525830, upload_time = "2025-11-30T20:21:59.699Z" }, + { url = "https://files.pythonhosted.org/packages/e0/ad/bd0331f740f5705cc555a5e17fdf334671262160270962e69a2bdef3bf76/rpds_py-0.30.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:99b47d6ad9a6da00bec6aabe5a6279ecd3c06a329d4aa4771034a21e335c3a97", size = 412033, upload_time = "2025-11-30T20:22:00.991Z" }, + { url = "https://files.pythonhosted.org/packages/f8/1e/372195d326549bb51f0ba0f2ecb9874579906b97e08880e7a65c3bef1a99/rpds_py-0.30.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:33f559f3104504506a44bb666b93a33f5d33133765b0c216a5bf2f1e1503af89", size = 390828, upload_time = "2025-11-30T20:22:02.723Z" }, + { url = "https://files.pythonhosted.org/packages/ab/2b/d88bb33294e3e0c76bc8f351a3721212713629ffca1700fa94979cb3eae8/rpds_py-0.30.0-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:946fe926af6e44f3697abbc305ea168c2c31d3e3ef1058cf68f379bf0335a78d", size = 404683, upload_time = "2025-11-30T20:22:04.367Z" }, + { url = "https://files.pythonhosted.org/packages/50/32/c759a8d42bcb5289c1fac697cd92f6fe01a018dd937e62ae77e0e7f15702/rpds_py-0.30.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:495aeca4b93d465efde585977365187149e75383ad2684f81519f504f5c13038", size = 421583, upload_time = "2025-11-30T20:22:05.814Z" }, + { url = "https://files.pythonhosted.org/packages/2b/81/e729761dbd55ddf5d84ec4ff1f47857f4374b0f19bdabfcf929164da3e24/rpds_py-0.30.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9a0ca5da0386dee0655b4ccdf46119df60e0f10da268d04fe7cc87886872ba7", size = 572496, upload_time = "2025-11-30T20:22:07.713Z" }, + { url = "https://files.pythonhosted.org/packages/14/f6/69066a924c3557c9c30baa6ec3a0aa07526305684c6f86c696b08860726c/rpds_py-0.30.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8d6d1cc13664ec13c1b84241204ff3b12f9bb82464b8ad6e7a5d3486975c2eed", size = 598669, upload_time = "2025-11-30T20:22:09.312Z" }, + { url = "https://files.pythonhosted.org/packages/5f/48/905896b1eb8a05630d20333d1d8ffd162394127b74ce0b0784ae04498d32/rpds_py-0.30.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3896fa1be39912cf0757753826bc8bdc8ca331a28a7c4ae46b7a21280b06bb85", size = 561011, upload_time = "2025-11-30T20:22:11.309Z" }, + { url = "https://files.pythonhosted.org/packages/22/16/cd3027c7e279d22e5eb431dd3c0fbc677bed58797fe7581e148f3f68818b/rpds_py-0.30.0-cp311-cp311-win32.whl", hash = "sha256:55f66022632205940f1827effeff17c4fa7ae1953d2b74a8581baaefb7d16f8c", size = 221406, upload_time = "2025-11-30T20:22:13.101Z" }, + { url = "https://files.pythonhosted.org/packages/fa/5b/e7b7aa136f28462b344e652ee010d4de26ee9fd16f1bfd5811f5153ccf89/rpds_py-0.30.0-cp311-cp311-win_amd64.whl", hash = "sha256:a51033ff701fca756439d641c0ad09a41d9242fa69121c7d8769604a0a629825", size = 236024, upload_time = "2025-11-30T20:22:14.853Z" }, + { url = "https://files.pythonhosted.org/packages/14/a6/364bba985e4c13658edb156640608f2c9e1d3ea3c81b27aa9d889fff0e31/rpds_py-0.30.0-cp311-cp311-win_arm64.whl", hash = "sha256:47b0ef6231c58f506ef0b74d44e330405caa8428e770fec25329ed2cb971a229", size = 229069, upload_time = "2025-11-30T20:22:16.577Z" }, + { url = "https://files.pythonhosted.org/packages/03/e7/98a2f4ac921d82f33e03f3835f5bf3a4a40aa1bfdc57975e74a97b2b4bdd/rpds_py-0.30.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a161f20d9a43006833cd7068375a94d035714d73a172b681d8881820600abfad", size = 375086, upload_time = "2025-11-30T20:22:17.93Z" }, + { url = "https://files.pythonhosted.org/packages/4d/a1/bca7fd3d452b272e13335db8d6b0b3ecde0f90ad6f16f3328c6fb150c889/rpds_py-0.30.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6abc8880d9d036ecaafe709079969f56e876fcf107f7a8e9920ba6d5a3878d05", size = 359053, upload_time = "2025-11-30T20:22:19.297Z" }, + { url = "https://files.pythonhosted.org/packages/65/1c/ae157e83a6357eceff62ba7e52113e3ec4834a84cfe07fa4b0757a7d105f/rpds_py-0.30.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca28829ae5f5d569bb62a79512c842a03a12576375d5ece7d2cadf8abe96ec28", size = 390763, upload_time = "2025-11-30T20:22:21.661Z" }, + { url = "https://files.pythonhosted.org/packages/d4/36/eb2eb8515e2ad24c0bd43c3ee9cd74c33f7ca6430755ccdb240fd3144c44/rpds_py-0.30.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a1010ed9524c73b94d15919ca4d41d8780980e1765babf85f9a2f90d247153dd", size = 408951, upload_time = "2025-11-30T20:22:23.408Z" }, + { url = "https://files.pythonhosted.org/packages/d6/65/ad8dc1784a331fabbd740ef6f71ce2198c7ed0890dab595adb9ea2d775a1/rpds_py-0.30.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f8d1736cfb49381ba528cd5baa46f82fdc65c06e843dab24dd70b63d09121b3f", size = 514622, upload_time = "2025-11-30T20:22:25.16Z" }, + { url = "https://files.pythonhosted.org/packages/63/8e/0cfa7ae158e15e143fe03993b5bcd743a59f541f5952e1546b1ac1b5fd45/rpds_py-0.30.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d948b135c4693daff7bc2dcfc4ec57237a29bd37e60c2fabf5aff2bbacf3e2f1", size = 414492, upload_time = "2025-11-30T20:22:26.505Z" }, + { url = "https://files.pythonhosted.org/packages/60/1b/6f8f29f3f995c7ffdde46a626ddccd7c63aefc0efae881dc13b6e5d5bb16/rpds_py-0.30.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47f236970bccb2233267d89173d3ad2703cd36a0e2a6e92d0560d333871a3d23", size = 394080, upload_time = "2025-11-30T20:22:27.934Z" }, + { url = "https://files.pythonhosted.org/packages/6d/d5/a266341051a7a3ca2f4b750a3aa4abc986378431fc2da508c5034d081b70/rpds_py-0.30.0-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:2e6ecb5a5bcacf59c3f912155044479af1d0b6681280048b338b28e364aca1f6", size = 408680, upload_time = "2025-11-30T20:22:29.341Z" }, + { url = "https://files.pythonhosted.org/packages/10/3b/71b725851df9ab7a7a4e33cf36d241933da66040d195a84781f49c50490c/rpds_py-0.30.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a8fa71a2e078c527c3e9dc9fc5a98c9db40bcc8a92b4e8858e36d329f8684b51", size = 423589, upload_time = "2025-11-30T20:22:31.469Z" }, + { url = "https://files.pythonhosted.org/packages/00/2b/e59e58c544dc9bd8bd8384ecdb8ea91f6727f0e37a7131baeff8d6f51661/rpds_py-0.30.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:73c67f2db7bc334e518d097c6d1e6fed021bbc9b7d678d6cc433478365d1d5f5", size = 573289, upload_time = "2025-11-30T20:22:32.997Z" }, + { url = "https://files.pythonhosted.org/packages/da/3e/a18e6f5b460893172a7d6a680e86d3b6bc87a54c1f0b03446a3c8c7b588f/rpds_py-0.30.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5ba103fb455be00f3b1c2076c9d4264bfcb037c976167a6047ed82f23153f02e", size = 599737, upload_time = "2025-11-30T20:22:34.419Z" }, + { url = "https://files.pythonhosted.org/packages/5c/e2/714694e4b87b85a18e2c243614974413c60aa107fd815b8cbc42b873d1d7/rpds_py-0.30.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7cee9c752c0364588353e627da8a7e808a66873672bcb5f52890c33fd965b394", size = 563120, upload_time = "2025-11-30T20:22:35.903Z" }, + { url = "https://files.pythonhosted.org/packages/6f/ab/d5d5e3bcedb0a77f4f613706b750e50a5a3ba1c15ccd3665ecc636c968fd/rpds_py-0.30.0-cp312-cp312-win32.whl", hash = "sha256:1ab5b83dbcf55acc8b08fc62b796ef672c457b17dbd7820a11d6c52c06839bdf", size = 223782, upload_time = "2025-11-30T20:22:37.271Z" }, + { url = "https://files.pythonhosted.org/packages/39/3b/f786af9957306fdc38a74cef405b7b93180f481fb48453a114bb6465744a/rpds_py-0.30.0-cp312-cp312-win_amd64.whl", hash = "sha256:a090322ca841abd453d43456ac34db46e8b05fd9b3b4ac0c78bcde8b089f959b", size = 240463, upload_time = "2025-11-30T20:22:39.021Z" }, + { url = "https://files.pythonhosted.org/packages/f3/d2/b91dc748126c1559042cfe41990deb92c4ee3e2b415f6b5234969ffaf0cc/rpds_py-0.30.0-cp312-cp312-win_arm64.whl", hash = "sha256:669b1805bd639dd2989b281be2cfd951c6121b65e729d9b843e9639ef1fd555e", size = 230868, upload_time = "2025-11-30T20:22:40.493Z" }, + { url = "https://files.pythonhosted.org/packages/ed/dc/d61221eb88ff410de3c49143407f6f3147acf2538c86f2ab7ce65ae7d5f9/rpds_py-0.30.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f83424d738204d9770830d35290ff3273fbb02b41f919870479fab14b9d303b2", size = 374887, upload_time = "2025-11-30T20:22:41.812Z" }, + { url = "https://files.pythonhosted.org/packages/fd/32/55fb50ae104061dbc564ef15cc43c013dc4a9f4527a1f4d99baddf56fe5f/rpds_py-0.30.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e7536cd91353c5273434b4e003cbda89034d67e7710eab8761fd918ec6c69cf8", size = 358904, upload_time = "2025-11-30T20:22:43.479Z" }, + { url = "https://files.pythonhosted.org/packages/58/70/faed8186300e3b9bdd138d0273109784eea2396c68458ed580f885dfe7ad/rpds_py-0.30.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2771c6c15973347f50fece41fc447c054b7ac2ae0502388ce3b6738cd366e3d4", size = 389945, upload_time = "2025-11-30T20:22:44.819Z" }, + { url = "https://files.pythonhosted.org/packages/bd/a8/073cac3ed2c6387df38f71296d002ab43496a96b92c823e76f46b8af0543/rpds_py-0.30.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0a59119fc6e3f460315fe9d08149f8102aa322299deaa5cab5b40092345c2136", size = 407783, upload_time = "2025-11-30T20:22:46.103Z" }, + { url = "https://files.pythonhosted.org/packages/77/57/5999eb8c58671f1c11eba084115e77a8899d6e694d2a18f69f0ba471ec8b/rpds_py-0.30.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:76fec018282b4ead0364022e3c54b60bf368b9d926877957a8624b58419169b7", size = 515021, upload_time = "2025-11-30T20:22:47.458Z" }, + { url = "https://files.pythonhosted.org/packages/e0/af/5ab4833eadc36c0a8ed2bc5c0de0493c04f6c06de223170bd0798ff98ced/rpds_py-0.30.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:692bef75a5525db97318e8cd061542b5a79812d711ea03dbc1f6f8dbb0c5f0d2", size = 414589, upload_time = "2025-11-30T20:22:48.872Z" }, + { url = "https://files.pythonhosted.org/packages/b7/de/f7192e12b21b9e9a68a6d0f249b4af3fdcdff8418be0767a627564afa1f1/rpds_py-0.30.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9027da1ce107104c50c81383cae773ef5c24d296dd11c99e2629dbd7967a20c6", size = 394025, upload_time = "2025-11-30T20:22:50.196Z" }, + { url = "https://files.pythonhosted.org/packages/91/c4/fc70cd0249496493500e7cc2de87504f5aa6509de1e88623431fec76d4b6/rpds_py-0.30.0-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:9cf69cdda1f5968a30a359aba2f7f9aa648a9ce4b580d6826437f2b291cfc86e", size = 408895, upload_time = "2025-11-30T20:22:51.87Z" }, + { url = "https://files.pythonhosted.org/packages/58/95/d9275b05ab96556fefff73a385813eb66032e4c99f411d0795372d9abcea/rpds_py-0.30.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a4796a717bf12b9da9d3ad002519a86063dcac8988b030e405704ef7d74d2d9d", size = 422799, upload_time = "2025-11-30T20:22:53.341Z" }, + { url = "https://files.pythonhosted.org/packages/06/c1/3088fc04b6624eb12a57eb814f0d4997a44b0d208d6cace713033ff1a6ba/rpds_py-0.30.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d4c2aa7c50ad4728a094ebd5eb46c452e9cb7edbfdb18f9e1221f597a73e1e7", size = 572731, upload_time = "2025-11-30T20:22:54.778Z" }, + { url = "https://files.pythonhosted.org/packages/d8/42/c612a833183b39774e8ac8fecae81263a68b9583ee343db33ab571a7ce55/rpds_py-0.30.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ba81a9203d07805435eb06f536d95a266c21e5b2dfbf6517748ca40c98d19e31", size = 599027, upload_time = "2025-11-30T20:22:56.212Z" }, + { url = "https://files.pythonhosted.org/packages/5f/60/525a50f45b01d70005403ae0e25f43c0384369ad24ffe46e8d9068b50086/rpds_py-0.30.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:945dccface01af02675628334f7cf49c2af4c1c904748efc5cf7bbdf0b579f95", size = 563020, upload_time = "2025-11-30T20:22:58.2Z" }, + { url = "https://files.pythonhosted.org/packages/0b/5d/47c4655e9bcd5ca907148535c10e7d489044243cc9941c16ed7cd53be91d/rpds_py-0.30.0-cp313-cp313-win32.whl", hash = "sha256:b40fb160a2db369a194cb27943582b38f79fc4887291417685f3ad693c5a1d5d", size = 223139, upload_time = "2025-11-30T20:23:00.209Z" }, + { url = "https://files.pythonhosted.org/packages/f2/e1/485132437d20aa4d3e1d8b3fb5a5e65aa8139f1e097080c2a8443201742c/rpds_py-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:806f36b1b605e2d6a72716f321f20036b9489d29c51c91f4dd29a3e3afb73b15", size = 240224, upload_time = "2025-11-30T20:23:02.008Z" }, + { url = "https://files.pythonhosted.org/packages/24/95/ffd128ed1146a153d928617b0ef673960130be0009c77d8fbf0abe306713/rpds_py-0.30.0-cp313-cp313-win_arm64.whl", hash = "sha256:d96c2086587c7c30d44f31f42eae4eac89b60dabbac18c7669be3700f13c3ce1", size = 230645, upload_time = "2025-11-30T20:23:03.43Z" }, + { url = "https://files.pythonhosted.org/packages/ff/1b/b10de890a0def2a319a2626334a7f0ae388215eb60914dbac8a3bae54435/rpds_py-0.30.0-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:eb0b93f2e5c2189ee831ee43f156ed34e2a89a78a66b98cadad955972548be5a", size = 364443, upload_time = "2025-11-30T20:23:04.878Z" }, + { url = "https://files.pythonhosted.org/packages/0d/bf/27e39f5971dc4f305a4fb9c672ca06f290f7c4e261c568f3dea16a410d47/rpds_py-0.30.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:922e10f31f303c7c920da8981051ff6d8c1a56207dbdf330d9047f6d30b70e5e", size = 353375, upload_time = "2025-11-30T20:23:06.342Z" }, + { url = "https://files.pythonhosted.org/packages/40/58/442ada3bba6e8e6615fc00483135c14a7538d2ffac30e2d933ccf6852232/rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cdc62c8286ba9bf7f47befdcea13ea0e26bf294bda99758fd90535cbaf408000", size = 383850, upload_time = "2025-11-30T20:23:07.825Z" }, + { url = "https://files.pythonhosted.org/packages/14/14/f59b0127409a33c6ef6f5c1ebd5ad8e32d7861c9c7adfa9a624fc3889f6c/rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:47f9a91efc418b54fb8190a6b4aa7813a23fb79c51f4bb84e418f5476c38b8db", size = 392812, upload_time = "2025-11-30T20:23:09.228Z" }, + { url = "https://files.pythonhosted.org/packages/b3/66/e0be3e162ac299b3a22527e8913767d869e6cc75c46bd844aa43fb81ab62/rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1f3587eb9b17f3789ad50824084fa6f81921bbf9a795826570bda82cb3ed91f2", size = 517841, upload_time = "2025-11-30T20:23:11.186Z" }, + { url = "https://files.pythonhosted.org/packages/3d/55/fa3b9cf31d0c963ecf1ba777f7cf4b2a2c976795ac430d24a1f43d25a6ba/rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:39c02563fc592411c2c61d26b6c5fe1e51eaa44a75aa2c8735ca88b0d9599daa", size = 408149, upload_time = "2025-11-30T20:23:12.864Z" }, + { url = "https://files.pythonhosted.org/packages/60/ca/780cf3b1a32b18c0f05c441958d3758f02544f1d613abf9488cd78876378/rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51a1234d8febafdfd33a42d97da7a43f5dcb120c1060e352a3fbc0c6d36e2083", size = 383843, upload_time = "2025-11-30T20:23:14.638Z" }, + { url = "https://files.pythonhosted.org/packages/82/86/d5f2e04f2aa6247c613da0c1dd87fcd08fa17107e858193566048a1e2f0a/rpds_py-0.30.0-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:eb2c4071ab598733724c08221091e8d80e89064cd472819285a9ab0f24bcedb9", size = 396507, upload_time = "2025-11-30T20:23:16.105Z" }, + { url = "https://files.pythonhosted.org/packages/4b/9a/453255d2f769fe44e07ea9785c8347edaf867f7026872e76c1ad9f7bed92/rpds_py-0.30.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6bdfdb946967d816e6adf9a3d8201bfad269c67efe6cefd7093ef959683c8de0", size = 414949, upload_time = "2025-11-30T20:23:17.539Z" }, + { url = "https://files.pythonhosted.org/packages/a3/31/622a86cdc0c45d6df0e9ccb6becdba5074735e7033c20e401a6d9d0e2ca0/rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:c77afbd5f5250bf27bf516c7c4a016813eb2d3e116139aed0096940c5982da94", size = 565790, upload_time = "2025-11-30T20:23:19.029Z" }, + { url = "https://files.pythonhosted.org/packages/1c/5d/15bbf0fb4a3f58a3b1c67855ec1efcc4ceaef4e86644665fff03e1b66d8d/rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:61046904275472a76c8c90c9ccee9013d70a6d0f73eecefd38c1ae7c39045a08", size = 590217, upload_time = "2025-11-30T20:23:20.885Z" }, + { url = "https://files.pythonhosted.org/packages/6d/61/21b8c41f68e60c8cc3b2e25644f0e3681926020f11d06ab0b78e3c6bbff1/rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c5f36a861bc4b7da6516dbdf302c55313afa09b81931e8280361a4f6c9a2d27", size = 555806, upload_time = "2025-11-30T20:23:22.488Z" }, + { url = "https://files.pythonhosted.org/packages/f9/39/7e067bb06c31de48de3eb200f9fc7c58982a4d3db44b07e73963e10d3be9/rpds_py-0.30.0-cp313-cp313t-win32.whl", hash = "sha256:3d4a69de7a3e50ffc214ae16d79d8fbb0922972da0356dcf4d0fdca2878559c6", size = 211341, upload_time = "2025-11-30T20:23:24.449Z" }, + { url = "https://files.pythonhosted.org/packages/0a/4d/222ef0b46443cf4cf46764d9c630f3fe4abaa7245be9417e56e9f52b8f65/rpds_py-0.30.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f14fc5df50a716f7ece6a80b6c78bb35ea2ca47c499e422aa4463455dd96d56d", size = 225768, upload_time = "2025-11-30T20:23:25.908Z" }, + { url = "https://files.pythonhosted.org/packages/86/81/dad16382ebbd3d0e0328776d8fd7ca94220e4fa0798d1dc5e7da48cb3201/rpds_py-0.30.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:68f19c879420aa08f61203801423f6cd5ac5f0ac4ac82a2368a9fcd6a9a075e0", size = 362099, upload_time = "2025-11-30T20:23:27.316Z" }, + { url = "https://files.pythonhosted.org/packages/2b/60/19f7884db5d5603edf3c6bce35408f45ad3e97e10007df0e17dd57af18f8/rpds_py-0.30.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ec7c4490c672c1a0389d319b3a9cfcd098dcdc4783991553c332a15acf7249be", size = 353192, upload_time = "2025-11-30T20:23:29.151Z" }, + { url = "https://files.pythonhosted.org/packages/bf/c4/76eb0e1e72d1a9c4703c69607cec123c29028bff28ce41588792417098ac/rpds_py-0.30.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f251c812357a3fed308d684a5079ddfb9d933860fc6de89f2b7ab00da481e65f", size = 384080, upload_time = "2025-11-30T20:23:30.785Z" }, + { url = "https://files.pythonhosted.org/packages/72/87/87ea665e92f3298d1b26d78814721dc39ed8d2c74b86e83348d6b48a6f31/rpds_py-0.30.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac98b175585ecf4c0348fd7b29c3864bda53b805c773cbf7bfdaffc8070c976f", size = 394841, upload_time = "2025-11-30T20:23:32.209Z" }, + { url = "https://files.pythonhosted.org/packages/77/ad/7783a89ca0587c15dcbf139b4a8364a872a25f861bdb88ed99f9b0dec985/rpds_py-0.30.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3e62880792319dbeb7eb866547f2e35973289e7d5696c6e295476448f5b63c87", size = 516670, upload_time = "2025-11-30T20:23:33.742Z" }, + { url = "https://files.pythonhosted.org/packages/5b/3c/2882bdac942bd2172f3da574eab16f309ae10a3925644e969536553cb4ee/rpds_py-0.30.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4e7fc54e0900ab35d041b0601431b0a0eb495f0851a0639b6ef90f7741b39a18", size = 408005, upload_time = "2025-11-30T20:23:35.253Z" }, + { url = "https://files.pythonhosted.org/packages/ce/81/9a91c0111ce1758c92516a3e44776920b579d9a7c09b2b06b642d4de3f0f/rpds_py-0.30.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47e77dc9822d3ad616c3d5759ea5631a75e5809d5a28707744ef79d7a1bcfcad", size = 382112, upload_time = "2025-11-30T20:23:36.842Z" }, + { url = "https://files.pythonhosted.org/packages/cf/8e/1da49d4a107027e5fbc64daeab96a0706361a2918da10cb41769244b805d/rpds_py-0.30.0-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:b4dc1a6ff022ff85ecafef7979a2c6eb423430e05f1165d6688234e62ba99a07", size = 399049, upload_time = "2025-11-30T20:23:38.343Z" }, + { url = "https://files.pythonhosted.org/packages/df/5a/7ee239b1aa48a127570ec03becbb29c9d5a9eb092febbd1699d567cae859/rpds_py-0.30.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4559c972db3a360808309e06a74628b95eaccbf961c335c8fe0d590cf587456f", size = 415661, upload_time = "2025-11-30T20:23:40.263Z" }, + { url = "https://files.pythonhosted.org/packages/70/ea/caa143cf6b772f823bc7929a45da1fa83569ee49b11d18d0ada7f5ee6fd6/rpds_py-0.30.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0ed177ed9bded28f8deb6ab40c183cd1192aa0de40c12f38be4d59cd33cb5c65", size = 565606, upload_time = "2025-11-30T20:23:42.186Z" }, + { url = "https://files.pythonhosted.org/packages/64/91/ac20ba2d69303f961ad8cf55bf7dbdb4763f627291ba3d0d7d67333cced9/rpds_py-0.30.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:ad1fa8db769b76ea911cb4e10f049d80bf518c104f15b3edb2371cc65375c46f", size = 591126, upload_time = "2025-11-30T20:23:44.086Z" }, + { url = "https://files.pythonhosted.org/packages/21/20/7ff5f3c8b00c8a95f75985128c26ba44503fb35b8e0259d812766ea966c7/rpds_py-0.30.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:46e83c697b1f1c72b50e5ee5adb4353eef7406fb3f2043d64c33f20ad1c2fc53", size = 553371, upload_time = "2025-11-30T20:23:46.004Z" }, + { url = "https://files.pythonhosted.org/packages/72/c7/81dadd7b27c8ee391c132a6b192111ca58d866577ce2d9b0ca157552cce0/rpds_py-0.30.0-cp314-cp314-win32.whl", hash = "sha256:ee454b2a007d57363c2dfd5b6ca4a5d7e2c518938f8ed3b706e37e5d470801ed", size = 215298, upload_time = "2025-11-30T20:23:47.696Z" }, + { url = "https://files.pythonhosted.org/packages/3e/d2/1aaac33287e8cfb07aab2e6b8ac1deca62f6f65411344f1433c55e6f3eb8/rpds_py-0.30.0-cp314-cp314-win_amd64.whl", hash = "sha256:95f0802447ac2d10bcc69f6dc28fe95fdf17940367b21d34e34c737870758950", size = 228604, upload_time = "2025-11-30T20:23:49.501Z" }, + { url = "https://files.pythonhosted.org/packages/e8/95/ab005315818cc519ad074cb7784dae60d939163108bd2b394e60dc7b5461/rpds_py-0.30.0-cp314-cp314-win_arm64.whl", hash = "sha256:613aa4771c99f03346e54c3f038e4cc574ac09a3ddfb0e8878487335e96dead6", size = 222391, upload_time = "2025-11-30T20:23:50.96Z" }, + { url = "https://files.pythonhosted.org/packages/9e/68/154fe0194d83b973cdedcdcc88947a2752411165930182ae41d983dcefa6/rpds_py-0.30.0-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:7e6ecfcb62edfd632e56983964e6884851786443739dbfe3582947e87274f7cb", size = 364868, upload_time = "2025-11-30T20:23:52.494Z" }, + { url = "https://files.pythonhosted.org/packages/83/69/8bbc8b07ec854d92a8b75668c24d2abcb1719ebf890f5604c61c9369a16f/rpds_py-0.30.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:a1d0bc22a7cdc173fedebb73ef81e07faef93692b8c1ad3733b67e31e1b6e1b8", size = 353747, upload_time = "2025-11-30T20:23:54.036Z" }, + { url = "https://files.pythonhosted.org/packages/ab/00/ba2e50183dbd9abcce9497fa5149c62b4ff3e22d338a30d690f9af970561/rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0d08f00679177226c4cb8c5265012eea897c8ca3b93f429e546600c971bcbae7", size = 383795, upload_time = "2025-11-30T20:23:55.556Z" }, + { url = "https://files.pythonhosted.org/packages/05/6f/86f0272b84926bcb0e4c972262f54223e8ecc556b3224d281e6598fc9268/rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5965af57d5848192c13534f90f9dd16464f3c37aaf166cc1da1cae1fd5a34898", size = 393330, upload_time = "2025-11-30T20:23:57.033Z" }, + { url = "https://files.pythonhosted.org/packages/cb/e9/0e02bb2e6dc63d212641da45df2b0bf29699d01715913e0d0f017ee29438/rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a4e86e34e9ab6b667c27f3211ca48f73dba7cd3d90f8d5b11be56e5dbc3fb4e", size = 518194, upload_time = "2025-11-30T20:23:58.637Z" }, + { url = "https://files.pythonhosted.org/packages/ee/ca/be7bca14cf21513bdf9c0606aba17d1f389ea2b6987035eb4f62bd923f25/rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5d3e6b26f2c785d65cc25ef1e5267ccbe1b069c5c21b8cc724efee290554419", size = 408340, upload_time = "2025-11-30T20:24:00.2Z" }, + { url = "https://files.pythonhosted.org/packages/c2/c7/736e00ebf39ed81d75544c0da6ef7b0998f8201b369acf842f9a90dc8fce/rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:626a7433c34566535b6e56a1b39a7b17ba961e97ce3b80ec62e6f1312c025551", size = 383765, upload_time = "2025-11-30T20:24:01.759Z" }, + { url = "https://files.pythonhosted.org/packages/4a/3f/da50dfde9956aaf365c4adc9533b100008ed31aea635f2b8d7b627e25b49/rpds_py-0.30.0-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:acd7eb3f4471577b9b5a41baf02a978e8bdeb08b4b355273994f8b87032000a8", size = 396834, upload_time = "2025-11-30T20:24:03.687Z" }, + { url = "https://files.pythonhosted.org/packages/4e/00/34bcc2565b6020eab2623349efbdec810676ad571995911f1abdae62a3a0/rpds_py-0.30.0-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fe5fa731a1fa8a0a56b0977413f8cacac1768dad38d16b3a296712709476fbd5", size = 415470, upload_time = "2025-11-30T20:24:05.232Z" }, + { url = "https://files.pythonhosted.org/packages/8c/28/882e72b5b3e6f718d5453bd4d0d9cf8df36fddeb4ddbbab17869d5868616/rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:74a3243a411126362712ee1524dfc90c650a503502f135d54d1b352bd01f2404", size = 565630, upload_time = "2025-11-30T20:24:06.878Z" }, + { url = "https://files.pythonhosted.org/packages/3b/97/04a65539c17692de5b85c6e293520fd01317fd878ea1995f0367d4532fb1/rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:3e8eeb0544f2eb0d2581774be4c3410356eba189529a6b3e36bbbf9696175856", size = 591148, upload_time = "2025-11-30T20:24:08.445Z" }, + { url = "https://files.pythonhosted.org/packages/85/70/92482ccffb96f5441aab93e26c4d66489eb599efdcf96fad90c14bbfb976/rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:dbd936cde57abfee19ab3213cf9c26be06d60750e60a8e4dd85d1ab12c8b1f40", size = 556030, upload_time = "2025-11-30T20:24:10.956Z" }, + { url = "https://files.pythonhosted.org/packages/20/53/7c7e784abfa500a2b6b583b147ee4bb5a2b3747a9166bab52fec4b5b5e7d/rpds_py-0.30.0-cp314-cp314t-win32.whl", hash = "sha256:dc824125c72246d924f7f796b4f63c1e9dc810c7d9e2355864b3c3a73d59ade0", size = 211570, upload_time = "2025-11-30T20:24:12.735Z" }, + { url = "https://files.pythonhosted.org/packages/d0/02/fa464cdfbe6b26e0600b62c528b72d8608f5cc49f96b8d6e38c95d60c676/rpds_py-0.30.0-cp314-cp314t-win_amd64.whl", hash = "sha256:27f4b0e92de5bfbc6f86e43959e6edd1425c33b5e69aab0984a72047f2bcf1e3", size = 226532, upload_time = "2025-11-30T20:24:14.634Z" }, + { url = "https://files.pythonhosted.org/packages/69/71/3f34339ee70521864411f8b6992e7ab13ac30d8e4e3309e07c7361767d91/rpds_py-0.30.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c2262bdba0ad4fc6fb5545660673925c2d2a5d9e2e0fb603aad545427be0fc58", size = 372292, upload_time = "2025-11-30T20:24:16.537Z" }, + { url = "https://files.pythonhosted.org/packages/57/09/f183df9b8f2d66720d2ef71075c59f7e1b336bec7ee4c48f0a2b06857653/rpds_py-0.30.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:ee6af14263f25eedc3bb918a3c04245106a42dfd4f5c2285ea6f997b1fc3f89a", size = 362128, upload_time = "2025-11-30T20:24:18.086Z" }, + { url = "https://files.pythonhosted.org/packages/7a/68/5c2594e937253457342e078f0cc1ded3dd7b2ad59afdbf2d354869110a02/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3adbb8179ce342d235c31ab8ec511e66c73faa27a47e076ccc92421add53e2bb", size = 391542, upload_time = "2025-11-30T20:24:20.092Z" }, + { url = "https://files.pythonhosted.org/packages/49/5c/31ef1afd70b4b4fbdb2800249f34c57c64beb687495b10aec0365f53dfc4/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:250fa00e9543ac9b97ac258bd37367ff5256666122c2d0f2bc97577c60a1818c", size = 404004, upload_time = "2025-11-30T20:24:22.231Z" }, + { url = "https://files.pythonhosted.org/packages/e3/63/0cfbea38d05756f3440ce6534d51a491d26176ac045e2707adc99bb6e60a/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9854cf4f488b3d57b9aaeb105f06d78e5529d3145b1e4a41750167e8c213c6d3", size = 527063, upload_time = "2025-11-30T20:24:24.302Z" }, + { url = "https://files.pythonhosted.org/packages/42/e6/01e1f72a2456678b0f618fc9a1a13f882061690893c192fcad9f2926553a/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:993914b8e560023bc0a8bf742c5f303551992dcb85e247b1e5c7f4a7d145bda5", size = 413099, upload_time = "2025-11-30T20:24:25.916Z" }, + { url = "https://files.pythonhosted.org/packages/b8/25/8df56677f209003dcbb180765520c544525e3ef21ea72279c98b9aa7c7fb/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58edca431fb9b29950807e301826586e5bbf24163677732429770a697ffe6738", size = 392177, upload_time = "2025-11-30T20:24:27.834Z" }, + { url = "https://files.pythonhosted.org/packages/4a/b4/0a771378c5f16f8115f796d1f437950158679bcd2a7c68cf251cfb00ed5b/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:dea5b552272a944763b34394d04577cf0f9bd013207bc32323b5a89a53cf9c2f", size = 406015, upload_time = "2025-11-30T20:24:29.457Z" }, + { url = "https://files.pythonhosted.org/packages/36/d8/456dbba0af75049dc6f63ff295a2f92766b9d521fa00de67a2bd6427d57a/rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ba3af48635eb83d03f6c9735dfb21785303e73d22ad03d489e88adae6eab8877", size = 423736, upload_time = "2025-11-30T20:24:31.22Z" }, + { url = "https://files.pythonhosted.org/packages/13/64/b4d76f227d5c45a7e0b796c674fd81b0a6c4fbd48dc29271857d8219571c/rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:dff13836529b921e22f15cb099751209a60009731a68519630a24d61f0b1b30a", size = 573981, upload_time = "2025-11-30T20:24:32.934Z" }, + { url = "https://files.pythonhosted.org/packages/20/91/092bacadeda3edf92bf743cc96a7be133e13a39cdbfd7b5082e7ab638406/rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:1b151685b23929ab7beec71080a8889d4d6d9fa9a983d213f07121205d48e2c4", size = 599782, upload_time = "2025-11-30T20:24:35.169Z" }, + { url = "https://files.pythonhosted.org/packages/d1/b7/b95708304cd49b7b6f82fdd039f1748b66ec2b21d6a45180910802f1abf1/rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:ac37f9f516c51e5753f27dfdef11a88330f04de2d564be3991384b2f3535d02e", size = 562191, upload_time = "2025-11-30T20:24:36.853Z" }, +] + [[package]] name = "ruff" version = "0.12.0" @@ -1200,14 +1833,15 @@ wheels = [ [[package]] name = "starlette" -version = "0.46.2" +version = "0.49.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846, upload_time = "2025-04-13T13:56:17.942Z" } +sdist = { url = "https://files.pythonhosted.org/packages/1b/3f/507c21db33b66fb027a332f2cb3abbbe924cc3a79ced12f01ed8645955c9/starlette-0.49.1.tar.gz", hash = "sha256:481a43b71e24ed8c43b11ea02f5353d77840e01480881b8cb5a26b8cae64a8cb", size = 2654703, upload_time = "2025-10-28T17:34:10.928Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037, upload_time = "2025-04-13T13:56:16.21Z" }, + { url = "https://files.pythonhosted.org/packages/51/da/545b75d420bb23b5d494b0517757b351963e974e79933f01e05c929f20a6/starlette-0.49.1-py3-none-any.whl", hash = "sha256:d92ce9f07e4a3caa3ac13a79523bd18e3bc0042bb8ff2d759a8e7dd0e1859875", size = 74175, upload_time = "2025-10-28T17:34:09.13Z" }, ] [[package]] @@ -1251,17 +1885,17 @@ wheels = [ [[package]] name = "typer" -version = "0.15.2" +version = "0.24.1" source = { registry = "https://pypi.org/simple" } dependencies = [ + { name = "annotated-doc" }, { name = "click" }, { name = "rich" }, { name = "shellingham" }, - { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/8b/6f/3991f0f1c7fcb2df31aef28e0594d8d54b05393a0e4e34c65e475c2a5d41/typer-0.15.2.tar.gz", hash = "sha256:ab2fab47533a813c49fe1f16b1a370fd5819099c00b119e0633df65f22144ba5", size = 100711, upload_time = "2025-02-27T19:17:34.807Z" } +sdist = { url = "https://files.pythonhosted.org/packages/f5/24/cb09efec5cc954f7f9b930bf8279447d24618bb6758d4f6adf2574c41780/typer-0.24.1.tar.gz", hash = "sha256:e39b4732d65fbdcde189ae76cf7cd48aeae72919dea1fdfc16593be016256b45", size = 118613, upload_time = "2026-02-21T16:54:40.609Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/7f/fc/5b29fea8cee020515ca82cc68e3b8e1e34bb19a3535ad854cac9257b414c/typer-0.15.2-py3-none-any.whl", hash = "sha256:46a499c6107d645a9c13f7ee46c5d5096cae6f5fc57dd11eccbbb9ae3e44ddfc", size = 45061, upload_time = "2025-02-27T19:17:32.111Z" }, + { url = "https://files.pythonhosted.org/packages/4a/91/48db081e7a63bb37284f9fbcefda7c44c277b18b0e13fbc36ea2335b71e6/typer-0.24.1-py3-none-any.whl", hash = "sha256:112c1f0ce578bfb4cab9ffdabc68f031416ebcc216536611ba21f04e9aa84c9e", size = 56085, upload_time = "2026-02-21T16:54:41.616Z" }, ] [[package]] @@ -1275,23 +1909,23 @@ wheels = [ [[package]] name = "typing-inspection" -version = "0.4.0" +version = "0.4.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/82/5c/e6082df02e215b846b4b8c0b887a64d7d08ffaba30605502639d44c06b82/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122", size = 76222, upload_time = "2025-02-25T17:27:59.638Z" } +sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload_time = "2025-10-01T02:14:41.687Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/31/08/aa4fdfb71f7de5176385bd9e90852eaf6b5d622735020ad600f2bab54385/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f", size = 14125, upload_time = "2025-02-25T17:27:57.754Z" }, + { url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload_time = "2025-10-01T02:14:40.154Z" }, ] [[package]] name = "urllib3" -version = "2.4.0" +version = "2.6.3" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/8a/78/16493d9c386d8e60e442a35feac5e00f0913c0f4b7c217c11e8ec2ff53e0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466", size = 390672, upload_time = "2025-04-10T15:23:39.232Z" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload_time = "2026-01-07T16:24:43.925Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/11/cc635220681e93a0183390e26485430ca2c7b5f9d33b15c74c2861cb8091/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813", size = 128680, upload_time = "2025-04-10T15:23:37.377Z" }, + { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload_time = "2026-01-07T16:24:42.685Z" }, ] [[package]]