Elixir Interface / Adapter for Google Gemini LLM, for both AI Studio and Vertex AI
-
Updated
Mar 5, 2026 - Elixir
Elixir Interface / Adapter for Google Gemini LLM, for both AI Studio and Vertex AI
An Elixir SDK for Claude Code - provides programmatic access to Claude Code CLI with streaming message processing
OpenAI Codex SDK written in Elixir
Agent Session Manager - A comprehensive Elixir library for managing AI agent sessions, state persistence, conversation context, and multi-agent orchestration workflows
Ollixir provides a first-class Elixir client with feature parity to the official ollama-python library. Ollixir runs large language models locally or on your infrastructure via Ollama.
Protocol-based AI adapter foundation for Elixir - unified abstractions for gemini_ex, claude_agent_sdk, codex_sdk with automatic fallback, capability detection, and telemetry
Elixir client SDK for the Jules API - orchestrate AI coding sessions
Native Elixir SDK for the Notion API — comprehensive, idiomatic client for Notion workspaces, databases, pages, blocks, users, comments, and search. Built on OTP with supervised HTTP, automatic rate limiting, pagination helpers, and robust error handling for BEAM applications.
🛠️ Enhance your Elixir development with Claude Code plugins for smoother coding, formatting, and efficient project management.
Native Elixir SDK for the GitHub REST API — comprehensive, idiomatic client for repositories, issues, pull requests, actions, organizations, users, apps, and more. Built on OTP with supervised HTTP, OAuth and token onboarding, pagination helpers, rate-limit awareness, and robust error handling for BEAM applications.
Full-featured Elixir client for the Model Context Protocol (MCP) with multi-transport support, resources, prompts, tools, and telemetry.
vLLM - High-throughput, memory-efficient LLM inference engine with PagedAttention, continuous batching, CUDA/HIP optimization, quantization (GPTQ/AWQ/INT4/INT8/FP8), tensor/pipeline parallelism, OpenAI-compatible API, multi-GPU/TPU/Neuron support, prefix caching, and multi-LoRA capabilities
Foundational Elixir runtime library for deterministic CLI subprocess orchestration, normalized event and payload semantics, provider profile contracts, registry plumbing, and shared support modules for the nshkr AI SDK runtime stack rearchitecture.
Shared LLM Actions for NSAI runtimes. Wraps PortfolioCore adapters with Jido.Action semantics and CrucibleIR.Backend input/output to centralize provider access.
An Elixir SDK for the Gemini CLI — Build AI-powered applications with Google Gemini via a robust, idiomatic wrapper around the Gemini CLI. Features streaming, structured output, session management, model selection, and OTP supervision tree integration for production-grade Gemini-powered Elixir apps.
🤖 Enable local large language models with Ollixir, the Elixir client mirroring the ollama-python library for seamless chat, generation, and model management.
Elixir SDK for the Amp CLI — provides a comprehensive client library for interacting with Amp's AI-powered coding agent, including thread management, tool orchestration, streaming responses, and programmatic access to Amp's full feature set from Elixir/OTP applications
Prompt and parsing utilities for Crucible and NSAI. Provides templating, schema validation, structured output parsing, and tool-call helpers for consistent LLM IO.
Add a description, image, and links to the nshkr-ai-sdk topic page so that developers can more easily learn about it.
To associate your repository with the nshkr-ai-sdk topic, visit your repo's landing page and select "manage topics."