Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
-
Updated
Oct 8, 2025 - Python
Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
XAI
Setup scripts for using TensorBlock Forge with Claude Code - access any AI model through Claude's interface
Experimental application that integrates Spring AI and CodeGate
An AI chat proxy with universal tool access, protocol conversion, load balancing, key isolation, prompt enhancement, centralized MCP hub, and built-in WebSearch & WebFetch — more than an AI assistant for chat, translation, mind maps, flowcharts, and search.
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
A lightweight proxy for LLM API calls with guardrails, metrics, and monitoring. A vibe coding experiment.
AI Proxy Server - A high-performance, secure unified API gateway for multiple LLM providers (OpenAI, Gemini, Groq, OpenRouter, Cloudflare) with intelligent routing, rate limiting, and streaming support. Features modular architecture, enhanced security, and optimized performance.
Hybrid AI routing: LOCAL Ollama + CLOUD GitHub Copilot
Poc pour tester fesabilité d'un proxy ia simple avec monitoring
🚀 Enterprise AI Proxy: Claude Code SDK + LiteLLM integration with AWS masking, Redis persistence, TDD architecture. Complete headless mode support for production deployment.
低价中转api_中转api为什么便宜_为什么AI中转站价格会便宜_Api中转平台_中转api推荐,推荐低价Chatgpt中转api_Openaiapi中转_Gemini中转api_Claude中转api
🛠️ Proxy between Claude Code, LiteLLM, and other LLM providers, enabling streaming, tool use, and multi-turn chats effortlessly.
Add a description, image, and links to the ai-proxy topic page so that developers can more easily learn about it.
To associate your repository with the ai-proxy topic, visit your repo's landing page and select "manage topics."