Works With Your Stack
OpenAI-compatible API. Drop-in replacement for any OpenAI SDK. Works with LangChain, LlamaIndex, Vercel AI SDK, and more.
Works With Your Stack
OpenAI-compatible API. Drop-in replacement for any OpenAI SDK. Zero refactoring needed.
OpenAI Compatible
Drop-in replacement for the OpenAI API. Change one line of code, keep your entire codebase. Full compatibility with chat completions, embeddings, and more.
Python & Node.js SDKs
Works with the official OpenAI Python and Node.js packages out of the box. No custom SDK needed, no wrapper libraries.
Framework Support
Native compatibility with LangChain, LlamaIndex, Vercel AI SDK, and CrewAI. Plug Requesty into your existing AI pipelines instantly.
400+ Models, One API
Access Anthropic, OpenAI, Google, DeepSeek, Meta, Mistral and more through a single endpoint. Switch models without changing code.
client = OpenAI( api_key="sk-..." )
client = OpenAI(
api_key="req-...",
base_url="https://router.requesty.ai/v1"
)One Line to 400+ Models
Use the OpenAI Python package with Requesty's base URL
from openai import OpenAI
client = OpenAI(
base_url="https://router.requesty.ai/v1",
api_key="your-requesty-key"
)
# Access any model from any provider
response = client.chat.completions.create(
model="anthropic/claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello!"}]
)