Requesty
Plug & Play

Works With Your Stack

OpenAI-compatible API. Drop-in replacement for any OpenAI SDK. Works with LangChain, LlamaIndex, Vercel AI SDK, and more.

View documentation
Integrations

Works With Your Stack

OpenAI-compatible API. Drop-in replacement for any OpenAI SDK. Zero refactoring needed.

OpenAI Compatible

Drop-in replacement for the OpenAI API. Change one line of code, keep your entire codebase. Full compatibility with chat completions, embeddings, and more.

Python & Node.js SDKs

Works with the official OpenAI Python and Node.js packages out of the box. No custom SDK needed, no wrapper libraries.

Framework Support

Native compatibility with LangChain, LlamaIndex, Vercel AI SDK, and CrewAI. Plug Requesty into your existing AI pipelines instantly.

400+ Models, One API

Access Anthropic, OpenAI, Google, DeepSeek, Meta, Mistral and more through a single endpoint. Switch models without changing code.

API Compatibility
100%
Before
client = OpenAI(
  api_key="sk-..."
)
After (one line change)
client = OpenAI(
  api_key="req-...",
  base_url="https://router.requesty.ai/v1"
)
Chat Completions
Embeddings
Streaming
Function Calling
Vision

One Line to 400+ Models

Use the OpenAI Python package with Requesty's base URL

main.py
Python
from openai import OpenAI

client = OpenAI(
    base_url="https://router.requesty.ai/v1",
    api_key="your-requesty-key"
)

# Access any model from any provider
response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4-20250514",
    messages=[{"role": "user", "content": "Hello!"}]
)
OpenAI Compatible
Streaming Support
Function Calling

Start integrating with Requesty

$6 in free credits. No credit card required.

View pricing