Weâre excited to announce that Grok 3âxAIâs next-generation AI model with powerful reasoning capabilitiesâis now available for seamless integration via the Requesty Router. Whether youâre using Cline, Roo Code, OpenWebUI, or other open-source tools, connecting Grok 3 to your development workflow has never been easier. In this post, weâll show you how to get started, share some best practices, and invite you to join our community on Discord to stay ahead of the curve.
What is Grok 3?
Grok 3 is xAIâs latest and most capable large language model. Building on massive pretraining and advanced reinforcement learning for reasoning, Grok 3 excels at:
Mathematics & Problem Solving Competition-level math performance and robust logical reasoning.
Coding & Debugging Generates well-structured code across popular languages, plus it can systematically debug multi-file projects.
World Knowledge & QA Handles graduate-level knowledge tasks, general domain questions, and complicated research queries.
Instruction-Following & Creativity Delivers concise, user-aligned responses and can generate creative content ranging from stories to design ideas.
Grok 3 ships with two variants:
Grok 3 Beta (fast, broad domain coverage, top-tier performance)
Grok 3 mini (cost-efficient, optimized for shorter queries and smaller contexts)
Additionally, both models have a (Think) variant for deeper, multi-step reasoningâperfect for complex tasks.
Why Integrate via Requesty Router?
The Requesty Router provides a single, OpenAI-compatible API endpoint that connects you to over 50 different large language models, including Grok 3. By placing Grok 3 behind a unified interface, you can:
Reduce Complexity No separate signups or key management for each model; one key unlocks them all.
Effortlessly Switch Models Dynamically route requests to different LLMs based on cost, speed, or capability, without rewriting your code.
Monitor Usage & Costs A centralized dashboard shows consumption across all models, helping you optimize budgets.
Automatic Fallbacks & Retries If one model or provider is down, your requests can automatically fail over to an alternative, keeping your workflows humming.
Community & Support Receive hands-on help, examples, and best practices from the Requesty community and support team.
Setup Overview
At a high level, hereâs how youâll get Grok 3 working in various open-source tools:
Sign up for a Requesty Router account (or use your existing one).
Obtain your unified Router API Key (from app.requesty.ai/router).
Configure your chosen tool (Cline, Roo Code, OpenWebUI, etc.) to point at:
Base URL: https://router.requesty.ai/v1
Model: xai/grok-3:beta or xai/grok-3-mini:beta (and for deep reasoning, xai/grok-3:beta-think or xai/grok-3-mini:beta-think)
Auth Header: Authorization: Bearer <YOUR_ROUTER_API_KEY> (OpenAI-compatible header)
Send requests as if youâre calling an OpenAI-style completion or chat endpoint.
Enjoy the benefits of advanced reasoning and world-class language generation.
Using Grok 3 with Cline
Requesty Routing for Cline
Many of our users pair the Requesty Router with the Cline coding agent to quickly switch between model providers. Itâs straightforward:
Select âRequestyâ from the API Provider dropdown inside Clineâs settings.
Add your API Key â You can create or retrieve it on the Router Page in our platform.
Paste your Model ID â Youâll find Grok 3 or other models in the Model List.
Weâve created dedicated models for Cline. If you want to use those, the format is slightly different than the standard model names. You can find more details in the Dedicated Models documentation on the Requesty platform.
Example: If you set an alias âcodingâ for Grok 3 in the Requesty dashboard, you can reference it in Cline as alias/coding.
Quick-Start Example
Open Cline (either in VS Code after installing the extension, or via CLI).
Select Requesty as the provider and input your Router API Key.
Choose a model (e.g., xai/grok-3:beta or use a dedicated alias).
Ask a coding question or request code generation. Cline routes the call to Grok 3 automatically.
Using Grok 3 with Roo Code
Roo Code also supports direct integration with the Requesty Router. The process is similar:
Open Roo Code and look for the provider settings.
Pick âRequestyâ as your API provider.
Enter your Requesty Router API Key.
Paste the Grok 3 model ID (xai/grok-3:beta), or use an alias like alias/coding.
Roo Code will now direct your coding prompts and completions through Grok 3. Switching to another model in Roo Code is just as easyâchange the model ID or alias in the settings.
Using Grok 3 with OpenWebUI
OpenWebUI (OWUI) is a popular browser-based UI for local or remote LLM endpoints. It supports OpenAI-compatible calls out of the box:
Launch OpenWebUI and go to the Providers/Settings section.
Select âOpenAI-compatibleâ as your provider type.
Input the base endpoint: https://router.requesty.ai/v1.
Paste your API key in the âBearer Tokenâ or âAPI Keyâ field.
Set the model name**:** xai/grok-3:beta (or xai/grok-3-mini:beta, or a Think variant).
Save & refresh. You can now interact with Grok 3 in your browser and watch the real-time responses.
VS Code Extension & Instant Model Switching
Another handy option is the Requesty VS Code extension, which lets you switch LLMs on the fly right inside your editor:
Get Your API Key Go to app.requesty.ai/router to generate or copy your key.
Install the Extension In VS Code, search âRequestyâ in the Extensions panel, then click Install.
Add Your Key Click the Requesty icon on the sidebar and paste your key when prompted.
Create an Alias (e.g., âcodingâ) For example, set Grok 3 as your âcodingâ model.
Use the Alias In Cline, Roo Code, or other tools, set the model_id to alias/coding.
Now you can star (âď¸) your favorite models in the Requesty dashboard for quick reference, switch them anytime, and keep all your usage tracking centralized.
Power Tips: Making the Most of Grok 3
Use âThinkâ When You Need Depth For routine tasks, standard Grok 3 is quick and efficient. For complex problem-solving or multi-step reasoning, use xai/grok-3:beta-think for a rich chain-of-thought.
Leverage Context Windows Grok 3 can handle up to 1M tokens in the Beta. Provide all relevant details or conversation history for the best results.
Combine with Other Models The Requesty Router allows fallback or âsplitâ strategiesâsend simple queries to a cheaper model, heavy tasks to Grok 3.
Monitor Usage The Requesty dashboard shows token usage, costs, and performance data so you can avoid surprises.
Experiment with System Prompts If your tool supports âsystemâ or âroleâ messages, use them to define style or domain constraints. Grok 3 respects these for improved alignment.
Join Our Discord
Ready to explore Grok 3âs capabilities, share your results, or get help with advanced integrations? Join our Requesty Discord community! Our teamâand a vibrant group of developersâare there to:
Answer questions about setting up or tuning your Grok 3 environment.
Offer support for advanced multi-model strategies and usage.
Showcase projects using Grok 3 with open-source tooling.
Discuss the future of advanced reasoning LLMs.
Weâd love to see what you build!
Conclusion
Grok 3 is ushering in a new era of advanced AI reasoningâcoupling robust knowledge with deeply effective chain-of-thought. By tapping into it via the Requesty Router, you gain a single, streamlined interface for all your open-source coding tools, whether itâs Cline, Roo Code, OpenWebUI, or something else entirely.
Take a few minutes to set up your integration, and youâll be on your way to powering your development workflows or research projects with next-level intelligence. If you have any questionsâor just want to share your successesâbe sure to join our Discord and connect with like-minded developers.
Get started today and let Grok 3 supercharge your open-source AI toolkit!