Laravel AI SDK + OpenRouter: Build Production-Ready AI Features in Laravel 13 (Complete Guide)

Quick Answer

The Laravel AI SDK shipped stable with Laravel 13 on March 17, 2026. It's a first-party, provider-agnostic package for building AI features inside any Laravel app.

To use it with OpenRouter, install the package, then configure OpenRouter as a custom base URL provider in config/ai.php. You get access to 100+ models through one API key — with agents, streaming, queues, and testable fakes all built in.

composer require laravel/ai — then point it at OpenRouter and you're building agents in minutes.

Before Laravel 13, adding AI to a Laravel app meant picking a provider, installing their SDK, writing a custom service class to wrap it, and hoping that abstraction held when you needed to swap models mid-sprint. Most teams ended up with a fat service that knew too much, a leaky wrapper that only worked for one provider, or raw Http::post() calls buried in controllers.

I went through this myself — I integrated OpenRouter into a production SaaS using the manual approach documented in my earlier guide. It worked. But it required building and maintaining your own abstraction layer.

Laravel 13 changes this completely. The Laravel AI SDK is now first-party, production-stable, and ships with the framework itself. This guide covers everything from setup to agents, streaming, provider failover with OpenRouter, and how to properly test AI features.

1. What the Laravel AI SDK actually is

Taylor Otwell described the motivation during the Laracon EU 2026 announcement: "It felt like we needed a first-party opinion on interacting with AI providers — just like we have opinions on sending email or queuing jobs."

That framing is exactly right. The SDK isn't a thin wrapper around one API. It's a full architectural layer — the same way Eloquent is a full layer on top of SQL, not just a query shortcut.

Text & Agents
Generate & reason
Tool-calling agents with instructions, memory, structured output, and multi-step reasoning
Images & Audio
Create & transcribe
Generate images from prompts, synthesize audio, and transcribe voice to text
Embeddings & Search
Semantic search
Generate embeddings and run vector similarity queries directly in Eloquent
Provider-agnostic
One config switch
Switch between OpenAI, Anthropic, Gemini, OpenRouter — zero application code changes

The SDK also handles retry logic, error normalization, and queue integration automatically. You don't write that yourself anymore.

2. Where OpenRouter fits in

OpenRouter is a unified API gateway for 100+ AI models. You send requests to one endpoint with one API key, and OpenRouter routes them to the underlying provider — GPT-4o, Claude, Gemini, Llama, Mistral, and many more.

The Laravel AI SDK supports OpenRouter natively via a custom base URL. This combination gives you the best of both worlds: Laravel's first-party agent system with OpenRouter's model flexibility and cost advantages.

In practice this means:

  • Use free OpenRouter models during local development — zero API cost
  • Use GPT-4o Mini in production for general tasks at a fraction of GPT-4o cost
  • Upgrade to Claude or GPT-4o for specific agents that need it — one config line
  • Automatic failover if a provider goes down — without touching application code
Tip: OpenRouter uses the same OpenAI-compatible API format, so the Laravel AI SDK integrates with it seamlessly via the url override in config/ai.php. No special driver required.

3. Installation

# Install the Laravel AI SDK composer require laravel/ai # Publish config and migration files php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider" # Run migrations (creates agent_conversations and agent_conversation_messages tables) php artisan migrate
Requirements: The Laravel AI SDK requires Laravel 13 and PHP 8.3 or higher. If you're still on Laravel 12, see my previous guide for the manual OpenRouter approach, or upgrade first — it's a 10-minute process with zero breaking changes.

4. Configuring OpenRouter as your provider

Open config/ai.php and add OpenRouter as a named provider. The key insight is that OpenRouter is OpenAI-compatible — so you use the openai driver and override the base URL.

// config/ai.php 'default' => env('AI_PROVIDER', 'openrouter'), 'providers' => [ 'openrouter' => [ 'driver' => 'openai', 'key' => env('OPENROUTER_API_KEY'), 'url' => 'https://openrouter.ai/api/v1', ], // Keep OpenAI as a fallback if needed 'openai' => [ 'driver' => 'openai', 'key' => env('OPENAI_API_KEY'), ], // Or use Anthropic directly 'anthropic' => [ 'driver' => 'anthropic', 'key' => env('ANTHROPIC_API_KEY'), ], ],
# .env AI_PROVIDER=openrouter OPENROUTER_API_KEY=sk-or-v1-your-key-here # Optional fallbacks OPENAI_API_KEY=sk-your-openai-key ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
Getting an OpenRouter key: Sign up at openrouter.ai — it's free to start. You only pay for usage above the free tier. During local development, models like meta-llama/llama-3.1-8b-instruct:free are completely free.

5. Creating your first Agent

Agents are the core abstraction in the Laravel AI SDK. Each agent is a dedicated PHP class that holds the instructions, context, tools, and output schema for a specific AI task. Think of it as a specialized assistant you configure once and use throughout your app.

# Generate an agent class php artisan make:agent SupportAgent
<?php namespace App\Ai\Agents; use Laravel\Ai\Agent; use Laravel\Ai\Attributes\Model; use Laravel\Ai\Attributes\Provider; use Laravel\Ai\Attributes\Temperature; #[Provider('openrouter')] #[Model('openai/gpt-4o-mini')] #[Temperature(0.4)] class SupportAgent extends Agent { public function instructions(): string { return 'You are a helpful customer support assistant for an e-commerce platform. ' . 'Answer questions about orders, shipping, and returns clearly and concisely. ' . 'If you cannot help, politely direct the user to contact support. ' . 'Always respond in the same language the user writes in.'; } }
// In your controller use App\Ai\Agents\SupportAgent; public function chat(Request $request): JsonResponse { $request->validate(['message' => 'required|string|max:1000']); $response = SupportAgent::make() ->prompt($request->message); return response()->json([ 'reply' => (string) $response, ]); }

That's it. No HTTP client setup, no JSON parsing, no error handling boilerplate. The SDK manages all of that behind the scenes.

6. Structured output — getting JSON back reliably

One of the most common AI integration problems is parsing unstructured text responses. The SDK solves this with a JSON schema definition on the agent. The model is constrained to return exactly the structure you define.

# Generate an agent with structured output php artisan make:agent ProductDescriptionAgent --structured
<?php namespace App\Ai\Agents; use Laravel\Ai\Agent; use Laravel\Ai\Attributes\Model; use Laravel\Ai\Attributes\Provider; use Illuminate\Contracts\JsonSchema\JsonSchema; #[Provider('openrouter')] #[Model('openai/gpt-4o-mini')] class ProductDescriptionAgent extends Agent { public function instructions(): string { return 'You are an e-commerce copywriter. Generate a product title, a short description (max 80 words), and 3 bullet point features.'; } public function schema(JsonSchema $schema): array { return [ 'title' => $schema->string()->required(), 'description' => $schema->string()->required(), 'features' => $schema->array( $schema->string() )->required(), ]; } }
// Usage — response is already parsed into a typed object $result = ProductDescriptionAgent::make() ->prompt("Write a product description for: {$product->name}. Category: {$product->category}."); $title = $result->title; // string $description = $result->description; // string $features = $result->features; // array of strings $product->update([ 'ai_title' => $title, 'ai_description' => $description, 'ai_features' => $features, ]);
Tip: Structured output is the right approach for any AI feature where you need to store or display specific fields. It eliminates regex parsing and JSON decoding errors — the SDK guarantees the response matches your schema or throws a typed exception you can catch.

7. Streaming responses to the frontend

Streaming makes AI features feel instant even when the model is still generating. Instead of waiting for the full response before showing anything, the frontend renders tokens as they arrive.

<?php namespace App\Http\Controllers; use App\Ai\Agents\SupportAgent; use Illuminate\Http\Request; class ChatController extends Controller { public function stream(Request $request) { $request->validate(['message' => 'required|string|max:1000']); return response()->stream(function () use ($request) { $stream = SupportAgent::make() ->stream($request->message); foreach ($stream as $chunk) { echo "data: " . json_encode(['text' => $chunk]) . "\n\n"; ob_flush(); flush(); } echo "data: [DONE]\n\n"; ob_flush(); flush(); }, 200, [ 'Content-Type' => 'text/event-stream', 'Cache-Control' => 'no-cache', 'X-Accel-Buffering' => 'no', ]); } }
// Frontend JavaScript — consuming the stream const source = new EventSource('/chat/stream?message=' + encodeURIComponent(input)); let output = ''; source.onmessage = (event) => { if (event.data === '[DONE]') { source.close(); return; } const chunk = JSON.parse(event.data); output += chunk.text; document.getElementById('response').textContent = output; };

8. Queuing heavy AI jobs

Not every AI task needs to be synchronous. Generating product descriptions for a whole catalogue, summarizing uploaded documents, or processing batches overnight — these belong in the queue. The SDK integrates with Laravel's queue system natively.

php artisan make:job GenerateProductDescriptions
<?php namespace App\Jobs; use App\Ai\Agents\ProductDescriptionAgent; use App\Models\Product; use Illuminate\Bus\Queueable; use Illuminate\Contracts\Queue\ShouldQueue; use Illuminate\Foundation\Bus\Dispatchable; use Illuminate\Queue\InteractsWithQueue; class GenerateProductDescriptions implements ShouldQueue { use Dispatchable, InteractsWithQueue, Queueable; public int $tries = 3; public int $timeout = 120; public function __construct( public readonly int $productId ) {} public function handle(): void { $product = Product::findOrFail($this->productId); $result = ProductDescriptionAgent::make() ->prompt("Product: {$product->name}. Category: {$product->category}."); $product->update([ 'ai_title' => $result->title, 'ai_description' => $result->description, 'ai_features' => $result->features, 'ai_generated_at' => now(), ]); } public function failed(\Throwable $exception): void { \Log::error("AI generation failed for product {$this->productId}: {$exception->getMessage()}"); } }
// Dispatch for a single product GenerateProductDescriptions::dispatch($product->id); // Dispatch for all products missing AI content Product::whereNull('ai_generated_at') ->each(fn ($product) => GenerateProductDescriptions::dispatch($product->id)->onQueue('ai'));
Important: AI API calls can be slow — often 5–30 seconds for complex prompts. Always set a $timeout on jobs that call AI agents. The default Laravel job timeout is 60 seconds, which is often too short for structured output or multi-step agents. Set it to 120 or higher depending on your use case.

9. Provider failover with OpenRouter

This is where the combination of the Laravel AI SDK and OpenRouter becomes particularly powerful for production. You can configure multiple providers and the SDK will automatically fall back if one fails.

// config/ai.php — configure failover providers 'providers' => [ 'openrouter' => [ 'driver' => 'openai', 'key' => env('OPENROUTER_API_KEY'), 'url' => 'https://openrouter.ai/api/v1', ], 'anthropic' => [ 'driver' => 'anthropic', 'key' => env('ANTHROPIC_API_KEY'), ], ],
// In your agent — list providers in priority order use Laravel\Ai\Attributes\Provider; #[Provider(['openrouter', 'anthropic'])] // tries openrouter first, anthropic if it fails #[Model('openai/gpt-4o-mini')] class SupportAgent extends Agent { // ... }

If OpenRouter is unavailable or rate-limited, the SDK automatically retries with Anthropic — no error surfaced to the user, no code change required. For production SaaS where uptime matters, this is a meaningful safety net.

10. Provider and model comparison

Use caseRecommended model (via OpenRouter)Why
Local development meta-llama/llama-3.1-8b-instruct:free Free tier, good enough for dev Zero cost
General production tasks openai/gpt-4o-mini Fast, cheap, handles most use cases well
Complex reasoning / agents openai/gpt-4o Best tool use and structured output reliability
Long documents / analysis anthropic/claude-3-5-sonnet 200K context window, excellent instruction following
Embeddings openai/text-embedding-3-small Cheapest per token, works well for semantic search

11. Testing AI features properly

This is the part most tutorials skip, but it's essential for production code. The Laravel AI SDK ships with built-in fakes so you can write real tests without hitting any API.

<?php namespace Tests\Feature; use App\Ai\Agents\SupportAgent; use App\Http\Controllers\ChatController; use Laravel\Ai\Testing\AgentFake; use Tests\TestCase; class ChatControllerTest extends TestCase { public function test_chat_returns_agent_response(): void { // Intercept all agent calls — no real API requests AgentFake::make(SupportAgent::class) ->withResponse('I can help you track your order.'); $response = $this->postJson('/chat', [ 'message' => 'Where is my order?', ]); $response->assertOk() ->assertJsonPath('reply', 'I can help you track your order.'); // Assert the agent was actually called with the right prompt AgentFake::assertPrompted(SupportAgent::class, 'Where is my order?'); } public function test_structured_output_is_saved_to_product(): void { $product = Product::factory()->create(); AgentFake::make(ProductDescriptionAgent::class) ->withStructuredResponse([ 'title' => 'Premium Wireless Headphones', 'description' => 'Crystal clear audio with 40-hour battery life.', 'features' => ['40hr battery', 'Noise cancelling', 'Foldable design'], ]); GenerateProductDescriptions::dispatchSync($product->id); $this->assertDatabaseHas('products', [ 'id' => $product->id, 'ai_title' => 'Premium Wireless Headphones', ]); } }
Why this matters: Without fakes, your CI pipeline makes real API calls on every test run — adding cost, latency, and flakiness. The fake intercepts everything at the agent layer, so your tests run fast and reliably with no API keys required in the CI environment.

12. Semantic search with embeddings (bonus)

Laravel 13 adds native vector query support to the query builder. Combined with the AI SDK's embedding generation, you can build semantic search without any external vector database — just PostgreSQL with the pgvector extension.

// Generate and store an embedding when a product is created use Laravel\Ai\Embeddings; $embedding = Embeddings::of($product->name . ' ' . $product->description) ->generate() ->vector(); $product->update(['embedding' => $embedding]);
// Search semantically — finds conceptually related products, not just keyword matches $results = DB::table('products') ->whereVectorSimilarTo('embedding', $request->query) ->limit(10) ->get();
Note: Vector search requires PostgreSQL with the pgvector extension enabled. It does not work with MySQL. If you're on MySQL, you'll need a separate vector store like Pinecone or Qdrant for this feature specifically — but everything else in this guide works with any database.

13. The old way vs the new way — a quick comparison

If you read my previous OpenRouter integration guide, here's how the two approaches compare now:

Manual approach (Laravel 12)AI SDK (Laravel 13)
Setup Install 3rd-party package, write service class composer require laravel/ai
Provider switch Rewrite service class Change one config value Zero code change
Structured output Parse JSON manually, handle failures Schema definition, typed response object
Testing Mock HTTP client manually AgentFake::make() built in
Retry logic Write it yourself Handled automatically
Failover Not available Multi-provider array on the agent

14. Conclusion

The Laravel AI SDK removes all the reasons to delay adding AI features to your application. There's no abstraction layer to write, no provider-specific SDK gymnastics, and no gap between "it works locally" and "it's testable in CI." It's the same pattern as email or queues — configure the driver, use the facade, write tests with the fake.

OpenRouter on top of it gives you model flexibility and cost control without any additional complexity. Switch models by changing a string. Access free models in development. Fall back automatically when a provider has issues.

If you're on Laravel 12, the manual approach still works — but upgrading to Laravel 13 is worth it for this alone. The upgrade is zero breaking changes, and this guide is waiting for you on the other side.

Need Help Adding AI to Your Laravel App?

I've integrated OpenRouter and LLM features into production SaaS platforms — from chatbots and content generators to automated e-commerce workflows. Happy to help you design the right AI architecture for your product.

Based in Bangladesh · Remote worldwide · Fast turnaround

About the Author

Kamruzzaman Polash — Software Engineer specialising in Laravel, REST APIs, and scalable backend systems. 10+ projects delivered for clients worldwide.