Before Laravel 13, adding AI to a Laravel app meant picking a provider, installing their SDK, writing a custom service class to wrap it, and hoping that abstraction held when you needed to swap models mid-sprint. Most teams ended up with a fat service that knew too much, a leaky wrapper that only worked for one provider, or raw Http::post() calls buried in controllers.
I went through this myself — I integrated OpenRouter into a production SaaS using the manual approach documented in my earlier guide. It worked. But it required building and maintaining your own abstraction layer.
Laravel 13 changes this completely. The Laravel AI SDK is now first-party, production-stable, and ships with the framework itself. This guide covers everything from setup to agents, streaming, provider failover with OpenRouter, and how to properly test AI features.
1. What the Laravel AI SDK actually is
Taylor Otwell described the motivation during the Laracon EU 2026 announcement: "It felt like we needed a first-party opinion on interacting with AI providers — just like we have opinions on sending email or queuing jobs."
That framing is exactly right. The SDK isn't a thin wrapper around one API. It's a full architectural layer — the same way Eloquent is a full layer on top of SQL, not just a query shortcut.
Text & Agents
Generate & reason
Tool-calling agents with instructions, memory, structured output, and multi-step reasoning
Images & Audio
Create & transcribe
Generate images from prompts, synthesize audio, and transcribe voice to text
Embeddings & Search
Semantic search
Generate embeddings and run vector similarity queries directly in Eloquent
Provider-agnostic
One config switch
Switch between OpenAI, Anthropic, Gemini, OpenRouter — zero application code changes
The SDK also handles retry logic, error normalization, and queue integration automatically. You don't write that yourself anymore.
2. Where OpenRouter fits in
OpenRouter is a unified API gateway for 100+ AI models. You send requests to one endpoint with one API key, and OpenRouter routes them to the underlying provider — GPT-4o, Claude, Gemini, Llama, Mistral, and many more.
The Laravel AI SDK supports OpenRouter natively via a custom base URL. This combination gives you the best of both worlds: Laravel's first-party agent system with OpenRouter's model flexibility and cost advantages.
In practice this means:
- Use free OpenRouter models during local development — zero API cost
- Use GPT-4o Mini in production for general tasks at a fraction of GPT-4o cost
- Upgrade to Claude or GPT-4o for specific agents that need it — one config line
- Automatic failover if a provider goes down — without touching application code
Tip: OpenRouter uses the same OpenAI-compatible API format, so the Laravel AI SDK integrates with it seamlessly via the url override in config/ai.php. No special driver required.
3. Installation
composer require laravel/ai
php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider"
php artisan migrate
Requirements: The Laravel AI SDK requires Laravel 13 and PHP 8.3 or higher. If you're still on Laravel 12, see my
previous guide for the manual OpenRouter approach, or upgrade first — it's a 10-minute process with zero breaking changes.
4. Configuring OpenRouter as your provider
Open config/ai.php and add OpenRouter as a named provider. The key insight is that OpenRouter is OpenAI-compatible — so you use the openai driver and override the base URL.
'default' => env('AI_PROVIDER', 'openrouter'),
'providers' => [
'openrouter' => [
'driver' => 'openai',
'key' => env('OPENROUTER_API_KEY'),
'url' => 'https://openrouter.ai/api/v1',
],
'openai' => [
'driver' => 'openai',
'key' => env('OPENAI_API_KEY'),
],
'anthropic' => [
'driver' => 'anthropic',
'key' => env('ANTHROPIC_API_KEY'),
],
],
AI_PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-v1-your-key-here
OPENAI_API_KEY=sk-your-openai-key
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
Getting an OpenRouter key: Sign up at
openrouter.ai — it's free to start. You only pay for usage above the free tier. During local development, models like
meta-llama/llama-3.1-8b-instruct:free are completely free.
5. Creating your first Agent
Agents are the core abstraction in the Laravel AI SDK. Each agent is a dedicated PHP class that holds the instructions, context, tools, and output schema for a specific AI task. Think of it as a specialized assistant you configure once and use throughout your app.
php artisan make:agent SupportAgent
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Agent;
use Laravel\Ai\Attributes\Model;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Attributes\Temperature;
#[Provider('openrouter')]
#[Model('openai/gpt-4o-mini')]
#[Temperature(0.4)]
class SupportAgent extends Agent
{
public function instructions(): string
{
return 'You are a helpful customer support assistant for an e-commerce platform. '
. 'Answer questions about orders, shipping, and returns clearly and concisely. '
. 'If you cannot help, politely direct the user to contact support. '
. 'Always respond in the same language the user writes in.';
}
}
use App\Ai\Agents\SupportAgent;
public function chat(Request $request): JsonResponse
{
$request->validate(['message' => 'required|string|max:1000']);
$response = SupportAgent::make()
->prompt($request->message);
return response()->json([
'reply' => (string) $response,
]);
}
That's it. No HTTP client setup, no JSON parsing, no error handling boilerplate. The SDK manages all of that behind the scenes.
6. Structured output — getting JSON back reliably
One of the most common AI integration problems is parsing unstructured text responses. The SDK solves this with a JSON schema definition on the agent. The model is constrained to return exactly the structure you define.
php artisan make:agent ProductDescriptionAgent --structured
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Agent;
use Laravel\Ai\Attributes\Model;
use Laravel\Ai\Attributes\Provider;
use Illuminate\Contracts\JsonSchema\JsonSchema;
#[Provider('openrouter')]
#[Model('openai/gpt-4o-mini')]
class ProductDescriptionAgent extends Agent
{
public function instructions(): string
{
return 'You are an e-commerce copywriter. Generate a product title,
a short description (max 80 words), and 3 bullet point features.';
}
public function schema(JsonSchema $schema): array
{
return [
'title' => $schema->string()->required(),
'description' => $schema->string()->required(),
'features' => $schema->array(
$schema->string()
)->required(),
];
}
}
$result = ProductDescriptionAgent::make()
->prompt("Write a product description for: {$product->name}. Category: {$product->category}.");
$title = $result->title;
$description = $result->description;
$features = $result->features;
$product->update([
'ai_title' => $title,
'ai_description' => $description,
'ai_features' => $features,
]);
Tip: Structured output is the right approach for any AI feature where you need to store or display specific fields. It eliminates regex parsing and JSON decoding errors — the SDK guarantees the response matches your schema or throws a typed exception you can catch.
7. Streaming responses to the frontend
Streaming makes AI features feel instant even when the model is still generating. Instead of waiting for the full response before showing anything, the frontend renders tokens as they arrive.
<?php
namespace App\Http\Controllers;
use App\Ai\Agents\SupportAgent;
use Illuminate\Http\Request;
class ChatController extends Controller
{
public function stream(Request $request)
{
$request->validate(['message' => 'required|string|max:1000']);
return response()->stream(function () use ($request) {
$stream = SupportAgent::make()
->stream($request->message);
foreach ($stream as $chunk) {
echo "data: " . json_encode(['text' => $chunk]) . "\n\n";
ob_flush();
flush();
}
echo "data: [DONE]\n\n";
ob_flush();
flush();
}, 200, [
'Content-Type' => 'text/event-stream',
'Cache-Control' => 'no-cache',
'X-Accel-Buffering' => 'no',
]);
}
}
const source = new EventSource('/chat/stream?message=' + encodeURIComponent(input));
let output = '';
source.onmessage = (event) => {
if (event.data === '[DONE]') {
source.close();
return;
}
const chunk = JSON.parse(event.data);
output += chunk.text;
document.getElementById('response').textContent = output;
};
8. Queuing heavy AI jobs
Not every AI task needs to be synchronous. Generating product descriptions for a whole catalogue, summarizing uploaded documents, or processing batches overnight — these belong in the queue. The SDK integrates with Laravel's queue system natively.
php artisan make:job GenerateProductDescriptions
<?php
namespace App\Jobs;
use App\Ai\Agents\ProductDescriptionAgent;
use App\Models\Product;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
class GenerateProductDescriptions implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable;
public int $tries = 3;
public int $timeout = 120;
public function __construct(
public readonly int $productId
) {}
public function handle(): void
{
$product = Product::findOrFail($this->productId);
$result = ProductDescriptionAgent::make()
->prompt("Product: {$product->name}. Category: {$product->category}.");
$product->update([
'ai_title' => $result->title,
'ai_description' => $result->description,
'ai_features' => $result->features,
'ai_generated_at' => now(),
]);
}
public function failed(\Throwable $exception): void
{
\Log::error("AI generation failed for product {$this->productId}: {$exception->getMessage()}");
}
}
GenerateProductDescriptions::dispatch($product->id);
Product::whereNull('ai_generated_at')
->each(fn ($product) => GenerateProductDescriptions::dispatch($product->id)->onQueue('ai'));
Important: AI API calls can be slow — often 5–30 seconds for complex prompts. Always set a $timeout on jobs that call AI agents. The default Laravel job timeout is 60 seconds, which is often too short for structured output or multi-step agents. Set it to 120 or higher depending on your use case.
9. Provider failover with OpenRouter
This is where the combination of the Laravel AI SDK and OpenRouter becomes particularly powerful for production. You can configure multiple providers and the SDK will automatically fall back if one fails.
'providers' => [
'openrouter' => [
'driver' => 'openai',
'key' => env('OPENROUTER_API_KEY'),
'url' => 'https://openrouter.ai/api/v1',
],
'anthropic' => [
'driver' => 'anthropic',
'key' => env('ANTHROPIC_API_KEY'),
],
],
use Laravel\Ai\Attributes\Provider;
#[Provider(['openrouter', 'anthropic'])]
#[Model('openai/gpt-4o-mini')]
class SupportAgent extends Agent
{
}
If OpenRouter is unavailable or rate-limited, the SDK automatically retries with Anthropic — no error surfaced to the user, no code change required. For production SaaS where uptime matters, this is a meaningful safety net.
10. Provider and model comparison
| Use case | Recommended model (via OpenRouter) | Why |
| Local development |
meta-llama/llama-3.1-8b-instruct:free |
Free tier, good enough for dev Zero cost |
| General production tasks |
openai/gpt-4o-mini |
Fast, cheap, handles most use cases well |
| Complex reasoning / agents |
openai/gpt-4o |
Best tool use and structured output reliability |
| Long documents / analysis |
anthropic/claude-3-5-sonnet |
200K context window, excellent instruction following |
| Embeddings |
openai/text-embedding-3-small |
Cheapest per token, works well for semantic search |
11. Testing AI features properly
This is the part most tutorials skip, but it's essential for production code. The Laravel AI SDK ships with built-in fakes so you can write real tests without hitting any API.
<?php
namespace Tests\Feature;
use App\Ai\Agents\SupportAgent;
use App\Http\Controllers\ChatController;
use Laravel\Ai\Testing\AgentFake;
use Tests\TestCase;
class ChatControllerTest extends TestCase
{
public function test_chat_returns_agent_response(): void
{
AgentFake::make(SupportAgent::class)
->withResponse('I can help you track your order.');
$response = $this->postJson('/chat', [
'message' => 'Where is my order?',
]);
$response->assertOk()
->assertJsonPath('reply', 'I can help you track your order.');
AgentFake::assertPrompted(SupportAgent::class, 'Where is my order?');
}
public function test_structured_output_is_saved_to_product(): void
{
$product = Product::factory()->create();
AgentFake::make(ProductDescriptionAgent::class)
->withStructuredResponse([
'title' => 'Premium Wireless Headphones',
'description' => 'Crystal clear audio with 40-hour battery life.',
'features' => ['40hr battery', 'Noise cancelling', 'Foldable design'],
]);
GenerateProductDescriptions::dispatchSync($product->id);
$this->assertDatabaseHas('products', [
'id' => $product->id,
'ai_title' => 'Premium Wireless Headphones',
]);
}
}
Why this matters: Without fakes, your CI pipeline makes real API calls on every test run — adding cost, latency, and flakiness. The fake intercepts everything at the agent layer, so your tests run fast and reliably with no API keys required in the CI environment.
12. Semantic search with embeddings (bonus)
Laravel 13 adds native vector query support to the query builder. Combined with the AI SDK's embedding generation, you can build semantic search without any external vector database — just PostgreSQL with the pgvector extension.
use Laravel\Ai\Embeddings;
$embedding = Embeddings::of($product->name . ' ' . $product->description)
->generate()
->vector();
$product->update(['embedding' => $embedding]);
$results = DB::table('products')
->whereVectorSimilarTo('embedding', $request->query)
->limit(10)
->get();
Note: Vector search requires PostgreSQL with the pgvector extension enabled. It does not work with MySQL. If you're on MySQL, you'll need a separate vector store like Pinecone or Qdrant for this feature specifically — but everything else in this guide works with any database.
13. The old way vs the new way — a quick comparison
If you read my previous OpenRouter integration guide, here's how the two approaches compare now:
| Manual approach (Laravel 12) | AI SDK (Laravel 13) |
| Setup |
Install 3rd-party package, write service class |
composer require laravel/ai |
| Provider switch |
Rewrite service class |
Change one config value Zero code change |
| Structured output |
Parse JSON manually, handle failures |
Schema definition, typed response object |
| Testing |
Mock HTTP client manually |
AgentFake::make() built in |
| Retry logic |
Write it yourself |
Handled automatically |
| Failover |
Not available |
Multi-provider array on the agent |
14. Conclusion
The Laravel AI SDK removes all the reasons to delay adding AI features to your application. There's no abstraction layer to write, no provider-specific SDK gymnastics, and no gap between "it works locally" and "it's testable in CI." It's the same pattern as email or queues — configure the driver, use the facade, write tests with the fake.
OpenRouter on top of it gives you model flexibility and cost control without any additional complexity. Switch models by changing a string. Access free models in development. Fall back automatically when a provider has issues.
If you're on Laravel 12, the manual approach still works — but upgrading to Laravel 13 is worth it for this alone. The upgrade is zero breaking changes, and this guide is waiting for you on the other side.