We’ve talked before about the fundamentals of resilience, mastering streams, and handling retries. Those are the essential survival skills for any developer integrating with third-party APIs. But survival isn’t the end goal. The goal is to thrive.
As applications scale, we move from simple request-response problems to complex architectural challenges.
- How do you stop a slow API from crippling your user’s experience?
- How do you build a single, scalable service that must talk to hundreds of different API endpoints with different credentials?
- How do you create a perfect, non-intrusive audit log of every single byte that leaves or enters your application?
- How do you upload a 5GB file to a backup service without your script cr...
We’ve talked before about the fundamentals of resilience, mastering streams, and handling retries. Those are the essential survival skills for any developer integrating with third-party APIs. But survival isn’t the end goal. The goal is to thrive.
As applications scale, we move from simple request-response problems to complex architectural challenges.
- How do you stop a slow API from crippling your user’s experience?
- How do you build a single, scalable service that must talk to hundreds of different API endpoints with different credentials?
- How do you create a perfect, non-intrusive audit log of every single byte that leaves or enters your application?
- How do you upload a 5GB file to a backup service without your script crashing from memory exhaustion?
These aren’t “http-client” problems; they’re application design problems. And symfony/http-client, when combined with the power of the full Symfony ecosystem, provides elegant, robust solutions.
Today, we’re leaving the basics behind. We’re going to architect four production-ready, non-trivial patterns using Symfony 7.x and PHP 8.x. These patterns solve real-world enterprise challenges, and I guarantee they’ll give you a new appreciation for the tools you have.
Let’s get to work.
Our Toolkit
We’ll start with a standard Symfony application. The packages we use will be specific to each pattern.
# Our core component
composer require symfony/http-client
All code will use attributes, constructor property promotion, and strict typing as per modern PHP and Symfony standards.
The “Fire and Forget” — Decoupling with Messenger
Your user signs up. You need to send their data to a third-party CRM, a newsletter service, and a new-user-welcome-email API. The email API is fast, the newsletter is slow (1–2 seconds), and the CRM is… unreliable.
If you make these three API calls sequentially in your controller, the user will be staring at a loading spinner for 3–5 seconds. This is an unacceptable user experience. The user’s registration succeeded; they shouldn’t be punished for our slow, non-critical background tasks.
The Solution: Decouple the work. We’ll use symfony/messenger to dispatch a “fire and forget” message. The controller’s job is just to request the work. A separate worker process will handle the actual HTTP calls in the background.
composer require symfony/messenger symfony/doctrine-messenger
(We’re using the Doctrine transport for simplicity. In production, you’d use RabbitMQ, SQS, Redis, etc.)
The Message (The Data)
First, we create a simple DTO (Data Transfer Object) to represent the work to be done.
// src/Message/AddNewUserToCrm.php
namespace App\Message;
final readonly class AddNewUserToCrm
{
public function __construct(
public int $userId,
public string $email,
) {
}
}
The Handler (The Worker)
This is where HttpClientInterface lives. This service will be triggered by the message bus, not by a controller.
// src/MessageHandler/AddNewUserToCrmHandler.php
namespace App\MessageHandler;
use App\Message\AddNewUserToCrm;
use Psr\Log\LoggerInterface;
use Symfony\Component\Messenger\Attribute\AsMessageHandler;
use Symfony\Contracts\HttpClient\HttpClientInterface;
#[AsMessageHandler]
final readonly class AddNewUserToCrmHandler
{
public function __construct(
// We configure a specific client for our CRM API
private HttpClientInterface $crmApiClient,
private LoggerInterface $logger,
) {
}
public function __invoke(AddNewUserToCrm $message): void
{
$this->logger->info(
'Processing new user for CRM',
['user' => $message->userId]
);
try {
$response = $this->crmApiClient->request('POST', '/api/v2/contacts', [
'json' => [
'email' => $message->email,
'user_id' => $message->userId,
'source' => 'app_registration',
],
]);
// We only care if it succeeds or fails
$this->logger->info(
'CRM API response',
['status' => $response->getStatusCode()]
);
} catch (\Throwable $e) {
$this->logger->error(
'Failed to send user to CRM',
['error' => $e->getMessage(), 'user' => $message->userId]
);
// The messenger component will handle retries based on your config
throw $e;
}
}
}
The Controller (The Dispatcher)
The controller becomes blissfully simple. Its only job is to create the user and dispatch the message. It does not wait for the HTTP call.
// src/Controller/RegistrationController.php
namespace App.Controller;
use App\Entity\User;
use App\Message\AddNewUserToCrm;
use Doctrine\ORM\EntityManagerInterface;
use Symfony\Bundle\FrameworkBundle\Controller\AbstractController;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\Messenger\MessageBusInterface;
use Symfony\Component\Routing\Attribute\Route;
class RegistrationController extends AbstractController
{
#[Route(‘/register’, name: ‘api_register’, methods: [‘POST’])]
public function register(
Request $request,
EntityManagerInterface $em,
MessageBusInterface $bus
): JsonResponse {
// … create and save the $user …
$user = new User();
$user->setEmail($request->getPayload()->get(‘email’));
// … (set password, etc)
$em->persist($user);
$em->flush();
// This is the "Fire and Forget" part.
// This call is synchronous, but it's just adding a row
// to the 'messenger_messages' table. It's lightning fast.
$bus->dispatch(new AddNewUserToCrm(
userId: $user->getId(),
email: $user->getEmail()
));
// Return a response to the user *immediately*.
return $this->json(
['status' => 'User created!'],
JsonResponse::HTTP_CREATED
);
}
}
Configuration
We need to tell Messenger to handle our message asynchronously.
# config/packages/messenger.yaml
framework:
messenger:
# We use the Doctrine transport (creates a 'messenger_messages' table)
transports:
async: '%env(MESSENGER_TRANSPORT_DSN)%'
routing:
# Route our message to the 'async' transport
'App\Message\AddNewUserToCrm': async
services:
# Configure the specific client for our handler
App\MessageHandler\AddNewUserToCrmHandler:
arguments:
$crmApiClient: ‘@http_client.crm’
http_client:
clients:
http_client.crm:
base_uri: 'https://api.my-crm.com'
headers:
'Authorization': 'Bearer %env(CRM_API_KEY)%'
(Run php bin/console doctrine:schema:update — force to create the messenger_messages table.)
Verification
- Open two terminals.
- In Terminal 1, run the worker: php bin/console messenger:consume async -vv
- In Terminal 2, send the request: curl -X POST http://127.0.0.1:8000/register -d ‘{“email”:”test@example.com”}’
- Observe:
- Terminal 2 (your curl command) will get an instant {“status”:”User created!”} response.
- Terminal 1 (your worker) will then spring to life, showing the logs: “Processing new user for CRM…” and “CRM API response”.
You have successfully decoupled your application logic from a slow third-party API.
The “Multi-Tenant” Factory — Dynamic Client Configuration
You’re building a SaaS platform that integrates with a service like Shopify, BigCommerce, or a custom-domain API. Each of your tenants (customers) has a different base_uri (e.g., my-shop-1.shopify.com, my-shop-2.shopify.com) and different API credentials.
You cannot define 5,000 clients in http_client.yaml. You need a way to create scoped clients on the fly.
The Solution: Create a Client Factory service. This service uses the withOptions() method, which is the real power of symfony/http-client. This method returns a new, immutable, scoped client instance without modifying the original.
The Factory Service
This service is the heart of the pattern. It’s shockingly simple.
// src/Service/TenantApiClientFactory.php
namespace App\Service;
use App\Entity\Tenant; // Your tenant entity
use Symfony\Contracts\HttpClient\HttpClientInterface;
final readonly class TenantApiClientFactory
{
public function __construct(
// Inject the default client. This is just a template.
private HttpClientInterface $defaultClient,
) {
}
/**
* Creates a new, immutable client scoped to a specific tenant.
*/
public function createClientForTenant(Tenant $tenant): HttpClientInterface
{
// withOptions() is the magic. It creates a *new* client
// with these options merged on top of the default ones.
return $this->defaultClient->withOptions([
'base_uri' => $tenant->getApiBaseUri(), // e.g., 'https://my-shop.shopify.com'
'headers' => [
// e.g., 'X-Shopify-Access-Token'
$tenant->getApiAuthHeaderName() => $tenant->getApiAuthToken(),
'Accept' => 'application/json',
],
// You can also set tenant-specific timeouts, etc.
'timeout' => 10,
]);
}
}
The Consumer Service
Now, any service that needs to do tenant-specific work (like a ProductSyncer) doesn’t inject a client. It injects the factory.
// src/Service/ProductSyncer.php
namespace App\Service;
use App\Entity\Tenant;
use App\Repository\TenantRepository;
use Psr\Log\LoggerInterface;
final readonly class ProductSyncer
{
public function __construct(
private TenantApiClientFactory $clientFactory,
private TenantRepository $tenantRepository,
private LoggerInterface $logger,
) {
}
/**
* Syncs products for *all* active tenants.
*/
public function syncAllTenants(): void
{
$tenants = $this->tenantRepository->findActiveTenants();
foreach ($tenants as $tenant) {
$this->logger->info('Syncing tenant', ['id' => $tenant->getId()]);
// 1. Create a client just for this tenant
$client = $this.clientFactory->createClientForTenant($tenant);
try {
// 2. Make the call. The base_uri and auth are
// automatically handled by our scoped client.
$response = $client->request('GET', '/admin/api/2024-04/products.json');
$products = $response->toArray();
// ... (do work with the products) ...
$this->logger->info('Sync complete', ['count' => count($products)]);
} catch (\Throwable $e) {
$this->logger->error('Sync failed', [
'tenant' => $tenant->getId(),
'error' => $e->getMessage()
]);
}
}
}
}
Configuration
The YAML is minimal. We just define the default client, which our factory will use as a base.
# config/packages/http_client.yaml
services:
# The factory itself is auto-wired.
# It will receive the default ‘@http_client’
App\Service\TenantApiClientFactory:
arguments:
$defaultClient: ‘@http_client’
http_client:
# These are the default options.
# Our factory’s withOptions() will override them.
defaults:
timeout: 5.0
headers:
‘User-Agent’: ‘My-SaaS-Platform/1.0’
Verification
- Create a test command: php bin/console debug:container ProductSyncer to ensure it’s wired correctly.
- Set up two mock Tenant objects in a test or command.
- Point their base_uri to https://httpbin.org/anything (a public echo API).
- Point one tenant’s auth to [‘X-Api-Key’ => ‘TENANT_A’] and the other to [‘X-Api-Key’ => ‘TENANT_B’].
- Call $client->request(‘GET’, ‘/anything’) for both.
- The JSON response from httpbin will contain a headers key. Assert that the X-Api-Key header in the response exactly matches the one you set for each specific tenant.
You can now serve thousands of tenants from a single, clean, and maintainable codebase.
The “Global Observer” — Deep Logging with EventDispatcher
Your application is making hundreds of API calls from dozens of different services. A customer reports an error. You need to know:
- Exactly what was sent (URL, method, headers, body) from your server.
- Exactly what was received (status code, headers, body).
- How long exactly did it take?
The Symfony Profiler is great for dev, but it’s not available in prod. You need a robust, production-safe audit trail.
The Solution: symfony/http-client is deeply integrated with the EventDispatcher. We can create a subscriber that listens to HttpClientEvents and logs every single request and response, globally, without any service needing to know it’s happening.
-> HttpClient::request()
-> EventDispatcher
-> [HttpClientEvents::REQUEST]
-> Our HttpTraceSubscriber (logs request)
-> Network
-> [HttpClientEvents::RESPONSE]
-> Our HttpTraceSubscriber (logs response, calculates time)
-> [Service A] gets Response]
composer require symfony/stopwatch
(We also use psr/log-logger-interface and symfony/event-dispatcher, which are typically already included.)
The Event Subscriber
This one class will do all the work.
// src/EventSubscriber/HttpTraceSubscriber.php
namespace App\EventSubscriber;
use Psr\Log\LoggerInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Symfony\Component\HttpClient\Event\HttpClientEvents;
use Symfony\Component\HttpClient\Event\RequestEvent;
use Symfony\Component\HttpClient\Event\ResponseEvent;
use Symfony\Component\Stopwatch\Stopwatch;
final class HttpTraceSubscriber implements EventSubscriberInterface
{
private const STOPWATCH_NAME = ‘http_client.request’;
public function __construct(
private readonly LoggerInterface $httpClientLogger,
private readonly Stopwatch $stopwatch,
) {
}
public static function getSubscribedEvents(): array
{
return [
// Start logging *before* the request is sent
HttpClientEvents::REQUEST => ['onClientRequest', 10],
// Log the result *after* the response is received
HttpClientEvents::RESPONSE => ['onClientResponse', 10],
];
}
public function onClientRequest(RequestEvent $event): void
{
$request = $event->getRequest();
$hash = $this->getRequestHash($request);
// Start the stopwatch for this specific request
$this->stopwatch->start(self::STOPWATCH_NAME . '.' . $hash);
$this->httpClientLogger->info(
sprintf('HTTP Request Sent: %s %s',
$request->getMethod(),
$request->getUrl()
),
[
'http_method' => $request->getMethod(),
'url' => $request->getUrl(),
// Be careful logging headers/body in prod!
// 'headers' => $request->getOptions()['headers'] ?? [],
]
);
}
public function onClientResponse(ResponseEvent $event): void
{
$hash = $this->getRequestHash($event->getRequest());
// Stop the stopwatch and get the duration
$duration = $this->stopwatch->isStarted(self::STOPWATCH_NAME . '.' . $hash)
? $this->stopwatch->stop(self::STOPWATCH_NAME . '.' . $hash)->getDuration()
: 0;
$response = $event->getResponse();
$this->httpClientLogger->info(
sprintf('HTTP Response Received: %s %s',
$response->getStatusCode(),
$event->getRequest()->getUrl()
),
[
'http_method' => $event->getRequest()->getMethod(),
'url' => $event->getRequest()->getUrl(),
'status_code' => $response->getStatusCode(),
'duration_ms' => $duration,
// 'response_headers' => $response->getHeaders(false),
]
);
}
private function getRequestHash(object $request): string
{
// Creates a unique ID for this request object
return spl_object_hash($request);
}
}
Configuration
We need to tell Symfony to use this subscriber and inject our special http_client logger channel.
# config/services.yaml
services:
_defaults:
autowire: true
autoconfigure: true
App\EventSubscriber\HttpTraceSubscriber:
# We explicitly tag it as an event subscriber
tags: [ 'kernel.event_subscriber' ]
arguments:
# Inject the 'http_client' channel logger
$httpClientLogger: '@monolog.logger.http_client'
# config/packages/monolog.yaml
monolog:
channels: [‘http_client’] # Define a new channel
handlers:
http_client:
type: rotating_file
path: ‘%kernel.logs_dir%/http_client.log’
level: info
channels: [‘http_client’] # Only log ‘http_client’ messages
max_files: 10
Verification
- Clear your cache: php bin/console cache:clear
- Make any HTTP request from any service in your application (e.g., the ProductSyncer from Pattern 2).
- Tail your new log file: tail -f var/log/http_client.log
- You will see detailed, structured JSON logs for both the request and the response, complete with the duration_ms.
You now have a complete, production-safe audit trail for all external HTTP communication, which is invaluable for debugging and compliance.
The “Data Stream” — Memory-Efficient Large File Uploads
A background job needs to generate a 2GB backup (e.g., a .sql.gz dump or a large CSV export) and upload it to an S3 bucket or another file storage API.
The naive approach is file_get_contents():
$body = file_get_contents('large-backup.sql.gz'); // 2GB
// BOOM! PHP Fatal error: Allowed memory size of ... exhausted
$client->request('PUT', '...', ['body' => $body]);
This loads the entire 2GB file into a single PHP string, destroying your memory limit.
The Solution: symfony/http-client can stream uploads. The body option can accept a resource handle or an iterable (like a generator). This lets PHP read the file (or generate the data) chunk-by-chunk and send it over the network without ever loading the whole thing into memory.
The Service (Method A: Uploading an Existing File)
This is the simplest, most common use case: streaming a large file from disk.
// src/Service/BackupUploader.php
namespace App\Service;
use Psr\Log\LoggerInterface;
use Symfony\Component\HttpClient\Exception\TransportException;
use Symfony\Contracts\HttpClient\HttpClientInterface;
final readonly class BackupUploader
{
public function __construct(
private HttpClientInterface $storageApiClient,
private LoggerInterface $logger,
) {
}
public function uploadBackup(string $filePath): bool
{
if (!is_readable($filePath)) {
$this->logger->error('File not readable', ['path' => $filePath]);
return false;
}
// 1. Open a *resource handle* to the file.
// This does NOT load it into memory.
$fileHandle = fopen($filePath, 'r');
if ($fileHandle === false) {
$this->logger->error('Failed to open file handle', ['path' => $filePath]);
return false;
}
$this->logger->info('Starting backup upload', ['path' => $filePath]);
try {
// 2. Pass the resource handle as the body.
// HttpClient will stream from it.
$response = $this->storageApiClient->request(
'PUT',
'/my-backups/' . basename($filePath),
[
// This is the key.
'body' => $fileHandle,
'headers' => [
// Some APIs require this for large files
'Content-Type: application/octet-stream',
]
]
);
// getStatusCode() waits for the *entire* upload
$statusCode = $response->getStatusCode();
$this->logger->info('Upload complete', ['status' => $statusCode]);
return $statusCode === 200 || $statusCode === 201;
} catch (\Throwable $e) {
$this->logger->error('Upload failed', ['error' => $e->getMessage()]);
return false;
} finally {
// 3. ALWAYS close the file handle.
if (is_resource($fileHandle)) {
fclose($fileHandle);
}
}
}
}
The Service (Method B: Uploading Generated Dat
a)
What if the data isn’t a file? What if it’s a massive CSV report you’re generating from 10 million database rows? We can use a Generator.
// src/Service/ReportUploader.php
namespace App\Service;
use App\Repository\ProductRepository; // Has 10M rows
use Symfony\Contracts\HttpClient\HttpClientInterface;
final readonly class ReportUploader
{
public function __construct(
private HttpClientInterface $storageApiClient,
private ProductRepository $productRepository,
) {
}
public function uploadProductReport(): void
{
$this->storageApiClient->request(
'POST',
'/reports/product-export.csv',
[
// Pass the generator *directly* as the body.
'body' => $this->generateProductCsv(),
'headers' => ['Content-Type: text/csv']
]
);
// Note: We're not even waiting for the response here,
// but we could by calling $response->getStatusCode().
}
/**
* This generator yields data chunk-by-chunk.
* At no point is the full report in memory.
*/
private function generateProductCsv(): \Generator
{
// 1. Yield the header row
yield "ID,SKU,Name,Price\n";
// 2. Stream results from the database (Doctrine can do this)
foreach ($this->productRepository->streamAllProducts() as $product) {
// 3. Yield one line at a time
yield sprintf(
"%d,%s,%s,%d\n",
$product->getId(),
$product->getSku(),
$product->getName(),
$product->getPrice()
);
// In a real app, you'd also detach($product) from Doctrine
// to save even more memory.
}
}
}
Verification
- Create a large dummy file in your project root: dd if=/dev/zero of=100mb-dummy-file.bin bs=1M count=100
- Create a test command.
- Inside the command, record memory_get_peak_usage(true).
- Test 1 (Bad): Use file_get_contents(‘100mb-dummy-file.bin’) and pass it as the body. Record the peak memory after this line. It will be > 100MB.
- Test 2 (Good): Use fopen(‘100mb-dummy-file.bin’, ‘r’) and pass the handle as the body. Record the peak memory. It will be tiny (e.g., < 2MB).
Your command’s output will prove the memory efficiency of the streaming pattern.
Conclusion
symfony/http-client is one of the most powerful and well-designed components in the ecosystem. It’s not just a wrapper around cURL; it’s a fully-featured architecture component.
Today we’ve built four enterprise-ready solutions that go far beyond simple API calls:
- Messenger Decoupling: We made our application feel instant to the user, delegating slow API calls to a background worker for a vastly improved UX.
- Dynamic Factories: We built a scalable, multi-tenant solution that can create thousands of unique clients from a single, clean factory service.
- Event-Based Auditing: We created a zero-effort, global logging system that gives us a complete audit trail of all external communication.
- Streaming Uploads: We learned to upload gigabytes of data with a near-zero memory footprint, making our background jobs robust and reliable.
These patterns are the difference between an application that works and an application that scales.
But these are just a few of the possibilities. The true power of Symfony lies in how these components connect.
What about you? What advanced symfony/http-client patterns have you built? What’s your go-to recipe for a complex integration? Share your own variants and hard-won lessons in the comments below — let’s make this a space where we can all learn.
If you enjoy these deep dives into practical, enterprise-level Symfony architecture, be sure to follow me here.
I have many more patterns and guides planned, and your subscription is the best way to make sure you don’t miss the next one.
Go build something amazing.