28 Jan 2026

Symfony Big Data Import Improvement

When processing millions of records in Symfony console commands, performance becomes critical. Here are three practical optimization techniques that can dramatically improve your import speeds.

1. Disable SQL Logging

$this->entityManager->getConnection()->getConfiguration()->setSQLLogger(null);

Doctrine logs every SQL query by default. When processing millions of records, this creates significant memory overhead. Disabling it is a simple one-liner that makes a big difference.

2. Batch Flushing

if ($itemsCreated > 0 && $itemsCreated % 1000 === 0) {
    $this->entityManager->flush();
    $this->entityManager->clear();
}

Instead of flushing after every record, batch your flushes. Processing 1000 records at a time significantly reduces database round trips and keeps memory usage under control.

3. Pre-load Entities

Avoid queries inside loops!

// Get existing entities efficiently using direct SQL
$existingItems = $this->connection->fetchAllAssociative(
    "SELECT id, external_id FROM items"
);

// Create lookup array
$existingMap = array_flip(array_column($existingItems, "external_id"));

Pre-collect all necessary data before processing. This converts expensive repeated queries into a single lookup operation, eliminating the N+1 query problem entirely.

Summary

  • Disable SQL logging to reduce memory overhead
  • Batch your flushes (every 1000 records)
  • Pre-load entities to prevent N+1 queries