Back to blog
Backend Systemsadvanced

Bulk Insert 10,000 Rows Without Killing Your Database

AddRange + SaveChanges chokes on large datasets. Learn EF Core 7+ ExecuteUpdateAsync/ExecuteDeleteAsync, EFCore.BulkExtensions, smart batching, and when raw SQL is the right answer — with real benchmark numbers.

LearnixoApril 14, 20265 min read
.NETC#EF CoreEntity FrameworkBulk OperationsPerformance
Share:𝕏

Why AddRange + SaveChanges is Slow

EF Core's change tracker was built for safety, not throughput. When you call AddRange with 10,000 entities and then SaveChangesAsync, EF:

  1. Tracks every entity in memory
  2. Generates one INSERT statement per entity
  3. Executes each statement in sequence (or small batches of ~42 by default)
  4. Reads back generated keys for each row

For 10,000 rows that is roughly 240 round-trips to the database. On a local connection it takes seconds. Over a network it takes minutes.

C#
// Slow — do NOT do this for large datasets
var products = GenerateProducts(10_000);
_db.Products.AddRange(products);
await _db.SaveChangesAsync(ct); // ~8-12 seconds for 10k rows

EF Core 7+ — ExecuteUpdateAsync and ExecuteDeleteAsync

For bulk updates and deletes you do not need any third-party library. EF Core 7 added ExecuteUpdateAsync and ExecuteDeleteAsync, which translate directly to a single UPDATE or DELETE statement.

Bulk Delete

C#
// Deletes all expired sessions in ONE SQL DELETE
int deleted = await _db.Sessions
    .Where(s => s.ExpiresAt < DateTime.UtcNow)
    .ExecuteDeleteAsync(ct);

// Generated SQL:
// DELETE FROM Sessions WHERE ExpiresAt < '2026-04-14 00:00:00'

No entity loading, no change tracking, no round-trips per row.

Bulk Update

C#
// Marks all overdue invoices as such in ONE SQL UPDATE
int updated = await _db.Invoices
    .Where(i => i.DueDate < DateTime.UtcNow && i.Status == InvoiceStatus.Pending)
    .ExecuteUpdateAsync(setters => setters
        .SetProperty(i => i.Status, InvoiceStatus.Overdue)
        .SetProperty(i => i.UpdatedAt, DateTime.UtcNow), ct);

ExecuteUpdateAsync and ExecuteDeleteAsync bypass the change tracker entirely — interceptors, SaveChanges hooks, and audit logs do NOT fire for these operations. Keep that in mind.


EFCore.BulkExtensions — Bulk Insert and Upsert

For bulk inserts, use EFCore.BulkExtensions. It uses SqlBulkCopy under the hood (SQL Server) or equivalent for PostgreSQL/MySQL.

Bash
dotnet add package EFCore.BulkExtensions

Bulk Insert

C#
var products = GenerateProducts(10_000);

await _db.BulkInsertAsync(products, ct);
// Executes a single SqlBulkCopy operation
// ~150ms for 10k rows vs ~10,000ms with AddRange

Bulk Insert or Update (Upsert)

C#
var options = new BulkConfig
{
    UpdateByProperties = [nameof(Product.Sku)], // match on SKU, not PK
    PropertiesToExclude = [nameof(Product.CreatedAt)] // don't overwrite CreatedAt on update
};

await _db.BulkInsertOrUpdateAsync(products, options, ct);

Bulk Delete

C#
// Deletes entities matching the list by PK
await _db.BulkDeleteAsync(productsToDelete, ct);

Batching Large Datasets

Even BulkInsert should be batched if you are inserting millions of rows. A single batch of 1M rows holds everything in memory and can cause timeouts.

C#
const int BatchSize = 5_000;

var allProducts = await LoadProductsFromFile(); // 500,000 products

foreach (var batch in allProducts.Chunk(BatchSize))
{
    await _db.BulkInsertAsync(batch.ToList(), ct);
    // Optional: reset context to free memory
}

Chunk() is a built-in LINQ method (.NET 6+) that splits a sequence into arrays of at most BatchSize elements.


Tracking vs No-Tracking for Bulk Reads

When reading large datasets for processing (not for updates), always use AsNoTracking(). The change tracker adds memory overhead proportional to the number of entities.

C#
// Slow for large reads — change tracker stores all 50k entities
var products = await _db.Products.ToListAsync(ct);

// Fast — no change tracker overhead
var products = await _db.Products
    .AsNoTracking()
    .ToListAsync(ct);

For even larger datasets, stream with AsAsyncEnumerable() instead of loading everything into memory:

C#
await foreach (var product in _db.Products
    .AsNoTracking()
    .AsAsyncEnumerable()
    .WithCancellation(ct))
{
    await ProcessProductAsync(product);
}

This reads one row at a time from the open data reader — memory stays flat regardless of result set size.


Benchmark Numbers (SQL Server, local)

| Operation | 10,000 rows | Notes | |---|---|---| | AddRange + SaveChanges | ~9,500ms | Default batch size ~42 | | BulkInsert (EFCore.BulkExtensions) | ~180ms | SqlBulkCopy | | ExecuteDeleteAsync | ~25ms | Single DELETE statement | | ExecuteUpdateAsync | ~30ms | Single UPDATE statement | | Raw SqlBulkCopy | ~120ms | Baseline; slightly faster than BulkExtensions |

Numbers vary by hardware, network, and row width. The ratios hold.


When to Drop Down to Raw SQL

EF Core is not the right tool for everything. Use raw SQL when:

  • You need database-specific features (MERGE, COPY, table hints)
  • Your operation is complex enough that EF's translation is worse than what you'd write
  • You are doing initial data seeding with hundreds of thousands of rows
C#
// Raw SQL for seeding reference data
await _db.Database.ExecuteSqlRawAsync(@"
    INSERT INTO Countries (Code, Name)
    SELECT Code, Name FROM OPENJSON(@json)
    WITH (Code NVARCHAR(2), Name NVARCHAR(100))",
    new SqlParameter("@json", JsonSerializer.Serialize(countries)));

For very high-throughput pipelines, also consider bypassing EF entirely for the write path and using SqlBulkCopy directly:

C#
using var bulkCopy = new SqlBulkCopy(connectionString);
bulkCopy.DestinationTableName = "Products";
bulkCopy.BatchSize = 5_000;
bulkCopy.BulkCopyTimeout = 120;

var table = ToDataTable(products); // map to DataTable
await bulkCopy.WriteToServerAsync(table, ct);

Decision Tree

Need to insert large dataset?
  └─ Yes → Use BulkInsertAsync (EFCore.BulkExtensions)
         → For 500k+ rows, batch with Chunk()

Need to update many rows matching a condition?
  └─ Yes → Use ExecuteUpdateAsync (EF Core 7+)

Need to delete many rows matching a condition?
  └─ Yes → Use ExecuteDeleteAsync (EF Core 7+)

Reading large dataset without updates?
  └─ Yes → Use AsNoTracking()
         → Streaming? Use AsAsyncEnumerable()

Nothing above fits?
  └─ Use raw SQL or SqlBulkCopy directly

The default SaveChanges path is fine for typical CRUD operations on a handful of entities. The moment you are processing lists measured in thousands, switch to the appropriate bulk tool.

Enjoyed this article?

Explore the Backend Systems learning path for more.

Found this helpful?

Share:𝕏

Leave a comment

Have a question, correction, or just found this helpful? Leave a note below.