.NET & C# Development · Lesson 66 of 92
Stream Large Datasets With IAsyncEnumerable
The Problem With Task<List<T>>
// This loads every row into memory before returning anything
public async Task<List<ReportRow>> GetReportAsync()
{
return await db.ReportRows.ToListAsync(); // 100k rows in RAM
}If that query returns 100,000 rows, you're allocating all of them before the first byte reaches the caller. For large exports, batch jobs, or streaming APIs, this kills memory.
IAsyncEnumerable<T>
IAsyncEnumerable<T> lets you produce items one at a time with yield return inside an async method. The caller receives items as they are produced — no intermediate list.
public async IAsyncEnumerable<ReportRow> GetReportStreamAsync(
[EnumeratorCancellation] CancellationToken ct = default)
{
await foreach (var row in db.ReportRows.AsAsyncEnumerable().WithCancellation(ct))
{
// You can transform here — no need to buffer
yield return new ReportRow
{
Id = row.Id,
Value = row.RawValue * 1.2m,
Label = row.Category.ToUpperInvariant()
};
}
}[EnumeratorCancellation] wires up the token so WithCancellation(ct) on the consumer side propagates correctly.
Consuming with await foreach
await foreach (var row in service.GetReportStreamAsync(cancellationToken))
{
await writer.WriteLineAsync(row.ToString());
}Items are processed as they arrive. Memory usage stays roughly constant regardless of dataset size.
Streaming From EF Core
EF Core's AsAsyncEnumerable() opens a streaming cursor rather than loading the whole result set:
public async IAsyncEnumerable<ProductDto> StreamProductsAsync(
string category,
[EnumeratorCancellation] CancellationToken ct = default)
{
var query = db.Products
.Where(p => p.Category == category)
.OrderBy(p => p.Id)
.Select(p => new ProductDto(p.Id, p.Name, p.Price));
await foreach (var product in query.AsAsyncEnumerable().WithCancellation(ct))
{
yield return product;
}
}Do not use Skip/Take in a loop for pagination when you just need all rows — that generates N queries. Use AsAsyncEnumerable() for a single forward pass.
Streaming HTTP Responses (NDJSON)
ASP.NET Core supports returning IAsyncEnumerable<T> directly from controller actions. The framework serialises each item as it arrives and flushes the response incrementally using newline-delimited JSON (ndjson).
[HttpGet("export")]
public IAsyncEnumerable<ReportRowDto> ExportAsync(CancellationToken ct) =>
reportService.GetReportStreamAsync(ct);Response headers will include Transfer-Encoding: chunked. The client reads each line as a complete JSON object:
{"id":1,"label":"A","value":12.5}
{"id":2,"label":"B","value":9.0}
...For more control over content type or flush behaviour, use IActionResult with a manual loop:
[HttpGet("export-manual")]
public async Task ExportManualAsync(CancellationToken ct)
{
Response.ContentType = "application/x-ndjson";
await foreach (var row in reportService.GetReportStreamAsync(ct))
{
var json = JsonSerializer.Serialize(row);
await Response.WriteAsync(json + "\n", ct);
await Response.Body.FlushAsync(ct);
}
}Cancellation in Async Streams
Cancellation needs to be threaded through correctly — it does not just propagate by magic.
public async IAsyncEnumerable<SensorReading> PollSensorsAsync(
IEnumerable<int> sensorIds,
[EnumeratorCancellation] CancellationToken ct = default)
{
foreach (var id in sensorIds)
{
ct.ThrowIfCancellationRequested();
var reading = await sensorClient.ReadAsync(id, ct);
yield return reading;
await Task.Delay(100, ct); // throttle between reads
}
}When the consumer cancels (e.g., request aborted), ct.ThrowIfCancellationRequested() exits the stream cleanly without finishing all sensors.
Comparing with Task<List<T>>
// 100k rows — Task<List<T>>
var sw = Stopwatch.StartNew();
var list = await db.Orders.ToListAsync();
Console.WriteLine($"Loaded {list.Count} items in {sw.ElapsedMilliseconds}ms, {GC.GetTotalMemory(false) / 1_000_000}MB");
// Output: Loaded 100000 items in 1840ms, 412MB
// 100k rows — IAsyncEnumerable<T>
sw.Restart();
var count = 0;
await foreach (var order in db.Orders.AsAsyncEnumerable())
{
count++;
_ = order.Id; // simulate processing
}
Console.WriteLine($"Streamed {count} items in {sw.ElapsedMilliseconds}ms, {GC.GetTotalMemory(false) / 1_000_000}MB");
// Output: Streamed 100000 items in 2100ms, 18MBProcessing is slightly slower (streaming has overhead per item), but memory drops from ~400 MB to ~18 MB. For batch jobs running on constrained infrastructure, that difference is the job passing or OOM-crashing.
Practical Rules
- Use
IAsyncEnumerable<T>when the result could be large and the consumer can process items one at a time (exports, reports, batch pipelines). - Always pass
[EnumeratorCancellation] CancellationTokenin the producer signature. - Do not
yield returninside atry/catchthat catches general exceptions — the iterator state machine makes this hard to reason about. Catch specific exceptions before theyield. - Avoid mixing
await foreachwith EF Core change tracking on the same context as other writes — the streaming cursor holds an open connection. UseAsNoTracking()for read-only streams. - Don't return
IAsyncEnumerable<T>across HTTP if the client cannot consume streaming responses — buffer with.ToListAsync()at the boundary instead.