File Upload & Download — Multipart Forms, Streaming, and Azure Blob
Handle file uploads and downloads correctly in ASP.NET Core — IFormFile for small files, streaming for large files, multipart upload, and serving files from Azure Blob Storage without loading them into memory.
File handling is one of the areas where "it works on my machine with a 10KB test file" blows up in production with a 500MB video. This lesson covers small-file uploads, streaming for large files, and serving files from Azure Blob Storage without memory pressure.
Single File Upload with IFormFile
For small files (under ~30 MB), IFormFile is the simplest approach:
[HttpPost("upload")]
[RequestSizeLimit(30_000_000)] // 30 MB max
[RequestFormLimits(MultipartBodyLengthLimit = 30_000_000)]
public async Task<IActionResult> Upload(
IFormFile file, CancellationToken ct)
{
if (file.Length == 0)
return BadRequest("File is empty.");
// Validate extension
var allowed = new[] { ".jpg", ".jpeg", ".png", ".pdf" };
var ext = Path.GetExtension(file.FileName).ToLowerInvariant();
if (!allowed.Contains(ext))
return BadRequest($"File type {ext} is not allowed.");
// Validate MIME type (don't trust the extension alone)
var allowedMime = new[] { "image/jpeg", "image/png", "application/pdf" };
if (!allowedMime.Contains(file.ContentType))
return BadRequest("Invalid content type.");
// Generate a safe storage name — never use the original file name directly
var storedName = $"{Guid.NewGuid()}{ext}";
var path = Path.Combine("uploads", storedName);
await using var stream = System.IO.File.Create(path);
await file.CopyToAsync(stream, ct);
return Ok(new { fileName = storedName, size = file.Length });
}Never save file.FileName directly to disk — it can contain path traversal characters (../../../etc/passwd). Always generate your own storage name.
Multiple File Upload
[HttpPost("upload-many")]
public async Task<IActionResult> UploadMany(
IFormFileCollection files, CancellationToken ct)
{
if (files.Count == 0)
return BadRequest("No files uploaded.");
if (files.Count > 10)
return BadRequest("Maximum 10 files per request.");
var results = new List<object>();
foreach (var file in files)
{
// Validate each file...
var storedName = $"{Guid.NewGuid()}{Path.GetExtension(file.FileName)}";
var path = Path.Combine("uploads", storedName);
await using var stream = System.IO.File.Create(path);
await file.CopyToAsync(stream, ct);
results.Add(new { original = file.FileName, stored = storedName });
}
return Ok(results);
}Upload With Metadata (File + JSON Together)
Combining file and JSON metadata in one multipart request:
POST /api/documents
Content-Type: multipart/form-data; boundary=----boundary
------boundary
Content-Disposition: form-data; name="metadata"
Content-Type: application/json
{"title":"Q1 Report","category":"Finance","tags":["quarterly","2026"]}
------boundary
Content-Disposition: form-data; name="file"; filename="report.pdf"
Content-Type: application/pdf
<binary data>
------boundary--public record DocumentMetadata(string Title, string Category, string[] Tags);
[HttpPost("documents")]
public async Task<IActionResult> UploadDocument(
[FromForm] DocumentMetadata metadata,
IFormFile file,
CancellationToken ct)
{
// metadata is bound from the form field
// file is bound from the file field
var storedName = $"{Guid.NewGuid()}.pdf";
// ... save file and metadata
return Created($"/api/documents/{storedName}", new { storedName, metadata });
}Streaming Large Files (No Memory Buffering)
IFormFile buffers the entire file in memory (or a temp file) before your code runs. For large files, this kills performance. Use streaming instead:
// Program.cs — disable form value model binding to get raw streaming
app.MapPost("/api/uploads/stream", async (HttpContext ctx, CancellationToken ct) =>
{
if (!ctx.Request.HasFormContentType)
return Results.BadRequest("Expected multipart/form-data.");
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(ctx.Request.ContentType), 100);
var reader = new MultipartReader(boundary, ctx.Request.Body);
MultipartSection? section;
while ((section = await reader.ReadNextSectionAsync(ct)) is not null)
{
var hasContentDispositionHeader =
ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (!hasContentDispositionHeader) continue;
if (contentDisposition!.IsFileDisposition())
{
var trustedFileName = Path.GetRandomFileName();
var path = Path.Combine("uploads", trustedFileName);
await using var targetStream = File.Create(path);
await section.Body.CopyToAsync(targetStream, ct);
// File written directly from network to disk — no full memory buffer
}
}
return Results.Ok();
})
.WithRequestTimeout(TimeSpan.FromMinutes(10)); // large files take timeThe MultipartReader reads the stream section by section. Each file section is copied directly from the network stream to the output stream without ever holding the whole file in memory.
Upload to Azure Blob Storage
Saving files locally doesn't work in cloud deployments (pods are ephemeral). Azure Blob Storage is the standard target:
dotnet add package Azure.Storage.Blobspublic class BlobStorageService
{
private readonly BlobContainerClient _container;
public BlobStorageService(IConfiguration config)
{
var client = new BlobServiceClient(config["AzureStorage:ConnectionString"]);
_container = client.GetBlobContainerClient(config["AzureStorage:ContainerName"]);
}
public async Task<Uri> UploadAsync(
Stream content, string contentType, CancellationToken ct = default)
{
var blobName = Guid.NewGuid().ToString();
var blob = _container.GetBlobClient(blobName);
await blob.UploadAsync(content, new BlobHttpHeaders { ContentType = contentType }, ct);
return blob.Uri;
}
public async Task DeleteAsync(string blobName, CancellationToken ct = default)
=> await _container.GetBlobClient(blobName).DeleteIfExistsAsync(ct: ct);
}[HttpPost("upload")]
public async Task<IActionResult> Upload(IFormFile file, CancellationToken ct)
{
await using var stream = file.OpenReadStream();
var uri = await _blobStorage.UploadAsync(stream, file.ContentType, ct);
return Ok(new { url = uri });
}Serving File Downloads
Option 1: Files stored locally
[HttpGet("download/{fileName}")]
public IActionResult Download(string fileName)
{
// Sanitize — prevent path traversal
var safeName = Path.GetFileName(fileName);
var path = Path.Combine("uploads", safeName);
if (!System.IO.File.Exists(path))
return NotFound();
// File() streams the file without loading it all into memory
return PhysicalFile(
path,
"application/octet-stream",
safeName,
enableRangeProcessing: true // supports partial content / resume
);
}enableRangeProcessing: true supports HTTP Range requests — video players and download managers use these to resume interrupted downloads.
Option 2: Stream from Azure Blob
Don't proxy large files through your API — generate a short-lived SAS URL and redirect the client directly to Blob Storage:
[HttpGet("download/{blobName}")]
public async Task<IActionResult> Download(string blobName, CancellationToken ct)
{
var blob = _container.GetBlobClient(blobName);
if (!await blob.ExistsAsync(ct))
return NotFound();
// Generate a SAS URL valid for 5 minutes
var sasUri = blob.GenerateSasUri(BlobSasPermissions.Read,
DateTimeOffset.UtcNow.AddMinutes(5));
return Redirect(sasUri.ToString()); // 302 — client fetches directly from Azure
}The client gets redirected to Azure and downloads at Azure's bandwidth — not your API server's. Scales to any file size with zero server memory pressure.
Option 3: Stream through the API (for access control)
When you need to control access per-download (audit logging, subscription checks):
[Authorize]
[HttpGet("download/{blobName}")]
public async Task Download(string blobName, CancellationToken ct)
{
// Check user has access to this file...
var blob = _container.GetBlobClient(blobName);
var properties = await blob.GetPropertiesAsync(ct: ct);
Response.ContentType = properties.Value.ContentType;
Response.Headers.ContentDisposition =
$"attachment; filename=\"{Path.GetFileName(blobName)}\"";
Response.ContentLength = properties.Value.ContentLength;
// Stream directly from Azure to the HTTP response — no memory buffer
await blob.DownloadToAsync(Response.Body, ct);
}Security Checklist for File Handling
✅ Validate file extension — whitelist only (.jpg, .pdf, etc.)
✅ Validate MIME type from Content-Type header
✅ Read magic bytes to verify actual file type (libraries: FileMagic, MimeDetective)
✅ Never use the original filename as the storage path
✅ Set a hard size limit with [RequestSizeLimit]
✅ Scan uploads with antivirus if storing user content (Azure Defender for Storage)
✅ Store outside the web root — files in /uploads should not be directly served
✅ Use a CDN/SAS URL for downloads — don't proxy through your API if avoidable
✅ Log every upload: user, IP, filename, size, timestampQuick Reference
// Small upload
IFormFile file → IFormFile.CopyToAsync(stream)
// Large upload (streaming, no buffer)
MultipartReader + ReadNextSectionAsync → section.Body.CopyToAsync(target)
// Azure upload
BlobClient.UploadAsync(stream, headers)
// Download — local file
PhysicalFile(path, contentType, enableRangeProcessing: true)
// Download — Azure SAS redirect (best for large files)
blob.GenerateSasUri(Read, expires) → Results.Redirect(sasUri)
// Download — stream through API (for access control)
blob.DownloadToAsync(Response.Body)Enjoyed this article?
Explore the Backend Systems learning path for more.
Found this helpful?
Leave a comment
Have a question, correction, or just found this helpful? Leave a note below.