Handling large file uploads in ASP.NET Core applications requires careful configuration and performance optimization. By default, ASP.NET Core enforces request size limits to protect applications from denial-of-service attacks and excessive memory consumption. However, enterprise applications such as document management systems, media platforms, and data processing tools often require support for large file uploads. Implementing streaming, request size configuration, and secure validation ensures scalable and reliable file handling.
This article explains how to configure ASP.NET Core for large file uploads, avoid memory bottlenecks, use streaming for performance, and apply best practices for production environments.
Understanding Default Upload Limits in ASP.NET Core
ASP.NET Core limits the maximum request body size by default. If a file exceeds this limit, the application returns a 413 Payload Too Large error.
To allow large file uploads, you must configure request size limits at different levels:
Kestrel server configuration
IIS configuration (if hosted on IIS)
Controller-level request limit attributes
Configure Kestrel Server Limits
In Program.cs, configure Kestrel to increase the maximum request body size:
builder.WebHost.ConfigureKestrel(serverOptions =>
{
serverOptions.Limits.MaxRequestBodySize = 524288000; // 500 MB
});
Alternatively, configure per-request limit using attribute:
[RequestSizeLimit(524288000)] // 500 MB
[HttpPost("upload")]
public async Task<IActionResult> Upload(IFormFile file)
{
return Ok();
}
Configure IIS for Large Uploads
If hosting on IIS, update web.config:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="524288000" />
</requestFiltering>
</security>
</system.webServer>
This ensures IIS does not block large file requests before reaching ASP.NET Core.
Avoid Loading Large Files into Memory
Using IFormFile with CopyToAsync into memory streams can cause high memory usage for very large files. Instead, use streaming to process files efficiently.
Example of streaming to disk:
[HttpPost("upload")]
public async Task<IActionResult> Upload()
{
var file = Request.Form.Files[0];
var filePath = Path.Combine("Uploads", file.FileName);
using (var stream = new FileStream(filePath, FileMode.Create))
{
await file.CopyToAsync(stream);
}
return Ok("File uploaded successfully");
}
For extremely large files, use Request.Body directly to stream content without buffering.
Disable Form Value Model Binding for Streaming
To optimize performance, disable default form model binding when handling large files.
[DisableFormValueModelBinding]
[RequestSizeLimit(long.MaxValue)]
public async Task<IActionResult> UploadLargeFile()
{
using var stream = new FileStream("largefile.dat", FileMode.Create);
await Request.Body.CopyToAsync(stream);
return Ok();
}
This approach avoids loading the entire file into memory.
Validate File Size and Type
Always validate file size and content type to prevent malicious uploads.
if (file.Length > 524288000)
{
return BadRequest("File size exceeds limit.");
}
var allowedExtensions = new[] { ".pdf", ".jpg", ".png" };
var extension = Path.GetExtension(file.FileName);
if (!allowedExtensions.Contains(extension.ToLower()))
{
return BadRequest("Invalid file type.");
}
Server-side validation is essential for security.
Use Chunked Upload for Very Large Files
For multi-gigabyte uploads, implement chunked upload strategy:
Split file into smaller chunks on frontend
Upload chunks sequentially
Reassemble file on server
This reduces failure risk and improves reliability over unstable networks.
Configure Multipart Body Length Limit
You can configure form options globally:
builder.Services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 524288000; // 500 MB
});
This ensures multipart uploads support large file sizes.
Performance Best Practices
Use streaming instead of buffering
Store files outside application root
Use asynchronous file operations
Enable HTTPS for secure transfer
Consider cloud storage integration (Azure Blob Storage or AWS S3)
Monitor disk space and upload throughput
Proper configuration ensures scalability while protecting system resources.
Common Mistakes to Avoid
Forgetting IIS request limit configuration
Loading entire file into memory
Not validating file type
Allowing unlimited upload size
Storing sensitive files without encryption
Careful handling of large uploads improves reliability and protects backend infrastructure.
Summary
Handling large file uploads in ASP.NET Core requires configuring request size limits in Kestrel and IIS, using streaming instead of memory buffering, validating file size and type, and implementing chunked upload strategies for very large files. By leveraging asynchronous file operations, disabling unnecessary model binding, and applying proper security validations, developers can build scalable and secure web applications capable of processing large file uploads efficiently without overwhelming server resources.