This article explains how to build a production-grade large file upload system that supports 1–5 GB files, chunk upload, resume support, parallel chunks, and verification.
Written in simple Indian English and includes practical backend + frontend code, workflows, and diagrams.
All headings use a smaller, professional style as you requested.
1. Introduction
Uploading multi-GB files cannot rely on normal single-request uploads. Browsers timeout, networks fluctuate, and memory limits fail.
A robust solution is chunk upload, where files are split into small segments (2 MB–10 MB).
Each chunk is uploaded separately, and the server stitches them.
This approach supports:
Resume upload after interruption
Efficient bandwidth usage
Parallel chunk uploads
Easy retry of failed chunks
Verification using checksum
2. Key Features of Our Chunk Upload System
3. Database Design
Table: UploadSession
UploadSessionId (PK)
FileName
FileSize
UploadedBytes
TotalChunks
CompletedChunks
Status (InProgress / Completed / Failed)
CreatedDate
Table: UploadChunks
ChunkId (PK)
UploadSessionId (FK)
ChunkNumber
Size
IsUploaded (bit)
UploadedDate
4. ER Diagram
+-------------------+ 1 → N +------------------+
| UploadSession |--------------- | UploadChunks |
+-------------------+ +------------------+
| UploadSessionId | | ChunkId |
| FileName | | UploadSessionId |
| FileSize | | ChunkNumber |
| Status | | IsUploaded |
+-------------------+ +------------------+
5. Architecture Diagram (Visio Style)
+--------------+ +---------------------+
| Angular App | <------> | ASP.NET Core API |
| (Chunk Split)| | /upload/chunk |
+--------------+ +---------+-----------+
|
v
+-----------------------+
| Temporary Chunk Store |
| (Disk / Blob) |
+-----------+-----------+
|
v
+-------------------+
| Final Merge File |
+-------------------+
6. Workflow / Flowchart
[Start Upload]
|
v
[Create Upload Session]
|
v
[Split File Into Chunks]
|
v
[Upload Chunk #1]
|
+----> Failed? Retry
|
[Upload Remaining Chunks]
|
v
[All Chunks Uploaded?]
|
v
[Merge Chunks into Final File]
|
v
[Mark Upload Completed]
7. ASP.NET Core Backend (Chunk Upload API)
7.1 Create Upload Session
[HttpPost("start")]
public IActionResult StartUpload([FromBody] StartUploadRequest req)
{
var session = new UploadSession
{
FileName = req.FileName,
FileSize = req.FileSize,
TotalChunks = req.TotalChunks,
Status = "InProgress",
CreatedDate = DateTime.Now
};
_db.UploadSession.Add(session);
_db.SaveChanges();
return Ok(new { session.UploadSessionId });
}
7.2 Upload Each Chunk
[HttpPost("chunk")]
public async Task<IActionResult> UploadChunk([FromForm] UploadChunkRequest req)
{
string tempDir = Path.Combine("temp", req.UploadSessionId.ToString());
Directory.CreateDirectory(tempDir);
string chunkPath = Path.Combine(tempDir, $"{req.ChunkNumber}.part");
using (var fs = new FileStream(chunkPath, FileMode.Create))
{
await req.Chunk.CopyToAsync(fs);
}
// mark in database
var chunk = new UploadChunk
{
UploadSessionId = req.UploadSessionId,
ChunkNumber = req.ChunkNumber,
Size = req.Chunk.Length,
IsUploaded = true,
UploadedDate = DateTime.Now
};
_db.UploadChunks.Add(chunk);
_db.SaveChanges();
return Ok();
}
7.3 Resume Support API
[HttpGet("resume/{sessionId}")]
public IActionResult GetUploadedChunks(int sessionId)
{
var chunks = _db.UploadChunks
.Where(x => x.UploadSessionId == sessionId)
.Select(x => x.ChunkNumber)
.ToList();
return Ok(chunks);
}
7.4 Merge Chunks
[HttpPost("merge")]
public IActionResult MergeChunks([FromBody] MergeRequest req)
{
string tempDir = Path.Combine("temp", req.UploadSessionId.ToString());
string finalPath = Path.Combine("uploads", req.FileName);
using (var fs = new FileStream(finalPath, FileMode.Create))
{
for (int i = 1; i <= req.TotalChunks; i++)
{
string chunkPath = Path.Combine(tempDir, $"{i}.part");
byte[] bytes = System.IO.File.ReadAllBytes(chunkPath);
fs.Write(bytes, 0, bytes.Length);
}
}
return Ok("Merged Successfully");
}
8. Angular Frontend (Chunking + Upload)
8.1 Split File into Chunks
splitFile(file: File, chunkSize = 2 * 1024 * 1024) {
const chunks = [];
let start = 0;
while (start < file.size) {
const end = Math.min(start + chunkSize, file.size);
chunks.push(file.slice(start, end));
start = end;
}
return chunks;
}
8.2 Upload Chunks with Resume Support
async uploadChunks(sessionId: number, file: File) {
const chunks = this.splitFile(file);
const uploaded = await this.getUploadedChunks(sessionId);
for (let i = 0; i < chunks.length; i++) {
if (uploaded.includes(i)) continue;
const form = new FormData();
form.append('UploadSessionId', sessionId.toString());
form.append('ChunkNumber', i.toString());
form.append('Chunk', chunks[i]);
await this.http.post('/upload/chunk', form).toPromise();
}
}
8.3 Merge Request
mergeFile(sessionId: number, file: File, totalChunks: number) {
return this.http.post('/upload/merge', {
uploadSessionId: sessionId,
fileName: file.name,
totalChunks
});
}
9. Sequence Diagram (Chunk Upload + Resume)
User → Angular → Start Upload → API
User → Angular → Upload Chunk N → API
API → Save Chunk → Disk + DB
User → Angular → Resume Check → API → Uploaded List
User → Angular → Upload Remaining Chunks
User → Angular → Merge Request → API → Final File
10. Best Practices for Production
Use Azure Blob Storage or AWS S3 instead of disk
Enable parallel chunk uploads
Use SHA-256 checksum for verification
Track progress using UploadedBytes
Use signalR for real-time progress updates
Add virus scanning (ClamAV or any service)
11. Conclusion
Chunk uploading is the only reliable method for handling very large files (1–5 GB).
With ASP.NET Core and Angular, you can build a fast, resumable, fault-tolerant upload system suitable for video processing, large archives, backups, or enterprise document storage.