Introduction
This article shows how to build a production-ready File Manager: a secure API for storing and serving files, and an Angular UI for uploading, browsing, previewing, and managing files. The guide focuses on practical code you can copy and run, with attention to security, scalability, and real-world concerns such as chunked uploads, streaming downloads, thumbnails and access control.
The writing style is simple Indian English and targeted at senior developers.
High-level architecture
[Angular SPA] <--- HTTPS/JWT ---> [ASP.NET Core API] <---> [Blob Storage (S3 / Azure Blob)]
|
+--> [SQL / Metadata store]
|
+--> [Virus Scan Service] (optional)
|
+--> [CDN for public assets]
Responsibilities
API: authentication, authorization, upload endpoints (direct, chunked), metadata, thumbnails, streaming download, signed URLs for CDN.
UI: upload (single/file list, drag & drop), chunked/resumable upload, preview, search, pagination, delete, permission UI.
Storage choices and trade-offs
Blob storage (Azure/AWS S3/GCS): recommended for production. Cheap, scalable, supports presigned URLs and CDN integration.
Database: store metadata only (file id, name, size, mime, owner, permissions, storage path, createdAt, hash).
Local disk: fine for simple demos, but not for multi-instance deployments.
Use blob storage for file content and keep small metadata in SQL Server or PostgreSQL.
Database model (simple)
Files(
Id GUID PK,
FileName nvarchar(512),
ContentType nvarchar(128),
Size bigint,
OwnerId GUID,
StoragePath nvarchar(1024),
Hash varchar(64),
IsPublic bit,
CreatedAt datetimeoffset,
UpdatedAt datetimeoffset
)
FilePermissions(
FileId FK,
UserId FK,
CanRead bit,
CanWrite bit,
CanDelete bit
)
Index on OwnerId, CreatedAt, and FileName (include full-text or trigram index for search).
Backend: Project setup
Create ASP.NET Core Web API project and add packages:
dotnet new webapi -n FileManagerApi
cd FileManagerApi
dotnet add package Azure.Storage.Blobs // or AWSSDK.S3
dotnet add package Microsoft.EntityFrameworkCore.SqlServer
dotnet add package Swashbuckle.AspNetCore // Swagger
Register DbContext, authentication (JWT), CORS and blob client in Program.cs.
Backend: File entity and DTOs
public class FileEntity
{
public Guid Id { get; set; }
public string FileName { get; set; } = string.Empty;
public string ContentType { get; set; } = string.Empty;
public long Size { get; set; }
public Guid OwnerId { get; set; }
public string StoragePath { get; set; } = string.Empty; // blob path
public string? Hash { get; set; }
public bool IsPublic { get; set; }
public DateTimeOffset CreatedAt { get; set; } = DateTimeOffset.UtcNow;
}
public class FileUploadResultDto { public Guid Id { get; set; } public string FileName { get; set; } public string Url { get; set; } }
Backend: Storage service (abstraction)
Create an interface IFileStorage and two implementations: AzureBlobStorage (production) and LocalFileStorage (dev). Core methods:
Task<string> UploadAsync(Stream content, string path, string contentType, CancellationToken ct);
Task<Stream> DownloadAsync(string path, CancellationToken ct);
Task DeleteAsync(string path, CancellationToken ct);
Task GenerateThumbnailAsync(string sourcePath, string thumbPath);
Task<string> CreatePresignedUrlAsync(string path, TimeSpan expiry);
Keep the blob client usage inside the storage implementation. This keeps controller code clean and testable.
Backend: Controller endpoints (examples)
FilesController endpoints:
POST /api/files — simple multi-part upload (small files)
POST /api/files/chunk/init — initiate chunked upload => returns uploadId and chunk size
PUT /api/files/chunk/{uploadId}/{chunkIndex} — upload a chunk
POST /api/files/chunk/complete/{uploadId} — finalize and assemble
GET /api/files — list files (filters, pagination)
GET /api/files/{id} — get metadata
GET /api/files/{id}/download — stream download or redirect to presigned URL
DELETE /api/files/{id} — delete file
GET /api/files/{id}/thumbnail — return small thumbnail or presigned URL
Sample upload action (simple)
[HttpPost]
[Authorize]
public async Task<IActionResult> Upload(IFormFile file)
{
if (file == null || file.Length == 0) return BadRequest();
var ownerId = User.GetUserId();
// server-side validations
if (file.Length > _options.MaxFileSize) return StatusCode(413);
var id = Guid.NewGuid();
var storagePath = $"files/{id}/{file.FileName}";
await _storage.UploadAsync(file.OpenReadStream(), storagePath, file.ContentType, HttpContext.RequestAborted);
var entity = new FileEntity { Id = id, FileName = file.FileName, ContentType = file.ContentType, Size = file.Length, OwnerId = ownerId, StoragePath = storagePath };
_db.Files.Add(entity);
await _db.SaveChangesAsync();
var url = await _storage.CreatePresignedUrlAsync(storagePath, TimeSpan.FromMinutes(60));
return Ok(new FileUploadResultDto { Id = id, FileName = file.FileName, Url = url });
}
Chunked upload flow
Client calls init with metadata. Server stores an upload record with temporary storage path and expected chunks.
Client PUTs each chunk. Server stores chunks in temporary blob location or local temp. Validate chunk sizes.
On complete, server assembles chunks in order into the final blob (many storage SDKs support multipart uploads, e.g., S3 multipart upload or Azure Block Blob PutBlock/PutBlockList).
Use storage-native multipart APIs to avoid re-assembling in app memory.
Backend: Security and validations
Authentication: JWT for API calls.
Authorization: owner or allowed user can read/write/delete. Use policies and [Authorize] with role checks.
Content validation: check ContentType and extension whitelist.
Size limits: enforce MaxFileSize and per-user quotas.
Virus scanning: integrate a scanning step in upload pipeline (e.g., call ClamAV or third-party service) — mark file as Safe=false until scanned.
Rate limiting: throttle uploads to avoid abuse.
Signed URLs: for public downloads use short-lived presigned URLs instead of streaming through API for performance.
Backend: Streaming downloads and partial content
For large files support Range header to allow resume and streaming. Example:
[HttpGet("{id}/download")]
public async Task<IActionResult> Download(Guid id)
{
var file = await _db.Files.FindAsync(id);
if (file == null) return NotFound();
var stream = await _storage.DownloadAsync(file.StoragePath, HttpContext.RequestAborted);
return File(stream, file.ContentType, enableRangeProcessing: true, file.FileName);
}
enableRangeProcessing: true lets ASP.NET Core handle Range and partial content.
Backend: Thumbnail generation
Generate thumbnails asynchronously after upload using background worker (Hangfire / BackgroundService).
Store thumbnails in thumbs/{id}.jpg and serve via presigned URL or API endpoint.
For office documents, use server-side converters (LibreOffice) or third-party conversion service.
Frontend: Angular setup and libraries
Angular 17 project scaffold
Use ngx-file-drop or <input type="file" multiple> for drag-and-drop
Use @azure/storage-blob or aws-sdk for direct-to-blob client uploads if using presigned URLs
Use Angular services with HttpClient for metadata/API calls
Install example packages
npm install ngx-file-drop @azure/storage-blob
Frontend: FileService (Angular)
file.service.ts responsibilities:
Call API to list files, metadata
Initiate chunked upload and send chunks
Request presigned URL for direct upload
Request download URL or start streaming download
Example simplified upload (small files)
upload(file: File) {
const fd = new FormData();
fd.append('file', file);
return this.http.post<FileUploadResult>('/api/files', fd);
}
For direct-to-blob using presigned URL:
Call POST /api/files/presign with metadata. Server returns presigned URL.
Use fetch or HttpClient.put to upload file directly to blob storage.
Notify API POST /api/files/confirm to persist metadata.
This avoids proxying binary through API and improves throughput.
Frontend: Chunked upload example (basic)
async uploadInChunks(file: File) {
const chunkSize = 5 * 1024 * 1024; // 5MB
const uploadInit = await this.http.post('/api/files/chunk/init', { fileName: file.name, size: file.size }).toPromise();
const uploadId = uploadInit.uploadId;
const totalChunks = Math.ceil(file.size / chunkSize);
for (let i = 0; i < totalChunks; i++) {
const start = i * chunkSize;
const end = Math.min(start + chunkSize, file.size);
const chunk = file.slice(start, end);
await this.http.put(`/api/files/chunk/${uploadId}/${i}`, chunk, { headers: { 'Content-Type': 'application/octet-stream' } }).toPromise();
}
await this.http.post(`/api/files/chunk/complete/${uploadId}`, {}).toPromise();
}
Handle retry for each chunk and resume by asking server which chunks already uploaded.
Frontend: UI components
FileBrowserComponent: list, pagination, search, filters (type, date)
UploadComponent: drag-drop, progress bar, error handling, retry
FilePreviewComponent: image preview, document preview via iframe, audio/video player
FileActions: download (direct link), share (generate short presigned URL), delete, rename
UX notes
Show progress per file and overall progress.
Allow canceling uploads.
Provide meaningful error messages (size, type, quota exceeded).
Security: Sharing and access control
Implement IsPublic flag: if true store object in public container or issue long-lived CDN URL.
For private files return short presigned URLs or stream via API with authentication.
Permission model: owner-only by default, support group shares and link-sharing with optional password and expiry.
Scaling and CDN
Use presigned URLs + CDN for downloads to reduce load on API.
Store files in region close to users.
Use background workers to generate thumbnails and virus scans.
Use lifecycle policies on blob storage to move old files to cold storage.
Monitoring and observability
Track upload success/fail rates, average upload size, total storage used, top file types.
Log events: upload started/completed/failed, download events.
Alert on sudden spikes in failures or storage usage.
Testing and deployment
Unit test storage service with in-memory or local emulator.
Integration test upload/download flows using local storage emulators (Azurite for Azure Blob, LocalStack for S3).
Use CI/CD to run tests and deploy API and Angular app. Use env vars for storage keys.
Example: Minimal FilesController (abridged)
[ApiController]
[Route("api/[controller]")]
public class FilesController : ControllerBase
{
private readonly IFileStorage _storage;
private readonly AppDbContext _db;
public FilesController(IFileStorage storage, AppDbContext db) { _storage = storage; _db = db; }
[HttpPost]
[Authorize]
public async Task<IActionResult> Upload(IFormFile file)
{
if (file == null || file.Length == 0) return BadRequest();
var ownerId = User.GetUserId();
// validations omitted for brevity
var id = Guid.NewGuid();
var storagePath = $"files/{id}/{file.FileName}";
await _storage.UploadAsync(file.OpenReadStream(), storagePath, file.ContentType, CancellationToken.None);
var entity = new FileEntity { Id = id, FileName = file.FileName, ContentType = file.ContentType, Size = file.Length, OwnerId = ownerId, StoragePath = storagePath };
_db.Files.Add(entity);
await _db.SaveChangesAsync();
var url = await _storage.CreatePresignedUrlAsync(storagePath, TimeSpan.FromMinutes(60));
return Ok(new FileUploadResultDto { Id = id, FileName = file.FileName, Url = url });
}
[HttpGet("{id}/download")]
public async Task<IActionResult> Download(Guid id)
{
var file = await _db.Files.FindAsync(id);
if (file == null) return NotFound();
var stream = await _storage.DownloadAsync(file.StoragePath, CancellationToken.None);
return File(stream, file.ContentType, file.FileName, enableRangeProcessing: true);
}
}
Final best practices (short)
Avoid streaming large files through the API when possible — use presigned URLs.
Use storage-native multipart APIs for chunked uploads.
Keep metadata in SQL and content in blob storage.
Scan uploaded files for malware before marking them as safe.
Enforce quotas, size limits, and content-type whitelists.
Use CDN and presigned URLs for fast downloads.
Next steps
I can now:
scaffold a runnable GitHub repo with backend and Angular frontend scaffold,
add full storage implementations for Azure Blob and S3,
generate Angular components (Upload, Browser, Preview) with working code,
add EF Core migrations and seed data,
produce PNG/SVG diagrams for architecture and flow.
Which one would you like next?