Introduction
In modern enterprise applications, logs are your first line of defense against production issues.
When your system runs across multiple APIs, microservices, and background jobs, collecting and analyzing logs becomes a challenge.
That’s where Centralized Logging comes in — a single place to collect, search, and analyze logs in real time.
In this article, you’ll learn how to build a Centralized Logging and Monitoring system using:
Serilog for structured logging in ASP.NET Core
Elastic Stack (ELK) — Elasticsearch, Logstash, and Kibana — for log aggregation, processing, and visualization
Why Centralized Logging?
Let’s say you’re running a web application with multiple APIs.
When a user reports,
“Something failed when placing an order,”
you might have to check multiple log files across servers to find what went wrong.
Centralized logging solves that by:
Storing all logs in one place
Making logs searchable and filterable
Offering real-time dashboards and alerts
Enabling pattern detection for performance or security issues
Architecture Overview
Here’s how everything connects:
ASP.NET Core App
│
▼
[Serilog]
│
▼
[Logstash]
│
▼
[Elasticsearch]
│
▼
[Kibana Dashboard]
Explanation
ASP.NET Core app writes structured logs via Serilog .
Logs are sent to Logstash , which parses and processes them.
Elasticsearch stores and indexes logs efficiently.
Kibana visualizes the logs in dashboards and reports.
Step 1: Setting Up ELK Stack
Option 1: Using Docker (Recommended for Local Setup)
Create a docker-compose.yml file:
version: '3.3'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.10.0
environment:
- discovery.type=single-node
- ES_JAVA_OPTS=-Xms1g -Xmx1g
ports:
- "9200:9200"
logstash:
image: docker.elastic.co/logstash/logstash:8.10.0
ports:
- "5044:5044"
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
depends_on:
- elasticsearch
kibana:
image: docker.elastic.co/kibana/kibana:8.10.0
ports:
- "5601:5601"
depends_on:
- elasticsearch
Logstash Configuration Example ( logstash.conf )
input {
tcp {
port => 5044
codec => json_lines
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "aspnetcore-logs-%{+YYYY.MM.dd}"
}
}
Step 2: Add Serilog to an ASP.NET Core Project
Install NuGet packages
dotnet add package Serilog.AspNetCore
dotnet add package Serilog.Sinks.Console
dotnet add package Serilog.Sinks.File
dotnet add package Serilog.Sinks.Network
Step 3: Configure Serilog in Program.cs
using Serilog;
var builder = WebApplication.CreateBuilder(args);
// Configure Serilog
Log.Logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.WriteTo.Console()
.WriteTo.File("Logs/app-.log", rollingInterval: RollingInterval.Day)
.WriteTo.TCPSink("localhost", 5044) // Send to Logstash
.CreateLogger();
builder.Host.UseSerilog();
builder.Services.AddControllers();
var app = builder.Build();
app.MapControllers();
app.Run();
What this does
Logs are written to cthe onsole and file locally.
At the same time, logs are sent to Logstash over TCP.
Step 4: Create a Sample Controller
Log Example
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
[ApiController]
[Route("api/[controller]")]
public class OrderController : ControllerBase
{
private readonly ILogger<OrderController> _logger;
public OrderController(ILogger<OrderController> logger)
{
_logger = logger;
}
[HttpPost("place")]
public IActionResult PlaceOrder()
{
_logger.LogInformation("Order placed successfully at {time}", DateTime.UtcNow);
return Ok("Order created");
}
[HttpGet("error")]
public IActionResult ThrowError()
{
try
{
throw new Exception("Test exception in order processing");
}
catch (Exception ex)
{
_logger.LogError(ex, "Order processing failed");
return StatusCode(500, "Error");
}
}
}
Step 5: Verify Logs in Kibana
Once your ASP.NET Core app starts sending logs, open Kibana :
👉 http://localhost:5601
Create a new index pattern — aspnetcore-logs-*
You’ll now see logs from your .NET app appear live in Kibana’s “Discover” tab.
You can also:
Filter by log level ( Information , Error , Warning )
Group by controller or service name
Create dashboards showing error frequency, API response time, and more.
Step 6: Adding More Context with Enrichment
Serilog lets you enrich logs with contextual data such as:
Example
Log.Logger = new LoggerConfiguration()
.Enrich.WithProperty("Application", "OrderService")
.Enrich.WithProperty("Environment", builder.Environment.EnvironmentName)
.Enrich.FromLogContext()
.WriteTo.Console()
.WriteTo.TCPSink("localhost", 5044)
.CreateLogger();
This helps you filter and analyze logs by app or environment directly in Kibana.
Step 7: Visualizing Metrics & Alerts in Kibana
Kibana isn’t just for log browsing. You can:
Create error trend charts
Visualize API call patterns
Set email or Slack alerts for high error rates
Correlate logs with performance metrics (e.g., CPU or memory usage)
This transforms your logs into an observability system , not just error tracking.
Step 8: Security & Maintenance Tips
✅ Always secure your ELK endpoints using authentication and SSL.
✅ Use log rotation to prevent disk space issues.
✅ Avoid logging sensitive data (passwords, tokens).
✅ Use separate indices for different environments.
Real-World Use Cases
Multi-tenant SaaS Platforms: Monitor logs per tenant or region.
E-Commerce Systems: Trace transaction failures in real time.
Microservice Architectures: Aggregate logs from multiple services into one dashboard.
DevOps Teams: Use log trends for root cause analysis and uptime metrics.
Flow Diagram – Serilog + ELK Stack Integration
+------------------------+
| ASP.NET Core App |
| (Serilog Integration) |
+-----------+------------+
|
v
+------------------------+
| Logstash (Parser) |
+-----------+------------+
|
v
+------------------------+
| Elasticsearch (Index) |
+-----------+------------+
|
v
+------------------------+
| Kibana Dashboard |
| (Search, Visualize) |
+------------------------+
Conclusion
Centralized logging with Serilog and ELK Stack brings structure, visibility, and control to your ASP.NET Core applications.
You no longer need to jump between servers or check log files manually — everything is in one place, searchable, visual, and intelligent.
If you’re running a large enterprise or cloud-based system, setting up this pipeline is one of the best investments you can make for reliability and operational insight.