Logging With ElasticSearch, Kibana, Serilog Using ASP.NET Core Docker

In this article, I'll show you how to set up and run your application using ElasticSearch, Kibana and Serilog.
 
Before diving deep into implementation, let's understand the basics.
 
You can find the source code here
 
What is ElasticSearch?
 
Elasticsearch is a distributed, open source search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured. Elasticsearch is built on Apache Lucene and was first released in 2010 by Elasticsearch N.V. (now known as Elastic) Sourced from here.
 
What is Kibana?
 
Kibana is a UI application that sits on top of ElasticSearch. It provides search and visualization capabilities for data indexes in the ElasticSearch.
 
What is Serilog?
 
I have already written an in-depth article on Serilog, I highly encourage you to go through by clicking here
 
Why logging with ElasticSearch and Kibana?
 
Traditionally, we often use to create a flat file logging. It comes with few drawbacks
  • Accessing the log file on the server is a tedious job.
  • Searching for errors in the log file is quite cumbersome and time consuming.
These drawbacks came be rectified using ElasticSearch. It makes logging easily accessible and searchable using a simple query language coupled with Kibana interface.
 
Prerequisites
 
To move along, make sure you have the following installed
  • Visual studio/ Visual studio code
  • Docker Desktop
  • .net core sdk 3.1

Project Creation and Nuget package

 
Let's begin by creating an ASP.NET Core Web API application and give the project name as "ElasticKibanaLoggingVerify".
 
After project creation, make sure you have the nuget packages installed.
  1. dotnet add package Serilog.AspNetCore  
  2. dotnet add package Serilog.Enrichers.Environment  
  3. dotnet add package Serilog.Sinks.Elasticsearch  

Docker Compose of ElasticSearch and Kibana

 
Before jumping into implementation, let's spin up the docker container for ElasticSearch and Kibana.
 
Docker supports single and multi-node ElasticSearch. Single node is recommended for development and testing; whereas, multinode for pre-prod and prod environment.
 
Create a new folder as docker and new file as docker-compose.yml.
  1. version: '3.4'  
  2.   
  3. elasticsearch:  
  4.    container_name: elasticsearch  
  5.    image: docker.elastic.co/elasticsearch/elasticsearch:7.9.1  
  6.    ports:  
  7.     - 9200:9200  
  8.    volumes:  
  9.     - elasticsearch-data:/usr/share/elasticsearch/data  
  10.    environment:  
  11.     - xpack.monitoring.enabled=true  
  12.     - xpack.watcher.enabled=false  
  13.     - "ES_JAVA_OPTS=-Xms512m -Xmx512m"  
  14.     - discovery.type=single-node  
  15.    networks:  
  16.     - elastic  
  17.   
  18.   kibana:  
  19.    container_name: kibana  
  20.    image: docker.elastic.co/kibana/kibana:7.9.1  
  21.    ports:  
  22.     - 5601:5601  
  23.    depends_on:  
  24.     - elasticsearch  
  25.    environment:  
  26.     - ELASTICSEARCH_URL=http://localhost:9200  
  27.    networks:  
  28.     - elastic  
  29.     
  30. networks:  
  31.   elastic:  
  32.     driver: bridge  
  33.   
  34. volumes:  
  35.   elasticsearch-data:  
Note
While creating an article, the latest version of ElasticSearch and Kibana is v7.9.1. I highly recommend you to look at this link to ensure that you are working on the latest version.
 
There is also easy way to create both ElasticSearch and Kibana using a single container command here.
 
Note
I haven't tried with above container and running both ElasticSearch and Kibana in a single container is not recommended in production environment.
 

Verify that ElasticSearch and Kibana are up and running

 
Navigate to http://localhost:9200 for verifying ElasticSearch
 
Navigate to http://localhost:5601 for verifying Kibana
 
 

Removing the Out of box configuration for logging

 
As discussed in the previous article on Serilog, out of the box logging configuration in appsettings.json is not necessary. Only the below configuration is required from out of the box appsettings.json.
  1. {  
  2.   "AllowedHosts""*"  
  3. }  
Now add ElasticSearch url in the appsettings.json.
  1. "ElasticConfiguration": {  
  2.         "Uri""http://localhost:9200"  
  3.     },  
  4.     "AllowedHosts""*"  
Configure logging
 
The next step is to configure logging in program.cs.
  1. using System;  
  2. using System.Reflection;  
  3. using Microsoft.AspNetCore.Hosting;  
  4. using Microsoft.Extensions.Configuration;  
  5. using Microsoft.Extensions.Hosting;  
  6. using Serilog;  
  7. using Serilog.Sinks.Elasticsearch;  
  8.   
  9. namespace ElasticKibanaLoggingVerify  
  10. {  
  11.     public class Program  
  12.     {  
  13.         public static void Main(string[] args)  
  14.         {  
  15.             var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");  
  16.             var configuration = new ConfigurationBuilder()  
  17.                 .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)  
  18.                 .AddJsonFile(  
  19.                     $"appsettings.{Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT")}.json",  
  20.                     optional: true)  
  21.                 .Build();  
  22.             Log.Logger = new LoggerConfiguration()  
  23.                 .Enrich.FromLogContext()  
  24.                 .WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(configuration["ElasticConfiguration:Uri"]))  
  25.                 {  
  26.                     AutoRegisterTemplate = true,  
  27.                     IndexFormat = $"{Assembly.GetExecutingAssembly().GetName().Name.ToLower()}-{DateTime.UtcNow:yyyy-MM}"  
  28.                 })  
  29.                 .Enrich.WithProperty("Environment", environment)  
  30.                 .ReadFrom.Configuration(configuration)  
  31.                 .CreateLogger();  
  32.             CreateHostBuilder(args).Build().Run();  
  33.         }  
  34.   
  35.         public static IHostBuilder CreateHostBuilder(string[] args) =>  
  36.             Host.CreateDefaultBuilder(args)  
  37.                 .ConfigureWebHostDefaults(webBuilder =>  
  38.                 {  
  39.                     webBuilder.UseStartup<Startup>();  
  40.                 }).ConfigureAppConfiguration(configuration =>  
  41.                 { configuration.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true);  
  42.                     configuration.AddJsonFile(  
  43.                         $"appsettings.{Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT")}.json",optional: true);  
  44.                 })  
  45.         .UseSerilog();  
  46.     }  
  47. }  
In the previous article on Serilog, we have seen how important the enrichment and the SinkOptions are.
 
You can register the ElasticSearch sink in code as follows
  1. WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(configuration["ElasticConfiguration:Uri"]))  
  2.                 {  
  3.                     AutoRegisterTemplate = true,  
  4.                     IndexFormat = $"{Assembly.GetExecutingAssembly().GetName().Name.ToLower()}-{DateTime.UtcNow:yyyy-MM}"  
  5.                 })  

Create a Controller to validate the behavior

 
You can create a controller to verify the logging details in Kibana.
  1. Route("api/[controller]")]  
  2.     public class ElasticSearchController : Controller  
  3.     {  
  4.   
  5.         private readonly ILogger<ElasticSearchController> _logger;  
  6.   
  7.         public ElasticSearchController(ILogger<ElasticSearchController> logger)  
  8.         {  
  9.             _logger = logger;  
  10.         }  
  11.   
  12.         // GET: api/values  
  13.         [HttpGet]  
  14.         public int GetRandomvalue()  
  15.         {  
  16.             var random = new Random();  
  17.             var randomValue=random.Next(0, 100);  
  18.             _logger.LogInformation($"Random Value is {randomValue}");  
  19.             return randomValue;  
  20.         }  
  21. }  
The above controller is self explanatory and generates random values between 0 to 100. Thereafter, I'm logging the random value using
  1. _logger.LogInformation($"Random Value is {randomValue}");  

Start logging events to ElasticSearch and configure Kibana

 
Now, run the Web API application by clicking on F5 and navigating to https://localhost:5001/api/ElasticSearch
 
Now, let's configure an index in Kibana
 
 
 
After creating an index, you can filter the message by using
  1. message: "59"  
 

Logging error to ElasticSearch

 
Let's add a new HTTP GET method in ElasticSearch Controller
  1. [HttpGet("{id}")]  
  2.         public string ThrowErrorMessage(int id)  
  3.         {  
  4.             try  
  5.             {  
  6.                 if (id <= 0)  
  7.                     throw new Exception($"id cannot be less than or equal to o. value passed is {id}");  
  8.                 return id.ToString();  
  9.             }  
  10.             catch (Exception ex)  
  11.             {  
  12.                 _logger.LogError(ex, ex.Message);  
  13.             }  
  14.             return string.Empty;  
  15.         }  
Above code is quite straightforward and self explanatory. Let's test the ThrowErrorMessage method by passing id equals to 0.
 
 
You can narrow down to error log using:
  1. level:"error"  
You can filter further by configuring multiple conditions:
  1. message: "value passed" and level:"error"  

Conclusion

 
Earlier, setting up the logging was quite a tedious job and thereafter, getting the flat file from servers and identifying/searching the error is another pain.
 
With Docker in place, setting up the environment is effortless and especially with the MicroService architecture, it's much easier to create the ElasticSearch indexes and the data that has been logged can be visualized using Kibana.
 
Logging can be even more powerful, when you perform the same setup in Kubernetes/Azure Kubernetes Service orchestrator with multiple-nodes.
 
There's really no excuse for developers/architects to not incorporate logging using ElasticSearch, Kibana and Docker.
 
Cheers!
 
I hope you like the article. In case, you find the article interesting then kindly like and share it.