Creating An Azure API For Custom Logging In Azure Log Analytics

Introduction

 
Anyone who has ever done application development in any capacity is almost certain to understand the need to log actions, events, and errors for the application and modules. It is one of those things that does not provide direct value to the end-users but is extremely helpful in error conditions in applications and very critical for best practices. The logs that we push should pinpoint the exact module or component and provide all the necessary information about the error/exception that the application has encountered which will help the engineer figure out the cause and fix the problem appropriately.
 
Traditionally, all these logs used to be stored in application databases which would be then accessed and understood by the engineer using log reading applications that can present the logs in an understandable format. The problem with this approach is that since the custom logs are stored in the application database/logging mechanism, the log data is generally huge and has application events, user accesses, server actions, etc. logged in the same space along with error information. This is a huge overhead for the engineer who is trying to find the needle in this haystack of log information. Another problem is that then each application has its own space of logging so more often than not, we have to keep tabs on multiple log spaces and then map it to the right application and find errors. In addition, there will be inconsistencies in the way custom logs are pushed as there is no standardized approach and each implementer has done the logging in the way it worked for them. Last but not the least, the engineer needs to have access to all of the log spaces as well. While our hardworking fellow engineers make that work in most environments, it is best if we can somehow introduce standardization, have a single space for collecting and looking at failure logs and minimize the overall maintenance effort for applications and logs.
 
With this context, we started looking at our possible solutions. Azure offers a very powerful service in Azure Log Analytics which, apart from integrating well with the Azure Monitor service, provides the capability to define, read, write, query, and search logs. For this article, we are going to leverage the API store architecture that we have defined in our earlier article to host a new API that will push our application logs into Log Analytics workspace which we will then view and analyze using the Log Analytics interface. Let's jump right into it.
 
Creating an Azure API for custom logging in Azure Log Analytics
 

Implementation

 
First, we are going to define a schema that we will use for our custom logs. Each application's log requirements are going to be different and would want to log details relevant to the application to help us at a later stage to figure out the issues and errors. So, we cannot really have a restricted schema and should give enough flexibility to the consuming application that they can log what they desire, in the format they desire.
 
For this reason, we are going to define just a basic schema for our Log Analytics Workspace that we will create for custom logs. We are going in with fixed fields described as under,
  • log filename: Name of the log file that you have created in your Log Analytics Workspace
  • AppName:  Unique name/identifier for your application that can be used later to map to application
  • ModuleName: Name of the module/function/method for which the logging is being done
  • log data: This is a JSON field that can hold any number of attributes and details in a key-value pair (nested or plain). This has to be a valid JSON in order for us to make use of the visual and parsing capabilities in Log Analytics
Okay. Now that we have decided on the schema, let's go ahead and create our helper class under the Helper folder in our solution and paste the following code in it. We will create three methods in this helper file. The first one to build the signature hash string required for authorizing to the Azure Log Analytics endpoint. The signature hash is built using secret, message, and SHA256 encryption. The second method will be the one that will ingest the log to Azure Log Analytics endpoint using the signature generated and log data. The third and final method is the root method that our custom Azure Function is going to invoke on the trigger to perform the requested action for logging data into the Azure Log Analytics Workspace. We are using the Azure Data Collector API implementation in this. 
 
LogAnalyticsHelper.cs 
  1. using System;  
  2. using System.Text;  
  3. using System.Threading.Tasks;  
  4. using System.Net.Http;  
  5. using System.Net.Http.Headers;  
  6. using System.Security.Cryptography;  
  7. using Newtonsoft.Json;  
  8. using Microsoft.Extensions.Logging;  
  9. using Newtonsoft.Json.Linq;  
  10.   
  11. namespace Reusable.Functions  
  12. {  
  13.     public class LogAnalyticsHelper  
  14.     {  
  15.         /// <summary>  
  16.         /// Get LogAnalytics Specific Details  
  17.         /// </summary>  
  18.         /// <returns></returns>  
  19.         public static string GetTimeStampField()  
  20.         {  
  21.             return ConstantsHelper.TimeStampField;  
  22.         }  
  23.   
  24.         /// <summary>  
  25.         /// Validate User Input Json  
  26.         /// </summary>  
  27.         /// <param name="strInput"></param>  
  28.         /// <param name="log"></param>  
  29.         /// <returns></returns>  
  30.         public static bool IsValidJson(string strInput, ILogger log)  
  31.         {  
  32.             strInput = strInput.Trim();  
  33.             if ((strInput.StartsWith("{") && strInput.EndsWith("}")) || //For object  
  34.                 (strInput.StartsWith("[") && strInput.EndsWith("]"))) //For array  
  35.             {  
  36.                 try  
  37.                 {  
  38.                     var obj = JToken.Parse(strInput);  
  39.                     return true;  
  40.                 }  
  41.                 catch (JsonReaderException jex)  
  42.                 {  
  43.                     //Exception in parsing json  
  44.                     log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ jex.Message}");  
  45.                     return false;  
  46.                 }  
  47.                 catch (JsonException je)  
  48.                 {  
  49.                     //Exception in parsing json  
  50.                     log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ je.Message}");  
  51.                     return false;  
  52.                 }  
  53.                 catch (Exception ex) //some other exception  
  54.                 {  
  55.                     //Exception in parsing json  
  56.                     log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ ex.Message}");  
  57.                     return false;  
  58.                 }  
  59.             }  
  60.             else  
  61.             {  
  62.                 return false;  
  63.             }  
  64.         }  
  65.   
  66.         /// <summary>  
  67.         /// To push logs in Log Analytics Workspace  
  68.         /// </summary>  
  69.         /// <param name="json"></param>  
  70.         /// <param name="workspaceId"></param>  
  71.         /// <param name="sharedKey"></param>  
  72.         /// <param name="log"></param>  
  73.         /// <returns></returns>  
  74.         public static async Task<bool> PushLogsToLogAnalytics(string json, string logFileName, string workspaceId, string sharedKey, ILogger log)  
  75.         {  
  76.             try  
  77.             {  
  78.                 // Create a hash for the API signature  
  79.                 var datestring = DateTime.UtcNow.ToString("r");  
  80.                 var jsonBytes = Encoding.UTF8.GetBytes(json);  
  81.                 string stringToHash = "POST\n" + jsonBytes.Length + "\napplication/json\n" + "x-ms-date:" + datestring + "\n/api/logs";  
  82.                 string hashedString = LogAnalyticsHelper.BuildSignature(stringToHash, sharedKey, log);  
  83.                 log.LogInformation($"HashedString : {hashedString}");  
  84.                 string signature = "SharedKey " + workspaceId + ":" + hashedString;  
  85.                 log.LogInformation($"Signature : " + signature);  
  86.                 bool ingestionStatus = await LogAnalyticsHelper.IngestToLogAnalytics(signature, datestring, json, logFileName, workspaceId, log);  
  87.                 return ingestionStatus;  
  88.             }  
  89.             catch (Exception e)  
  90.             {  
  91.                 log.LogInformation($"PushLogsToLogAnalytics got Exception \n  Time: {DateTime.Now} \n Exception{e.Message} and complete Exception:{e}");  
  92.                 return false;  
  93.             }  
  94.   
  95.         }  
  96.   
  97.         //Build the API signature  
  98.         /// <summary>  
  99.         /// To build signature for log data  
  100.         /// </summary>  
  101.         /// <param name="message"></param>  
  102.         /// <param name="secret"></param>  
  103.         /// <param name="log"></param>  
  104.         /// <returns></returns>  
  105.         public static string BuildSignature(string message, string secret, ILogger log)  
  106.         {  
  107.             log.LogInformation($"Begin BuildSignature \n Start Time: {DateTime.Now}");  
  108.             var encoding = new System.Text.ASCIIEncoding();  
  109.             byte[] keyByte = Convert.FromBase64String(secret);  
  110.             byte[] messageBytes = encoding.GetBytes(message);  
  111.             using (var hmacsha256 = new HMACSHA256(keyByte))  
  112.             {  
  113.                 byte[] hash = hmacsha256.ComputeHash(messageBytes);  
  114.                 return Convert.ToBase64String(hash);  
  115.             }  
  116.         }  
  117.   
  118.         /// <summary>  
  119.         /// To Ingest Into Log Analytics  
  120.         /// </summary>  
  121.         /// <param name="signature"></param>  
  122.         /// <param name="date"></param>  
  123.         /// <param name="datajson"></param>  
  124.         /// <param name="workspaceId"></param>  
  125.         /// <param name="log"></param>  
  126.         /// <returns></returns>  
  127.         public static async Task<bool> IngestToLogAnalytics(string signature, string date, string datajson, string logFile, string workspaceId, ILogger log)  
  128.         {  
  129.             try  
  130.             {  
  131.                 string url = "https://" + workspaceId + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01";  
  132.                 HttpClient client = new HttpClient();  
  133.                 client.DefaultRequestHeaders.Add("Accept""application/json");  
  134.                 client.DefaultRequestHeaders.Add("Log-Type", logFile);  
  135.                 client.DefaultRequestHeaders.Add("Authorization", signature);  
  136.                 client.DefaultRequestHeaders.Add("x-ms-date", date);  
  137.                 client.DefaultRequestHeaders.Add("time-generated-field", GetTimeStampField());  
  138.   
  139.                 HttpContent httpContent = new StringContent(datajson, Encoding.UTF8);  
  140.                 httpContent.Headers.ContentType = new MediaTypeHeaderValue("application/json");  
  141.   
  142.                 var response = await client.PostAsync(new Uri(url), httpContent);  
  143.                 if (response.IsSuccessStatusCode)  
  144.                 {  
  145.                     HttpContent responseContent = response.Content;  
  146.                     var result = await responseContent.ReadAsStringAsync().ConfigureAwait(false);  
  147.                     log.LogInformation("Ingestion of Logs is completed with status code : " + response.StatusCode);  
  148.                     return true;  
  149.                 }  
  150.                 else  
  151.                 {  
  152.                     HttpContent responseContent = response.Content;  
  153.                     string result = await responseContent.ReadAsStringAsync().ConfigureAwait(false);  
  154.                     log.LogInformation("Ingestion of Logs has failed with status code : " + response.StatusCode);  
  155.                     return false;  
  156.                 }  
  157.             }  
  158.             catch (Exception e)  
  159.             {  
  160.                 log.LogInformation($"IngestToLogAnalytics got Exception \n  Time: {DateTime.Now} \n Exception{e.Message} and complete Exception:{e}");  
  161.                 return false;  
  162.             }  
  163.         }         
  164.     }  
  165. }  
Next, let's add a new Azure Function to our existing solution that we have set up in the previous article. We will add this function to the API Management at a later point to provide the API end-point for our consumers via subscription. We will define a HTTP Trigger function for pushing the log data to Log Analytics Workspace. Since our log schema expects the application log details in a JSON format, we will also add a method to validate that the log information being sent is containing a valid JSON. If not, we are not going to log this information as it will interfere with the visual presentation and parsing in Log Analytics Workspace. Once we have validated the JSON, we are going to fetch the Log Analytics Workspace ID and Key from the Azure Key Vault using custom API that we have created in our earlier artile<link>. You may hardcode this but from a security best practice point of view, this should be stored and fetched from Azure Key Vault or any other secure vault of your choice. Once we have these details, we will just invoke the method from our LogAnalyticsHelper class to push the log into Azure Log Analytics.
 
PushLogsToLogAnalytics.cs
  1. using System;  
  2. using System.Threading.Tasks;  
  3. using Microsoft.Azure.WebJobs;  
  4. using Microsoft.Azure.WebJobs.Extensions.Http;  
  5. using Microsoft.Extensions.Logging;  
  6. using System.Net;  
  7. using System.Net.Http;  
  8. using Newtonsoft.Json.Linq;  
  9.   
  10. namespace Reusable.Functions  
  11. {  
  12.     /// <summary>  
  13.     /// This function is to inject logs into Log Analytics Workspace  
  14.     /// </summary>  
  15.     public static class PushLogsToLogAnalytics  
  16.     {  
  17.         [FunctionName("PushLogsToLogAnalytics")]  
  18.         public static async Task<HttpResponseMessage> Run(  
  19.             [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequestMessage req,  
  20.             ILogger log)  
  21.         {  
  22.             try  
  23.             {  
  24.                 log.LogInformation("PushLogsToLogAnalytics Function Called");  
  25.                 //Get Request Data  
  26.                 dynamic data = await req.Content.ReadAsAsync<object>();  
  27.                 string customLogFile = data.LogFileName;  
  28.                 string automationName = data.AutomationName;  
  29.                 string moduleName = data.ModuleName;  
  30.                 string logData = Convert.ToString(data.LogData);  
  31.   
  32.                 //Parsing provided logData Json  
  33.                 JObject logDataObj = JObject.Parse(logData);  
  34.                 string logDataJson = logDataObj.ToString(Newtonsoft.Json.Formatting.Indented);  
  35.                   
  36.                 //Preparing Final Json for Log Analytics Injection  
  37.                 dynamic obj = new JObject();  
  38.                 obj.AutomationName = automationName;  
  39.                 obj.ModuleName = moduleName;  
  40.                 obj.Log = logDataJson;  
  41.                 string myJson = obj.ToString(Newtonsoft.Json.Formatting.Indented);  
  42.                 log.LogInformation("PreparedFinalJson : " + myJson);  
  43.   
  44.                 //Validating Json - User provided Log Data Json and prepared final Json  
  45.                 bool isChildJsonValid = LogAnalyticsHelper.IsValidJson(logDataJson, log);  
  46.                 bool isParentJsonValid = LogAnalyticsHelper.IsValidJson(myJson, log);  
  47.   
  48.                 if (isChildJsonValid && isParentJsonValid)  
  49.                 {  
  50.                     log.LogInformation("Fetching details from KeyVault");  
  51.                     log.LogInformation("Invoking FetchKeyVaultSecret method");  
  52.                     string workspaceId = await KeyVaultHelper.FetchKeyVaultSecret(ConstantsHelper.GetEnvironmentVariable(ConstantsHelper.logAnalyticsWorkspaceID), log);  
  53.                     string primaryKey = await KeyVaultHelper.FetchKeyVaultSecret(ConstantsHelper.GetEnvironmentVariable(ConstantsHelper.logAnalyticsWorkspaceSharedKey), log);  
  54.                     log.LogInformation("FetchKeyVaultSecret executed successfully");  
  55.   
  56.                     //Invoking PushLogsToLogAnalytics method to ingest the logs into workspace  
  57.                     bool status = await LogAnalyticsHelper.PushLogsToLogAnalytics(myJson, customLogFile, workspaceId, primaryKey, log);  
  58.                     if (status)  
  59.                     {  
  60.                         log.LogInformation("Ingestion of log analytics is completed.");  
  61.                         return req.CreateResponse(HttpStatusCode.OK, "[Info] Ingestion of log analytics is completed.");  
  62.                     }  
  63.                     else  
  64.                     {  
  65.                         log.LogInformation("Ingestion of log analytics is failed");  
  66.                         return req.CreateResponse(HttpStatusCode.BadRequest, "[Error] Ingestion of log analytics is failed");  
  67.                     }  
  68.                 }  
  69.                 else  
  70.                 {  
  71.                     return req.CreateResponse(HttpStatusCode.BadRequest, $"[Warning] Invalid Json Provided");  
  72.                 }  
  73.             }  
  74.             catch (System.Exception ex)  
  75.             {  
  76.                 return req.CreateResponse(HttpStatusCode.NotFound, $"{ex.Message}");  
  77.             }  
  78.         }  
  79.     }  
  80. }  
Creating an Azure API for custom logging in Azure Log Analytics 
 
We have the code ready and we can now push it to the Azure Function resource that we have created either by using the publish command from Azure Function extension in VS Code or using the DevOps pipelines if you have them set up for publishing resources to Azure Function. Once the function is deployed, the next step is to bind it to our API Management resource in Azure and it to the product and subscriptions that we want it to be exposed to. We have done other settings like UAMI (user-assigned managed identity) already in our earlier article
 
Creating an Azure API for custom logging in Azure Log Analytics 
Creating an Azure API for custom logging in Azure Log Analytics 
Creating an Azure API for custom logging in Azure Log Analytics 
Creating an Azure API for custom logging in Azure Log Analytics  
That's it. We are now ready to invoke the custom logging API. Next, let's see that in action. Below is a sample C# and PowerShell Codes for invoking the APIs and passing different log information in separate applications.
 
C# Sample
  1. using (var client = new HttpClient())  
  2. {  
  3.     string Uri = "Your-API-Uri";  
  4.     string myJson = @"{ ""LogFileName"":""Your-Custom-Log-File-In-LogAnalytics"",  
  5.     ""AutomationName""""Your-Automation-Name"",  
  6.     ""ModuleName""""Your-Automation-Module-Name"",  
  7.     ""LogData"": {  
  8.     ""Attribute1""""some-value"",  
  9.     ""Attribute2""""some-value"",  
  10.     ""Attribute3"": {  
  11.     ""Attribute1""""some-value"",  
  12.     ""Attribute2""""some-value"",  
  13.     ""Attribute3""""some-value"",  
  14.     ""Attribute4""""some-value"",  
  15.     ""Attribute5""""some-value"",  
  16.     ""Attribute6""""some-value""  
  17.                          }  
  18.                     }  
  19.     }";  
  20.     //Injecting logs into log analytics workspace  
  21.                             client.DefaultRequestHeaders.Add(ConstantsHelper.ocp_Apim_Subscription_Key, "Your-API-Subscription-Key");  
  22.     var response = await client.PostAsync(Uri, new StringContent(myJson, System.Text.Encoding.UTF8, "application/json"));  
  23.     if (response.StatusCode == System.Net.HttpStatusCode.OK)  
  24.     {  
  25.        log.LogInformation("Logging is completed successfully with status code : " + response.StatusCode);  
  26.     }  
  27.     else  
  28.     {  
  29.         log.LogInformation("Logging is failed with status code : " + response.StatusCode);  
  30.     }  
  31. }  
PowerShell Sample
  1. $body = @{  
  2.     LogFileName = "Your-Custom-Log-File-In-LogAnalytics"  
  3.     AutomationName = "Your-Automation-Name"  
  4.     ModuleName = "Your-Automation-Module-Name"  
  5.     # You can define your custom json for logging unstructured logs. You can have your custom attributes inplace of below given attributes   
  6.     LogData = @{  
  7.         Attribute1 = "Some-Value"  
  8.         Attribute2 = "Some-Value"  
  9.         Attribute3 = "Some-Value"  
  10.         Attribute4 = "Some-Value"  
  11.         Attribute5 = "Some-Value"  
  12.         Attribute6 = "Some-Value"  
  13.     }  
  14. }  
  15. $body = $body | ConvertTo-Json  
  16.   
  17. $api = "Your-API-Uri"  
  18. [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12  
  19. $headers = @{}  
  20. $headers.Add("Ocp-Apim-Subscription-Key""Your-API-Subscription-Key")  
  21.  
  22. # Invocation of API using RestMethod  
  23. $response = Invoke-RestMethod -Uri $api -Method Post -Headers $headers -Body $body -ContentType "application/json"  
So, the above implementations demonstrate that our API can accept and log different logging schema information as long as it is a valid JSON. Additionally, it is super easy to integrate into different applications, modules, and platforms since our logging method is now an API end-point. Let's now view this in the Azure Log Analytics workspace and visualize.
 
Creating an Azure API for custom logging in Azure Log Analytics
 
You can now see how we have benefitted by defining just the base schema and leaving the logging schema as a JSON field. Log Analytics has the capacity to parse this JSON field and give you a proper visualization with clearly defined property and value out of the JSON provided. Now if we had to search for a specific application or time, we do have separate fields for that. However, if we wanted to query on one of the actual application logs in JSON, we will have some limitations in the KQL and we will have to do some customizations. So, the recommendation from my end will be to take out the primary search fields so that we will be able to leverage full OOTB search capabilities offered in Log Analytics. The JSON based field does give us flexibility in logging variety of data but the trade-off with schema-less fields needs to be evaluated appropriately as we definitely lose out on some search capabilities with the JSON fields.
 
Creating an Azure API for custom logging in Azure Log Analytics 
 

Conclusion

 
We have discussed and demonstrated the custom logging API that can be built using C# code that will be hosted on Azure Functions and exposed via the Azure API Management Service. The schema for logs needs to be an informed decision that you take prior to implementing based on your environment, application requirement, and purpose of logging. The complete code and implementation are available at our GitHub Repository
 
Happy Coding!

Next Recommended Reading Enabling Log Analytics In Logic App