Introduction
Anyone who has ever done application development in any capacity is almost certain to understand the need to log actions, events, and errors for the application and modules. It is one of those things that does not provide direct value to the end-users but is extremely helpful in error conditions in applications and very critical for best practices. The logs that we push should pinpoint the exact module or component and provide all the necessary information about the error/exception that the application has encountered which will help the engineer figure out the cause and fix the problem appropriately.
Traditionally, all these logs used to be stored in application databases which would be then accessed and understood by the engineer using log reading applications that can present the logs in an understandable format. The problem with this approach is that since the custom logs are stored in the application database/logging mechanism, the log data is generally huge and has application events, user accesses, server actions, etc. logged in the same space along with error information. This is a huge overhead for the engineer who is trying to find the needle in this haystack of log information. Another problem is that then each application has its own space of logging so more often than not, we have to keep tabs on multiple log spaces and then map it to the right application and find errors. In addition, there will be inconsistencies in the way custom logs are pushed as there is no standardized approach and each implementer has done the logging in the way it worked for them. Last but not the least, the engineer needs to have access to all of the log spaces as well. While our hardworking fellow engineers make that work in most environments, it is best if we can somehow introduce standardization, have a single space for collecting and looking at failure logs and minimize the overall maintenance effort for applications and logs.
With this context, we started looking at our possible solutions. Azure offers a very powerful service in Azure Log Analytics which, apart from integrating well with the Azure Monitor service, provides the capability to define, read, write, query, and search logs. For this article, we are going to leverage the API store architecture that we have defined in our earlier
article to host a new API that will push our application logs into Log Analytics workspace which we will then view and analyze using the Log Analytics interface. Let's jump right into it.
Implementation
First, we are going to define a schema that we will use for our custom logs. Each application's log requirements are going to be different and would want to log details relevant to the application to help us at a later stage to figure out the issues and errors. So, we cannot really have a restricted schema and should give enough flexibility to the consuming application that they can log what they desire, in the format they desire.
For this reason, we are going to define just a basic schema for our Log Analytics Workspace that we will create for custom logs. We are going in with fixed fields described as under,
- log filename: Name of the log file that you have created in your Log Analytics Workspace
- AppName: Unique name/identifier for your application that can be used later to map to application
- ModuleName: Name of the module/function/method for which the logging is being done
- log data: This is a JSON field that can hold any number of attributes and details in a key-value pair (nested or plain). This has to be a valid JSON in order for us to make use of the visual and parsing capabilities in Log Analytics
Okay. Now that we have decided on the schema, let's go ahead and create our helper class under the Helper folder in our solution and paste the following code in it. We will create three methods in this helper file. The first one to build the signature hash string required for authorizing to the Azure Log Analytics endpoint. The signature hash is built using secret, message, and SHA256 encryption. The second method will be the one that will ingest the log to Azure Log Analytics endpoint using the signature generated and log data. The third and final method is the root method that our custom Azure Function is going to invoke on the trigger to perform the requested action for logging data into the Azure Log Analytics Workspace. We are using the
Azure Data Collector API implementation in this.
LogAnalyticsHelper.cs
- using System;
- using System.Text;
- using System.Threading.Tasks;
- using System.Net.Http;
- using System.Net.Http.Headers;
- using System.Security.Cryptography;
- using Newtonsoft.Json;
- using Microsoft.Extensions.Logging;
- using Newtonsoft.Json.Linq;
-
- namespace Reusable.Functions
- {
- public class LogAnalyticsHelper
- {
-
-
-
-
- public static string GetTimeStampField()
- {
- return ConstantsHelper.TimeStampField;
- }
-
-
-
-
-
-
-
- public static bool IsValidJson(string strInput, ILogger log)
- {
- strInput = strInput.Trim();
- if ((strInput.StartsWith("{") && strInput.EndsWith("}")) ||
- (strInput.StartsWith("[") && strInput.EndsWith("]")))
- {
- try
- {
- var obj = JToken.Parse(strInput);
- return true;
- }
- catch (JsonReaderException jex)
- {
-
- log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ jex.Message}");
- return false;
- }
- catch (JsonException je)
- {
-
- log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ je.Message}");
- return false;
- }
- catch (Exception ex)
- {
-
- log.LogInformation($"\n IsValidJson method got Exception \n Time: { DateTime.Now} \n Exception{ ex.Message}");
- return false;
- }
- }
- else
- {
- return false;
- }
- }
-
-
-
-
-
-
-
-
-
- public static async Task<bool> PushLogsToLogAnalytics(string json, string logFileName, string workspaceId, string sharedKey, ILogger log)
- {
- try
- {
-
- var datestring = DateTime.UtcNow.ToString("r");
- var jsonBytes = Encoding.UTF8.GetBytes(json);
- string stringToHash = "POST\n" + jsonBytes.Length + "\napplication/json\n" + "x-ms-date:" + datestring + "\n/api/logs";
- string hashedString = LogAnalyticsHelper.BuildSignature(stringToHash, sharedKey, log);
- log.LogInformation($"HashedString : {hashedString}");
- string signature = "SharedKey " + workspaceId + ":" + hashedString;
- log.LogInformation($"Signature : " + signature);
- bool ingestionStatus = await LogAnalyticsHelper.IngestToLogAnalytics(signature, datestring, json, logFileName, workspaceId, log);
- return ingestionStatus;
- }
- catch (Exception e)
- {
- log.LogInformation($"PushLogsToLogAnalytics got Exception \n Time: {DateTime.Now} \n Exception{e.Message} and complete Exception:{e}");
- return false;
- }
-
- }
-
-
-
-
-
-
-
-
-
- public static string BuildSignature(string message, string secret, ILogger log)
- {
- log.LogInformation($"Begin BuildSignature \n Start Time: {DateTime.Now}");
- var encoding = new System.Text.ASCIIEncoding();
- byte[] keyByte = Convert.FromBase64String(secret);
- byte[] messageBytes = encoding.GetBytes(message);
- using (var hmacsha256 = new HMACSHA256(keyByte))
- {
- byte[] hash = hmacsha256.ComputeHash(messageBytes);
- return Convert.ToBase64String(hash);
- }
- }
-
-
-
-
-
-
-
-
-
-
- public static async Task<bool> IngestToLogAnalytics(string signature, string date, string datajson, string logFile, string workspaceId, ILogger log)
- {
- try
- {
- string url = "https://" + workspaceId + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01";
- HttpClient client = new HttpClient();
- client.DefaultRequestHeaders.Add("Accept", "application/json");
- client.DefaultRequestHeaders.Add("Log-Type", logFile);
- client.DefaultRequestHeaders.Add("Authorization", signature);
- client.DefaultRequestHeaders.Add("x-ms-date", date);
- client.DefaultRequestHeaders.Add("time-generated-field", GetTimeStampField());
-
- HttpContent httpContent = new StringContent(datajson, Encoding.UTF8);
- httpContent.Headers.ContentType = new MediaTypeHeaderValue("application/json");
-
- var response = await client.PostAsync(new Uri(url), httpContent);
- if (response.IsSuccessStatusCode)
- {
- HttpContent responseContent = response.Content;
- var result = await responseContent.ReadAsStringAsync().ConfigureAwait(false);
- log.LogInformation("Ingestion of Logs is completed with status code : " + response.StatusCode);
- return true;
- }
- else
- {
- HttpContent responseContent = response.Content;
- string result = await responseContent.ReadAsStringAsync().ConfigureAwait(false);
- log.LogInformation("Ingestion of Logs has failed with status code : " + response.StatusCode);
- return false;
- }
- }
- catch (Exception e)
- {
- log.LogInformation($"IngestToLogAnalytics got Exception \n Time: {DateTime.Now} \n Exception{e.Message} and complete Exception:{e}");
- return false;
- }
- }
- }
- }
Next, let's add a new Azure Function to our existing solution that we have set up in the previous
article. We will add this function to the API Management at a later point to provide the API end-point for our consumers via subscription. We will define a HTTP Trigger function for pushing the log data to Log Analytics Workspace. Since our log schema expects the application log details in a JSON format, we will also add a method to validate that the log information being sent is containing a valid JSON. If not, we are not going to log this information as it will interfere with the visual presentation and parsing in Log Analytics Workspace. Once we have validated the JSON, we are going to fetch the Log Analytics Workspace ID and Key from the Azure Key Vault using custom API that we have created in our earlier artile<link>. You may hardcode this but from a security best practice point of view, this should be stored and fetched from Azure Key Vault or any other secure vault of your choice. Once we have these details, we will just invoke the method from our LogAnalyticsHelper class to push the log into Azure Log Analytics.
PushLogsToLogAnalytics.cs
- using System;
- using System.Threading.Tasks;
- using Microsoft.Azure.WebJobs;
- using Microsoft.Azure.WebJobs.Extensions.Http;
- using Microsoft.Extensions.Logging;
- using System.Net;
- using System.Net.Http;
- using Newtonsoft.Json.Linq;
-
- namespace Reusable.Functions
- {
-
-
-
- public static class PushLogsToLogAnalytics
- {
- [FunctionName("PushLogsToLogAnalytics")]
- public static async Task<HttpResponseMessage> Run(
- [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequestMessage req,
- ILogger log)
- {
- try
- {
- log.LogInformation("PushLogsToLogAnalytics Function Called");
-
- dynamic data = await req.Content.ReadAsAsync<object>();
- string customLogFile = data.LogFileName;
- string automationName = data.AutomationName;
- string moduleName = data.ModuleName;
- string logData = Convert.ToString(data.LogData);
-
-
- JObject logDataObj = JObject.Parse(logData);
- string logDataJson = logDataObj.ToString(Newtonsoft.Json.Formatting.Indented);
-
-
- dynamic obj = new JObject();
- obj.AutomationName = automationName;
- obj.ModuleName = moduleName;
- obj.Log = logDataJson;
- string myJson = obj.ToString(Newtonsoft.Json.Formatting.Indented);
- log.LogInformation("PreparedFinalJson : " + myJson);
-
-
- bool isChildJsonValid = LogAnalyticsHelper.IsValidJson(logDataJson, log);
- bool isParentJsonValid = LogAnalyticsHelper.IsValidJson(myJson, log);
-
- if (isChildJsonValid && isParentJsonValid)
- {
- log.LogInformation("Fetching details from KeyVault");
- log.LogInformation("Invoking FetchKeyVaultSecret method");
- string workspaceId = await KeyVaultHelper.FetchKeyVaultSecret(ConstantsHelper.GetEnvironmentVariable(ConstantsHelper.logAnalyticsWorkspaceID), log);
- string primaryKey = await KeyVaultHelper.FetchKeyVaultSecret(ConstantsHelper.GetEnvironmentVariable(ConstantsHelper.logAnalyticsWorkspaceSharedKey), log);
- log.LogInformation("FetchKeyVaultSecret executed successfully");
-
-
- bool status = await LogAnalyticsHelper.PushLogsToLogAnalytics(myJson, customLogFile, workspaceId, primaryKey, log);
- if (status)
- {
- log.LogInformation("Ingestion of log analytics is completed.");
- return req.CreateResponse(HttpStatusCode.OK, "[Info] Ingestion of log analytics is completed.");
- }
- else
- {
- log.LogInformation("Ingestion of log analytics is failed");
- return req.CreateResponse(HttpStatusCode.BadRequest, "[Error] Ingestion of log analytics is failed");
- }
- }
- else
- {
- return req.CreateResponse(HttpStatusCode.BadRequest, $"[Warning] Invalid Json Provided");
- }
- }
- catch (System.Exception ex)
- {
- return req.CreateResponse(HttpStatusCode.NotFound, $"{ex.Message}");
- }
- }
- }
- }
We have the code ready and we can now push it to the Azure Function resource that we have created either by using the publish command from Azure Function extension in VS Code or using the DevOps pipelines if you have them set up for publishing resources to Azure Function. Once the function is deployed, the next step is to bind it to our API Management resource in Azure and it to the product and subscriptions that we want it to be exposed to. We have done other settings like UAMI (user-assigned managed identity) already in our earlier
article.
That's it. We are now ready to invoke the custom logging API. Next, let's see that in action. Below is a sample C# and PowerShell Codes for invoking the APIs and passing different log information in separate applications.
C# Sample
- using (var client = new HttpClient())
- {
- string Uri = "Your-API-Uri";
- string myJson = @"{ ""LogFileName"":""Your-Custom-Log-File-In-LogAnalytics"",
- ""AutomationName"": ""Your-Automation-Name"",
- ""ModuleName"": ""Your-Automation-Module-Name"",
- ""LogData"": {
- ""Attribute1"": ""some-value"",
- ""Attribute2"": ""some-value"",
- ""Attribute3"": {
- ""Attribute1"": ""some-value"",
- ""Attribute2"": ""some-value"",
- ""Attribute3"": ""some-value"",
- ""Attribute4"": ""some-value"",
- ""Attribute5"": ""some-value"",
- ""Attribute6"": ""some-value""
- }
- }
- }";
-
- client.DefaultRequestHeaders.Add(ConstantsHelper.ocp_Apim_Subscription_Key, "Your-API-Subscription-Key");
- var response = await client.PostAsync(Uri, new StringContent(myJson, System.Text.Encoding.UTF8, "application/json"));
- if (response.StatusCode == System.Net.HttpStatusCode.OK)
- {
- log.LogInformation("Logging is completed successfully with status code : " + response.StatusCode);
- }
- else
- {
- log.LogInformation("Logging is failed with status code : " + response.StatusCode);
- }
- }
PowerShell Sample
- $body = @{
- LogFileName = "Your-Custom-Log-File-In-LogAnalytics"
- AutomationName = "Your-Automation-Name"
- ModuleName = "Your-Automation-Module-Name"
- # You can define your custom json for logging unstructured logs. You can have your custom attributes inplace of below given attributes
- LogData = @{
- Attribute1 = "Some-Value"
- Attribute2 = "Some-Value"
- Attribute3 = "Some-Value"
- Attribute4 = "Some-Value"
- Attribute5 = "Some-Value"
- Attribute6 = "Some-Value"
- }
- }
- $body = $body | ConvertTo-Json
-
- $api = "Your-API-Uri"
- [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
- $headers = @{}
- $headers.Add("Ocp-Apim-Subscription-Key", "Your-API-Subscription-Key")
-
- # Invocation of API using RestMethod
- $response = Invoke-RestMethod -Uri $api -Method Post -Headers $headers -Body $body -ContentType "application/json"
So, the above implementations demonstrate that our API can accept and log different logging schema information as long as it is a valid JSON. Additionally, it is super easy to integrate into different applications, modules, and platforms since our logging method is now an API end-point. Let's now view this in the Azure Log Analytics workspace and visualize.
You can now see how we have benefitted by defining just the base schema and leaving the logging schema as a JSON field. Log Analytics has the capacity to parse this JSON field and give you a proper visualization with clearly defined property and value out of the JSON provided. Now if we had to search for a specific application or time, we do have separate fields for that. However, if we wanted to query on one of the actual application logs in JSON, we will have some limitations in the KQL and we will have to do some customizations. So, the recommendation from my end will be to take out the primary search fields so that we will be able to leverage full OOTB search capabilities offered in Log Analytics. The JSON based field does give us flexibility in logging variety of data but the trade-off with schema-less fields needs to be evaluated appropriately as we definitely lose out on some search capabilities with the JSON fields.
Conclusion
We have discussed and demonstrated the custom logging API that can be built using C# code that will be hosted on Azure Functions and exposed via the Azure API Management Service. The schema for logs needs to be an informed decision that you take prior to implementing based on your environment, application requirement, and purpose of logging. The complete code and implementation are available at our
GitHub Repository.
Happy Coding!