Real World Cloud App - From Start To Finish - The Business Layer

In the third article in this series, I talked about and showed how to code the data layer for this cloud solution. Since I talked about the business entities in the last article, in this one I will focus on the queueing using the Azure Service Bus and microservices using Azure Functions.

For years, I have been talking about using queues on both the client side and server side, to increase the speed of processing and the user experience. In this app, I will be adding event information from my WinForms app to the back-end for evaluation every time a user clicks on an advertisement link. Using a queue dramatically increases this speed since no processing of the data is done when it’s added to the queue.

Then a microservice, in this case, an Azure Function, will be triggered each time data is added to the queue. The function will then save the data into Cosmos DB. As you will see, using Azure Functions makes this process incredibly easy.

Real World Cloud App - From Start To Finish - The Business Layer 

In a contract worked on last year, I was able to decrease the time it took to process the data from a mobile client to the back-end from 7-70 seconds to under 500 milliseconds using queues and microservices. I discuss this more in my article titled “Are Microservices Just a New Marketing Term”.

ServiceBus Queue

Creating a Service Bus Queue is very simple. I first created a ServiceBus namespace called “dotNetTips”. Then I created a queue called “adclicks” with an overview page that looks like this,

ServiceBus Queue 

During the setup, I chose the following options.

  • Message time to live: I changed the default to 28 days. I figure if my Azure function is down for that amount of time I have bigger things to worry about than the user clicking on an ad.
  • Lock duration: I set this to 3 minutes since each item put in the queue will fire off a function instance.
  • Duplicate detections history: Set this to 10 minutes, but I doubt that there ever will be a duplicate.
  • Maximum Delivery Count: I won’t be using this since I won’t be processing an ad click in a batch.

ServiceBus Queue

To secure the sending of items into the queue, I configure a Shared Access Policy that only allows sending. Later, when I wire up the client app and the user tests, I will use the Primary Connection String to connect to this queue.
ServiceBus Queue 
That’s all I had to do to set up the queue. Very, very easy!

Sending Ad Clicks to the Queue

An example of the AdClickMessage data serialized to JSON looks like this.

  1. {  
  2.   "adId""4baca6f3-8446-4729-b587-5e015f8cf12d",  
  3.   "clickedOn""2019-01-05T20:26:55.8470841+00:00",  
  4.   "isoLanguage""eng",  
  5.   "isoRegion""USA"  
  6. }  

Since I have not written the endpoint to retrieve ad click data from the client, for now, I am testing adding items to the queue from a unit test. First, I wrote a class to interact with a Service Bus queue. After installing the Microsoft.Azure.ServiceBus NuGet package, I wrote the following code,

  1. public class Queue  
  2. {  
  3.     private IQueueClient _queue;  
  5.     public Queue(string serviceBusConnectionString, string queueName)  
  6.     {  
  7.         _queue = new QueueClient(serviceBusConnectionString, queueName);  
  8.     }  
  10.     public void Send(string messageBody)  
  11.     {  
  12.         var message = new Message(Encoding.UTF8.GetBytes(messageBody));  
  15.         _queue.SendAsync(message).Wait();  
  16.     }  
  17. }  

As you can see in the code above, once the connection to the queue is made, it’s only two lines of code to send a message. First, I create a new queue Message from the string sent to Send. Then I call SendAsync and I’m all done.

One thing to note, I found out the hard way that calling SendAsync with an await operator does not work. After removing await and adding Wait() to the end, it started working.

Azure Function

Now that ad click data can be queued, it’s time to write the Azure Function that will be fired each time an item is added so that it can be stored in Cosmos DB.

Creating a Function App in the Azure Portal is straight forward. I created one called “dotNetTipsAppAdsFunctions” along with a new Resource Group and Storage Plan in the West US 2 location. The rest of the settings I left as default. Afterwards, I did configure Application Insights, so I can view traces, performance metrics and issues.

StorageQueue Function

I decided to write the Azure Function in a C# Visual Studio project. As you can see from the graphic, I chose that this function will be triggered when data is added to a Service Bus queue called “adclicks”.

StorageQueue Function 

Since the data will be stored in Cosmos DB (see the third article in this series), the default function created by Visual Studio is not ideal. You will see, as part of the function definition, the data can be automatically saved to Cosmos DB.

Let’s go over what is happening in the definition for the ProcessAdClick function.

  1. [FunctionName("ProcessAdClick")]  
  2. public static void Run([ServiceBusTrigger("adclicks",   
  3.                           Connection = "ServiceBusQueueConnectionString")]  
  4.                           string queueItem, ILogger log,   
  5.                        [CosmosDB(databaseName: "Ads", collectionName: "adclicks",  
  6.                           CreateIfNotExists = true,   
  7.                           ConnectionStringSetting = "AdsCosmosDBConnection")]  
  8.                           out dynamic document)  


The first configuration for this function is how it auto-magically creates a connection to the Service Bus queue previously discussed in this article.

  • QueueName: The first parameter is for the name of the Service Bus queue which in this case is “adclicks”.
  • Connection: Here, for security and configuration reasons, I am defining the name of application setting that will hold the connection string.
  • queueItem: The ad click data from the Service Bus queue will be sent in this parameter as a string.

The third parameter for this function will hold the reference to the logger where all the entries will be stored in Application Insights.


Second comes the configuration for Cosmos DB that will auto-magically store the ad click data to the database.

  • databaseName: Definition for the database that this data will be stored, which is “Ads”.
  • collectionName: Definition for the database collection, which is “adclicks”.
  • CreateIfNotExists: I have set this to true, just in case the collection has not been created.
  • ConnectionStringSetting: Here, for security and configuration reasons I am setting this to the application setting “AdsCosmosDBConnection”.
  • document: This will be set to the AdClickMessage entity (see code below).

The Code

Since a lot of the work is being done by configuration, there is very little code in this function.

  1. if (string.IsNullOrEmpty(queueItem) == false)  
  2. {  
  3.     log.LogInformation("Processing ad click {0}.", queueItem);  
  5.     document = JsonConvert.DeserializeObject<AdClickMessage>(queueItem);  
  6. }  
  7. else  
  8. {  
  9.     log.LogError("Could not process queue item.", string.Empty);  
  10.     document = null;  
  11.     throw new ArgumentNullException(nameof(queueItem), "Ad click cannot be null.");  
  12. }  

If you take away the null check and the logging, there is only one line of code! I simply convert the JSON string to an AdClickMessage entity and set it to the document out variable. That’s it! The Cosmos DB configuration takes care of connecting and saving it to the database. It can’t get any simpler than that.

I first had issues with this function since I strongly-typed queueItem as AdClickMessage. I assumed that the function would take care of deserializing the JSON, but I could not get that to work. I then just changed queueItem to a string and deserialized it myself using JsonConvert.


Deploying simple Azure Function like this is easy using Visual Studio. First, go to the Function App in the Azure Portal and download the publish profile. The right mouse-click on the project and select Publish. In the “Pick a publish target” window, select Import Profile and select the profile that you saved. At this point, Visual Studio will perform the first deployment of the function.

I then go into Profile Settings and select the values shown in this graphic. After that, I then go into Manage Application Settings and configure the settings for AdsCosmosDBConnection and ServiceBusQueueConnectionString. Then, I hit "Publish" which will build the project again and deploy it to Azure. As soon as that is done, which only takes a few seconds, the function is running!

Deploying simple Azure Function 

Monitoring with Application Insights

Since I setup Application Insights for this Function App, it’s easy to monitor it, including errors. In the Azure Portal, I go to Application Insights and navigate to dotNetTipsAppAdsFunctions. After selecting this I can then drill into features such as metrics, failures, performance, search and more. In Search, I can easily see a graphical view along with information for exceptions, trace, dependency events and more.

Monitoring with Application Insights 

In Failures, I can easily see and drill down on errors for a specific time frame. Here is where I found out I had a serialization issue after viewing this message: “Exception while executing function: ProcessAdClick Exception binding parameter 'queueItem' There was an error deserializing the object of type dotNetTips.App.Ads.Entities.Models.AdClickMessage. The input source is not correctly formatted.

Monitoring with Application Insights 
Compared to viewing logs in AWS Cloud Watch, Application Insights blows Cloud Watch out of the water! The logs in Cloud Watch are unusable via the AWS portal. Everyone I talked to about Cloud Watch says the same thing. Essentially you will need to pay for a service like logentries which starts at $48 a month to view and monitor them, where in Azure, it comes for free.

The only thing Application Insights needs to work on is that it exposes so much data, it’s difficult to know where to go to view what I need, but I’m sure I will get better at it the more I use it. Also, I hope they will start saving configuration settings, like time range, so the next time I go back into a view, Application Insights remembers it.

Changes & Issues

The publication of the article was delayed due to issues I ran into. I’m documenting them just in case you run into them.

  1. At some point, I was unable to publish the function anymore. I’m not sure if something changed in Azure or I changed something, but I spent weeks trying to figure it out. In the end, deleting the Function App in the Azure Portal and then recreating it overcame the issue.
  2. I did cause an issue by putting the wrong application configuration setting in the definition of the function. This is what I get for copying and pasting code from the internet!
  3. I also had an issue with the unit tests when using the connection string for the service bus from the Azure Portal. The Primary Connection String is incorrect when using the Service Bus in C#. From the portal, the Primary Connection String looks like this: Endpoint=sb://;SharedAccessKeyName=SendPolicy;SharedAccessKey=Prbum/XZNcNeaCwQ=;EntityPath=adclicks. But this does not work until EntityPath is removed since that is a different parameter when connecting to a queue.
  4. When tried to test processing the data from the queue to the function, it took me a while to figure out how to add test messages to the queue (not using a unit test). First, I went to the Portal and there is no way to do it there. I then fired up the Azure Storage Explorer, but for some reason, it does not support Service Bus (something they REALLY need to fix). I then just relied on firing off my unit test that adds an ad click to the queue. The Visual Studio add-in for AWS makes adding and deleting messages very easy.
  5. I tried to update my Newtonsoft.Json NuGet package to the latest version. This caused lots of build issues since one of the NuGet packages in the solution supports only an older version. So, I had to back it out and keep the 11.0.2 version. I call this “living in NuGet hell”.

Another issue I had was learning how to code using Azure Service Bus queues, Cosmos DB and Azure Functions. Azure is changing so rapidly that anything written goes out of date very quickly. I’m sure even the code in this article won’t work six months from now, but I hope it does.

Here are some of the things I have changed since I released the architecture and design article:

  1. I decided to use a ServiceBus queue instead of a normal Azure queue since it has more capabilities.


Learning curve aside, creating and working with queues and microservices is so much easier in Azure when compared to AWS. Whether you are on the Microsoft stack or not, your team needs to spend time looking into it on Azure. I would also seek out developers that have used both for their opinion and guidance. Cost numbers on a spreadsheet do not tell the entire story!

If you are looking at orchestrating (tying together in a workflow) your microservices, then Azure durable functions are hands down the way to go! I want to write an article on them later since they overcome most of the complexities that I guarantee you will encounter if you choose a cloud service other than Azure.

I would like to thank Azure MVP Sam Cogan along with a few others for helping me with this article.


  1. Microsoft Azure Developer: Create Serverless Functions Pluralsight course.
  2. Source for this solution can be found here.
  3. Utility Dev App - This is the app I will be connecting to this cloud app at the end of this series. I hope you will check it out.

Next Article

In the next article, I will be coding the communications layer, so check back here soon. Please make any comments below.

McCarter Consulting
Software architecture, code & app performance, code quality, Microsoft .NET & mentoring. Available!