Azure Batch Service - Deploy A Batch Application Using C# SDK

You should know what Azure batch service is. So follow this URL. Once you read it, you should create a batch account with one storage account attached with it. So now you have the below elements.

From batch account,
  1. batchaccount name
  2. batchaccount key
  3. batchaccount url
From storage account,
  1. name
  2. key 
The code uploaded here does not contain the above mentioned keys, but you are expected to fill them in  with your own keys.

One point to remember is that Azure batch service is a PaaS service. You separately need to upload any batch application in to it for utilizing that service. So Azure batch service and batch applications are different entities. Once you get that point, you are ready to proceed below.

The source code uploaded has two C# projects.
  1. AzureBatchService
    This is a console application which is also the starting project. All Azure C# SDK coding has been done here. From this project we will connect to Azure and upload things to the existing batch service. This project has three text files, which are inputs to a batch application. This batch application has been described below.

  2. TaskApplication
    This is a batch application. Every batch application in the world contains at least one file to be processed and also one application to process files. This specific application has been downloaded from a portal. This is also a console application, which will process text files and return the MAX count of words occurring in those text files. So this application need text file as input. These text files are included in above mentioned 1st project.
Our Goal

We will upload the batch application mentioned above (second project) along with its input files and then verify the output returned. We will verify this in Azure portal.Foruploading to Azure, we use the first project mentioned in the above section by implementing C# SDKs.

Expected Output

We should have the below entities and behaviors in our Azure batch service,
  1. Input text files to be uploaded in a separate blob container
  2. Output text file to be dropped in a separate blob container
  3. Batch application to be uploaded in a separate blob container
  4. Jobs, tasks and pools to be created in batch account
  5. The task assigned should be, running our batch application with it's input files. It should run using the pools defined
  6. A scheduler to be created in batch account with expeted intervel
  7. Task should run as defined in the scheduler with expected out put
Now please download the source code. Below are the keys mentioned above. You need to use yours here.
  1. // Update the Batch and Storage account credential strings below with the values unique to your accounts.  
  2. // These are used when constructing connection strings for the Batch and Storage client objects.  
  4. // Batch account credentials  
  5. private const string BatchAccountName = "eagbatchservice";  
  6. private const string BatchAccountKey = "";  
  7. private const string BatchAccountUrl = "";  
  9. // Storage account credentials  
  10. private const string StorageAccountName = "batchservicesouthindia";  
  11. private const string StorageAccountKey = "";  
Below has the code for connecting to batch acccount and creating entities
  1. using (BatchClient batchClient = BatchClient.Open(cred))  
  2.             {  
  3.                 // Create the pool that will contain the compute nodes that will execute the tasks.  
  4.                 // The ResourceFile collection that we pass in is used for configuring the pool's StartTask  
  5.                 // which is executed each time a node first joins the pool (or is rebooted or reimaged).  
  6.                 await CreatePoolIfNotExistAsync(batchClient, PoolId, applicationFiles);  
  8.                 // Create the job that will run the tasks.  
  9.                 await CreateJobAsync(batchClient, JobId, PoolId);  
  11.                 // Add the tasks to the job. We need to supply a container shared access signature for the  
  12.                 // tasks so that they can upload their output to Azure Storage.  
  13.                 await AddTasksAsync(batchClient, JobId, inputFiles, outputContainerSasUrl);  
  15.                 // Monitor task success/failure, specifying a maximum amount of time to wait for the tasks to complete  
  16.                 await MonitorTasks(batchClient, JobId, TimeSpan.FromMinutes(30));   
Below is the screenshot saying that the batch application upload has been completed. Don't delete any entities at this time as you need to verify it.


Verifying 3 container


Verifying Pools


Verifying Scheduler

Verifying Jobs


Verifying Output


Now our uploaded batch application has been running every two minutes with input files, as scheduled. After verification you may delete created pools, get rid of processing costs. As a follow-up try to do all these using an ARM template also.

Similar Articles