Azure Cognitive Services - Anomaly Detection

In this article, we’ll use various services provided by Azure such as Anomaly Detector in Azure Cognitive Service, Azure Synapse Analytics, Azure Key Vault, Azure Spark Pool and more to perform Anomaly Detection. 

Azure Cognitive Services  

Azure Cognitive Services is one of the services offered by Microsoft Azure that enables a wide range of organizations to build cognitive intelligence for applications with client library SDKs and REST APIs. Azure Cognitive Services allows developers to integrate cognitive features into applications with no prior knowledge of Machine Learning, Data Science, and Artificial Intelligence skillset. From Computer Vision to Natural Language Processing and Conversational AI, Azure Cognitive Services provides the support for a diverse prospect of applications. 

Anomaly Detection 

Anomaly detection helps in finding out anomaly ie. Usually detection of errors and unusual activities. It is basically analysis to find out outliers which are these observations or data points that deviate from the normal behaviour of a particular dataset. Numerous use cases such as Credit Card Fraud Detection, Event Detection in Networks, Cyber Security Intrusion Identification can be performed with Anomaly Detection.  


Before we start with this tutorial on Sentiment Analysis, we need to set up a few services in Azure. We would need an Azure Synapse Workspace, Text Analytics on Azure Cognitive Service (now called the Language Service), Azure Key Vault, Linked Services in Azure Synapse Analytics with Key vault and Azure Cognitive Service. Follow the listed articles in order and make sure these services are well created before initiating the steps below for Sentiment Analysis.   

  1. Azure Synapse Analytics – Create a Synapse Workspace  
  2. Azure Synapse Analytics – Create Apache Spark Pool  
  3. Azure Cognitive Services – Create Anomaly Detector
  4. Azure Key Vault – Create Key Vault  
  5. Azure Synapse Analytics – Link Services 

Once the above services are created and set up, we can now proceed with the steps below.   

Step 1  

Visit the Azure Portal. You’ll have the services running. Check out the Synapse Workspace you created and open it. Here mine is the ojashworkspace.   

Similarly, you’ll have keyvault, anomaly detector, datalake and numerous other services running in the resource group, here in my ojash-rg.  

Step 2  

Visit the Azure Synapse Analytics Workspace. The welcome screen will look similar to this.  

Furthermore, they link the services to Azure key Vault and Cognitive Service.  

Step 3 

In the Azure Synapse Workspace, you’ll have the Apache Pool running and the linked services connected with the Azure Key Vault, Azure Cognitive Service and Storage.   

Step 4 

Now, select the Develop on the Menu on the left corner.  

Click on the + sign.  

Under it, select Import.  

Step 5 

Now, down the file in your system and while importing select this notebook file – prepare_anomaly-detector_data.  

Step 6 

Now, as this file is imported, select the Apache Spark Pool.  

Now, you can run the query.  

Step 7 

Once the query is executed, we’ll be updated with Job Execution Success.  

During the process, you can check the job from Apache Spark Applications under Activities of Monitor.  

Step 8 

This will create a table named – anomaly_detector_testing_data.  

Now, we can refresh the workspace under Data section.  

We’ll find the table anomaly_detector_testing_data created in the DataLake which was created while we setup Azure Synapse Workspace in Azure.  

Step 8

Now, Right Click on the table name > Select Machine Learning > Predict with a model. ‘ 

Step 9

Here, we select the Anomaly Detector.  

In the previous article, we worked with Sentiment Analysis on Azure Cognitive Services - Sentiment Analysis Tutorial.  

Step 10

Now, fill up the details. Select the Azure Cognitive Service which is linked. Next Under Granularity – choose Monthly. Under the Timestamp column select the timestamp (string). Next, choose value (double) under Time series value column. Finally, select group (string) under Grouping Column.  

Once all is selected as above click on Open Notebook.  

Step 11

Now, here we have prediction model set for anomaly detection.  

Select your Apache Pool and run the code.  

If in case, the node size of vcores isn’t enough, you can switch by clicking on Manage Pools under Apache Spark Pools.   

Below, I’ve upgraded from 4 vCores/ 32GB to 16 vCores/ 128 GB.  

Step 12 

Once, the code is well executed we can see the results on the View section.  

Here, we’ve selected the Table view option.  

We can see under isAnomaly, a value set as True for Anomaly detection. This will come in handy in numerous Machine Learning tasks as well as Business Analytics jobs to develop better and accurate systems.  


Thus, in this article, we learned about Anomaly Detection and went through a step-by-step process to perform Anomaly Detection in our own user uploaded data by importing notebook. Numerous other Azure Services came into play for us to attain this from the Azure Synapse Workspace, Azure Key Vault, Azure Cognitive Services, the Azure Data Lake Storage and Azure Apache Pool. Similarly, we can perform other machine learning analysis and predictions with different other offerings in Azure.   

Similar Articles