Implementation Of Face API Using Microsoft Azure And Visual Studio

In this article, I am going to show how to implement the Face API using C# with the help of Microsoft Azure and Visual Studio.This article also shows how to create a WPF application that implements the Face API.This application detects the faces in the images, showing the red frame around each face and the status bar showing the description of each faces by moving the cursor to the frame.
 
Prerequisites
  • An active Azure Subscription.
  • Visual Studio 2015 or higher.
The flow of this article is,
  • Creating the Face API in Azure Portal.
  • Accessing & Managing the Face API Keys in WPF Applications.
  • Creating the WPF Application using the Face API in Visual Studio 2017.
Follow the steps to create the Face API on Azure portal.
 
Step 1
 
Sign into Azure portal.
 
Step 2

Press "+New", then choose "AI + cognitive services", and select "Face API".

 
Step 3
 
In the "Create Face API" blade, enter the name of your API, choose the desired subscription, and select the desired location for the API. Select the appropriate pricing Tier that is required for our use. Create the resource group for the API. Then, press "Create".
 
Copy the Endpoint and press the "Show / Manage Keys" to show our keys that are used in the WPF application.

 

Step 4
 
The "Manage Keys" blade shows the keys that are going to work as the access keys for our Face API. Click the "Copy" button and then paste it into the Notepad.


Implementing Face API in Visual Studio using C#
 
Step 5
 
Follow the steps to create a Windows-based WPF application using Visual Studio 2017.
  • Open Visual Studio 2015 or 2017.
  • From the menu bar, click File >> New >> New Project.
  • In Visual Studio, choose Installed-->Templates-->Visual C#-->Windows Classic Desktop-->WPF Application (.NET Application).
  • Enter the name for our application as FaceApiDemo and press OK to open it in the Solution Explorer. 
 
 
Step 6
 
We need to add two packages to our WPF applications. Right-click the Solution Explorer and press "NuGet Package Manager" to add two packages.

 
 
The JSON.NET is a popular Framework for JSON framework for .NET applications. We will use it because our Face API consists of a lot of JSON files to store the various details about the images.

In NuGet packages, browse for the Newtonsoft.json and then press "Install".

 

And also, browse for the Microsoft.ProjectOxford.Face and press "Install". This package is used to configure the Face API and our application through the HTTPS request. Thus, it imports a .NET library that encapsulates the Face API REST requests.

 

Step 7
 
Copy and paste the following code into the MainWindow.Xaml. This is the code of the layout for our Windows application.
  1. <Window x:Class="FaceApiDemo.MainWindow"  
  2.         xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"  
  3.         xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"  
  4.         Title="MainWindow" Height="700" Width="960">  
  5.     <Grid x:Name="BackPanel">  
  6.         <Image x:Name="FacePhoto" Stretch="Uniform" Margin="0,0,0,50" MouseMove="FacePhoto_MouseMove" />  
  7.         <DockPanel DockPanel.Dock="Bottom">  
  8.             <Button x:Name="BrowseButton" Width="72" Height="20" VerticalAlignment="Bottom" HorizontalAlignment="Left"  
  9.                     Content="Browse..."  
  10.                     Click="BrowseButton_Click" />  
  11.             <StatusBar VerticalAlignment="Bottom">  
  12.                 <StatusBarItem>  
  13.                     <TextBlock Name="faceDescriptionStatusBar" />  
  14.                 </StatusBarItem>  
  15.             </StatusBar>  
  16.         </DockPanel>  
  17.     </Grid>  
  18. </Window>  
Step 8
 
Open MainWindow.xaml.cs and replace the code with this one.
  1. using System;  
  2. using System.Collections.Generic;  
  3. using System.IO;  
  4. using System.Text;  
  5. using System.Threading.Tasks;  
  6. using System.Windows;  
  7. using System.Windows.Input;  
  8. using System.Windows.Media;  
  9. using System.Windows.Media.Imaging;  
  10. using Microsoft.ProjectOxford.Common.Contract;  
  11. using Microsoft.ProjectOxford.Face;  
  12. using Microsoft.ProjectOxford.Face.Contract;  
  13.   
  14. namespace FaceApiDemo  
  15.   
  16. {  
  17.     public partial class MainWindow : Window  
  18.     {  
  19.         // Replace the first parameter with your valid subscription key.  
  20.         //  
  21.         // Replace or verify the region in the second parameter.  
  22.         //  
  23.         // You must use the same region in your REST API call as you used to obtain your subscription keys.  
  24.         // For example, if you obtained your subscription keys from the westus region, replace  
  25.         // "westcentralus" in the URI below with "westus".  
  26.         //  
  27.         // NOTE: Free trial subscription keys are generated in the westcentralus region, so if you are using  
  28.         // a free trial subscription key, you should not need to change this region.  
  29.         private readonly IFaceServiceClient faceServiceClient =  
  30.             new FaceServiceClient("keys""Endpoints");  
  31.   
  32.         Face[] faces;                   // The list of detected faces.  
  33.         String[] faceDescriptions;      // The list of descriptions for the detected faces.  
  34.         double resizeFactor;            // The resize factor for the displayed image.  
  35.         //private object faceDescriptionStatusBar;  
  36.   
  37.         public MainWindow()  
  38.         {  
  39.             InitializeComponent();  
  40.         }  
  41.   
  42.         // Displays the image and calls Detect Faces.  
  43.   
  44.         private async void BrowseButton_Click(object sender, RoutedEventArgs e)  
  45.         {  
  46.             // Get the image file to scan from the user.  
  47.             var openDlg = new Microsoft.Win32.OpenFileDialog();  
  48.   
  49.             openDlg.Filter = "JPEG Image(*.jpg)|*.jpg";  
  50.             bool? result = openDlg.ShowDialog(this);  
  51.   
  52.             // Return if canceled.  
  53.             if (!(bool)result)  
  54.             {  
  55.                 return;  
  56.             }  
  57.   
  58.             // Display the image file.  
  59.             string filePath = openDlg.FileName;  
  60.   
  61.             Uri fileUri = new Uri(filePath);  
  62.             BitmapImage bitmapSource = new BitmapImage();  
  63.   
  64.             bitmapSource.BeginInit();  
  65.             bitmapSource.CacheOption = BitmapCacheOption.None;  
  66.             bitmapSource.UriSource = fileUri;  
  67.             bitmapSource.EndInit();  
  68.   
  69.             FacePhoto.Source = bitmapSource;  
  70.   
  71.             // Detect any faces in the image.  
  72.             Title = "Detecting...";  
  73.             faces = await UploadAndDetectFaces(filePath);  
  74.             Title = String.Format("Detection Finished. {0} face(s) detected", faces.Length);  
  75.   
  76.             if (faces.Length > 0)  
  77.             {  
  78.                 // Prepare to draw rectangles around the faces.  
  79.                 DrawingVisual visual = new DrawingVisual();  
  80.                 DrawingContext drawingContext = visual.RenderOpen();  
  81.                 drawingContext.DrawImage(bitmapSource,  
  82.                     new Rect(0, 0, bitmapSource.Width, bitmapSource.Height));  
  83.                 double dpi = bitmapSource.DpiX;  
  84.                 resizeFactor = 96 / dpi;  
  85.                 faceDescriptions = new String[faces.Length];  
  86.   
  87.                 for (int i = 0; i < faces.Length; ++i)  
  88.                 {  
  89.                     Face face = faces[i];  
  90.   
  91.                     // Draw a rectangle on the face.  
  92.                     drawingContext.DrawRectangle(  
  93.                         Brushes.Transparent,  
  94.                         new Pen(Brushes.Red, 2),  
  95.                         new Rect(  
  96.                             face.FaceRectangle.Left * resizeFactor,  
  97.                             face.FaceRectangle.Top * resizeFactor,  
  98.                             face.FaceRectangle.Width * resizeFactor,  
  99.                             face.FaceRectangle.Height * resizeFactor  
  100.                             )  
  101.                     );  
  102.   
  103.                     // Store the face description.  
  104.                     faceDescriptions[i] = FaceDescription(face);  
  105.                 }  
  106.   
  107.                 drawingContext.Close();  
  108.   
  109.                 // Display the image with the rectangle around the face.  
  110.                 RenderTargetBitmap faceWithRectBitmap = new RenderTargetBitmap(  
  111.                     (int)(bitmapSource.PixelWidth * resizeFactor),  
  112.                     (int)(bitmapSource.PixelHeight * resizeFactor),  
  113.                     96,  
  114.                     96,  
  115.                     PixelFormats.Pbgra32);  
  116.   
  117.                 faceWithRectBitmap.Render(visual);  
  118.                 FacePhoto.Source = faceWithRectBitmap;  
  119.   
  120.                 // Set the status bar text.  
  121.                 faceDescriptionStatusBar.Text = "Place the mouse pointer over a face to see the face description.";  
  122.             }  
  123.         }  
  124.   
  125.         // Displays the face description when the mouse is over a face rectangle.  
  126.   
  127.         private void FacePhoto_MouseMove(object sender, MouseEventArgs e)  
  128.         {  
  129.             // If the REST call has not completed, return from this method.  
  130.             if (faces == null)  
  131.                 return;  
  132.   
  133.             // Find the mouse position relative to the image.  
  134.             Point mouseXY = e.GetPosition(FacePhoto);  
  135.   
  136.             ImageSource imageSource = FacePhoto.Source;  
  137.             BitmapSource bitmapSource = (BitmapSource)imageSource;  
  138.   
  139.             // Scale adjustment between the actual size and displayed size.  
  140.             var scale = FacePhoto.ActualWidth / (bitmapSource.PixelWidth / resizeFactor);  
  141.   
  142.             // Check if this mouse position is over a face rectangle.  
  143.             bool mouseOverFace = false;  
  144.   
  145.             for (int i = 0; i < faces.Length; ++i)  
  146.             {  
  147.                 FaceRectangle fr = faces[i].FaceRectangle;  
  148.                 double left = fr.Left * scale;  
  149.                 double top = fr.Top * scale;  
  150.                 double width = fr.Width * scale;  
  151.                 double height = fr.Height * scale;  
  152.   
  153.                 // Display the face description for this face if the mouse is over this face rectangle.  
  154.                 if (mouseXY.X >= left && mouseXY.X <= left + width && mouseXY.Y >= top && mouseXY.Y <= top + height)  
  155.                 {  
  156.                     faceDescriptionStatusBar.Text = faceDescriptions[i];  
  157.                     mouseOverFace = true;  
  158.                     break;  
  159.                 }  
  160.             }  
  161.   
  162.             // If the mouse is not over a face rectangle.  
  163.             if (!mouseOverFace)  
  164.                 faceDescriptionStatusBar.Text = "Place the mouse pointer over a face to see the face description.";  
  165.         }  
  166.   
  167.         // Uploads the image file and calls Detect Faces.  
  168.   
  169.         private async Task<Face[]> UploadAndDetectFaces(string imageFilePath)  
  170.         {  
  171.             // The list of Face attributes to return.  
  172.             IEnumerable<FaceAttributeType> faceAttributes =  
  173.                 new FaceAttributeType[] { FaceAttributeType.Gender, FaceAttributeType.Age, FaceAttributeType.Smile, FaceAttributeType.Emotion, FaceAttributeType.Glasses, FaceAttributeType.Hair };  
  174.   
  175.             // Call the Face API.  
  176.             try  
  177.             {  
  178.                 using (Stream imageFileStream = File.OpenRead(imageFilePath))  
  179.                 {  
  180.                     Face[] faces = await faceServiceClient.DetectAsync(imageFileStream, returnFaceId: true, returnFaceLandmarks: false, returnFaceAttributes: faceAttributes);  
  181.                     return faces;  
  182.                 }  
  183.             }  
  184.             // Catch and display Face API errors.  
  185.             catch (FaceAPIException f)  
  186.             {  
  187.                 MessageBox.Show(f.ErrorMessage, f.ErrorCode);  
  188.                 return new Face[0];  
  189.             }  
  190.             // Catch and display all other errors.  
  191.             catch (Exception e)  
  192.             {  
  193.                 MessageBox.Show(e.Message, "Error");  
  194.                 return new Face[0];  
  195.             }  
  196.         }  
  197.   
  198.         // Returns a string that describes the given face.  
  199.   
  200.         private string FaceDescription(Face face)  
  201.         {  
  202.             StringBuilder sb = new StringBuilder();  
  203.   
  204.             sb.Append("Face: ");  
  205.   
  206.             // Add the gender, age, and smile.  
  207.             sb.Append(face.FaceAttributes.Gender);  
  208.             sb.Append(", ");  
  209.             sb.Append(face.FaceAttributes.Age);  
  210.             sb.Append(", ");  
  211.             sb.Append(String.Format("smile {0:F1}%, ", face.FaceAttributes.Smile * 100));  
  212.   
  213.             // Add the emotions. Display all emotions over 10%.  
  214.             sb.Append("Emotion: ");  
  215.             EmotionScores emotionScores = face.FaceAttributes.Emotion;  
  216.             if (emotionScores.Anger >= 0.1f) sb.Append(String.Format("anger {0:F1}%, ", emotionScores.Anger * 100));  
  217.             if (emotionScores.Contempt >= 0.1f) sb.Append(String.Format("contempt {0:F1}%, ", emotionScores.Contempt * 100));  
  218.             if (emotionScores.Disgust >= 0.1f) sb.Append(String.Format("disgust {0:F1}%, ", emotionScores.Disgust * 100));  
  219.             if (emotionScores.Fear >= 0.1f) sb.Append(String.Format("fear {0:F1}%, ", emotionScores.Fear * 100));  
  220.             if (emotionScores.Happiness >= 0.1f) sb.Append(String.Format("happiness {0:F1}%, ", emotionScores.Happiness * 100));  
  221.             if (emotionScores.Neutral >= 0.1f) sb.Append(String.Format("neutral {0:F1}%, ", emotionScores.Neutral * 100));  
  222.             if (emotionScores.Sadness >= 0.1f) sb.Append(String.Format("sadness {0:F1}%, ", emotionScores.Sadness * 100));  
  223.             if (emotionScores.Surprise >= 0.1f) sb.Append(String.Format("surprise {0:F1}%, ", emotionScores.Surprise * 100));  
  224.   
  225.             // Add glasses.  
  226.             sb.Append(face.FaceAttributes.Glasses);  
  227.             sb.Append(", ");  
  228.   
  229.             // Add hair.  
  230.             sb.Append("Hair: ");  
  231.   
  232.             // Display baldness confidence if over 1%.  
  233.             if (face.FaceAttributes.Hair.Bald >= 0.01f)  
  234.                 sb.Append(String.Format("bald {0:F1}% ", face.FaceAttributes.Hair.Bald * 100));  
  235.   
  236.             // Display all hair color attributes over 10%.  
  237.             HairColor[] hairColors = face.FaceAttributes.Hair.HairColor;  
  238.             foreach (HairColor hairColor in hairColors)  
  239.             {  
  240.                 if (hairColor.Confidence >= 0.1f)  
  241.                 {  
  242.                     sb.Append(hairColor.Color.ToString());  
  243.                     sb.Append(String.Format(" {0:F1}% ", hairColor.Confidence * 100));  
  244.                 }  
  245.             }  
  246.   
  247.             // Return the built string.  
  248.             return sb.ToString();  
  249.         }  
  250.     }  
  251. }  
In the 30th line of the code, enter the keys and the End-point URL that we acquired from Azure Face API.
  1. private readonly IFaceServiceClient faceServiceClient =  
  2.         new FaceServiceClient("_key_""End-point URL");  
Replace the Key and Endpoint using Face API on the Azure portal.
 
Step 9

Save and start the program.Then, click "Browse" button to import the image to detect the faces.

 
 
Step 10
 
Stay calm for some seconds. The cloud API wants to respond to our images to detect the Faces using red boxes around the faces in the image. Mooving the cursor over the face rectangle, the description for that image shows up in the status bar.

 
 
Conclusion
 
I hope you understood how to create an application using Face API on the Azure portal and implement the Face API using C# in Visual Studio.