Using Microsoft Cognitive Services With Xamarin Forms

A few days back, I saw this awesome video on Channel 9 by James Montemagno and learned about Microsoft Cognitive Services. The services are pretty awesome. Another awesome thing about these is that currently they are free (up to certain number of calls). The sample application which James showed in the video is in Xamarin Native. So, I thought to build the same application in Xamarin.Forms.

In this article, we will create a Xamarin.Forms mobile application which will use the following APIs of Microsoft Cognitive Services.

James' sample app uses 2 services. I added one more as some value adds.

Before starting the code, get Microsoft Cognitive Services API Keys (These steps may change if Microsoft updates its site).

  • Visit the site of Microsoft Cognitive Services.
  • Click ‘Get Started for Free’ button.

    Get Started for Free

  • Click ‘Let’s Go’ button on next page. 

    Let’s Go
  • Sign In using your Microsoft ID account (If you have one) or create a new one.

    Sign In
  • Once signed in, you will be presented with the below screen which will contain all your Cognitive Service API keys. These keys will be used to access these services and Microsoft will use them to track your number of Service calls.

    service

Now, create a new Xamarin Forms project and follow the steps, given below.

As the code of the Application is a bit longer, I will only be sharing the main code snippets. Due to this, you won’t be able to execute the application if you only follow the steps and copy the code from this article. Rather, I would suggest you to read this article as an explanation of the associated code saved on my Github.

  • The best part of using Microsoft Cognitive Services is that they have NuGet packages especially targeted to use them. The first step is to install the required packages, which are-

    • Newtonsoft.Json
    • Microsoft.Net.Http

      Microsoft.Net.Http

    • Microsoft.ProjectOxford.Common

      Microsoft.ProjectOxford.Common

    • Microsoft.ProjectOxford.Emotion

      Microsoft.ProjectOxford.Emotion

    • Microsoft.ProjectOxford.Face

      Microsoft.ProjectOxford.Face

To add these, right click on the solution file and select ‘Manage NuGet Packages for solution‘ option from popup context menu.

  • Since I am reusing the code of James' sample, the project structure will be the same and like this project, this application will also have MVVM architecture.

  • Create a folder named ‘Model‘. Add a class file named ‘ImageResult.cs‘ which is used to show the Images got as a result from the Bing Image Search. The reason, we are keeping this Model class in separate folder, is that this is application specific model and not service specific model.

  • Create a folder named ‘Services‘. This folder will be used to store the code related to Web Service calling etc.

  • Create another folder inside the ‘Services‘ folder named as ‘BingSearch‘. As the Application is invoking Bing Image search API directly, so this folder will be used to store the class/model files which will be used to interact with it and manage its results. The Classes inside this folder are-

    • Image
    • SearchResult
    • Suggestion
    • Thumbnail

  • Create a class named ‘EmotionService‘ inside ‘Services‘ folder. This will contain the code, given below, which will be used to invoke and utilize the Microsoft Emotion Service API.
    1. using Microsoft.ProjectOxford.Emotion;  
    2. using Microsoft.ProjectOxford.Emotion.Contract;  
    3. using System;  
    4. using System.IO;  
    5. using System.Linq;  
    6. using System.Threading.Tasks;  
    7.    
    8. namespace MSCogServiceEx.Services  
    9. {  
    10.     public class EmotionService  
    11.     {  
    12.         private static async Task<Emotion[]> GetHappinessAsync(Stream stream)  
    13.         {  
    14.             var emotionClient = new EmotionServiceClient(CognitiveServicesKeys.Emotion);  
    15.    
    16.             var emotionResults = await emotionClient.RecognizeAsync(stream);  
    17.    
    18.             if (emotionResults == null || emotionResults.Count() == 0)  
    19.             {  
    20.                 throw new Exception("Can't detect face");  
    21.             }  
    22.    
    23.             return emotionResults;  
    24.         }  
    25.    
    26.         //Average happiness calculation in case of multiple people  
    27.         public static async Task<float> GetAverageHappinessScoreAsync(Stream stream)  
    28.         {  
    29.             Emotion[] emotionResults = await GetHappinessAsync(stream);  
    30.    
    31.             float score = 0;  
    32.             foreach (var emotionResult in emotionResults)  
    33.             {  
    34.                 score = score + emotionResult.Scores.Happiness;  
    35.             }  
    36.    
    37.             return score / emotionResults.Count();  
    38.         }  
    39.    
    40.         public static string GetHappinessMessage(float score)  
    41.         {  
    42.             score = score * 100;  
    43.             double result = Math.Round(score, 2);  
    44.    
    45.             if (score >= 50)  
    46.                 return result + " % :-)";  
    47.             else  
    48.                 return result + "% :-(";  
    49.         }  
    50.     }  
    51. }  
    As you can see from the code, given above, we are using ‘GetHappinessAsync‘ method to get the emotion results of the image stream that we passed to the Emotion Service. Then, using the remaining methods to show the emotions of the people, present in the image.

  • Create the class named ‘FaceService‘ inside ‘Services‘ folder to contain the following code to interact with Microsoft Face API-
    1. using Microsoft.ProjectOxford.Face;  
    2. using Microsoft.ProjectOxford.Face.Contract;  
    3. using System;  
    4. using System.IO;  
    5. using System.Linq;  
    6. using System.Threading.Tasks;  
    7.    
    8. namespace MSCogServiceEx.Services  
    9. {  
    10.     public class FaceService  
    11.     {  
    12.         public static async Task<FaceRectangle[]> UploadAndDetectFaces(Stream stream)  
    13.         {  
    14.             var fClient = new FaceServiceClient(CognitiveServicesKeys.FaceKey);  
    15.    
    16.             var faces = await fClient.DetectAsync(stream);  
    17.             var faceRects = faces.Select(face => face.FaceRectangle);  
    18.    
    19.             if (faceRects == null || faceRects.Count() == 0)  
    20.             {  
    21.                 throw new Exception("Can't detect the faces");  
    22.             }  
    23.             return faceRects.ToArray();   
    24.         }  
    25.     }  
    26. }  
    In this code, we are using Face Service to return back an array of facerect object (which is a class containing rectangular co-ordinates of the faces present in picture). We will show the number of faces present in the image by the length of this array.

  • Create another class named ‘CognitiveServicesKeys‘ inside ‘Services’ folder. This class is used to store the Keys to pass along with service call (Microsoft will use this key to identify your API account and count the number of calls). As the number of calls are limited so I have not shared my key you can get one for yourself by following the steps mentioned in the beginning of blog.

  • Create a new folder named ‘ViewModel‘ as we are following the MVVM design this folder will be used to store the View-Models which will interact with the UI,

  • Create a class named ‘ImageSearchViewModel‘ implementing ‘INotifyPropertyChanged‘ Interface inside ‘ViewModel‘ folder. The main method of this class which invokes Microsoft Bing API is as follows,
    1. public async Task SearchForImages()  
    2.         {  
    3.             if (IsBusy)  
    4.                 return;  
    5.    
    6.             IsBusy = true;  
    7.             //Bing Image API  
    8.             var url = $"https://api.cognitive.microsoft.com/bing/v5.0/images/" +  
    9.                       $"search?q={searchString}" +  
    10.                       $"&count=20&offset=0&mkt=en-us&safeSearch=Strict";  
    11.    
    12.             try  
    13.             {  
    14.                 using (var client = new HttpClient())  
    15.                 {  
    16.                     client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", CognitiveServicesKeys.BingSearch);  
    17.    
    18.                     var json = await client.GetStringAsync(url);  
    19.    
    20.                     var result = JsonConvert.DeserializeObject<SearchResult>(json);  
    21.    
    22.                     Images = result.Images.Select(i => new ImageResult  
    23.                     {  
    24.                         ContextLink = i.HostPageUrl,  
    25.                         FileFormat = i.EncodingFormat,  
    26.                         ImageLink = i.ContentUrl,  
    27.                         ThumbnailLink = i.ThumbnailUrl,  
    28.                         Title = i.Name  
    29.                     }).ToList();  
    30.                      
    31.                 }  
    32.             }  
    33.             catch (Exception ex)  
    34.             {  
    35.                 //  ("Unable to query images: " + ex.Message);                 
    36.             }  
    37.             finally  
    38.             {  
    39.                 IsBusy = false;  
    40.             }  
    41.         }  
  • Create another class named ‘EmotionViewModel‘ implementing ‘INotifyPropertyChanged‘ Interface inside ‘ViewModel‘ folder for managing Emotion Service UI View.

  • Create another class named ‘FaceViewModel‘ implementing ‘INotifyPropertyChanged‘ Interface inside ‘ViewModel‘ folder for managing Face Service UI view.
    .
  • Create a folder Named ‘View‘ to store UI Views of the application.

  • Create a file named ‘ImageSearch‘ for Bing Image Search. The XAML code of the same is,
    1. <?xml version="1.0" encoding="utf-8" ?>  
    2. <ContentPage xmlns="http://xamarin.com/schemas/2014/forms"  
    3.              xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"  
    4.              x:Class="MSCogServiceEx.View.ImageSearch">  
    5.   <StackLayout Padding="5" Spacing="5">  
    6.     <StackLayout Orientation="Horizontal">  
    7.       <Entry Text="{Binding SearchString}"/>  
    8.       <Button Text="Search" Command="{Binding GetImagesCommand}"/>  
    9.     </StackLayout>  
    10.     <ActivityIndicator IsRunning="{Binding IsBusy}" IsVisible="{Binding IsBusy}"/>  
    11.     <ListView ItemsSource="{Binding Images}" CachingStrategy="RecycleElement">  
    12.       <ListView.SeparatorColor>  
    13.         <OnPlatform x:TypeArguments="Color" iOS="Transparent"/>  
    14.       </ListView.SeparatorColor>  
    15.       <ListView.ItemTemplate>  
    16.         <DataTemplate>  
    17.           <ViewCell>  
    18.             <StackLayout Orientation="Horizontal" Padding="10,0,0,0">  
    19.               <Image HeightRequest="50" WidthRequest="50"  
    20.                      Source="{Binding ThumbnailLink}"/>  
    21.               <StackLayout Padding="10" Spacing="5">  
    22.                 <Label Text="{Binding Title}"  
    23.                        TextColor="#3498db"  
    24.                        Style="{DynamicResource ListItemTextStyle}"/>  
    25.                 <Label Text="{Binding FileFormat}"  
    26.                        Style="{DynamicResource ListItemDetailTextStyle}"/>  
    27.               </StackLayout>  
    28.             </StackLayout>  
    29.           </ViewCell>        </DataTemplate>  
    30.       </ListView.ItemTemplate>  
    31.     </ListView>  
    32.   </StackLayout>  
    33. </ContentPage>  
  • Create a file named ‘EmotionEx‘ for Emotion API. The XAML code of the same is,
    1. <?xml version="1.0" encoding="utf-8" ?>  
    2. <ContentPage xmlns="http://xamarin.com/schemas/2014/forms"  
    3.              xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"  
    4.              x:Class="MSCogServiceEx.View.EmotionEx">  
    5.   <ContentPage.Resources>  
    6.     <ResourceDictionary>  
    7.       <Style TargetType="Button">  
    8.         <Setter Property="BorderRadius" Value="10" />  
    9.         <Setter Property="BorderWidth" Value="2" />  
    10.         <Setter Property="WidthRequest" Value="350" />  
    11.         <Setter Property="HeightRequest" Value="50" />  
    12.         <Setter Property="HorizontalOptions"  Value="Center" />  
    13.         <Setter Property="VerticalOptions"    Value="Center" />  
    14.         <Setter Property="FontSize" Value="Medium" />  
    15.         <Setter Property="BackgroundColor" Value="Blue" />  
    16.         <Setter Property="TextColor" Value="White" />  
    17.       </Style>      
    18.     </ResourceDictionary>  
    19.   </ContentPage.Resources>  
    20.   <ScrollView>  
    21.     <StackLayout Spacing="10" Padding="10" HorizontalOptions="Center" >  
    22.       <Label Text="Emotion Service Example" FontSize="Large" FontAttributes="Bold" />  
    23.       <StackLayout Spacing="10">  
    24.         <Button Text="Take Photo to Check Emotion " Command="{Binding TakePhotoCommand}"/>  
    25.         <Button Text="Pick Photo to Check Emotion " Command="{Binding PickPhotoCommand}"/>  
    26.       </StackLayout>  
    27.       <ActivityIndicator IsRunning="{Binding IsBusy}" IsVisible="{Binding IsBusy}"/>  
    28.       <Label Text="{Binding Message}" FontSize="Large" FontAttributes="Bold" />  
    29.       <Image Source="{Binding SelectedImage}" />  
    30.     </StackLayout>  
    31.   </ScrollView>  
    32. </ContentPage>  
  • Create a file named ‘FaceEx‘ for Face API. The XAML code of it is,
    1. public App()  
    2.        {  
    3.            var tabs = new TabbedPage  
    4.            {  
    5.                Title = "MS Cognitive Services",  
    6.                //BindingContext = new WeatherViewModel(),  
    7.                Children =  
    8.                {  
    9.                    new ImageSearch() {Title="Search Image" ,BindingContext = new ImageSearchViewModel() } ,  
    10.                    new EmotionEx() { Title="Emotion Ex."BindingContext = new EmotionViewModel()},  
    11.                    new FaceEx() {Title="Face Ex.",BindingContext=new FaceViewModel()}  
    12.                }  
    13.            };  
    14.    
    15.            MainPage = new NavigationPage(tabs)  
    16.            {  
    17.                BarBackgroundColor = Color.FromHex("3498db"),  
    18.                BarTextColor = Color.White  
    19.            };  
    20.        }  
  • Inside App class constructor, create the object of Views and bind them with View-Models and add them like the following code,
    1. public App()  
    2.        {  
    3.            var tabs = new TabbedPage  
    4.            {  
    5.                Title = "MS Cognitive Services",  
    6.                //BindingContext = new WeatherViewModel(),  
    7.                Children =  
    8.                {  
    9.                    new ImageSearch() {Title="Search Image" ,BindingContext = new ImageSearchViewModel() } ,  
    10.                    new EmotionEx() { Title="Emotion Ex.", BindingContext = new EmotionViewModel()},  
    11.                    new FaceEx() {Title="Face Ex.",BindingContext=new FaceViewModel()}  
    12.                }  
    13.            };  
    14.    
    15.            MainPage = new NavigationPage(tabs)  
    16.            {  
    17.                BarBackgroundColor = Color.FromHex("3498db"),  
    18.                BarTextColor = Color.White  
    19.            };  
    20.        }  

This is how application looks on iPhone Simulator,

output

So, this is how we can use Microsoft Cognitive Services with Xamarin.Forms. Let me know if I have missed anything.