Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service

Preface

 
This article series is a complete end to end tutorial that will explain the concept of face recognition and face detection using modern AI-based Azure cognitive service i.e. Azure’s Face API service.
 

Introduction

 
This article gives a walk through of face classification application which performs face detection, identification, grouping and finding look-alike faces. Readers are expected to go through the prior three articles before reading this article.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Tutorial Series

 
The entire series on learning Face API cognitive services is divided into four parts. The first part focuses on Azure Functions, Serverless computing and creating and testing Face API on Azure Portal.
 
The second part explains the use of Face API SDK. The third part will be focused on face identification, where person groups will be created, and identification of faces are done via training the models i.e. via machine learning techniques. The fourth part is the most interesting part that gives a walkthrough of Face classification application which performs face detection, identification, grouping and finding look-alike faces. Following is the four-part series.
  1. Face API Cognitive Service Day 1: Face API on Azure Portal
  2. Face API Cognitive Service Day 2: Exploring Face API SDK
  3. Face API Cognitive Service Day 3: Face Identification using Face API
  4. Face API Cognitive Service Day 4: Face classification app using Face API.

Face Classification App

 
Getting the Code
 
I have already created the app and you can get the same from the downloaded source code or the Git URL: https://github.com/akhilmittal/Face-API. In this app we’ll perform the operations like creating person groups and persons, detecting and identifying the faces, verifying the faces, grouping the faces and finding look-alike i.e. similar looking faces.
  1. Go to the Git URL and click on clone or download button to get the Git URL of the code. Copy that URL.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
  1. Open the command prompt but before that make sure that the Git is installed on your computer. Move to the directory where you want to fetch the source code and perform git clone <git URL> operation there on the command prompt.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
  1. Once the cloning is done, it's time to open the application. I am opening it in VS Code. So if you are using VS Code just go into the fetched code directory from the command prompt and then type command “code .”. This will open the VS Code with the fetched code.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service

    Following is the VS Code opened with the solution.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
  1. Let’s install the packages before we get started. In the command window, type npm install to install all the packages needed by the application here.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service

Set-up the code

  1. Once the code is downloaded, opened in the code editor and the packages are installed, we can move ahead to see what lies within. open the face-api-service.service.ts file and in the file, on the top, provide your base URL for the baseURL variable as shown below.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
  1. Similarly, provide the key for you API in the Ocp-Apim-Subscription-Key field on the same file in the end.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
Code for the file is as follows,
  1. import { Injectable } from '@angular/core';  
  2. import { HttpClient, HttpHeaders } from '@angular/common/http';  
  3. import { Observable } from 'rxjs/Observable';  
  4. import 'rxjs/add/operator/mergeMap';  
  5. import 'rxjs/add/observable/forkJoin';  
  6. import 'rxjs/add/observable/of';  
  7.   
  8. @Injectable()  
  9. export class FaceApiService {  
  10.   
  11.   private baseUrl = 'https://centralindia.api.cognitive.microsoft.com/face/v1.0';  
  12.   
  13.   constructor(private http: HttpClient) { }  
  14.   
  15.   // ***** Person Group Operations *****  
  16.   
  17.   getPersonGroups() {  
  18.     return this.http.get<any[]>(`${this.baseUrl}/persongroups`, httpOptions);  
  19.   }  
  20.   
  21.   createPersonGroup(personGroup) {  
  22.     return this.http.put<any[]>(`${this.baseUrl}/persongroups/${personGroup.personGroupId}`, personGroup, httpOptions);  
  23.   }  
  24.   
  25.   deletePersonGroup(personGroupId) {  
  26.     return this.http.delete(`${this.baseUrl}/persongroups/${personGroupId}`, httpOptions);  
  27.   }  
  28.   
  29.   trainPersonGroup(personGroupId) {  
  30.     return this.http.post<any[]>(`${this.baseUrl}/persongroups/${personGroupId}/train`, null, httpOptions);  
  31.   }  
  32.   
  33.   getPersonGroupTrainingStatus(personGroupId) {  
  34.     return this.http.get<any>(`${this.baseUrl}/persongroups/${personGroupId}/training`, httpOptions);  
  35.   }  
  36.   
  37.   // ***** Persons Operations *****  
  38.   
  39.   getPersonsByGroup(personGroupId) {  
  40.     return this.http.get<any[]>(`${this.baseUrl}/persongroups/${personGroupId}/persons`, httpOptions);      
  41.   }  
  42.   
  43.   getPerson(personGroupId, personId) {  
  44.     return this.http.get<any[]>(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}`, httpOptions);      
  45.   }  
  46.   
  47.   // ***** Person Operations *****  
  48.   
  49.   createPerson(personGroupId, person) {  
  50.     return this.http.post<any>(`${this.baseUrl}/persongroups/${personGroupId}/persons`, person, httpOptions);      
  51.   }  
  52.   
  53.   deletePerson(personGroupId, personId) {  
  54.     return this.http.delete<any[]>(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}`, httpOptions);      
  55.   }  
  56.   
  57.   // ***** Person Face Operations *****/  
  58.   
  59.   getPersonFaces(personGroupId, personId) {  
  60.     return this.http.get<any>(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}`, httpOptions).flatMap(person => {  
  61.       let obsList = [];  
  62.       if (person.persistedFaceIds.length) {  
  63.         for (const faceId of person.persistedFaceIds) {  
  64.           obsList.push(this.getPersonFace(personGroupId, personId, faceId));  
  65.         }  
  66.         return Observable.forkJoin(obsList);  
  67.       } else {  
  68.         return Observable.of([]);  
  69.       }  
  70.     });  
  71.   }  
  72.   
  73.   getPersonFace(personGroupId, personId, faceId) {  
  74.     return this.http.get(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}/persistedfaces/${faceId}`, httpOptions);  
  75.   }  
  76.   
  77.   addPersonFace(personGroupId, personId, url) {  
  78.     return this.http.post<any>(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}/persistedfaces?userData=${url}`, { url: url}, httpOptions);  
  79.   }  
  80.   
  81.   deletePersonFace(personGroupId, personId, faceId) {  
  82.     return this.http.delete(`${this.baseUrl}/persongroups/${personGroupId}/persons/${personId}/persistedfaces/${faceId}`, httpOptions);  
  83.   }  
  84.   
  85.   // ***** Face List Operations *****  
  86.   
  87.   createFaceList(faceListId) {  
  88.     return this.http.put(`${this.baseUrl}/facelists/${faceListId}`, { name: faceListId }, httpOptions);  
  89.   }  
  90.   
  91.   addFace(faceListId, url) {  
  92.     return this.http.post(`${this.baseUrl}/facelists/${faceListId}/persistedFaces`, { url: url }, httpOptions);  
  93.   }  
  94.   
  95.   // ***** Face Operations *****  
  96.   
  97.   detect(url) {  
  98.     return this.http.post<any[]>(`${this.baseUrl}/detect?returnFaceLandmarks=false&returnFaceAttributes=age,gender,smile,glasses,emotion,facialHair`, { url: url }, httpOptions);  
  99.   }  
  100.   
  101.   identify(personGroupId, faceIds) {  
  102.     let request = {  
  103.       personGroupId: personGroupId,  
  104.       faceIds: faceIds,  
  105.       confidenceThreshold: 0.4  
  106.     };  
  107.     return this.http.post<any[]>(`${this.baseUrl}/identify`, request, httpOptions);  
  108.   }  
  109.   
  110.   group(faceIds) {  
  111.     return this.http.post<any>(`${this.baseUrl}/group`, { faceIds: faceIds }, httpOptions);  
  112.   }  
  113.   
  114.   findSimilar(faceListId, faceId) {  
  115.     let request = { faceId: faceId, faceListId: faceListId };  
  116.     return this.http.post<any>(`${this.baseUrl}/findsimilars`, request, httpOptions);  
  117.   }  
  118. }  
  119.   
  120. // private (non-exported)  
  121.   
  122. const httpOptions = {  
  123.   headers: new HttpHeaders({  
  124.     'Content-Type''application/json',  
  125.     'Ocp-Apim-Subscription-Key''<key>'  
  126.   })  
  127. };  
If you closely look at the code, we have all the face operations that we performed previously defined here. They just need to get called from the UI which we are doing in the application.
 

Compile and Run the Application

 
It is an angular application you can run the application from the VS Code terminal or the command window as well. I am running it from the command window. So, type ng serve command on the command window and press enter.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Once compiled and the server is running, you’ll get the URL of the application. In my case, it is running at localhost 4200 port.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Copy that URL and open the same in the browser. We see the application running here. This application has Set-up and Face Detection buttons on the home page.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Create a person group

  1. Click on Set-up and add a person group. Note that the UI is bound to all the API calls in the background. We already explored all the API calls for creating person, groups, and faces. So, just go through the application code to explore the files and see how they are bound to make calls to the API.
  2. Add a Person Group and name that as family.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
  1. Once the person group is created and is being shown, add persons to that person group.

    Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service

Create a person

 
Once a person is added, add a new face to that person.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Add face

 
In the “Add Face” popup, provide the URL of the image of the face of the person and click save. It will show the image below.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Similarly, add more persons to that person group. For e.g. I added Akhil Mittal, Arsh and Udeep as persons. Added three faces for Akhil Mittal.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Added 4 faces for Arsh.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Added three faces for Udeep.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Train the person group

 
Now if you remember the next thing we did after adding, person groups, persons and faces was to train the model. So, click on Train Model that in the background is bound to train API endpoint and will train our person group model and make it ready for detection and identification. Once you hit the “Train Model” button, you see the “Training initiated” message.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Once training is done, you see the “Training Succeeded” message if you press the “Check Training Status” button.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Code for all the operations

 
In the configuration.component.ts we have all the components defined that perform these operations.
  1. import { Component, OnInit } from '@angular/core';  
  2. import { FaceApiService } from '../services/face-api-service.service';  
  3. import { InputBoxService } from '../input-box/input-box.service';  
  4. import * as _ from 'lodash';  
  5. import { ToasterService } from 'angular2-toaster';  
  6.   
  7. @Component({  
  8.   selector: 'app-configuration',  
  9.   templateUrl: './configuration.component.html',  
  10.   styleUrls: ['./configuration.component.css']  
  11. })  
  12. export class ConfigurationComponent implements OnInit {  
  13.   public loading = false;  
  14.   public personFaces = [];  
  15.   public personGroups = [];  
  16.   public personList = [];  
  17.   public selectedGroupId = '';  
  18.   public selectedPerson: any;  
  19.   
  20.   constructor(private faceApi: FaceApiService, private inputBox: InputBoxService, private toastr: ToasterService) { }  
  21.   
  22.   ngOnInit() {  
  23.     this.faceApi.getPersonGroups().subscribe(data => this.personGroups = data);  
  24.   }  
  25.   
  26.   addPersonGroup(){  
  27.     this.inputBox.show('Add Person Group''Person Group Name:').then(result => {   
  28.       let newPersonGroup = { personGroupId: _.kebabCase(result), name: result };  
  29.       this.faceApi.createPersonGroup(newPersonGroup).subscribe(data => {  
  30.         this.personGroups.push(newPersonGroup);  
  31.         this.selectedGroupId = newPersonGroup.personGroupId;  
  32.         this.onGroupsChange();  
  33.       });  
  34.     });  
  35.   }  
  36.   
  37.   deletePersonGroup() {  
  38.     this.faceApi.deletePersonGroup(this.selectedGroupId).subscribe(() => {  
  39.       _.remove(this.personGroups, x => x.personGroupId === this.selectedGroupId);  
  40.       this.selectedGroupId = '';  
  41.     });  
  42.   }  
  43.   
  44.   onGroupsChange() {  
  45.     if (this.selectedGroupId) {  
  46.       this.loading = true;  
  47.       this.faceApi.getPersonsByGroup(this.selectedGroupId).subscribe(data => {  
  48.         this.personList = data;   
  49.         this.selectedPerson = null;  
  50.         this.personFaces = [];  
  51.         this.loading = false;   
  52.       });  
  53.     }  
  54.   }  
  55.   
  56.   personClick(person) {  
  57.     this.selectedPerson = person;  
  58.     this.faceApi.getPersonFaces(this.selectedGroupId, this.selectedPerson.personId).subscribe(data => {   
  59.       this.personFaces = data;  
  60.     });  
  61.   }  
  62.   
  63.   addPerson() {  
  64.     this.inputBox.show('Add Person''Person Name:').then(result => {  
  65.       let newPerson: any = { name: result };  
  66.       this.faceApi.createPerson(this.selectedGroupId, { name: result }).subscribe(data => {  
  67.         newPerson.personId = data.personId;  
  68.         this.personList.push(newPerson);  
  69.         this.selectedPerson = newPerson;  
  70.       });  
  71.     });  
  72.   }  
  73.   
  74.   deletePerson(personId) {  
  75.     this.faceApi.deletePerson(this.selectedGroupId, this.selectedPerson.personId).subscribe(() => {  
  76.       _.remove(this.personList, x => x.personId === this.selectedPerson.personId);  
  77.       this.selectedPerson = null;  
  78.     });  
  79.   }  
  80.   
  81.   addPersonFace() {  
  82.     this.inputBox.show('Add Face''URL:').then(result => {  
  83.       this.faceApi.addPersonFace(this.selectedGroupId, this.selectedPerson.personId, result).subscribe(data => {  
  84.         let newFace = { persistedFaceId: data.persistedFaceId, userData: result };  
  85.         this.personFaces.push(newFace);  
  86.       });  
  87.     });  
  88.   }  
  89.   
  90.   deletePersonFace(persistedFaceId) {  
  91.     this.faceApi.deletePersonFace(this.selectedGroupId, this.selectedPerson.personId, persistedFaceId).subscribe(() => {  
  92.       _.remove(this.personFaces, x => x.persistedFaceId === persistedFaceId);  
  93.     });  
  94.   }  
  95.   
  96.   trainPersonGroup() {  
  97.     this.loading = true;  
  98.     this.faceApi.trainPersonGroup(this.selectedGroupId).subscribe(() => {  
  99.       this.toastr.pop('info''Training Initiated''Training has been initiated...');  
  100.       this.loading = false;  
  101.     });  
  102.   }  
  103.   
  104.   getGroupTrainingStatus() {  
  105.     this.loading = true;  
  106.     this.faceApi.getPersonGroupTrainingStatus(this.selectedGroupId).subscribe(result => {  
  107.       switch (result.status) {  
  108.         case 'succeeded':  
  109.           this.toastr.pop('success''Training Succeeded');  
  110.           break;  
  111.         case 'running':  
  112.           this.toastr.pop('info''Training still in progress...''Check back later');  
  113.           break;  
  114.         case 'failed':  
  115.           this.toastr.pop('error''Error during Training', result.message);  
  116.           break;  
  117.         default:  
  118.           break;  
  119.       }  
  120.       this.loading = false;  
  121.     });  
  122.   }  
  123. }  
These keep track of all the id’s and perform necessary operations with those ids on button clicks.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Face detection

 
On the top right side of the application, you can find the Face Recognition tab that has a submenu as Face Detection, Face Grouping and Look-alike faces. Click on Face Detection.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Selecting the Face Detection option will open up the screen to provide the image on which the faces needs to be detected. Put the URL of the image on that Image URL text box and click on Detect. Note that I have used the same image that I used initially with the API to detect faces. This time again, the same API call has been made and we see the faces detected with a yellow square.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Following is the code for face-tester.component.html under src->app->face-tester folder.
  1. <div class="container">  
  2.   <ngx-loading [show]="loading" [config]="{ backdropBorderRadius: '14px' }"></ngx-loading>  
  3.   
  4.   <div class="card">  
  5.     <h3 class="card-header">Test Faces</h3>  
  6.     <div class="card-body">  
  7.   
  8.       <div class="form-group">  
  9.         <label>Person Group</label>  
  10.         <select [(ngModel)]="selectedGroupId" name="personGroups" class="form-control">  
  11.           <option value="">(Select)</option>  
  12.           <option *ngFor="let group of personGroups" [value]="group.personGroupId">  
  13.             {{group.name}} ({{group.personGroupId}})  
  14.           </option>  
  15.         </select>  
  16.       </div>  
  17.       <div class="form-group">  
  18.         <label>Image URL:</label>  
  19.         <input type="text" class="form-control" name="groupName" [(ngModel)]="imageUrl">  
  20.       </div>  
  21.   
  22.       <button class="btn btn-primary mr-sm-2" (click)="detect()">Detect</button>  
  23.       <button class="btn btn-primary" (click)="identify()">Identify</button>  
  24.   
  25.       <hr/>  
  26.   
  27.       <div *ngIf="selectedFace" class="text-primary">  
  28.         <pre class="text-primary">{{selectedFace | json}}</pre>  
  29.       </div>  
  30.       <div *ngIf="selectedFace && selectedFace.identifiedPerson">  
  31.         <ngb-alert>  
  32.           Subject Identified: {{selectedFace.name}}  
  33.         </ngb-alert>  
  34.       </div>  
  35.     </div>  
  36.   </div>  
  37.   
  38.   <div class="card">  
  39.   
  40.     <div class="mainImgContainer" *ngIf="imageUrl">  
  41.       <img #mainImg class="card-img main-img" [src]="imageUrl" (load)="imageLoaded($event)" />  
  42.   
  43.       <div [ngClass]="{'face-box-green': item.identifiedPerson, 'face-box-yellow': !item.identifiedPerson}" *ngFor="let item of detectedFaces"  
  44.         (click)="faceClicked(item)" [style.top.px]="item.faceRectangle.top * multiplier" [style.left.px]="item.faceRectangle.left * multiplier"  
  45.         [style.height.px]="item.faceRectangle.height * multiplier" [style.width.px]="item.faceRectangle.width * multiplier"></div>  
  46.   
  47.     </div>  
  48.   </div>  
  49. </div>  
Code for face-tester.component.ts is below.
  1. import { Component, OnInit, ViewChild } from '@angular/core';  
  2. import { FaceApiService } from '../services/face-api-service.service';  
  3. import * as _ from 'lodash';  
  4. import { forkJoin } from 'rxjs/observable/forkJoin';  
  5.   
  6. @Component({  
  7.   selector: 'app-face-tester',  
  8.   templateUrl: './face-tester.component.html',  
  9.   styleUrls: ['./face-tester.component.css']  
  10. })  
  11. export class FaceTesterComponent implements OnInit {  
  12.   loading = false;  
  13.   public detectedFaces: any;  
  14.   public identifiedPersons = [];  
  15.   public imageUrl: string;  
  16.   public multiplier: number;  
  17.   public personGroups = [];  
  18.   public selectedFace: any;  
  19.   public selectedGroupId = '';  
  20.   @ViewChild('mainImg') mainImg;  
  21.   
  22.   constructor(private faceApi: FaceApiService) { }  
  23.   
  24.   ngOnInit() {  
  25.     this.loading = true;  
  26.     this.faceApi.getPersonGroups().subscribe(data => {  
  27.       this.personGroups = data;  
  28.       this.loading = false;  
  29.     });  
  30.   }  
  31.   
  32.   detect() {  
  33.     this.loading = true;  
  34.     this.faceApi.detect(this.imageUrl).subscribe(data => {  
  35.       this.detectedFaces = data;  
  36.       console.log('**detect results'this.detectedFaces);  
  37.       this.loading = false;  
  38.     });  
  39.   
  40.   }  
  41.   
  42.   faceClicked(face) {  
  43.     this.selectedFace = face;  
  44.     if (this.selectedFace.identifiedPersonId) {  
  45.       let identifiedPerson = _.find(this.identifiedPersons, { 'personId': face.identifiedPersonId });  
  46.       this.selectedFace.name = identifiedPerson.name;  
  47.     }  
  48.   }  
  49.   
  50.   identify() {  
  51.     let faceIds = _.map(this.detectedFaces, 'faceId');  
  52.     this.loading = true;  
  53.   
  54.     //NOTE: for Production app, max groups of 10  
  55.     this.faceApi.identify(this.selectedGroupId, faceIds).subscribe(identifiedFaces => {  
  56.       console.log('**identify results', identifiedFaces);  
  57.       let obsList = [];  
  58.   
  59.       _.forEach(identifiedFaces, identifiedFace => {  
  60.         if (identifiedFace.candidates.length > 0) {  
  61.           let detectedFace = _.find(this.detectedFaces, { faceId: identifiedFace.faceId });  
  62.           detectedFace.identifiedPerson = true;  
  63.           detectedFace.identifiedPersonId = identifiedFace.candidates[0].personId;  
  64.           detectedFace.identifiedPersonConfidence = identifiedFace.candidates[0].confidence;  
  65.           obsList.push(this.faceApi.getPerson(this.selectedGroupId, identifiedFace.candidates[0].personId));  
  66.         }  
  67.       });  
  68.   
  69.       // Call getPerson() for each identified face  
  70.       forkJoin(obsList).subscribe(results => {  
  71.         this.identifiedPersons = results;  
  72.         this.loading = false;  
  73.       });  
  74.     });  
  75.   }  
  76.   
  77.   imageLoaded($event) {  
  78.     this.selectedFace = null;  
  79.     this.detectedFaces = [];  
  80.     let img = this.mainImg.nativeElement;  
  81.     this.multiplier = img.clientWidth / img.naturalWidth;  
  82.   }  
  83. }  
This code gets the detected faces on the provided image and puts a yellow square on the image. When you click on the face, it shows up the JSON of that person’s face.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Again, performing another detect to make sure it works fine. I have uploaded one more image of my friend and mine together. Click on detect and we have two yellow squares on both the faces. This time select the person group as well i.e. family person group that we created earlier. Note that we are now detecting the images of mine and my friend who are already added as a person in the person group and we earlier trained our person group as well. So, we get two yellow squares on the detect.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 

Face Identification

 
Now, since these persons were part of person group, so ideally these should be identifiable. Click on identify, that sends identify the call to the API. Once we get a response, we see the yellow square boxes are changed to green, which means identification is done and successful.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
Cross-verify that by clicking on the face and we see the JSON corresponding to the identified face. So, the first one is a subject identified as “Udeep”
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service 
 
And the second one is identified as “Akhil”. These faces are identified because these already have an entry in the person group and their faces were already there in the person group when it was trained.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 

Face Grouping

 
Let’s perform the face grouping operations. We’ll provide a few URL’s separated by a newline character and execute grouping. These image URL’s are few images of the Udeep, Arsh, and Akhil. Ideally, grouping should work in a way to group similar images together and show.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
Once grouping request is made, we see that the images are grouped per person i.e. out of 11 URL’s provided for grouping. Faces identified for me are 5, for Arsh 3 and for Udeep 3. It worked perfectly. Note that for my images it also identified my face from the group of people as well in the image provided.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
Code for face-grouping.component.html is as follows.
  1. <div class="container">  
  2.   <ngx-loading [show]="loading" [config]="{ backdropBorderRadius: '14px' }"></ngx-loading>  
  3.   
  4.   <div class="card">  
  5.     <h3 class="card-header">Face Grouping</h3>  
  6.     <div class="card-body">  
  7.   
  8.       <textarea rows="8" cols="80" [(ngModel)]="imageUrls">  
  9.       </textarea>  
  10.   
  11.       <hr/>  
  12.   
  13.       <button class="btn btn-primary" (click)="executeGrouping()">Execute Grouping</button>  
  14.   
  15.       <div *ngFor="let group of groupingResults.groups">  
  16.         <h3>Group</h3>  
  17.         <div class="row">  
  18.           <div class="col-md-3" *ngFor="let face of group">  
  19.             <div class="card text-center">  
  20.               <div class="card-body card-block-img-container">  
  21.                 <span class="img-container">  
  22.                   <img class="img-person-face img-thumnail" [src]="getUrlForFace(face)" height="140" width="140" />  
  23.                 </span>  
  24.               </div>  
  25.             </div>  
  26.           </div>  
  27.         </div>  
  28.       </div>  
  29.   
  30.       <div *ngIf="groupingResults.messyGroup">  
  31.           <h3>Mixed Group</h3>  
  32.           <div class="row">  
  33.             <div class="col-md-3" *ngFor="let face of groupingResults.messyGroup">  
  34.               <div class="card text-center">  
  35.                 <div class="card-body card-block-img-container">  
  36.                   <span class="img-container">  
  37.                     <img class="img-person-face img-thumnail" [src]="getUrlForFace(face)" height="140" width="140" />  
  38.                   </span>  
  39.                 </div>  
  40.               </div>  
  41.             </div>  
  42.           </div>  
  43.         </div>  
  44.   
  45.     </div>  
  46.   </div>  
  47. </div>  
Code for face-grouping.component.ts is as follows.
  1. import { Component, OnInit } from '@angular/core';  
  2. import * as _ from 'lodash';  
  3. import { FaceApiService } from '../services/face-api-service.service';  
  4. import { forkJoin } from 'rxjs/observable/forkJoin';  
  5.   
  6. @Component({  
  7.   selector: 'app-face-grouping',  
  8.   templateUrl: './face-grouping.component.html',  
  9.   styleUrls: ['./face-grouping.component.css']  
  10. })  
  11. export class FaceGroupingComponent implements OnInit {  
  12.   public imageUrls: string[];  
  13.   public faces: any[];  
  14.   public groupingResults: any = {};  
  15.   public loading = false;  
  16.   
  17.   constructor(private faceApi: FaceApiService) { }  
  18.   
  19.   ngOnInit() { }  
  20.   
  21.   executeGrouping() {  
  22.     let urls = _.split(this.imageUrls, '\n');   
  23.   
  24.     let detectList = [];  
  25.     _.forEach(urls, url => {  
  26.       if (url){  
  27.         detectList.push(this.faceApi.detect(url));  
  28.       }  
  29.     });  
  30.   
  31.     this.loading = true;  
  32.     forkJoin(detectList).subscribe(detectResults => {  
  33.       this.faces = [];  
  34.       _.forEach(detectResults, (value, index) => this.faces.push({ url: urls[index], faceId: value[0].faceId} ));  
  35.       let faceIds = _.map(this.faces, 'faceId');  
  36.   
  37.       this.faceApi.group(faceIds).subscribe(data => {  
  38.         this.groupingResults = data;  
  39.         this.loading = false;  
  40.       });  
  41.     });  
  42.   }  
  43.   
  44.   getUrlForFace(faceId) {  
  45.     var face = _.find(this.faces, { faceId: faceId });  
  46.     return face.url;  
  47.   }  
  48.   
  49. }  

Finding Similar Faces

 
In this module of the application. We’ll try to find similar faces from the group of supplied images URL. We’ll supply a few images URL from which we need to find the face and one URL for which we want to find the similar face. For e.g., following is the image URL for which I want to find the similar faces from the group of faces.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
Now, in the find similar screen, provide new line character separated URL’s for the same or other images of the person for which you want to find lookalikes and in the next box give the URL of the person for which you want to match. Click on the “Find Similar” button and it gives you the matching face. If the face does not match, it returns nothing.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
You can find the find similar component at the following location shown in the image.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
Code for find-similar.component.ts is as follows.
  1. import { Component, OnInit } from '@angular/core';  
  2. import { FaceApiService } from '../services/face-api-service.service';  
  3. import * as _ from 'lodash';  
  4. import { forkJoin } from 'rxjs/observable/forkJoin';  
  5.   
  6. @Component({  
  7.   selector: 'app-find-similar',  
  8.   templateUrl: './find-similar.component.html',  
  9.   styleUrls: ['./find-similar.component.css']  
  10. })  
  11. export class FindSimilarComponent implements OnInit {  
  12.   public faces: any[];  
  13.   public loading = false;  
  14.   public imageUrls: string[];  
  15.   public queryFace: string = 'https://www.codeproject.com/script/Membership/Uploads/7869570/Akhil_5.png';  
  16.   public findSimilarResults: any[];  
  17.   
  18.   constructor(private faceApi: FaceApiService) { }  
  19.   
  20.   ngOnInit() { }  
  21.   
  22.   findSimilar() {  
  23.     this.loading = true;  
  24.   
  25.     // 1. First create a face list with all the imageUrls  
  26.     let faceListId = (new Date()).getTime().toString(); // comically naive, but this is just for demo  
  27.     this.faceApi.createFaceList(faceListId).subscribe(() => {  
  28.   
  29.       // 2. Now add all faces to face list  
  30.       let facesSubscribableList = [];  
  31.       let urls = _.split(this.imageUrls, '\n');  
  32.       _.forEach(urls, url => {  
  33.         if (url) {  
  34.           facesSubscribableList.push(this.faceApi.addFace(faceListId, url));  
  35.         }  
  36.       });  
  37.   
  38.       forkJoin(facesSubscribableList).subscribe(results => {  
  39.         this.faces = [];  
  40.         _.forEach(results, (value, index) => this.faces.push({ url: urls[index], faceId: value.persistedFaceId }));  
  41.   
  42.         // 3. Call Detect on query face so we can establish a faceId   
  43.         this.faceApi.detect(this.queryFace).subscribe(queryFaceDetectResult => {  
  44.           let queryFaceId = queryFaceDetectResult[0].faceId;  
  45.   
  46.           // 4. Call Find Similar with the query face and the face list  
  47.           this.faceApi.findSimilar(faceListId, queryFaceId).subscribe(finalResults => {  
  48.             console.log('**findsimilar Results', finalResults);  
  49.             this.findSimilarResults = finalResults;  
  50.             this.loading = false;  
  51.           });  
  52.         });  
  53.       });  
  54.     });  
  55.   }  
  56.   
  57.   getUrlForFace(faceId) {  
  58.     var face = _.find(this.faces, { faceId: faceId });  
  59.     return face.url;  
  60.   }  
  61.   
  62. }  
Code for find-similar.component.html is as follows.
  1. <div class="container">  
  2.   <ngx-loading [show]="loading" [config]="{ backdropBorderRadius: '14px' }"></ngx-loading>  
  3.   
  4.   <div class="card">  
  5.     <h3 class="card-header">Find Similar</h3>  
  6.     <div class="card-body">  
  7.   
  8.       <textarea rows="8" cols="80" [(ngModel)]="imageUrls">  
  9.         </textarea>  
  10.   
  11.       <input type="text" class="form-control" placeholder="Query Face" [(ngModel)]="queryFace" />  
  12.   
  13.       <hr/>  
  14.   
  15.       <button class="btn btn-primary" (click)="findSimilar()">Find Similar</button>  
  16.   
  17.       <div *ngIf="queryFace">  
  18.         <h3>Query Face</h3>  
  19.         <div class="row">  
  20.           <div class="col-md-3">  
  21.             <div class="card text-center">  
  22.               <div class="card-body card-block-img-container">  
  23.                 <span class="img-container">  
  24.                   <img class="img-person-face img-thumnail" [src]="queryFace" height="140" width="140" />  
  25.                 </span>  
  26.               </div>  
  27.             </div>  
  28.           </div>  
  29.         </div>  
  30.       </div>  
  31.   
  32.       <div *ngIf="findSimilarResults">  
  33.         <h3>Find Similar Results</h3>  
  34.         <div class="row">  
  35.           <div class="col-md-3" *ngFor="let face of findSimilarResults">  
  36.             <div class="card text-center">  
  37.               <div class="card-body card-block-img-container">  
  38.                 <span class="img-container">  
  39.                   <img class="img-person-face img-thumnail" [src]="getUrlForFace(face.persistedFaceId)" height="140" width="140" />  
  40.                 </span>  
  41.                 <hr/>  
  42.                 <span>Confidence: {{face.confidence}}</span>                  
  43.               </div>  
  44.             </div>  
  45.           </div>  
  46.         </div>  
  47.       </div>  
  48.   
  49.     </div>  
  50.   </div>  
  51. </div>  

Conclusion

 
This was an end to end article to show the capabilities of Azure Face API i.e. one of Azure’s cognitive service. The API is quite intelligent and strong to leverage the AI and machine learning capabilities and perform the actions. We saw in detail how to create an Azure account, how to create a Face API and get it up and running. We saw how CRUD operations could be performed over the Face API for person groups, persons and faces. Not only detection, but the API also performs operations like giving the facial attributes of the detected face, identifying the face from the trained model, grouping and finding similar faces as well. I hope it was fun.
 
Image Classification - Face Detection And Identification Using Azure Face API Cognitive Service
 
References
  1. https://github.com/smichelotti/ps-face-api-explorer
  2. https://www.nuget.org/packages/Microsoft.Azure.CognitiveServices.Vision.Face/
  3. https://azure.microsoft.com/en-in/services/cognitive-services/face/
  4. https://centralindia.dev.cognitive.microsoft.com/docs/services/563879b61984550e40cbbe8d/operations/563879b61984550f30395236

Code

  1. SDK Code
  2. Image Classification Application


Similar Articles