Student Pass or Fail Prediction Experiment on Azure Machine Learning

Here are the steps for performing Student Pass or Fail Prediction Experiment on Azure Machine Learning

  1.  First you need to get / create database file ( .csv ).



  2. Enter in Microsoft Azure Portal,



    Click on New -> Data Services -> Machine Learning -> Quick Create, than enter Workspace Name, Select Location, Create a new storage account, Give name of New storage account and Click Correct Sign.

  3. Workspace will be created,



    Workspace will be created and show online then Click below Open In Studio button after selection of your workspace Name ( if more than one workspace ). Then a separate (https://studio.azureml.net) Online Studio will be loaded in new browser tab.

  4. Machine Learning Studio will open in separate browser Tab.



    Left side Menu Experiments, Web Services, Datasets, Trained Models, Settings.

  5. Now if you want to create a new experiment with your own dataset (.CSV) file,



    Click on DATASETS (Left Menu). In Datasets two tabs are there MY DATASETS and SAMPLES, where MY DATASETS means your used or uploaded datasets are listed below and SAMPLES means you can see Samples of  already available Datasets of studio. If you want to upload your ( .csv ) file than click below button NEW.

  6. Than again Click DATASETS -> FROM LOCAL FILE,



  7. Browse your .csv file, Enter a name for the Dataset than Click Right Button below. Your Dataset File (.CSV) will be uploaded. You can see uploaded dataset in MY DATASETS.



  8. Now we will make Experiment,



    Click on EXPERIMENTS in Left Main Menu, you will find MY EXPERIMENRS (Already created Experiments by User) and SAMPLES (Already readymade available Experiments to study). We want to create our own then select below NEW button.

  9. New Experiment



    Select EXPERIMENT and Blank Experiment button ( + Logo ).

  10. Your New Experiment will be loaded.



    You can click on Experiment created on 5/6/2015 and Rename your Experiment (Give your experiment Name).

  11. Write dataset name which you have uploaded recently in the search bar's left side, and that dataset will be loaded below in My Datasets, you simply Drag that Dataset from left side menu and Drop it in Experiment Space.



  12. Likewise you search Split, Train Model, Score Model and evaluate model and Drag and Drop all item in Experiment space one by one like shown in above snapshot.

    Split – Split we use for Data Transformation, we will give 70% Data of CSV file to train the Model and remaining 30% Data to test the train model.

    Train Model – Train Model has two inputs; one input is of ML Algorithm and second input is for 70% Data from Split.

    Score Model – This model also has two inputs one from Train model and second input from 30% Data from Split to taste the Train Model.

    Evaluate Model – This model evaluates the score results and calculates the ML parameters like True Positive, True Negative, False Positive, False Negative, ROC, Precision, Recall, Accuracy of Experiment, F1 Score, AUC etc. for classification algorithm.



  13. Now Split the Data



    Give connection of .CSV file to Split, Select Split and click Properties in Right side Menu where you can divide database for train model and score model (e.g. 80% - 20%, 75 % - 25 %, 70% - 30 %) ideally 70% for Train model and remaining 30% for Score Model, So in Property enter 0.7 (means 70 % for Train Model 30 % data for Score Model) value in Fraction of Rows.

  14. Now do the connections like shown in snapshot, first output of split to train model and second output to score model, output of train model to first input of score model, output of score model to evaluate model.



  15. Now apply Classification Algorithm to Train the Model,



    Type Classification in search bar left side; it will load 14 Classification Algorithms, Drag Two-Class Decision Jungle Algorithm from Left side and Drop above Train Model. Give output of that Algorithm to First input of Train Model.

  16. Now we want to Train the Model on Result field of .csv file.



    Select Train Model Right Side click on Property, Select Launch Column Selector.

  17. Wherein Select Include -> Column Names -> Result field and Click right mark button.



  18. Now Give Name to Experiment and Save. Give Name to Experiment (Student Pass Fail Prediction) than Click on SAVE Button in below tile menu.



  19. Run the Experiment. Click on RUN button in below tiled menu,



  20. After Running the Experiment,



    You will find Green Right Mark on each Model that means you have no errors in each model.

  21. See output of Score Model,



    Right Click on output point of score model and press Visualize.

  22. Output of Score Model,



    Two extra fields will be added after Result field, which is Scored Labels and Scored Probabilities. Scored Labels means Train Model has predicted the Result of that 30% Data which we have passed in Score Model.

  23. See ML Parameters of Algorithm in Evaluate Model,



    Right Click on output point of Evaluate Model and Press Visualization.

  24. ROC Graph of Score Dataset,



  25. Parameters



    Above Result shows Accuracy almost 68% of Experiment.

    - A = True Positive B = False Negative C = False Positive D = True Negative
    - Accuracy = (A+D) / (A+B+C+D)
    - Precision = A / (A+C)
    - Recall = A / (A+B)

  26. Make Score Experiment to publish as Web Service.



    Once again Save the experiment and Run the Experiment then Create Scoring, Experiment button will be activated then Click Create Scoring Experiment.

  27. Scoring Experiment Created



    Now your experiment is converted into two Tabs Training Experiment and Scoring Experiment, In Scoring Experiment your Split, Two Class Decision Jungle Algorithm will be merged in Student Pass Fail Prediction Model, and score model will have Web service input and Web service output and .CSV Dataset will be disabled.

    Now Press Save Button and Press Run Button of Scoring Experiment.

  28. Convert Score Experiment to Web Service view and apply output filter



    After Running the Score Experiment your score model has green right mark that means you've successfully run the experiment after that you need to convert the score experiment to Web service view by Click the Switch to Web Service View button below.

  29. Web Service view enabled



    Now we want score Label (Result) as a web service output from values which we will insert in web service input so we need to set web service output as score label.

  30. Add Project Column to set output field as Score Label of web service



    Search Project Column in search bar and Drag and Drop Project Column between Score model and Web service output.

  31. Apply Score model output to Project Column and output of Project Column to Web Service output. Than Select Project Column -> Click Property Right Side -> Select Launch Column Selector.



  32. Select No Column -> Include -> Column Names -> Select Scored Labels from Dropdown (Do not type). Click Right Mark.



  33. Save the Experiment, Run the Experiment. Find Green Right Mark in Score Model and Project Column that means runs successfully.



  34. Now Experiment ready to publish as Web Service.



    For that click on PUBLISH WEB SERVICE button below tiled Menu.

  35. Dashboard of Web Service.



  36. Web Service API Key, Press TEST Button to test the web service and get output. Whether Student will Pass or Fail this web service will predict the Result.



  37. Enter Data to see the Predicted result of Student whether he will Pass or Fail



    Enter Data one by one and press Right Mark Button.

  38. Processing



  39. Web Service Predicted Result PASS ( >=40 ) Highlighted,



  40. Request Response



  41. When you click on Request Response. You will find API documentation.






  42. Sample Code in 3 Languages c#, python and R language.

  43. C# Code of Web Service to apply this web service in Visual Studio.

    1. // This code requires the Nuget package Microsoft.AspNet.WebApi.Client to be installed.    
    2. // Instructions for doing this in Visual Studio:    
    3. // Tools -> Nuget Package Manager -> Package Manager Console    
    4. // Install-Package Microsoft.AspNet.WebApi.Client    
    5. using System;  
    6. using System.Collections.Generic;  
    7. using System.IO;  
    8. using System.Net.Http;  
    9. using System.Net.Http.Formatting;  
    10. using System.Net.Http.Headers;  
    11. using System.Text;  
    12. using System.Threading.Tasks;  
    13. namespace CallRequestResponseService   
    14. {  
    15.     public class StringTable  
    16.     {  
    17.         public string[] ColumnNames  
    18.         {  
    19.             get;  
    20.             set;  
    21.         }  
    22.         public string[, ] Values  
    23.         {  
    24.             get;  
    25.             set;  
    26.         }  
    27.     }  
    28.     class Program  
    29.     {  
    30.         static void Main(string[] args)  
    31.         {  
    32.             InvokeRequestResponseService().Wait();  
    33.         }  
    34.         static async Task InvokeRequestResponseService()   
    35.         {  
    36.             using(var client = new HttpClient())   
    37.             {  
    38.                 var scoreRequest = new   
    39.                 {  
    40.                     Inputs = new Dictionary < string, StringTable > ()   
    41.                     {  
    42.                             {  
    43.                                 "input1",  
    44.                                 new StringTable()  
    45.                                 {  
    46.                                     ColumnNames = new string[]   
    47.                                         {  
    48.                                             "Full name","Illness (in %) per semester","Attendance (in %) per semester","SSC result (in %)","HSC result (in %)","Father Education","Mother Education",  
    49.                                             "Hostel","Study hours (per day)","Sports","Disability","Medium","Result"  
    50.                                         },  
    51.                                         Values = new string[, ] {  
    52.                                             {  
    53.                                                 "value","0","0""0","0","0","0","value","0","value","value""value","value"  
    54.                                             },   
    55.                                               {  
    56.                                                 "value","0","0","0","0","0","0","value","0","value","value""value","value"  
    57.                                                  
    58.                                             },  
    59.                                         }  
    60.                                 }  
    61.                             },  
    62.                         },  
    63.                         GlobalParameters = new Dictionary < stringstring > () {}  
    64.                 };  
    65.                 const string apiKey = "abc123"// Replace this with the API key for the web service    
    66.                 client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey);  
    67.                 client.BaseAddress = new Uri("https://ussouthcentral.services.azureml.net/workspaces/7c38001675154abdaf92f8ce0b9f9e27/services/9cfb7036562044b78a724398339fb12a/execute?api-version=2.0&details=true");  
    68.                 // WARNING: The 'await' statement below can result in a deadlock if you are calling this code from the UI thread of an ASP.Net application.    
    69.                 // One way to address this would be to call ConfigureAwait(false) so that the execution does not attempt to resume on the original context.    
    70.                 // For instance, replace code such as:    
    71.                 // result = await DoSomeTask()    
    72.                 // with the following:    
    73.                 // result = await DoSomeTask().ConfigureAwait(false)    
    74.                 HttpResponseMessage response = await client.PostAsJsonAsync("", scoreRequest);  
    75.                 if (response.IsSuccessStatusCode)   
    76.                 {  
    77.                     string result = await response.Content.ReadAsStringAsync();  
    78.                     Console.WriteLine("Result: {0}", result);  
    79.                 } else   
    80.                 {  
    81.                     Console.WriteLine(string.Format("The request failed with status code: {0}", response.StatusCode));  
    82.                     // Print the headers - they include the requert ID and the timestamp, which are useful for debugging the failure    
    83.                     Console.WriteLine(response.Headers.ToString());  
    84.                     string responseContent = await response.Content.ReadAsStringAsync();  
    85.                     Console.WriteLine(responseContent);  
    86.                 }  
    87.             }  
    88.         }  
    89.     }  
    90. }