Partying on the Azure stack. Cortana, ML, Power BI and Azure Functions…

At Fred we pride ourselves on innovation and one of the key tenants of this is our annual Geekend. This is a 36 hour intensive hack fest in which we smash out a retail workflow using as much cool tech as possible in the interest of leveraging it in the coming year. This year I was playing with machine learning and attempting to use it to provide a demand forecasting solution for Dynamics AX.

The idea was to utilize an Azure ML model to predict and generate a list of items to order based on past sales. This was provided thanks to my co-worker and all round machine learning wizard, Praveen.

With the prediction engine in place we attempted to use as many other features provided by the Azure platform and came up with a pretty sweet story to tell. The components involved were:

  • Azure Scheduler
  • Azure Functions
  • Azure Machine Learning
  • Azure SQL
  • Power BI
  • Cortana Intelligence Suite

Although the final product set to be ordered would ultimately be consumed by AX master planning, for the sake of the demo we were just surfacing these results in a Power BI visualization.

Buckle up…

The whole story is summarized pretty nicely in the diagram below.

GeekendFlow

  1. Azure Scheduler allows me to execute anything on some pretty complex schedules but in this case I’m just hitting a HTTP endpoint to kick off an Azure Function.
  2. The Azure function hits the machine learning model through an exposed Web API endpoint and the resulting predictions are returned as JSON.
  3. The response is then deserialized and persisted in Azure SQL.
  4. There is a Power BI visualization pointing at the Azure SQL data source which can be used to look at the resulting data set.
  5. We then provided Cortana visualizations and connected her to our Office 365 subscription so we could just ask for the demand forecast directly from Windows 10.

I’ll break down each step in some detail starting with Azure Scheduler.

I jumped into the new Azure portal and created a new scheduler instance from the marketplace. After giving the scheduler a name I basically had to configure two items; The action and when I wanted it to run.

I added the HTTP endpoint for the Azure Function that I’ll create in the next step as the action and specified the schedule as a simple recurring time value. I could have used the native timer trigger in Azure Functions but I liked the more intuitive Azure Scheduler and it gave me an excuse to use another component. The Azure Scheduler also has a nice history tool to see the executions from within the portal.

Scheduler History

Next it was time to hit up Azure Functions and I have been madly thinking of something to do with this since they announced it at Build. In the end, Azure Functions are pretty much the simplest micro-service implementation you can dream up. A simple piece of logic that can be triggered by a timer, a HTTP endpoint or a myriad of different events such as the creation of a new blob in Azure Storage. On top of this, Functions have a really neat pricing model (The first 1 million requests are free) and the ability to dynamically scale. For this little exploration of their potential I simply had a HTTP endpoint that I could hit from the Azure Scheduler. Big thanks to Andrew Coates and Jeremy Thake from Microsoft who helped point me in the right direction when trying to work out how to use Functions in the wee hours of the morning at Geekend.

The aim of the Azure Function was to call a REST service exposed out of Azure Machine Learning and then persist the results for Power BI to display. I first went into the Azure ML Studio and opened the web service endpoint that we had exposed. I clicked the Test button and entered in some information to drive the model.

testML

This returned a nice little JSON payload which I threw into my friendly neighbourhood json2csharp converter and generated the C# class hierarchy from the response. I saved it and included it in my function so I could deserialize the result.

Now that I had the payload and some classes to represent it I punched out a quick Azure Function to do the work.

 

#r "Microsoft.WindowsAzure.Storage"
#r "System.Data"
#r "Newtonsoft.Json"

using System.Net;
using Microsoft.WindowsAzure.Storage.Table;
using System.Data.SqlClient;
using System.Data;
using System.Net.Http.Headers;
using Newtonsoft.Json;

public async static void Run(HttpRequestMessage req, TraceWriter log) {
 log.Info($ "C# HTTP trigger function processed a request.");

 using(var client = new HttpClient()) {
  var scoreRequest = new {
   Inputs = new Dictionary  () {
     {
      "input1",
      new StringTable() {
       ColumnNames = new string[] {
         "test.length",
         "seasonality",
         "observation.freq",
         "timeformat"
        },
        Values = new string[, ] {
         {
          "1",
          "2",
          "1",
          "utc"
         }
        }
      }
     },
    },
    GlobalParameters = new Dictionary  () {}
  };
  const string apiKey = ""; // Replace this with the API key for the web service
  client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey);

  client.BaseAddress = new Uri("");

  HttpResponseMessage response = await client.PostAsJsonAsync("", scoreRequest);

  if (response.IsSuccessStatusCode) {
   string result = await response.Content.ReadAsStringAsync();

   RootObject predicts = JsonConvert.DeserializeObject  (result);

   using(SqlConnection con = new SqlConnection("")) {
    con.Open();

    SqlCommand cmd = con.CreateCommand();
    cmd.Parameters.Add("@ItemId", SqlDbType.VarChar);
    cmd.Parameters.Add("@ProductName", SqlDbType.VarChar);
    cmd.Parameters.Add("@OrderQuantity", SqlDbType.VarChar);

    foreach(var prediction in predicts.Results.output1.value.Values) {
     cmd.CommandText = "INSERT INTO Common.DemandForecast (ItemId, ProductName, OrderQuantity) VALUES (@ItemId,@ProductName,@OrderQuantity)";
     cmd.Parameters["@ItemId"].Value = prediction[1];
     cmd.Parameters["@ProductName"].Value = "";
     cmd.Parameters["@OrderQuantity"].Value = prediction[9];

     cmd.ExecuteNonQuery();
    }
   }
  } else {
   Console.WriteLine(string.Format("The request failed with status code: {0}", response.StatusCode));

   // Print the headers - they include the requert ID and the timestamp, which are useful for debugging the failure
   Console.WriteLine(response.Headers.ToString());

   string responseContent = await response.Content.ReadAsStringAsync();
   Console.WriteLine(responseContent);
  }
 }
}

public class Value {
 public List  ColumnNames {
  get;
  set;
 }
 public List  ColumnTypes {
  get;
  set;
 }
 public List < List > Values {
  get;
  set;
 }
}

public class Output1 {
 public string type {
  get;
  set;
 }
 public Value value {
  get;
  set;
 }
}

public class Results {
 public Output1 output1 {
  get;
  set;
 }
}

public class RootObject {
 public Results Results {
  get;
  set;
 }
}

public class StringTable {
 public string[] ColumnNames {
  get;
  set;
 }
 public string[, ] Values {
  get;
  set;
 }
}

With a little magic from NewtonSoft.Json I deserialized the response, iterated through the predictions and inserted them into a table in Azure SQL. As the the table I inserted the rows into it was a data source for Power BI I could then refresh it and view the visualizations in the browser. What would be really nice is a way to refresh the data source using the REST API but I couldn’t really find a way to do this. I read through the Data Refresh documentation and the answer wasn’t forthcoming until I found this user voice request for programmatic refresh. I guess it’s manual for now or alternatively a schedule could be established in the Power BI interface.

Finally we wanted to show off the deep integration provided by the Cortana Intelligence Suite and the ability to quickly surface this in Windows 10. As we had visualizations that had been laid out for Cortana we first connected it to our Office 365 tenant using the Connected Account configuration.

cortana connected

Cortana then sprang to life with all sorts of Office 365 goodness but also with some serious integration to Power BI after we enabled the dataset for Cortana. This meant I could speak to Cortana and say, “Hey Cortana, Show me the demand forecast” to retrieve a visualization formatted for the Windows 10 Cortana workspace.

c_demand        c_results

While this visualization showed some really pretty data, the Show more details in Power BI enabled the user to get serious with the full Power BI experience and all that it can provide.

demand table       demand treemap

While the result was just a pretty report, the next step would to actually use the data within AX to feed master planning and produce orders. Despite the logical extension the hackfest produced a pretty viable story and let us play with a whole plethora of components from the Azure landscape.

With that all wrapped up, I’ll leave you with a picture of the Azure ML model. I don’t really understand it but you have to admit its impressive…

ml

Leave a comment