Azure IoT Hub and Power BI Streaming dataflows

Ok time for some serious Prototyping today. Based on the announcement stated below I had the perfect use case to finally test some of the functionality provided by Azure IoT Hub and Power BI Streaming dataflows.

Streaming dataflows in Power BI premium now available in public preview | Microsoft Power BI Blog | Microsoft Power BI

Some time ago I bought this cool DevKit after I had attended a Microsoft workshop in Munich.

An all-in-one IoT kit built for the cloud (microsoft.github.io)

The walkthrough in this article is very precise.

Connect an MXCHIP AZ3166 to Azure IoT Hub quickstart | Microsoft Docs

I was able to successfully install all the required software on my local machine. Same is true for the Azure part and so I ended up with this – A piece of hardware sending real time data to my Azure IotT Hub. How cool is that.

The Power BI part was the last thing to do. Again, I followed this well written article.

Streaming dataflows (preview) – Power BI | Microsoft Docs

Et voila I ended up with data which went all the way from my local machine to Azure from there to Power BI and back to my local Power BI Designer. I am still very impressed with the result. For me it was so much fun to glue all this stuff together. This really made me smile for the rest of the day.

Solving Real World Scenarios with Microsoft Flow – Challenge No. 3 – Part two

To get things started I use a simple daily schedule – nothing special here:

Then I drag a Data Operations – Compose step on the flow. I use this to calculate the previous day. I have found out that I have to enter the expression with leading and trailing double quotes as otherwise it won’t work.

What I find very strange here is the fact that I can’t edit the item anymore after I have saved it. Seems like a bug to me…

Next comes the HTTP Get step:

The API expects the following format:

http://api.fixer.io/2017-01-01?symbols=USD,GBP

I use the CheckDate variable from the previous step to get the rates for the previous day.

Now it’s time for the new Pars JSON step, input is the body of the HTTP step.

To generate the schema all I have to do is pasting the JSON returned by the API.

For information purposes I send myself a mail with the relevant data.

The result looks like this.

As there are only new rates for weekdays I’m using a condition in the next step. Only if the date returned is equal to the previous day proceed to the last step.

First I have to define a streaming dataset in Power BI. Again this is pretty straight forward.

In Microsoft Flow I have defined the counter part as follows.

And that’s all. The rates are transferred to Power BI on a daily basis and only for weekdays (see the gap between 03/17/17 and 03/17/20.

One problem remains. I don’t know why but Live Tiles show no data for my streaming dataset. A bug?

 

So that’s the story so far. On my wish list I have one big point. Microsoft please provide access to the streaming dataset, e.g. in Power BI Desktop or Power Query …

 

Solving Real World Scenarios with Microsoft Flow – Challenge No. 3

So I really like playing around with Microsoft Flow. In this „challenge“ I did basically the following:

  • Calling the fixer api on a daily basis
  • Parse the JSON reponse
  • do some logical tests
  • Send myself an e-mail with the current fx rates
  • Add the content to a Power BI Streaming dataset

Although the result looks quite simple it took me a while to figure things out. I will go through each step in the next posting…