24 Days of PowerPlatform – Day 17 – Build your PowerPlatform Whish-App

workingondata

24dopp_day17Welcome to today’s PowerPlatform blog post!

Christmas is approaching very fast and with the use of the PowerPlatform tools I would like to build a Christmas WishApp using PowerApps, PowerBI and Microsoft Flow.

We will build a PowerApp that gathers wishes from the requesters. Those wishes are displayed in a Power BI realtime dashboard which is filled by the use of a Microsoft Flow.

Create the Power BI Dataset

  • Open a browser and navigate to: powerbi.com
  • Log in with your Power BI User Account
  • We need to create a Power BI Streaming dataset (which serves as the datasource for the Santa Wish Dashboard)

24dopp_day17_WishApp_01

  • Add a new streaming Dataset to your Workspace (Select API option)
  • Create the dataset with the following settings

24dopp_day17_WishApp_04

Dataset name:   SantaWishes

WishName      TEXT

Quantity          NUMBER

Requester        TEXT

Historic data analysis     ON

  • Create a new Power BI report based on the data set

24dopp_day17_WishApp_05

  • Create a report with…

Ursprünglichen Post anzeigen 431 weitere Wörter

Dataflows in Power BI: Overview Part 7 – External CDM Folders

BI Polar

One key aspect of Power BI dataflows is that they store their data in CDM Folders in Azure Data Lake Storage gen2.[1] When a dataflow is refreshed, the queries that define the dataflow entities are executed, and their results are stored in the underlying CDM Folders in the data lake.

By default the Power BI service hides the details of the underlying storage. Only the Power BI service can write to the CDM folders, and only the Power BI service can read from them.

NARRATOR:

But Matthew knew that there are other options beyond the default…

Because the CDM folder format is an open standard, any service or application can create them. A CDM folder can be produced by Azure Data Factory, Azure Databricks, or any other service that can output text and JSON files. Once the CDM folder exists, we just need to let Power BI know…

Ursprünglichen Post anzeigen 836 weitere Wörter

Blogging & „The Gap“

Mark Brummel Blog | Microsoft Dynamics NAV

A little over a week ago I attended the Dutch Dynamics Community again, for the first time in a while. It was good to catch up and exchange news with peers in my network.

„You’ve been quiet Mark!“ is what I’ve been told a few times and that’s true. For some time I try to be quiet and focus on things that are more personal like being a dad of five now and trying to manage the project of fixing up my Land Rover Defender.

It’s not a secret that for quite a while I was in big disagreement with the roadmap Microsoft was following and my resignation as Microsoft MVP was a direct result of that.

“You have enemies? Good. That means you’ve stood up for something, sometime in your life.” ― Winston Churchill

Last week I’ve joined over 1.400 of my fellow Navision enthousiasts into NAVTechDays. In the…

Ursprünglichen Post anzeigen 990 weitere Wörter

Positioning Power BI Dataflows

BI Polar

It’s been two weeks since Power BI dataflows became publicly available in preview, and there’s been a flood of interest and excitement as people have started exploring and using this new capability for self-service data preparation.

There has also been a growing trickle of confusion. To illustrate this confusion, I will share a few examples.

Here’s the first example: A comment thread from a few weeks ago on this post, initiated by Neville:

2018-11-22_12-01-55

The potential confusion here may come from the fact that Power BI dataflow entities are similar in purpose to tables in a data warehouse or data mart. But due to many factors (including the underlying storage being in files in a data lake) it is not generally appropriate to think of dataflows as a replacement for a data warehouse. While there may be exceptions to this rule, any time you hear someone saying you don’t need…

Ursprünglichen Post anzeigen 759 weitere Wörter

Solving Real World Scenarios with the Microsoft Power Platform

Time to play with dataflows. I have the following situation. For transactions in foreign currencies I need accurate fx rates. Can a dataflow solve this requirement?

First thing I did was to subscribe for a free plan here:

 

Then in Power BI service I created my very first dataflow. Basically the flow consists of one simple api call (well described on the fixer page):

And with some basic steps in the Power Query editor I ended up with this:

Now I can link every transaction in foreign currency to my central fx rates table. This is great but there was one important thing missing. I couldn’t figure out a way to store the results like it is possible with the streaming dataset feature.

So I was searching for an alternative solution. Why not use CDS for application to store the data? And then point the Power BI dataflow to the CDS entity:

I created a new entity as I could not find an existing one in the standard cds entities:

 

Afterwards I created a custom connector to the fixer api:

 

Next thing I needed was a flow to pull the data in my newly created entity:

And last of all I created another dataflow in Power BI pointing to the CDS entity:

 

Now I have the result I was looking for. A central fx rates table with historical data.

I have to admit figuring out all of this was fun. And best of all I could solve a real business need.