Was für die Feiertage…
One key aspect of Power BI dataflows is that they store their data in CDM Folders in Azure Data Lake Storage gen2. When a dataflow is refreshed, the queries that define the dataflow entities are executed, and their results are stored in the underlying CDM Folders in the data lake.
By default the Power BI service hides the details of the underlying storage. Only the Power BI service can write to the CDM folders, and only the Power BI service can read from them.
But Matthew knew that there are other options beyond the default…
Because the CDM folder format is an open standard, any service or application can create them. A CDM folder can be produced by Azure Data Factory, Azure Databricks, or any other service that can output text and JSON files. Once the CDM folder exists, we just need to let Power BI know…
Ursprünglichen Post anzeigen 836 weitere Wörter
Der Ansatz hat es wirklich in sich, FIFO Lagerführung in DAX abzubilden. Finde das sehr spannend, um beispielweise Bewegungen aus ERP-Systemen zu analysieren. Natürlich ist das noch keine vollwertige Lagerbewertung, aber ein super „Starter Pack“ um darauf aufzubauen. Vielen Dank an PHILIP SEAMARK
A little over a week ago I attended the Dutch Dynamics Community again, for the first time in a while. It was good to catch up and exchange news with peers in my network.
„You’ve been quiet Mark!“ is what I’ve been told a few times and that’s true. For some time I try to be quiet and focus on things that are more personal like being a dad of five now and trying to manage the project of fixing up my Land Rover Defender.
It’s not a secret that for quite a while I was in big disagreement with the roadmap Microsoft was following and my resignation as Microsoft MVP was a direct result of that.
“You have enemies? Good. That means you’ve stood up for something, sometime in your life.” ― Winston Churchill
Last week I’ve joined over 1.400 of my fellow Navision enthousiasts into NAVTechDays. In the…
Ursprünglichen Post anzeigen 990 weitere Wörter
It’s been two weeks since Power BI dataflows became publicly available in preview, and there’s been a flood of interest and excitement as people have started exploring and using this new capability for self-service data preparation.
There has also been a growing trickle of confusion. To illustrate this confusion, I will share a few examples.
Here’s the first example: A comment thread from a few weeks ago on this post, initiated by Neville:
The potential confusion here may come from the fact that Power BI dataflow entities are similar in purpose to tables in a data warehouse or data mart. But due to many factors (including the underlying storage being in files in a data lake) it is not generally appropriate to think of dataflows as a replacement for a data warehouse. While there may be exceptions to this rule, any time you hear someone saying you don’t need…
Ursprünglichen Post anzeigen 759 weitere Wörter
Time to play with dataflows. I have the following situation. For transactions in foreign currencies I need accurate fx rates. Can a dataflow solve this requirement?
First thing I did was to subscribe for a free plan here:
Then in Power BI service I created my very first dataflow. Basically the flow consists of one simple api call (well described on the fixer page):
And with some basic steps in the Power Query editor I ended up with this:
Now I can link every transaction in foreign currency to my central fx rates table. This is great but there was one important thing missing. I couldn’t figure out a way to store the results like it is possible with the streaming dataset feature.
So I was searching for an alternative solution. Why not use CDS for application to store the data? And then point the Power BI dataflow to the CDS entity:
I created a new entity as I could not find an existing one in the standard cds entities:
Afterwards I created a custom connector to the fixer api:
Next thing I needed was a flow to pull the data in my newly created entity:
And last of all I created another dataflow in Power BI pointing to the CDS entity:
Now I have the result I was looking for. A central fx rates table with historical data.
I have to admit figuring out all of this was fun. And best of all I could solve a real business need.