Power BI dataflows to store historical data

Data Azure

Power BI dataflows is a nice capability of Power BI. It’s actually not a new Microsoft product. If you use Excel and Power BI long enough, there is a big chance that you’ve already used Power Query at least once. And basically, to keep things simple, it’s just Power Query, but you have to edit it on a web-browser, in Power BI Service.

Pro license is your entrance ticket to use this nice capability. Although Premium licenses (per capacity or per user) provide more advanced features, many applications of Power BI dataflows are already available with Pro license. One of them is to store historical data, and of course, to show them in reports.

In this blog post, I will briefly introduce Power BI dataflows (with some useful links if you want to learn more). Next, I’ll share how to store historical data in your own storage account and show…

Ursprünglichen Post anzeigen 3.046 weitere Wörter

DAX is simple, but it is not easy

Inspired by this great video and as a follow up to my last post regarding the Mastering DAX workshop I want to tell you about my findings regarding the DAX language:

DAX is simple,

but it is not easy.

This is so true. When I was sitting in the classroom, I realized in every example we did how true this statement is. If you see a working DAX formula you are tempted to say: “OK, no problem, this is easy”. But making a working DAX measure – which is really working all the time – no matter what the user is doing in terms of filtering and grouping is hard work.

If your measure is working in one situation but not in all situations

then your measure was wrong from the beginning.

Again, in many cases it seems that a measure is working. But as soon as you are changing some parameters or add an extra step of complexity it is not working at all. In many cases this means go back to start and do your work again.

Stick with the basics

Don’t bother about sophisticated functions.

Put all your efforts into filter context, row context, iterations, and context transition.  I am still learning the hard way that this is the only way to success.

You can’t learn DAX by copying formulas

or patterns from the internet or books.

I tend to learn new things by browsing the internet. If I find a solution which is pointing me in the right direction, I usually succeed in solving my problem. With DAX I first thought ok cool do it in the same way at sometimes it seems to work but unfortunately other times not. So learning the basics is so important not just insert some CALUCULATE command here and there.

You can’t learn DAX by reading books

Only by practicing.

I have bought quite a few books in the past and by studying them I was fooling myself. I thought I know enough about DAX now. But the truth is I knew nothing. As soon as you are doing real live examples you will realize that without a solid understanding of the basics you are caught in an endless try and error loop.

So having learned all of this I feel at least in a position which points me in the right direction.

With every project I will gain confidence but nevertheless frustration will be my good companion for quite some time.

Stick to the basics

and try to think the DAX way.

Mastering DAX workshop

Obwohl ich nun schon wirklich lange möglichst viele Bereiche zum Thema Power BI verfolge, ist die DAX Sprache nach wie vor herausfordernd für mich. Wobei mittlerweile denke ich, das geht jedem so…

Daher habe ich als Auffrischung unlängst den Mastering DAX workshop der allseits bekannten SQLBI Helden besucht.

https://www.sqlbi.com/cert/yjlt1zzu-211003/

Ich habe nun den Vergleich zum ebenfalls angebotenen Online Course und einer Schulung in „echt“.

Ich muss sagen, da liegen wirklich Welten dazwischen. So viele Kleinigkeiten, die in DAX aber sehr große Auswirkungen haben, können einfach nur face to face bewusst gemacht werden.

Die für mich wichtigsten Erkenntnisse sind zwar nichts Neues, aber man muss sie diese egal auf welchem Level man sich in DAX bewegt immer wieder vor Auge führen:

Obey the rules

Es gibt zwar oft mehrere Wege, um eine DAX Formel zu erstellen, aber im Prinzip fast immer nur eine richtige Lösung.

  • The filter context filters data
  • The row context computes data

 (nothing else)

Heißt einfach übersetzt, stimmt das Ergebnis einer Formel nicht, ist zu 99,99 % das die Ursache des Problemes.

CALCULATE löst viele Probleme

Entscheidend dabei ist aber, das man sich immer wieder vor Augen hält, wann CALCULATE was macht, warum CALCULATE das macht und wann CALCULATE allenfalls gar nicht erforderlich ist.

  • CACULATE schafft neue Herausforderungen
  • Stichwort Context transition
    • Überträgt den aktuellen Filter context in einen äquivalenten Row Context

NEVER filter a table, always filter fields

Selbsterklärend hier geht es vor allem um Performance

Do I need Power BI Premium?

Data - Marc

Often I got the question from customers: „Can you assign my workspace to a premium capacity?“ But frequently they actually do not really need Power BI Premium. It remains to be a difficult topic to decide whether someone needs Power BI Premium or not. Therefore, I decided to setup a decision tree that helps to decide if you need Power BI Premium or not.

This decision tree highlights a bunch of Premium specific requirements and features like breaking the data size limits, XMLA Endpoints, unlimited content sharing and much more!

Ursprünglichen Post anzeigen 182 weitere Wörter

Azure IoT Hub and Power BI Streaming dataflows

Ok time for some serious Prototyping today. Based on the announcement stated below I had the perfect use case to finally test some of the functionality provided by Azure IoT Hub and Power BI Streaming dataflows.

Streaming dataflows in Power BI premium now available in public preview | Microsoft Power BI Blog | Microsoft Power BI

Some time ago I bought this cool DevKit after I had attended a Microsoft workshop in Munich.

An all-in-one IoT kit built for the cloud (microsoft.github.io)

The walkthrough in this article is very precise.

Connect an MXCHIP AZ3166 to Azure IoT Hub quickstart | Microsoft Docs

I was able to successfully install all the required software on my local machine. Same is true for the Azure part and so I ended up with this – A piece of hardware sending real time data to my Azure IotT Hub. How cool is that.

The Power BI part was the last thing to do. Again, I followed this well written article.

Streaming dataflows (preview) – Power BI | Microsoft Docs

Et voila I ended up with data which went all the way from my local machine to Azure from there to Power BI and back to my local Power BI Designer. I am still very impressed with the result. For me it was so much fun to glue all this stuff together. This really made me smile for the rest of the day.

Back on track – my learning journey

The first half year was very challenging for me. After a as always to short vacation I’m ready to start learning again.

Today I took another Fundamentals exam. Some questions are pretty basic some not. Nevertheless I like to take such exams mainly to get a feeling what is expected from new candidates in a certain field.

Microsoft Certified: Azure Data Fundamentals – Credly

Apart from that I will spend 3 days to refresh my DAX skills. I will attend the Mastering DAX Workshop offered by sqlbi.com. I know some of their video courses and the are brilliant. So I’m really looking forward to meet these experts in person.

Mastering DAX Workshop – Amsterdam – September 30-October 2, 2021 – SQLBI

Microsoft Exam DA-100

In 2018 I passed the MCSA BI Reporting exam which is no longer available.

So time to renew the Power BI part of this certification which nowadays seems to be Exam DA-100.

To pass the exam you should have spent some time with Power BI. As I’m more interested in data design, connectors and other technical stuff behind the scenes the exam was a bit of a challenge for me. Nevertheless it worked out…

https://www.credly.com/badges/a9b50481-2792-4399-950b-09419748a606/public_url