This is definitely an area I will be putting a lot of energy into in 2022. Welcome to the new world.
I’ve written lot of time ago a post on how to save a file from Dynamics 365 Business Central SaaS to an FTP server. The solution can be found here.
After this post I’ve received lots of requests about how to do the opposite: if someone sends me a file on an FTP server, how can I retrieve the file, parse it and then save the data into Dynamics 365 Business Central SaaS?
There are certainly different ways for doing that, but (as promised in the past days to some of you) in this post I want to describe what I think it’s one of the best solution in terms of performances, scalability and reliability. The schema of the solution is the following:
Here, we’re using an Azure Logic App for connecting to an FTP server (via the FTP connector). The Logic App retrieves the content of the uploaded…
Ursprünglichen Post anzeigen 1.245 weitere Wörter
Im preparation for the PL-200 exam I decided to give the PL-100 exam a try.
And I’m pleased it worked out quite well
You still have almost a month to sign up for the challenge.
Don’t miss the opportunity to get a free exam voucher.
Power BI dataflows is a nice capability of Power BI. It’s actually not a new Microsoft product. If you use Excel and Power BI long enough, there is a big chance that you’ve already used Power Query at least once. And basically, to keep things simple, it’s just Power Query, but you have to edit it on a web-browser, in Power BI Service.
Pro license is your entrance ticket to use this nice capability. Although Premium licenses (per capacity or per user) provide more advanced features, many applications of Power BI dataflows are already available with Pro license. One of them is to store historical data, and of course, to show them in reports.
In this blog post, I will briefly introduce Power BI dataflows (with some useful links if you want to learn more). Next, I’ll share how to store historical data in your own storage account and show…
Ursprünglichen Post anzeigen 3.046 weitere Wörter
Inspired by this great video and as a follow up to my last post regarding the Mastering DAX workshop I want to tell you about my findings regarding the DAX language:
DAX is simple,
but it is not easy.
This is so true. When I was sitting in the classroom, I realized in every example we did how true this statement is. If you see a working DAX formula you are tempted to say: “OK, no problem, this is easy”. But making a working DAX measure – which is really working all the time – no matter what the user is doing in terms of filtering and grouping is hard work.
If your measure is working in one situation but not in all situations
then your measure was wrong from the beginning.
Again, in many cases it seems that a measure is working. But as soon as you are changing some parameters or add an extra step of complexity it is not working at all. In many cases this means go back to start and do your work again.
Stick with the basics
Don’t bother about sophisticated functions.
Put all your efforts into filter context, row context, iterations, and context transition. I am still learning the hard way that this is the only way to success.
You can’t learn DAX by copying formulas
or patterns from the internet or books.
I tend to learn new things by browsing the internet. If I find a solution which is pointing me in the right direction, I usually succeed in solving my problem. With DAX I first thought ok cool do it in the same way at sometimes it seems to work but unfortunately other times not. So learning the basics is so important not just insert some CALUCULATE command here and there.
You can’t learn DAX by reading books
Only by practicing.
I have bought quite a few books in the past and by studying them I was fooling myself. I thought I know enough about DAX now. But the truth is I knew nothing. As soon as you are doing real live examples you will realize that without a solid understanding of the basics you are caught in an endless try and error loop.
So having learned all of this I feel at least in a position which points me in the right direction.
With every project I will gain confidence but nevertheless frustration will be my good companion for quite some time.
Stick to the basics
and try to think the DAX way.
Obwohl ich nun schon wirklich lange möglichst viele Bereiche zum Thema Power BI verfolge, ist die DAX Sprache nach wie vor herausfordernd für mich. Wobei mittlerweile denke ich, das geht jedem so…
Daher habe ich als Auffrischung unlängst den Mastering DAX workshop der allseits bekannten SQLBI Helden besucht.
Ich habe nun den Vergleich zum ebenfalls angebotenen Online Course und einer Schulung in „echt“.
Ich muss sagen, da liegen wirklich Welten dazwischen. So viele Kleinigkeiten, die in DAX aber sehr große Auswirkungen haben, können einfach nur face to face bewusst gemacht werden.
Die für mich wichtigsten Erkenntnisse sind zwar nichts Neues, aber man muss sie diese egal auf welchem Level man sich in DAX bewegt immer wieder vor Auge führen:
Obey the rules
Es gibt zwar oft mehrere Wege, um eine DAX Formel zu erstellen, aber im Prinzip fast immer nur eine richtige Lösung.
- The filter context filters data
- The row context computes data
Heißt einfach übersetzt, stimmt das Ergebnis einer Formel nicht, ist zu 99,99 % das die Ursache des Problemes.
CALCULATE löst viele Probleme
Entscheidend dabei ist aber, das man sich immer wieder vor Augen hält, wann CALCULATE was macht, warum CALCULATE das macht und wann CALCULATE allenfalls gar nicht erforderlich ist.
- CACULATE schafft neue Herausforderungen
- Stichwort Context transition
- Überträgt den aktuellen Filter context in einen äquivalenten Row Context
NEVER filter a table, always filter fields
Selbsterklärend hier geht es vor allem um Performance
Often I got the question from customers: „Can you assign my workspace to a premium capacity?“ But frequently they actually do not really need Power BI Premium. It remains to be a difficult topic to decide whether someone needs Power BI Premium or not. Therefore, I decided to setup a decision tree that helps to decide if you need Power BI Premium or not.
This decision tree highlights a bunch of Premium specific requirements and features like breaking the data size limits, XMLA Endpoints, unlimited content sharing and much more!
Ursprünglichen Post anzeigen 182 weitere Wörter
Man suche den passenden Kurs
Und schon ist man bereit, sein Wissen unter Beweis zu stellen:
Ok time for some serious Prototyping today. Based on the announcement stated below I had the perfect use case to finally test some of the functionality provided by Azure IoT Hub and Power BI Streaming dataflows.
Some time ago I bought this cool DevKit after I had attended a Microsoft workshop in Munich.
The walkthrough in this article is very precise.
I was able to successfully install all the required software on my local machine. Same is true for the Azure part and so I ended up with this – A piece of hardware sending real time data to my Azure IotT Hub. How cool is that.
The Power BI part was the last thing to do. Again, I followed this well written article.
Et voila I ended up with data which went all the way from my local machine to Azure from there to Power BI and back to my local Power BI Designer. I am still very impressed with the result. For me it was so much fun to glue all this stuff together. This really made me smile for the rest of the day.