Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 5.5k • Free

16 contributions to Learn Microsoft Fabric
Fabric Capacity pricing vs resources (F2, F4,....)
The price of Fabric capacities doubles when going up in F Capacities. Does it mean that a given F Capacity has twice as much resources as the one below has? Apart from Memory and Compute, is an increase of any other resource that is factored into the price increase? If the increase of Memory and Compute alone governs the price increase, does it mean that going up a level in F capacity means doubling the Memory and Compute? Can we say F64 has 8 times as much resources as F8 has?
0
0
Future of Power Query vs Python for ETL in Fabric
Power Query (Dataflows gen2) has a user friendly GUI to generate ETL code, and hand coding is occasionally required to handle tricky cases. In contrast, impelemting ETL in Python mostly involves hand writing code. What is the future direction in MS Fabric in terms of Power Query vs Python? Is Power Query engine being improved so that Python will not be required? Or, is Python going to be the de-facto ETL language in Fabric? Instead of investing in time and effort to master both of these languages, is it worthwhile to focus on one and master it?
5
7
New comment Sep 12
Migrating Power BI Reports into the Fabric Eco System
This post might highlight there is so much to learn, which is why I value any steer from the community. We are invested in using Fabric, but we have lots of 'legacy' Power BI Reports being produced via Power BI Report Server and Power BI Service platforms. We have been testing the 'best way' to migrate a 'legacy' Power BI report into the Fabric ecosystem. So far we have identified to possible approaches: 1) start from scratch but copy the 'transformation steps' of the Power BI report into a dataflow. Modify the source from legacy to an appropriate Fabric Lakehouse that holds the same data. We are pointing the output destination of the dataflow to a 'gold' Lakehouse - which is essentially equivalent to the Tables of the legacy Power BI. We then need to manually recreate relationships and add any measures that existed in the Legacy Power BI Report.. becomes a manual nightmare! 2) we start with the legacy Power BI Report - create a desktop version, modify the source so it points to the relevant Fabric Hosted Lakehouse tables, and then publish it to the relevant Fabric Workspace. This seems to work .. until it doesn't. We get the Semantic Model and Report Fabric Items, the latter has all of the legacy measures but we struggle when we need to fix things, we are currently having to go back to the Desktop report adjust things there and then republish. Option 1 seems a lot of manual work, and option 2 requires a standalone desktop Power BI Report to exist, neither options are therefore sustainable. I am specifically referring to the migration of legacy Power BI Report Server reports, I am assuming Power BI Service hosted reports will be somewhat easier. How on earth do you modify the semantic models within Fabric if you have published the report to Fabric from Power BI Desktop might be the better title for this post :-)
3
5
New comment Aug 28
0 likes • Aug 21
Have you considered uploading your *.pbix files downloaded from Power BI Report Server (PBIRS) to Microsoft Fabric? If your data sources remain on-premises, you'll need to configure an On-premises Data Gateway so that Power BI reports on the cloud can fetch data from on-prem sources. We have found that *.pbix files meant for PBIRS can also be uploaded to Power BI Service / Fabric without having to make any modifications (except the data source connectivity). If your on-prem *.pbix have RLS, then those reports need some conversion and re-configuration for RLS on the cloud.
0 likes • Aug 28
@Mark Thacker you can upload pbix files to MS Fabric. If you don't have pbix files, you can download them from PBIRS. AFAIK, it is not possible to deploy pbix files directly from PBIRS to Fabric
Slowly Changing Dimensions (SCD) Type 2 with Dataflow Gen2
Microsoft just released a new tutorial on implementing Type 2 SCD with a Dataflow: 👉🔗 https://learn.microsoft.com/en-us/fabric/data-factory/slowly-changing-dimension-type-two
21
19
New comment Sep 16
Slowly Changing Dimensions (SCD) Type 2 with Dataflow Gen2
2 likes • Aug 27
@Axel C. If you asking the purpose of SCD type 2, this is from Kimball who formalised SCD types https://www.kimballgroup.com/2008/09/slowly-changing-dimensions-part-2/
List of scheduled ETL tasks (Dataflows, Pipelines, Notebooks)
Is there a unified place in Fabric that contains a list of all the ETL tasks that have been scheduled for execution? Similar to "SQL Server Agent > Jobs" of on-prem SQL Server?
0
2
New comment Aug 22
0 likes • Aug 22
@Will Needham What is "metadata-driven approach to running ETL"?
1-10 of 16
Surm Man
3
43points to level up
@surm-man-2004
Power BI & Data Engineer. (MEng, DP-600, 70-778, DP-900, SSIS, SSAS, SSRS, SQL, Power BI)

Active 3d ago
Joined Mar 26, 2024
powered by