Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 4.2k • Free

4 contributions to Learn Microsoft Fabric
Migrating Power BI Reports into the Fabric Eco System
This post might highlight there is so much to learn, which is why I value any steer from the community. We are invested in using Fabric, but we have lots of 'legacy' Power BI Reports being produced via Power BI Report Server and Power BI Service platforms. We have been testing the 'best way' to migrate a 'legacy' Power BI report into the Fabric ecosystem. So far we have identified to possible approaches: 1) start from scratch but copy the 'transformation steps' of the Power BI report into a dataflow. Modify the source from legacy to an appropriate Fabric Lakehouse that holds the same data. We are pointing the output destination of the dataflow to a 'gold' Lakehouse - which is essentially equivalent to the Tables of the legacy Power BI. We then need to manually recreate relationships and add any measures that existed in the Legacy Power BI Report.. becomes a manual nightmare! 2) we start with the legacy Power BI Report - create a desktop version, modify the source so it points to the relevant Fabric Hosted Lakehouse tables, and then publish it to the relevant Fabric Workspace. This seems to work .. until it doesn't. We get the Semantic Model and Report Fabric Items, the latter has all of the legacy measures but we struggle when we need to fix things, we are currently having to go back to the Desktop report adjust things there and then republish. Option 1 seems a lot of manual work, and option 2 requires a standalone desktop Power BI Report to exist, neither options are therefore sustainable. I am specifically referring to the migration of legacy Power BI Report Server reports, I am assuming Power BI Service hosted reports will be somewhat easier. How on earth do you modify the semantic models within Fabric if you have published the report to Fabric from Power BI Desktop might be the better title for this post :-)
2
5
New comment 22d ago
0 likes • 22d
hi @Surm Man thanks for your comment. Just to check when refer to 'uploading' PBIRS to MS Fabric is that the same as publishing via Desktop. Is there a way to upload directly from PBIRS?
1 like • 22d
@Will Needham thanks so much this is so relevant and very helpful
Triggering Fabric Pipelines
Our current strategy is to use Databricks as a 'data engine' bringing relevant data in to Databricks before short cutting it to Fabric (Bronze) for subsequent processing and consumption. My question is there a way to orchestrate the start Fabric data pipeline after Databricks job of moving the raw data to bronze finishes. It might seem a little unconventional to use Databricks to extract the data initially but that is direction we have adopted, any guidance on how we might trigger the pipeline (in Fabric) after the Databricks extraction would be much appreciated.
1
2
New comment 29d ago
Gaining stakeholder support for Fabric - with a security lens
Hi community, I am making good, steady progress with our Fabric deployment (still in POC stage) what I am struggling with is joining some dots and presenting a straightforward story to certain members of our security stakeholder community who are concerned with hosting data in Fabric. What I was hoping the community could help me with is a list of the top ten/twenty objections to deploying Fabric. So I can craft a suitable proactive response against it. I watched iRobot for the first time at the weekend - recommended - 20 + years old now! and what I relate to is this comment: "you must ask the right questions" or put another way "Have the answers to the questions before they are asked :-)"
0
1
New comment Jul 30
Monitoring Fabric Usage - best practice
I have just started a Fabric pilot and have a single F4 capacity, our user community are being encouraged to start the trial fabric licence, which I understand means they individually enjoy F64 SKU capacity. The plan is to migrate people over to our purchased capacity as and when their trials end. We want to gather some intelligence around resource usage across each trial to anticipate any challenges with our lower SKU capacity. With this context, I have some questions: 1) How do I monitor the usage of the trial capacities? 2) Is the only way to accurately x-charge Fabric usage via individual capacities, or is it possible to use reporting to show usage by workspace? Every day I learn a little more, and so please share your own experiences.
0
1
New comment May 2
1-4 of 4
Mark Thacker
1
1point to level up
@mark-thacker-6922
I help companies improve their data quality and data management by driving alignment between data and business objectives.

Active 2d ago
Joined Apr 29, 2024
United Kingdom
powered by