Activity
Mon
Wed
Fri
Sun
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 7k • Free

Fabric Dojo 织物

Private • 210 • $39/m

18 contributions to Learn Microsoft Fabric
Fabric Losses?
There's a section here for Fabric wins, but could we have one for losses? Or misses? ha! I'm only half kidding. Wanted to share something that I have learned recently about Fabric, specifically related to deployment pipelines. We use deployment pipelines in most of our client projects. Typically, we follow dev, test, prod, but sometimes we use dev, test, qa, prod. We have really enjoyed and benefited from deployment pipelines the past several years before using Fabric. It works wonderfully with semantic models and reports. However, with Fabric data engineering items, not so much. Though, I'm told there's hope that there will be improvements soon. Things I've learned about fabric items when using deployment pipelines 1. dataflows don't work with deployment pipelines. You must manually export the json and import it in your next workspace. This means any changes you make in dev that are ready to move to test have to manually moved over. 2. data pipeline connections have to manually configured whenever you deploy from one workspace to another. There is not currently parameterized connections in pipelines, so you can't setup deployment rules to switch the connection like you can for other items (such as the lakehouse a semantic model points to or a parameterized connection string in a semantic model) 3. SQL tables will migrate from a data warehouse, but the data won't come with them. -- data will need to be manually loaded (or use a notebook or some other automation to pull in the data) 4. Similarly, manually loaded tables in a lakehouse don't get copied over. They will need to be manually created in the new lakehouse (tables created from notebooks can be set to auto create in the new workspace provided that you copy your notebook over as well) 5. shortcuts don't work with deployment pipelines -- they also need to be created manually I'm told that parameterization of data pipeline connections is coming in Q1 2025 and that dataflows are also set to start working in deployment pipelines in Q1 (though they were originally supposed to be available in Q4 2024).
3
4
New comment 3h ago
2 likes • 5h
oh, and also lakehouse schemas don't come over via deployment pipelines either.
Approach to attempt DP-600 exam
Hi Community members i have DP-600 exam scheduled on 30/12. for the exam prep i have gone through Will's 6 hour video and links provided under further learning resources. my question is would this be enough to pass the exam with 700 points? Thanks
4
14
New comment 3h ago
1 like • 5h
congrats!
Introducing Lakehouse Schemas 👀(public preview)
A new (long awaited) feature has been released, but we are still waiting for a official release blog - but here is a documentation page for the feature. It's the ability to create Schemas in a Lakehouse! Traditionally, it was only the Data Warehouse you could create schemas. But now it's possible in the Lakehouse too. This opens up a lot of different possible architectural patterns! Have a play around and see what you think. When you create a new Lakehouse, you will have the option to select 'Lakehouse Schemas'. Then, when you open your Lakehouse, you will see a few new options: - New schema - New schema shortcut (interesting!)
44
29
New comment 5h ago
Introducing Lakehouse Schemas 👀(public preview)
0 likes • 5h
I was on a call with a Microsoft MVP and one of his employees earlier this morning and we were talking about Deployment Pipeline limitations and schemas in lakehouses came up. Apparently one of the current limitations with deploying lakehouses is that schemas aren't carried over from one workspace lakehouse to another. So that's another thing to keep in mind.
The fastest way to (really) learn Fabric
The DP-600 exam is great for giving you an appreciation for a wide range of topics, but does it really prepare you for life as an Analytics Engineer? From speaking with a few of you, it seems you don't feel ready or comfortable yet, which is completely understandable! The DP-600 tests your understanding of the theory, but in the real world, you are measured by your experience and ability to build solutions (how you can apply the theory). As such, I always advise for people to get hands-on as early and as frequently as possible. One of the best Microsoft resources is this list of hands-on exercises which guide you through a number of scenarios, and help you build confidence with the different tools that Fabric provides. Have you tried these resources yet? Let me know what you think! PS: I'm thinking about building a collection of hands-on tutorials which cover a lot of the core analytics engineering/ data engineering (and maybe wider?) tasks that you could expect in your career - would you find that useful?
Complete action
56
52
New comment 1h ago
The fastest way to (really) learn Fabric
0 likes • 5h
@Anthony Baulo - I agree with this. I have taken some courses from Maven and I really like that they give "real world" example of a "request" for something that you have to develop a solution for. And they tell you to attempt it on your own first. And then the next module of the video is them walking you through the solution.
Passed DP-600!
Yesterday, I passed the DP-600 exam with a score of 883! I dedicated two weeks to studying the theoretical material, but my three months of hands-on experience with Microsoft Fabric at work were absolutely essential, particularly for the Data Engineering portion of the exam. Here are some tips for anyone preparing for this certification: 1. MS Learn Strategy: Save MS Learn for the final part of the exam to review marked questions. It can help you answer 3–4 questions with certainty. Syntax-related questions (Spark, SQL, KQL, etc.) are usually the easiest to cross-check in MS Learn. 2. Pre-November 15th Content: Material from before November 15th is still highly relevant. Special thanks to @WillNedham for the incredible 6-hour video resource! 3. First-Time onVue Exams: If this is your first onVue exam, read tips about the exam process itself. Understanding the format—sandbox environment, practice tests, case studies, etc.—is crucial for being well-prepared. 4. I found the dataflow-related questions challenging because they included large images and were quite detailed. Good luck to everyone studying for the DP-600! Credential Link: https://learn.microsoft.com/en-us/users/alejandroferrerahernandez-6233/credentials/9c251753e99e7888?ref=https%3A%2F%2Fwww.linkedin.com%2F
13
8
New comment 3h ago
2 likes • 5h
congrats!!! I agree that the pre-Nov video is still VERY applicable to the current exam. I took it on 12/20.
1-10 of 18
Lori Keller
3
20points to level up
@lori-keller-7509
I am a Director for a data engineering and analytics consulting firm. I am tasked with ensuring our clients have scalable solutions.

Active 7m ago
Joined Dec 16, 2024
powered by