Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 6.5k • Free

Learn Power Apps

Private • 2k • $3/m

10 contributions to Learn Microsoft Fabric
Connecting multiple Apache Airflow jobs
I have 3 airflow jobs each with different tasks; airflow job_1 has task1 and 2, job_2 with task3 and job_3 with task4. Which would be the best way to connect the jobs in that order? I noticed data pipeline doesn't have a provision for an airflow job activity (or something like that).
1
1
New comment 17d ago
Reminder: Request a FREE DP-600 voucher starting today.
Hi everyone. The submission window for FREE DP-600 voucher requests opened an hour ago!... Make sure to submit a request before the vouchers run out. Check here 👍
19
26
New comment 20d ago
1 like • 21d
@Muhammad fahad Muhammad jawed on November 21st. I completed the challenge series on 20th, and my progress didn't reflect. Someone from Microsoft contacted me this week and shared the screenshots to prove I had completed the challenge and also sent the concern to fabric community. The sent me a voucher code in my inbox the next day.
1 like • 20d
@Muhammad fahad Muhammad jawed login to Home - Microsoft Fabric Community and check your private messages.
Applying Azure data engineering jobs
I´m looking for work and got a call from a recruiter. The job is a Azure data engineering job, the client uses Data Factory, Synapse and Databricks among others (apparently Fabric in future too but that was not on req list). I´ve been now studying for DP-600 and preparing for the exam. I have also prior background in other data tools. I filled in a skill assesment form for the job and put most things to "no experience" or "1" for e.g. Synapse, which is true (on scale 1-5). The recruiter called me back saying that "hey, you know very much already and are experienced, just put like 3 or 4 there". I´m a bit hesitant but is the skills from e.g. Fabric Data Factory so transferrable that I can just put 3 or 4 on Azure Datafactory? For context, I´m a Finn and in my culture belittling one´s own skills is seen as a virtue. The recruiter was just laughing and said that if I was from some other culture I would have just put all 4s and 5s across the board :D
4
5
New comment 9d ago
3 likes • 20d
Well, it seems you are having an imposter syndrome which is fine; most people get that when getting in a new role especially a senior one. Just be confident in what you are familiar with and use your first few weeks to play catch up with tools and terminologies you are not familiar with even if it means scheduling your free time to do that.
Having Trouble With Dp-600 Exam Voucher?
Hi guys, So, apparently there has been a challenge in getting vouchers after competing the Microsoft Ignite challenge: Microsoft Fabric. This is mainly because the challenge progress is not updating after completing all the modules. This results to Microsoft contacting you, requesting that you finish the modules so as to get the free voucher. The best way to resolve this and get a voucher is to send back a private message to whoever contacted you with a screenshot of the email you received after completing the challenge and one for modules completed page. You can also send the same (indicating your fabric community username) via email to fabric-ready@microsoft.com I hope it helps someone.
2
1
New comment 20d ago
Help Needed: Pipeline->Dataflows->Lakehouse->PowerBI
In the pre Fabric days is when I was fairly good with PowerBI and would use the Desktop for all the steps of importing data, transforming and then creating reports. The client I am working with has Fabric and we want to do it "properly" but find I am getting lost with a few stages. I have a workspace with the premium feature enabled with the diamond icon Can someone explain, if this is possible? I may have the steps or technical terms mixed up, but this is my general understanding of what I'm trying to achieve: 1. Import an on premise SQL into Fabric (Datapipeline?) 2. create a Lakehouse for this data 3. Transform and clean the data (Dataflow) 4. have a custom (or default) semantic model attached 5. Import the Lakehouse as a data source into PowerBI Desktop so that it inherits the semantic model AND data 6. Create reports/dashboards in Desktop 7. Publish: Once reports/dashboards are published they are refreshed based on the Lakehouse (frequency set by the Dataflow?) 8. Be able to modify the entire workflow as the needs evolve At the moment this last step (modify the workflow) seems to be the hardest part... If this is too vague then I can provide some specific examples of the steps in which I feel like I am close to achieving this but am blocked. Thanks!
2
4
New comment 26d ago
2 likes • Nov 19
I think the main issue here is confidence. You have a pretty good understanding of what you need to do to have a good report for your client and from what you have outlined as your steps, I think they are quite alright. For importing on premise data via a data pipeline, the copy data activity should come in handy, where you should be able to create a new lakehouse(say lakehouse_1) and new tables. You will be able to monitor the output and also create a schedule on when/frequency to run the pipeline. Before diving to lakehouse_1 to create a dataflow gen 2, I would suggest you create lakehouse_2 in your working space first. Head back to Lakehouse_1, create a dataflow gen 2 and Get data from Lakehouse_1. Do all your transformations using power query and set the destination for the transformed data to lakehouse_2. You can create a refresh schedule here too. In Lakehouse_2, manually add your table to the deafault semantic model. In the SQL analytics viewpoint, head to lakehouse settings and turn on sync for the default PowerBI semantic model. Then head to Your PowerBI desktop and get data via Power BI semantic models and select Lakehouse_2 and create your report. I don't quite understand No. 8 maybe you can elaborate more so we can help to figure out how to go about it.
1 like • Nov 19
Yeah. Identify the number of data sources you have, and map out the point at which you want to ingest them from and when you plan to merge the data and you will be fine. Then later, you can sit down with the team and identify how to optimize the process.
1-10 of 10
Wilfred Kihara
3
45points to level up
@wilfred-kihara-4301
All about Data

Active 15h ago
Joined Mar 9, 2024
powered by