Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 4.2k • Free

Fabric Dojo 织物

Private • 96 • $29/m

18 contributions to Learn Microsoft Fabric
Naming conventions for Fabric
Do you know/recommend some naming convention or standard for Fabric items and workspaces?
3
8
New comment 15d ago
2 likes • 19d
This has been an area of hot debate in my organization. The most robust convention I’ve found online is https://www.advancinganalytics.co.uk/blog/2023/8/16/whats-in-a-name-naming-your-fabric-artifacts?hs_amp=true. At first I thought I liked it and brought it forward. The idea gained some traction in the discussion but as soon as I went to implement it I immediately hated it. I feel like there are enough other indicators as to the artifact type to avoid needing to include the type in the name. Using that convention, at least in my brain, make crowded unreadable names. So I to am very curious to hear what others are doing!
2 likes • 19d
I do like the folders for helping give structure to the project but even more I enjoy using task flows. Having a logical map of a workspace has been so helpful and I love just clicking the task flow item in the flow diagram and it filtering all the artifacts for that step out. It’s really become how the projects are structured in my brain and makes finding things trivial (provided they have meaningful descriptive names as you mentioned!!)
Career Advice required
Hi, I am currently working as a BI Analyst and I have a good experience on SSIS, SQL and Power BI. Now I would like to take DP600 and looking to work as Fabric analytics engineer. But I am bit skeptical in taking up DP600 as I do not have experience using ADF or ADB. Please suggest if I can directly take DP600 certification or should I get some work experience in ADF and ADB first. Thanks
5
5
New comment 21d ago
4 likes • 24d
I wouldn’t attempt the test without playing in fabric a fair bit. Set up a trial tenant and work through each of the areas. Convert some projects you’ve done in SSIS into Fabric. Build them as dataflows. Then do them again in notebooks. Try them in lakehouse and warehouse. Only when you try and do real world stuff do you learn the nuances and limitations of each component. That’s what I’ve been doing lately and am hoping it will make the test taking really simple. It’s definitely making the information stick in my head better than just doing the learn courses.
Delete activity in a pipeline
Hi All, I create a small number of json files through a api, which get overwritten everytime time the pipeline runs. There is a very minute chance that one day a json file may not be overwritten, because we have gotten rid of the object and are no longer interested in its history (it will be rare but possible). I thought ..easy I'll insert a delete activity, but I can't seem to get the wildcard fucntionality to work (I am prolly being a dumbass here). I followed this link from MS and wildcards should work, and I have used them to ingest data in a similar ways many times (https://learn.microsoft.com/en-us/fabric/data-factory/delete-data-activity ) The activity step is super simple, If I take the wildcard out and leave filename blank, it will delete all files in the subfolder, which could work, though I would like to build in some more finesse, incorporating it in a loop and target filenames more selectively. What am I missing here? Cheers Hans
3
4
New comment 24d ago
Delete activity in a pipeline
1 like • 29d
I believe you need to select the wildcard file path radio button if you want to use wildcards.
Request for Feedback: Resume with 1.5 Years of Data Consulting Experience
I'd appreciate your feedback on my resume, which reflects 1.5 years of experience as a Data Consultant. I was accepted by a major company, but the CEO, who didn't interview me, mentioned wanting someone with more experience. I also had an interview with another company, but they offered a lowball deal of about $5 per hour, with 40% taxes on top of that. I've been applying to many positions, but I'm still not getting interviews. I'm using it as part of my cover letter. Dear Hiring Manager xxxx, I am writing to express my interest in the Data Analyst position at xxxx. With experience in data consulting and a history of improving analytics solutions, I believe I can contribute well to your team. In my current role as a Data consultant, I have the opportunity to work closely with clients, gaining an understanding of their unique data requirements, and providing continuous support throughout the project lifecycle. My solid experience in reverse engineering legacy Power BI reports, optimizing data models, and generating enhanced reports to meet evolving business needs has refined my skills in Power BI, SQL, and reverse engineering. Additionally, my grasp in DAX has allowed me to introduce new KPIs and improve analytics, thereby enhancing decision-making processes for clients. Coupled with my Microsoft DP-600 and PL300 certification, these skills equip me with the necessary expertise to excel in this role. Moreover, I am committed to continuous learning and staying updated on best practices, currently exploring Microsoft Azure (DP-203 certification). I am eager to leverage my capabilities to drive impactful insights and solutions for clients. Thank you for considering my application. I am excited about the opportunity to further discuss how my experience and skills align with the needs of your team. Please find my resume attached for your review. Sincerely, xxx Data Consultant
1
6
New comment 28d ago
Request for Feedback: Resume with 1.5 Years of Data Consulting Experience
2 likes • 29d
Looks like a solid entry level resume. There’s nothing super special about it but it’s hard when you don’t have a ton of experience. Last hiring round for an entry level job at my company we had 60 similar resumes. It’s really hard to pick between them. One thing I always like is resumes with links to GitHub or a portfolio. It really helps to see some actual code a candidate has written to get a feel for where they actually are at.
Running a notebook on Fabric
Good Day Team, A quick one, is it possible to run a notebook script that was created using Jupyter Notebook on Fabric Notebook that will be querying on-premises database and save the output to a local file folder ? Will the processing be much improved in terms of running the Notebook script? Thank you.
0
2
New comment 29d ago
1 like • 30d
Fabric notebooks do not currently have access to on-premise databases or file location via the data gateway. You need to use dataflows or data pipelines to load on prem data into a lakehouse prior to using spark notebooks. The only way I can think of to get it in directly via notebooks is if you publicly expose your database, but I don’t know too many organizations willing to do that.
1-10 of 18
Robert Lavigne
3
28points to level up
@robert-lavigne-4079
Healthcare data engineer and Fabric enthusiast.

Active 4h ago
Joined Aug 8, 2024
INTJ
Canada
powered by