Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public โ€ข 5.5k โ€ข Free

Nightscape Photography School

Private โ€ข 497 โ€ข Free

Content Savage Squad

Public โ€ข 331 โ€ข Free

Film & TV Skool

Private โ€ข 303 โ€ข Free

Landscape Photography Academy

Private โ€ข 74 โ€ข $7/m

Event Photography 101

Public โ€ข 137 โ€ข Free

Real Estate Photography

Private โ€ข 26 โ€ข Free

Fabric Dojo ็ป‡็‰ฉ

Private โ€ข 204 โ€ข $39/m

10 contributions to Learn Microsoft Fabric
Creating .zip in notebook
I got bunch of .delta table which in need to save as .json and zip them together to submit it tothrid party tool for futher processing. I was able to get all .delta tables into .json files in a single folder,. My consolidated .json folder is 'Files/temp/consolidated/' (see image below), but when trying to zip them together getting error: not able to find path with zipf.write(json_file_path, arcname=f) code: with zipfile.ZipFile('sample.zip', 'w') as zipf: for f in filenames: json_file_path = f"Files/temp/consolidated/{f}" print (json_file_path) zipf.write(json_file_path,arcname=f) what i learnt was: Microsoft Fabric Context: The Fabric environment uses a virtual file system, which doesnโ€™t provide direct file path access as in traditional file systems. zipfile.ZipFile.write Limitation: This method expects a direct file path, which isn't available for Fabric paths. Any one had this scenario, how did they get around it?
0
1
New comment 25d ago
Creating .zip in notebook
2 likes โ€ข 25d
This is now sorted out, should be api path to the folder. import zipfile files = [file.name for file in mssparkutils.fs.ls('Files/temp/json/consolidated') if file.isFile and file.name.endswith('.json')] archive = "/lakehouse/default/Files/temp/json/consolidated/brdsalljsondeflated.zip" # Open the tar archive in write mode with gzip compression with zipfile.ZipFile(archive, "w",compression=zipfile.ZIP_DEFLATED) as zf: for file in files: # Provide the full path to the file being added zf.write(f'/lakehouse/default/Files/temp/json/consolidated/{file}',arcname=file) Good reference to: All The Ways to Compress and Archive Files in Python
Lakehouse - Folders / Files
Hi Everyone, Has anyone experienced issues with folders or files being deleted in Fabric Lakehouse? I had a folder named 'Attachments' in the Lakehouse, which contained around 350k documents as part of a data migration process. These documents were converted from base64 strings and saved in the Lakehouse folder. However, this folder has now disappeared (not sure how :)). Does anyone know how to track the audit to determine what caused its deletion or if there's a way to restore it? Raised a ticket with MS and awaiting response. MS doc on Soft delete for Onelake files : https://learn.microsoft.com/en-us/fabric/onelake/onelake-disaster-recovery#soft-delete-for-onelake-files
1
5
New comment Oct 23
1 like โ€ข Oct 21
Capacity to which workspace was attached was not enabled with BCDA or Soft Delete, MS says to recover the folder they need to ensure it has been deleted, for which they need to look into audit logs. Now BCDA is enabled on the capacity :)
3 likes โ€ข Oct 23
We got this sorted out, we looked into Audit logs for 'DeleteFileOrBlob' but were not able to find anything and further digging into other actions, we found that the folder was moved into sub-folder by one of contributor mistakenly. Thing is when you delete file or folder you get user confirmation, but not with move action. Checked with Product Group if we can lock folder for delete and move and response was "Write (e.g., move, delete) includes Read, so denying write also denies read, therefore currently we cannot disable move/delete."
Notebooks
Newbie question- Iโ€™m assigning values to variables in a notebook and I want to submit those variables as parameters to another notebook. What is the recommended way of doing this? Thanks
3
2
New comment Sep 17
3 likes โ€ข Sep 17
check this :)
What are you working on this week (preferably with Fabric)?
Happy Monday to all the Fabricators! I'm interested... what are you building this week? Any blockers that we can help with?
8
36
New comment Jul 11
What are you working on this week (preferably with Fabric)?
1 like โ€ข Jul 8
I am looking into configuring a Service Principal for an external application to access a Fabric workspace, read data via the SQL endpoint, and write back to the Fabric Warehouse. Has anyone experienced this, and what challenges did you encounter?
0 likes โ€ข Jul 8
@Sujitkumar Chavan check this, may be helpful => Microsoft Fabric โ€“ So how do I build a Data Warehouse without Identity Columns and Merge? - Purple Frog Systems
Creating surrogate key with an Identity column in Delta lake
Just wanted to know how others are dealing with creating surrogate key with auto-increment in delta table. Seems this is something supported in Databricks, but not in fabric. in Databricks: CREATE OR REPLACE TABLE demo ( id BIGINT GENERATED ALWAYS AS IDENTITY, product_type STRING, sales BIGINT ); Going forward, the identity column titled "id" will auto-increment whenever you insert new records into the table. You can then insert new data like so: INSERT INTO demo (product_type, sales) VALUES ("Batteries", 150000);
0
2
New comment Jun 20
0 likes โ€ข Jun 20
That's the way we got it now.
1-10 of 10
Sreedhar Vengala
2
1point to level up
@sreedhar-vengala-7284
Cloud Data Architect primarily working with Microsoft Technologies, based in Brisbane, Australia.

Active 4h ago
Joined Jun 7, 2024
powered by