Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 5.5k • Free

CyberDojo

Private • 48.1k • Free

#TheTechHustle Community 🤝🏽

Private • 260 • $15/m

16 contributions to Learn Microsoft Fabric
Varchar max limit increased for data warehouse?
I've found the below blog states that, Microsoft Fabric increased the VARCHAR(MAX) will not truncated at 8kb and it'll support like mb now and mentioned as a private preview.Did anybody has that feature and just wanted to know is this really working if it really works and not truncated the big string values I can wait for this to implement a feature in my current project. Thanks, Article link: https://blog.fabric.microsoft.com/en-us/blog/working-with-large-data-types-in-fabric-warehouse/
3
2
New comment 22d ago
Varchar max limit increased for data warehouse?
0 likes • 22d
@Mubaraq Abdulmaleek yaah!! This is working now
Seeking Efficient Method to Retrieve Latest Incremental Files in Microsoft Fabric Data Pipeline
I'm working on a Microsoft Fabric Data Pipeline where my use case is to get the latest files from a specific folder to process them incrementally. I have implemented the following logic: 1. Get Metadata Activity (activity name: Get Meta data_PF) – This activity retrieves the list of files inside the parent folder along with their properties: - Child Items - Exists - Item Name - Item Type - Last Modified 2. ForEach Activity – I use a ForEach loop to iterate over the items returned by the Get Meta data_PF activity and apply the "Filter by Last Modified" option. The filtering uses: - Start Time (UTC): @variables('start_date') - End Time (UTC): @variables('end_date') This pipeline runs daily and pulls files incrementally based on the date range. The files are fetched from an FTP server and copied to the Lakehouse for further processing. Currently, I process two files per day (e.g., 2024-10-21_Find.csv and 2024-10-21_Auto.csv), but I’m concerned about scalability. If the folder eventually contains thousands or even 100k files, iterating through all of them to find the required ones could be a performance bottleneck. My Current Idea: Since each file follows a date-based naming convention (e.g., 2024-10-21_Find.csv and 2024-10-21_Auto.csv), my idea is to: 1. Fetch the last successful run date from an audit table (e.g., 2024-10-19). 2. Find the difference between the last run date and the current date (e.g., 2024-10-22 - 2024-10-19 = 3 days). 3. Generate an array containing the expected filenames for these days: ["2024-10-20","2024-10-20","2024-10-21","2024-10-21"] Then transform it into: ["2024-10-20_Find.csv","2024-10-20_Auto.csv","2024-10-21_Find.csv","2024-10-21_Auto.csv"] 4. Use this array in the Copy Activity to fetch only the relevant files without iterating through all files in the folder. Senior’s Suggestion: A senior colleague suggested that instead of using ForEach to iterate over all files, I could: 1. Use Lookup activity to fetch the file list and order it by the "Last Modified" timestamp in descending order.
0
0
Seeking Efficient Method to Retrieve Latest Incremental Files in Microsoft Fabric Data Pipeline
Array Variable passed to Notebook activity help
Hi Everyone, It seems that there is no straight forwqard way of passing an array variable from a ForEach activity to use as input to a Notebook activity. I would love to hear how this is being handled in the community. Any docs or examples would be fantastic Thank you
0
6
New comment Oct 9
0 likes • Oct 9
I think this might help any one.Who needs a immediate code base solution.
How to Configure Tumbling Window Triggers in Microsoft Fabric Similar to Azure Data Factory for Dependent Pipeline Execution?
How can I implement a Tumbling Window Trigger in Microsoft Fabric, similar to Azure Data Factory, where the functionality is used to trigger a pipeline based on the successful completion of another pipeline? What are the steps to configure such a dependent trigger in Microsoft Fabric, and how does it compare to Azure Data Factory's Tumbling Window Trigger? Thanks in advance for any help or insights! @Will Needham Would you please share you're thoughts on this?
0
0
Data Warehouse Pipeline - files from FTP server
Hello, I’m really new to Fabric and I’m trying to configure a Data warehouse pipeline to get csv files from a `root` folder in an FTP Client. The challenge is that it doesn’t take all the folder but only the first file. I’ve already checked this with ChatGPT but it explains that the menu should have the option to select a folder, but it doesn’t appear. Any hints on how this could be fixed? Thanks in advance. Daniel
0
4
New comment Oct 6
Data Warehouse Pipeline - files from FTP server
0 likes • Oct 4
I hope you need to select only one file from the root that part I'm not very sure how to do that one but if you want don't lookup to other folders then there's a option in COPY ACTIVITY and you have to disable the option called "Recursively".This will allow us to not search under any folder in your root dir.
0 likes • Oct 6
@Daniel Rodríguez ok,then would you please check use the option "Schema agnostic (binay copy)".and store it as a files in lakehouse and then you can do further transformation with in lakehouse or move it to Warehouse.
1-10 of 16
Avinash Sekar
2
10points to level up
@avinash-sekar-5216
Learner

Active 17d ago
Joined Jun 10, 2024
powered by