Azure Data Factory: How to Handle Multiple Sources with Dynamic Content in a Single Pipeline
I tried several approaches but none seem to work... I'm deploying to production and I'm converting an old project and I've been struggling with this for a few days now and could really use some help..... I've been banging my head against this for hours. I'm working on an Azure Data Factory (ADF) pipeline that needs to process data from multiple sources, specifically Blob Storage and SQL Database, and I'm running into issues with dynamic content. I want to read data from different containers and tables based on the parameters passed to the pipeline, but I'm not sure how to set this up dynamically. I've created a pipeline where I use a parameter for the source type. For example, if the source type is 'Blob', I want to dynamically set the container name and file path. However, when I try to use dynamic content in the Copy Data activity, I get the behavior: "Failed to resolve linked service with name 'BlobLinkedService'." Here's a simplified version of my pipeline JSON that includes the parameters: ```json { "name": "CopyDataPipeline", "properties": { "activities": [ { "name": "CopyFromBlob", "type": "Copy", "inputs": [ { "referenceName": "BlobSourceDataset", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "SqlSinkDataset", "type": "DatasetReference" } ], "source": { "type": "BlobSource" }, "sink": { "type": "SqlSink" } } ], "parameters": { "sourceType": { "type": "String" }, "containerName": { "type": "String" }, "filePath": { "type": "String" } } } } ``` I've also defined the datasets with parameters for the container name and file path, but it seems that the linked service isn't being recognized dynamically. I've tried using expressions in the dataset settings, such as `@pipeline().parameters.containerName`, but I keep hitting the same behavior. Is there a specific way to ensure that the linked service and datasets are used correctly when parameters are involved? Any guidance or best practices for dynamically handling multiple sources in Azure Data Factory would be greatly appreciated. My development environment is Linux. Any help would be greatly appreciated! Has anyone else encountered this? Any help would be greatly appreciated! Thanks for taking the time to read this!