CodexBloom - Programming Q&A Platform

Azure Data Factory: scenarios to Copy Data from Azure SQL Database to Blob Storage with Incorrect Format

👀 Views: 0 đŸ’Ŧ Answers: 1 📅 Created: 2025-06-12
azure data-factory blob-storage json

I tried several approaches but none seem to work. I'm trying to use Azure Data Factory (ADF) to copy data from my Azure SQL Database to Azure Blob Storage, but I keep running into an scenario where the data is not being formatted as expected in the Blob. I'm using ADF version 2.0 and have set up a pipeline that uses the Copy Data activity. The source dataset is configured to pull data from my Azure SQL Database, and the sink dataset is pointing to a Blob Storage container. I've set the format of the sink to 'DelimitedText' for CSV output. In my pipeline, I have the following settings: - Source Query: `SELECT * FROM Employees WHERE Active = 1` - Sink Format: Delimited Text with the following settings: - Column delimiter: Comma - Row delimiter: Newline - Escape character: Backslash Running the pipeline results in the following behavior: `Failed to copy data: The data type 'text' is not supported in the sink format 'DelimitedText'. (behavior Code: 2200)`. I've tried changing the data type of the 'Notes' field in my SQL table from `text` to `nvarchar(max)`, but I still encounter the same scenario. Additionally, I verified that my SQL Database allows ADF to access it, and I can run queries directly from the query editor without any problems. Here is the JSON definition for my Copy Data activity: ```json { "name": "CopyData", "type": "Copy", "inputs": [ { "referenceName": "SourceAzureSQL", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "SinkBlob", "type": "DatasetReference" } ], "typeProperties": { "source": { "type": "SqlSource" }, "sink": { "type": "BlobSink", "copyBehavior": "PreserveHierarchy" } } } ``` Is there a specific data type compatibility scenario with Azure Blob Storage and delimited text formats that I'm not aware of, or do I need to modify my dataset configurations further? Any guidance on how to troubleshoot this behavior would be greatly appreciated! For context: I'm using Json on Linux. I'm on Ubuntu 20.04 using the latest version of Json. Am I approaching this the right way?