CodexBloom - Programming Q&A Platform

Azure Data Factory: Incremental Load scenarios with 'Invalid Column Count' scenarios

👀 Views: 222 đŸ’Ŧ Answers: 1 📅 Created: 2025-06-08
azure-data-factory sql-server incremental-load json

I'm performance testing and I'm working on a project and hit a roadblock. I'm relatively new to this, so bear with me. I'm currently working on an Azure Data Factory pipeline that performs an incremental load from an on-premises SQL Server database to an Azure SQL Database. The pipeline uses a copy activity with a source dataset pointing to a SQL Server table and a sink dataset for the Azure SQL Database. However, I'm working with an 'Invalid Column Count' behavior during execution. The behavior message is as follows: `"The number of columns in the source does not match the number of columns in the destination."`. I've verified the schemas of both the source and destination tables, and they appear to match in terms of column names and data types. I am using the following settings in my copy activity: ```json { "source": { "type": "SqlSource", "query": "SELECT * FROM dbo.MyTable WHERE LastModified > @LastRunDate" }, "sink": { "type": "SqlSink", "tableName": "dbo.MyTable" }, "enableStaging": false } ``` I also set up a parameter `@LastRunDate` to control the incremental load, but I suspect the scenario might be due to the way the column mapping is handled. I've tried explicitly defining column mappings in the copy activity, but the question continues. Additionally, when running the pipeline in debug mode, I notice that it retrieves an empty dataset from SQL Server, which leads me to believe that the mapping might not be occurring correctly. I double-checked the parameter values being passed in, and they seem to be correct. Has anyone encountered this scenario before, or can someone point me in the right direction to resolve this 'Invalid Column Count' behavior? Any tips on configuring the copy activity for incremental loads would be greatly appreciated! I'm working on a web app that needs to handle this. I'm working on a service that needs to handle this. Any help would be greatly appreciated! How would you solve this?