Azure Data Factory Copy Data Activity scenarios with 'Data Type Mismatch' scenarios on SQL to Cosmos DB
I'm migrating some code and I'm attempting to set up I just started working with I'm writing unit tests and I'm trying to implement I've looked through the documentation and I'm still confused about I'm using Azure Data Factory to copy data from an Azure SQL Database to a Cosmos DB instance, and I keep working with a 'Data Type Mismatch' behavior during the Copy Data Activity..... My Azure SQL table has a column defined as `VARCHAR(50)`, while the target Cosmos DB container is configured to accept `String`. Here's the relevant snippet from my Copy Data Activity configuration: ```json { "source": { "type": "SqlSource", "sqlReaderQuery": "SELECT Id, Name FROM dbo.Users" }, "sink": { "type": "CosmosDbSink", "writeBehavior": "Insert" } } ``` I have verified that there are no null values in the `Name` column, and I also tried removing additional columns temporarily to pinpoint the scenario. The behavior message I receive is: ``` Data type mismatch: expected String but got DataType of column 'Name' from source. ``` Additionally, I've checked the mappings and they seem correct. I even set the mapping manually like this: ```json { "mappings": [ { "source": "Name", "sink": "name" }, { "source": "Id", "sink": "id" } ] } ``` I've also tried updating the schema in Cosmos DB, but the behavior continues. Is there something I could be overlooking in my data types or mappings? Any help would be appreciated! My development environment is Linux. This is happening in both development and production on Windows 10. I'd really appreciate any guidance on this. This is for a mobile app running on Windows 11. Has anyone else encountered this? This issue appeared after updating to Json LTS. My team is using Json for this web app. Am I missing something obvious? This is my first time working with Json 3.11. Is there a simpler solution I'm overlooking?