CodexBloom - Programming Q&A Platform

SQL Server: Handling Large Batch Inserts with Error Logging When Using Table-Valued Parameters

👀 Views: 35 đŸ’Ŧ Answers: 1 📅 Created: 2025-06-06
sql-server table-valued-parameters error-handling SQL

I'm optimizing some code but I'm working with SQL Server 2019 and trying to perform large batch inserts using table-valued parameters (TVPs). While testing with around 10,000 rows, I occasionally encounter an error that states: `The INSERT statement conflicted with the FOREIGN KEY constraint`. I want to log any errors that occur during the insert process without stopping the entire batch, but I'm not sure how to implement this effectively. Here's the code I've set up for inserting the data using a stored procedure: ```sql CREATE TYPE MyDataType AS TABLE ( ID INT PRIMARY KEY, Name NVARCHAR(100), RelatedID INT REFERENCES RelatedTable(ID) ); CREATE PROCEDURE InsertMyData @Data MyDataType READONLY AS BEGIN SET NOCOUNT ON; BEGIN TRY INSERT INTO MyTable (ID, Name, RelatedID) SELECT ID, Name, RelatedID FROM @Data; END TRY BEGIN CATCH -- Log the error DECLARE @ErrorMessage NVARCHAR(4000), @ErrorSeverity INT, @ErrorState INT; SELECT @ErrorMessage = ERROR_MESSAGE(), @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE(); INSERT INTO ErrorLog (ErrorMessage, ErrorDate) VALUES (@ErrorMessage, GETDATE()); THROW; -- Rethrow the error after logging END CATCH END; ``` I call this stored procedure from C# using Dapper like so: ```csharp var dataTable = new DataTable(); dataTable.Columns.Add("ID", typeof(int)); dataTable.Columns.Add("Name", typeof(string)); dataTable.Columns.Add("RelatedID", typeof(int)); // Populate dataTable with 10,000 rows of data... using (var connection = new SqlConnection(connectionString)) { var parameters = new DynamicParameters(); parameters.Add("@Data", dataTable.AsTableValuedParameter("MyDataType")); connection.Execute("InsertMyData", parameters, commandType: CommandType.StoredProcedure); } ``` The problem arises when foreign key constraints are violated. Instead of aborting the entire insert operation and rolling back the transaction, I want to continue inserting valid rows while logging any errors for the problematic ones. I've tried adjusting the `INSERT` statement to handle errors using a `MERGE` statement but that didn't work as expected. Is there a recommended approach to manage this scenario effectively? How can I modify my stored procedure so it can continue with valid inserts and log errors appropriately? Any advice or examples would be greatly appreciated! For reference, this is a production application.