SQL Server 2019: Query Performance Degradation After Index Creation on Large Table
I'm trying to implement I've been researching this but I'm stuck on something that should probably be simple. I've encountered a important performance degradation in my SQL Server 2019 application after creating an index on a large table. The table, `Orders`, contains over 10 million rows, and the index was intended to optimize queries for the `CustomerId` column. However, after adding the index, my previously performant query is now taking over 30 seconds to execute compared to just 2 seconds before the index was created. Here's the query I'm running: ```sql SELECT * FROM Orders WHERE CustomerId = @CustomerId; ``` I used the following T-SQL to create the index: ```sql CREATE INDEX IX_CustomerId ON Orders (CustomerId); ``` After creating the index, I ran `UPDATE STATISTICS Orders` and performed a database consistency check to ensure there were no issues. However, the performance has not improved. I also checked the execution plan and noticed that the query is using the index scan instead of an index seek, which seems counterintuitive. I've tried rebuilding the index and updating statistics multiple times, but it hasn't resolved the scenario. Additionally, Iβve reviewed the fill factor setting, and itβs set to the default of 100%. Could this be part of the question? Are there any specific configurations or best practices I should be aware of when indexing large tables, particularly when the queries do not seem to benefit from the index? Any insights or suggestions would be greatly appreciated! My development environment is Ubuntu. How would you solve this? This is my first time working with Sql latest. Has anyone dealt with something similar? I'm on Ubuntu 22.04 using the latest version of Sql. Thanks for your help in advance! I'm coming from a different tech stack and learning Sql. Any help would be greatly appreciated!