CodexBloom - Programming Q&A Platform

SQL Server: Performance implementing large JSON data parsing using OPENJSON in SQL Server 2019

👀 Views: 5 💬 Answers: 1 📅 Created: 2025-06-05
sql-server json performance sql

I'm refactoring my project and After trying multiple solutions online, I still can't figure this out. I'm stuck on something that should probably be simple. I'm experiencing important performance optimization when parsing large JSON data using the `OPENJSON` function in SQL Server 2019. My current implementation involves a table where one of the columns contains a JSON string that can be quite large, sometimes exceeding 50KB. When I query this table, I use `OPENJSON` to extract specific values from the JSON string. However, the performance degradation is noticeable as the size of the JSON increases. Here’s a simplified version of what I’m doing: ```sql SELECT t.Id, t.Name, j.value FROM MyTable t CROSS APPLY OPENJSON(t.JsonColumn) WITH ( value NVARCHAR(100) '$.desiredField' ) j WHERE t.Name = 'Example'; ``` In tests with moderate-sized JSON strings (around 20KB), the query completes in under a second. However, with larger JSON strings, I've seen execution times balloon to several seconds. I’ve tried indexing the `JsonColumn`, but it didn’t seem to help much. Additionally, I’m not sure if the way I’m using `CROSS APPLY` is optimal for performance. I’ve also considered breaking the JSON down into smaller parts or normalizing it into separate rows, but that would require considerable changes to my current schema. Is there any way to improve the performance of parsing large JSON using `OPENJSON`? Are there best practices I should follow to handle these larger JSON objects more effectively? Any insights would be greatly appreciated! How would you solve this? Any ideas what could be causing this? For context: I'm using Sql on Debian.