CodexBloom - Programming Q&A Platform

PostgreSQL: How to efficiently update rows with a JSONB field using a subquery?

πŸ‘€ Views: 78 πŸ’¬ Answers: 1 πŸ“… Created: 2025-06-12
postgresql jsonb performance sql

I'm getting frustrated with I'm working on a personal project and I'm working with performance optimization while trying to update a specific field in my PostgreSQL database that contains a JSONB column... My table structure is as follows: ```sql CREATE TABLE users ( id SERIAL PRIMARY KEY, profile JSONB, last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); ``` I need to update the `profile` field based on some conditions and then set `last_updated` to the current timestamp. Here’s what I tried: ```sql UPDATE users SET profile = jsonb_set(profile, '{age}', '30') WHERE profile->>'city' = 'New York'; ``` While this works, it’s quite slow when dealing with thousands of rows, and it locks the table for a noticeable amount of time. I also experimented with a subquery to limit the number of rows being updated: ```sql UPDATE users SET profile = jsonb_set(profile, '{age}', '30') WHERE id IN ( SELECT id FROM users WHERE profile->>'city' = 'New York' ); ``` However, this approach still doesn't seem efficient, and I often receive the message `behavior: deadlock detected` when multiple transactions try to run simultaneously. Is there a better way to handle this update operation, especially for large datasets? Any best practices or optimizations would be greatly appreciated! For context: I'm using Sql on Windows. My development environment is Linux. Any help would be greatly appreciated!