CodexBloom - Programming Q&A Platform

Improving SQL Query Performance for Large Dataset Pagination in a Node.js API

đź‘€ Views: 183 đź’¬ Answers: 1 đź“… Created: 2025-09-06
Node.js PostgreSQL performance SQL

I'm trying to figure out Quick question that's been bugging me - I'm sure I'm missing something obvious here, but Been tasked with optimizing a Node.js REST API that serves paginated results from a large PostgreSQL database. The current implementation fetches all records for a given page, leading to performance issues as the dataset scales. Here’s a simplified version of the query I’m using: ```sql SELECT * FROM users ORDER BY created_at DESC LIMIT $1 OFFSET $2; ``` The `LIMIT` and `OFFSET` approach works for smaller datasets, but as the number of records grows, I’m noticing slow response times. I’ve tried using index on `created_at`, but it hasn’t made a significant difference. In an attempt to improve performance, I explored the idea of using keyset pagination instead. This would involve modifying the endpoint to accept a cursor instead of page numbers. Here’s what I’ve come up with: ```sql SELECT * FROM users WHERE created_at < $1 ORDER BY created_at DESC LIMIT $2; ``` The cursor would be the `created_at` value of the last item from the previous page. This approach should reduce the amount of data scanned by PostgreSQL, but I’m concerned about handling edge cases—especially when records are inserted or deleted between requests. Furthermore, I've also looked into caching strategies using Redis to store the results of frequently accessed queries. However, implementing this correctly to avoid stale data has proven challenging, especially with a rapidly changing user base. While I know that using a cache may help, I’m unsure how to best invalidate or update the cache effectively. What are the best practices for implementing efficient pagination in this context? Any insights into the cursor method or caching strategies would be greatly appreciated. Is there a better approach? I've been using Sql for about a year now. Thanks for your help in advance!