CodexBloom - Programming Q&A Platform

How to implement guide with node.js and mysql connection pool timing out when using sequelize with large data sets

👀 Views: 5470 💬 Answers: 1 📅 Created: 2025-06-10
node.js sequelize mysql JavaScript

I've been researching this but I'm working with a frustrating scenario where my Node.js application, which uses Sequelize to interact with a MySQL database, is timing out when querying large data sets..... I'm using Node.js version 14.x and Sequelize version 6.6.2. My setup involves a MySQL connection pool configured as follows: ```javascript const { Sequelize } = require('sequelize'); const sequelize = new Sequelize('database', 'username', 'password', { host: 'localhost', dialect: 'mysql', pool: { max: 5, min: 0, acquire: 30000, idle: 10000, }, }); ``` When I execute a query that returns a large number of rows, such as: ```javascript const users = await sequelize.models.User.findAll({ where: { age: { [Op.gt]: 30 } } }); ``` I receive the following behavior: `behavior: connect ETIMEDOUT`. This happens consistently when the result set exceeds around 10,000 records. I've already tried increasing the `acquire` time in the pool configuration, but it hasn’t resolved the scenario. I've also verified that my database server can handle the queries efficiently outside of Node.js. I suspect that the connection pool might be struggling to manage the large data transfer or that there's a need for some additional configurations, especially with how Sequelize handles large datasets. I would appreciate any guidance on optimizing Sequelize or other strategies to avoid this timeout scenario. I'm working with Javascript in a Docker container on CentOS. What are your experiences with this? I'm using Javascript LTS in this project.