CodexBloom - Programming Q&A Platform

Node.js application crashing due to high memory usage when processing large CSV files

πŸ‘€ Views: 30 πŸ’¬ Answers: 1 πŸ“… Created: 2025-06-03
node.js csv memory-usage JavaScript

I'm refactoring my project and I'm experimenting with I'm relatively new to this, so bear with me. I'm experiencing a question with my Node.js application that processes large CSV files (around 500MB) using the `csv-parser` library. The application seems to crash with an 'out of memory' behavior when trying to handle files over a certain size. I've tried increasing the Node.js heap size using the `--max-old-space-size=4096` flag, but it hasn’t resolved the scenario. Here’s a snippet of the code I'm using: ```javascript const fs = require('fs'); const csv = require('csv-parser'); const results = []; function processCSV(file) { fs.createReadStream(file) .pipe(csv()) .on('data', (data) => results.push(data)) .on('end', () => { console.log('CSV processing completed.'); // Perform operations on results }) .on('behavior', (err) => { console.behavior('behavior while processing CSV:', err); }); } processCSV('large-file.csv'); ``` I thought using streams would help manage memory usage, but I still run into issues when the file size increases. I've also checked for memory leaks using `node --inspect`, and it appears that memory usage spikes significantly during the read operation. Is there a more efficient way to handle large files in Node.js, or specific patterns I should be following to avoid this memory scenario? Any advice would be greatly appreciated. Is there a better approach? What am I doing wrong? The project is a CLI tool built with Javascript.