GCP Pub/Sub message delivery latency spikes with batch processing in Node.js Cloud Functions
I'm performance testing and I've been banging my head against this for hours. I'm experiencing significant latency spikes when processing messages from a Pub/Sub topic in my Node.js Cloud Function. I've set up the function to handle batch processing of messages but noticed that under certain conditions, the processing time can increase dramatically, sometimes exceeding 30 seconds per batch. This is particularly evident when the batch size exceeds 10 messages. Hereβs a snippet of how I'm currently processing the messages: ```javascript const { PubSub } = require('@google-cloud/pubsub'); const pubsub = new PubSub(); exports.processMessages = async (message, context) => { const messages = JSON.parse(Buffer.from(message.data, 'base64').toString()); const batchPromises = messages.map(async (msg) => { // Simulate processing time await processMessage(msg); }); await Promise.all(batchPromises); }; async function processMessage(msg) { // Simulating some asynchronous processing return new Promise((resolve) => setTimeout(resolve, 2000)); } ``` I'm using Node.js 14 and have configured the function to trigger on a Pub/Sub topic. The function is deployed with a maximum of 400 MB memory and a timeout of 60 seconds. My understanding is that Pub/Sub should handle concurrent execution, but Iβm not sure why Iβm seeing such latencies when I increase the batch size. I have also tried adjusting the max instances setting in the Cloud Function configuration but it doesn't seem to alleviate the issue. Additionally, I monitored the Pub/Sub metrics and noticed that there are occasional spikes in message backlog during these latency periods. Could this be a configuration issue, or is there something specific in how I'm handling batch processing that might be causing these delays? Any insights or recommendations would be greatly appreciated. Cheers for any assistance! Thanks for taking the time to read this!