CodexBloom - Programming Q&A Platform

Laravel 10: implementing queued jobs timing out when handling large data sets

πŸ‘€ Views: 12 πŸ’¬ Answers: 1 πŸ“… Created: 2025-06-12
laravel queues jobs performance PHP

I'm converting an old project and I'm sure I'm missing something obvious here, but I'm working with a question with queued jobs in Laravel 10 when processing large data sets... I have a job that handles importing CSV data, which can sometimes exceed 10,000 rows. When I run the job, it's timing out and failing with the behavior message `The job was released back to the queue because the process was killed.` I've also increased the timeout setting in the queue configuration to 300 seconds, but it still fails. Here's a snippet of my job class: ```php class ImportCsvJob implements ShouldQueue { use Dispatchable, InteractsWithQueue, SerializesModels; public function handle() { $csvData = $this->getCsvData(); // This method reads the CSV foreach ($csvData as $row) { // Process each row User::create($row); } } } ``` I've tried splitting the CSV into smaller chunks and dispatching multiple jobs, but it doesn’t seem efficient as it increases the overall processing time. I've also looked into using `chunk` method in Eloquent but that seems to apply to fetch operations rather than the rows I am inserting. Is there a way to optimize this job or handle larger data sets without hitting the timeout? Would increasing the `queue:work` process memory limit help? Any suggestions would be appreciated! My development environment is Linux. For reference, this is a production service.