CodexBloom - Programming Q&A Platform

Django Celery Task Timeout guide with Long-Running Background Jobs

👀 Views: 29 đŸ’Ŧ Answers: 1 📅 Created: 2025-06-16
django celery asynchronous Python

Could someone explain I'm working with a timeout scenario with my Django application where I'm using Celery for background tasks. My task is taking longer than the default timeout, which is causing it to be killed prematurely. I'm using Django 3.2 and Celery 5.1. I have the following task defined: ```python from celery import shared_task @shared_task(time_limit=300) def long_running_task(param): # Simulating a long-running process result = 0 for i in range(1, 10000000): result += i return result ``` When I run the task, I get the following behavior in the Celery worker logs: ``` [behavior/MainProcess] Task myapp.tasks.long_running_task[...].failed: TimeoutError() ``` I've tried increasing the `time_limit` in the task decorator, but it seems to have no effect. Additionally, I've set `CELERY_TASK_TIME_LIMIT` in my Django settings: ```python CELERY_TASK_TIME_LIMIT = 600 # 10 minutes ``` In my `celery.py` file, I've configured the broker as follows: ```python from celery import Celery app = Celery('myapp', broker='redis://localhost:6379/0') app.config_from_object('django.conf:settings', namespace='CELERY') app.autodiscover_tasks() ``` Despite these settings, the task still times out. I've also ensured that Redis is up and running fine. Is there something I'm missing here? Any advice on how to properly handle longer-running tasks in Celery would be greatly appreciated! This is happening in both development and production on macOS. Am I approaching this the right way?