CodexBloom - Programming Q&A Platform

Django app using Celery timing out with Redis backend for long-running tasks

πŸ‘€ Views: 52 πŸ’¬ Answers: 1 πŸ“… Created: 2025-06-09
django celery redis Python

I'm working on a Django application where I'm using Celery with Redis as the message broker. I've defined a long-running task that processes a large amount of data using Pandas and NumPy, but I keep running into a timeout scenario. When the task runs for more than 300 seconds, I get the behavior `behavior: Task <TaskName> timed out`. I've tried increasing the `task_time_limit` and `task_soft_time_limit` settings in my Celery configuration, but the timeout still occurs. Here’s a simplified version of my task: ```python from celery import shared_task import pandas as pd import numpy as np import time @shared_task(time_limit=600, soft_time_limit=550) def long_running_task(data): df = pd.DataFrame(data) time.sleep(400) # Simulating a delay for processing processed_data = df.apply(np.sqrt) # Example processing return processed_data.tolist() ``` In my `celery.py`, the setup looks like this: ```python from celery import Celery app = Celery('myapp', broker='redis://localhost:6379/0') app.conf.task_time_limit = 600 app.conf.task_soft_time_limit = 550 ``` I also ensured that my Redis server can handle long-running tasks by checking the configuration. The worker logs show that the task is being killed due to the timeout. Am I missing something in the configuration, or is there a different approach to handle long tasks effectively?