Django Celery Task Not Retrieving Updated Model State After Database Commit
I've been struggling with this for a few days now and could really use some help. I'm trying to figure out I'm facing an issue where a Django Celery task is not retrieving the updated state of a database model after performing a database commit in the same request... I'm using Django 4.2 and Celery 5.2. The task is supposed to process some data that relies on a model's state, but it seems to operate on the old state instead. Here's the relevant part of my code: ```python # views.py from django.shortcuts import get_object_or_404 from .models import MyModel from .tasks import my_celery_task def my_view(request, pk): obj = get_object_or_404(MyModel, pk=pk) obj.some_field = 'new value' obj.save() # This commits the new state to the database my_celery_task.delay(obj.pk) # Called after the commit return HttpResponse('Task enqueued') ``` And my Celery task looks like this: ```python # tasks.py from celery import shared_task from .models import MyModel @shared_task def my_celery_task(obj_id): obj = MyModel.objects.get(pk=obj_id) # This fetches the object print(obj.some_field) # This always prints the old value ``` In the `my_celery_task`, even though I'm committing the change before calling the task, it still prints the old value of `some_field`. I tried adding a delay before the task call, thinking the task might run too quickly after the commit, but that didn't change anything. Also, I’ve verified that the session is correctly closed after the save. Is there something I'm missing here regarding how Celery tasks interact with the Django ORM? Do I need to ensure that the task is using a fresh connection to the database somehow? Any insights would be appreciated. For reference, this is a production application. I'm working with Python in a Docker container on Ubuntu 22.04. Is this even possible? This issue appeared after updating to Python 3.9.