How to implement guide with async database connection pooling in fastapi with sqlalchemy 1.4
I've been banging my head against this for hours... I'm wondering if anyone has experience with I'm stuck on something that should probably be simple... I'm working with connection issues when trying to use SQLAlchemy's async capabilities with FastAPI. Specifically, I have the following setup where I'm using `AsyncSession` to manage my database connections, but I keep running into the behavior: `sqlalchemy.exc.OperationalError: (asyncpg) could not connect to server: Connection refused`. This happens intermittently, especially under heavy load. Hereβs a simplified version of my code: ```python from fastapi import FastAPI, Depends from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker from sqlalchemy.orm import sessionmaker DATABASE_URL = "postgresql+asyncpg://user:password@localhost/dbname" engine = create_async_engine(DATABASE_URL, echo=True) SessionLocal = async_sessionmaker(bind=engine, class_=AsyncSession, expire_on_commit=False) app = FastAPI() async def get_db() -> AsyncSession: async with SessionLocal() as session: yield session @app.get("/items/{item_id}") async def read_item(item_id: int, db: AsyncSession = Depends(get_db)): result = await db.execute(select(Item).filter(Item.id == item_id)) return result.scalars().first() ``` I've ensured that my PostgreSQL server is running and accessible. Sometimes the application works fine, but under load, it fails to establish new connections. I've also tried adjusting the connection pool settings, but they don't seem to have any effect. Am I missing something crucial in my configuration? Any help would be greatly appreciated. This is part of a larger CLI tool I'm building. How would you solve this? I'm working with Python in a Docker container on CentOS. Am I approaching this the right way? For reference, this is a production service. Is this even possible?