CodexBloom - Programming Q&A Platform

FastAPI Streaming Response scenarios with 'ValueError: too many values to unpack' when using async generator

๐Ÿ‘€ Views: 33 ๐Ÿ’ฌ Answers: 1 ๐Ÿ“… Created: 2025-06-08
fastapi sqlalchemy asyncio Python

I need some guidance on I'm trying to implement a streaming response in my FastAPI application that reads data from a database and returns it in chunks. I have an async generator that fetches rows from an SQLAlchemy session, but when I try to return the response, I encounter a `ValueError: too many values to unpack (expected 2)` behavior. Hereโ€™s the relevant part of my code: ```python from fastapi import FastAPI, Response from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker from sqlalchemy.future import select from sqlalchemy import Column, Integer, String, select from sqlalchemy.orm import declarative_base Base = declarative_base() class Item(Base): __tablename__ = 'items' id = Column(Integer, primary_key=True) name = Column(String) value = Column(Integer) # Database setup DATABASE_URL = "postgresql+asyncpg://user:password@localhost/db" engine = create_async_engine(DATABASE_URL, echo=True) SessionLocal = async_sessionmaker(bind=engine, expire_on_commit=False) app = FastAPI() async def get_items(): async with SessionLocal() as session: result = await session.execute(select(Item)) async for item in result.scalars(): yield f'{item.name}: {item.value}\n' @app.get('/items/stream') async def stream_items(): return Response(get_items(), media_type='text/plain') ``` I have tried changing the way I'm yielding from the generator and using `result.all()` instead of `scalars()`, but that introduces additional complexity and doesnโ€™t resolve the scenario. I also experimented with returning a list instead of a streaming response, which works, but I really need the streaming functionality for performance reasons. The behavior seems to occur when FastAPI tries to handle the async generator output. How can I properly implement this streaming response without running into unpacking errors? This is happening in both development and production on CentOS.