#64 — BUG: Async warning on saving LLM chat

Repo: Twill-AI/facade State: closed | Status: done Assignee: Unassigned

Created: 2024-08-08 · Updated: 2025-09-15

Description

Found on #56. Code

                    asyncio.create_task(
                        self.persist_history(
                            tenant=tenant,
                            llmchat_id=llmchat_id,
                            raw_history=raw_history,
                            llmchat_items=llmchat_items,
                        )
                    )

Related to https://github.com/encode/starlette/issues/919, at least it has ideas how to support such “abandoned coroutines”.

Repro:

  • Start LLM chat and wait answer completed.
  • Check Facade’s logs.

Expected: no warning errors.

Actual: produces error

12:58:58.239 ERROR runners:118 - Task exception was never retrieved
future: <Task finished name='Task-490' coro=<<async_generator_athrow without __name__>()> exception=RuntimeError('async generator ignored GeneratorExit')>
RuntimeError: async generator ignored GeneratorExit

Notes

Add implementation notes, blockers, and context here

Add wikilinks to related people, meetings, or other tickets