#227 — BUG: Error in LLM Chat after all responses were sent in WS

Repo: Twill-AI/facade State: closed | Status: done Assignee: Unassigned

Created: 2024-12-03 · Updated: 2025-03-24

Description

Steps to reproduce:

  1. Go to the expanded widget page
  2. Ask for a modification
  3. The modification works but the frontend also receives an error that prevents it from saving correctly

Expected behavior: I expected the backend to just produce the “DONE” message

Actual behavior:

Looking through the logs it appears it is a serialization error when trying to persist the chat_history json from langgraph.

See https://portal.azure.com#@ca84ad29-936a-41a9-a523-35c3760c1302/blade/Microsoft_OperationsManagementSuite_Workspace/Logs.ReactView/resourceId/%2Fsubscriptions%2F9fada0e8-ad9d-4af4-ae3e-c50dadb7296c%2FresourceGroups%2Frg-staging-twill-eastus%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Fworkspacergstagingtwilleastusad9c/source/LogsBlade.AnalyticsShareLinkToQuery/q/H4sIAAAAAAAAA2WPzarCQAyF9z5F6No%252BwiykC0H8ARW3EttjjYwzZZJ75V58eKcqWjCLEHLOF06qGIwlIE26ropBo8c8trqv5qMbXU9IoGpgWfIFeyXnqKi5VONWQlvaVbwvj1xzgxKs9qMFcWgon8ru%252BnlAqVgdzqiN4pHsrwM1bDC5gEQpRKPZZrUkRRL28s8HjyKH6FLsoRHl2maz69sUOU%252Bmm%252FFj%252F0rmvqM%252B9R2SSgxujV%252Fph6H2Zj70UF5AlVu4xyt3WTu5ni4BAAA%253D/timespan/P1D/limit/1000

Here is an example trace of that type of error:

chunks = self.iterencode(o, _one_shot=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/encoder.py", line 258, in iterencode
return _iterencode(o, 0)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/encoder.py", line 180, in default
raise TypeError(f'

## Notes

_Add implementation notes, blockers, and context here_

## Related

_Add wikilinks to related people, meetings, or other tickets_