#227 — BUG: Error in LLM Chat after all responses were sent in WS
Repo: Twill-AI/facade State: closed | Status: done Assignee: Unassigned
Created: 2024-12-03 · Updated: 2025-03-24
Description
Steps to reproduce:
- Go to the expanded widget page
- Ask for a modification
- The modification works but the frontend also receives an error that prevents it from saving correctly
Expected behavior: I expected the backend to just produce the “DONE” message
Actual behavior:
Looking through the logs it appears it is a serialization error when trying to persist the chat_history json from langgraph.
Here is an example trace of that type of error:
chunks = self.iterencode(o, _one_shot=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/encoder.py", line 258, in iterencode
return _iterencode(o, 0)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/encoder.py", line 180, in default
raise TypeError(f'
## Notes
_Add implementation notes, blockers, and context here_
## Related
_Add wikilinks to related people, meetings, or other tickets_