#71 — [Bug]: Query page doesn’t handle LlmEngineError on first prompt.
Repo: Twill-AI/twill-ai-ui State: closed | Status: done Assignee: Unassigned
Created: 2024-10-21 · Updated: 2025-03-11
Description
Steps to reproduce
To reproduce:
- In staging login under any user, open new Query page and send a prompt which would force LlmEngine to make an error. Currently (2024-10-21) https://github.com/Twill-AI/twill-llm-engine/issues/169 may be used.
- Wait some time → see Expected 1
- Refresh the page and wait again → see Expected 2
What was the expected behavior?
- I expected it to show error indicating that some error happens on back-end and Twill thread won’t answer. Provide ability to retry prompt.
- Page doesn’t indicate error happened but looks like not loaded for a long time.
Screenshots
- First error
So everything is stuck, no idea what happened and there is no ability to retry prompt.
Fact that LlmEngineError happened can be found by GET /llmchats/1/items response.
- Second error
Note that this time front-end makes GET https://staging.twillai.com/api/v1/llmchats/1/items?limit=1&types=ChatResponseSummary request and is not trying to upload other LLM Chat items (i.e. only types=ChatResponseSummary was asked).
Notes
Add implementation notes, blockers, and context here
Related
Add wikilinks to related people, meetings, or other tickets