#71 — [Bug]: Query page doesn’t handle LlmEngineError on first prompt.

Repo: Twill-AI/twill-ai-ui State: closed | Status: done Assignee: Unassigned

Created: 2024-10-21 · Updated: 2025-03-11

Description

Steps to reproduce

To reproduce:

  1. In staging login under any user, open new Query page and send a prompt which would force LlmEngine to make an error. Currently (2024-10-21) https://github.com/Twill-AI/twill-llm-engine/issues/169 may be used.
  2. Wait some time see Expected 1
  3. Refresh the page and wait again see Expected 2

What was the expected behavior?

  1. I expected it to show error indicating that some error happens on back-end and Twill thread won’t answer. Provide ability to retry prompt.
  2. Page doesn’t indicate error happened but looks like not loaded for a long time.

Screenshots

  1. First error image

So everything is stuck, no idea what happened and there is no ability to retry prompt.

image

Fact that LlmEngineError happened can be found by GET /llmchats/1/items response.

  1. Second error

image

Note that this time front-end makes GET https://staging.twillai.com/api/v1/llmchats/1/items?limit=1&types=ChatResponseSummary request and is not trying to upload other LLM Chat items (i.e. only types=ChatResponseSummary was asked).

Notes

Add implementation notes, blockers, and context here

Add wikilinks to related people, meetings, or other tickets