#26 — Implement LLM chat flow on frontend

Repo: Twill-AI/twill-ai-ui State: closed | Status: done Assignee: Unassigned

Created: 2024-08-06 · Updated: 2025-03-11

Description

implement LLM chat flow, such that the LLM can give feedbacks when queried.

AC:

  • Support (see Facade API) ChatPromptText, ChatResponseStatus, ChatResponseText, ChatResponseEcharts.

  • LLM plot not downloading detailed images. Especially when ‘table’ view is selected

Notes

Add implementation notes, blockers, and context here

Add wikilinks to related people, meetings, or other tickets