#106 — Prepare and provide LLM chat for dashboard
Repo: Twill-AI/facade State: closed | Status: done Assignee: Unassigned
Created: 2024-09-13 · Updated: 2025-09-15
Description
GET /dashboards/{id} doesn’t have “serving LLM chat” ID. Also this chat should have some information about the dashboard at the start and don’t recreate it on multiple calls to API.
AC:
- GET /dashboards/{id} creates LLM chat for dashboard if it doesn’t exist. It populates it with LlmEngine-generated information (in background task, i.e. after returning response).
- POST /dashboards, GET /dashboards/{dashboard_id}, PATCH /dashboards/{dashboard_id} provide
serving_llmchat_idfield. Design and file task with the same purpose for widget. Probably it would be separate GET /dashboards/{id}/widget/{id} API endpoint.Included into scope. API for widgets is not required.
Implementation details
New LLM Chats are created during dashboard creation/patching to serve dashboard and inner widgets. LLM Chats are empty - they would be populated with “first/explanation” text in scope of https://github.com/Twill-AI/facade/issues/111
All POST /dashboards, GET /dashboards/{dashboard_id}, PATCH /dashboards/{dashboard_id} API-s return serving_llmchat_id field for dashboard and inner widgets in response.
Notes
Add implementation notes, blockers, and context here
Related
Add wikilinks to related people, meetings, or other tickets