#224 — BUG: prompt is visible in logs

Repo: Twill-AI/facade State: closed | Status: done Assignee: Unassigned

Created: 2024-12-03 · Updated: 2025-09-15

Description

Caused by https://twill-network.slack.com/archives/C07TPN6FCBX/p1733181323097299

Steps to reproduce:

  1. Probably start LLM chat, but while LLM generates answer close the tab.

Expected: prompt is not printed in logs.

Actual behavior: 2024-12-02 23:11:36.928|ERROR|-no-tenant|ws_connection_manager:337|901aed68-659d-4f33-9bc8-ff89e51a74b6: Can't handle message with data='sid=6 error=None type=<PacketType.ChatPromptText: 'ChatPromptText'> data=ChatPromptText(text='show me a bar chart instead', origin_type=<LlmChatOriginTypes.Widget: 'widget'>, origin_id=8)' log

Notes

Add implementation notes, blockers, and context here

Add wikilinks to related people, meetings, or other tickets