AI-powered chat with streaming SSE response
POST
/chat/stream
Same intent classification as POST /chat, but streams the answer as Server-Sent Events. For stats and analysis intents, LLM tokens are streamed incrementally. For search/conversation/action intents, the answer is sent as a single token event. A final "done" event contains metadata (conversation_id, results, action, display_in).
Authorizations
BearerAuth
JWT token from login
Type
HTTP (bearer)
or
AccessToken
Personal access token. Value: Token <your-token>
Type
API Key (header: Authorization)
Request Body
application/json
JSON "query": "string", "conversation_id": "string", "project_id": "string"
{
}
Responses
SSE stream with events: token ({"text":"..."}), done ({"conversation_id":"...", "results":[], "action":null, "display_in":"panel|chat"}), error ({"error":"..."})
text/event-stream
JSON
"string"