Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.clearmaas.com/llms.txt

Use this file to discover all available pages before exploring further.

Set stream: true to receive incremental tokens as Server-Sent Events instead of one final response. Latency-to-first-token drops to a single network round-trip.

OpenAI-compatible (Chat / Responses)

curl -N https://api.clearmaas.com/v1/chat/completions \
  -H "Authorization: Bearer sk-clearmaas-..." \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o-mini",
    "messages": [{"role":"user","content":"Tell me a haiku."}],
    "stream": true
  }'
Each line is data: {...}. Stream ends with data: [DONE]. To get the final usage object inside the stream, pass stream_options: { include_usage: true } — the chunk just before [DONE] will include token counts.

Anthropic Messages

Anthropic uses named SSE events. On ClearMaas’s first-class Anthropic surface, the full set Anthropic emits comes through directly:
event: message_start
event: content_block_start
event: ping
event: content_block_delta
event: content_block_stop
event: message_delta
event: message_stop
Each event is followed by a data: {...} JSON line.
curl -N https://api.clearmaas.com/v1/messages \
  -H "Authorization: Bearer sk-clearmaas-..." \
  -H "Content-Type: application/json" \
  -H "anthropic-version: 2023-06-01" \
  -d '{
    "model": "anthropic/claude-sonnet-4.6",
    "max_tokens": 256,
    "messages": [{"role":"user","content":"Tell me a haiku."}],
    "stream": true
  }'

Errors during a stream

Errors emitted mid-stream cannot use HTTP status codes (the status was sent when the stream opened). See Operations / Errors for the in-band error shapes.

Streaming and fallback

Once any byte of the response has been sent to the client, ClearMaas can no longer fall back to the next chain entry — see the streaming caveat in Model Fallbacks.

Next steps

Tool calling

Stream tool-call deltas as they arrive.

Errors

Handle mid-stream failures.