Opsmeter logo
Opsmeter
AI Cost & Inference Control
Integration docs

Developer Docs

Find what caused your AI bill. Ship telemetry fast with provider-agnostic LLM cost tracking and token-and-cost attribution.

Updated for 2026API v1GitHub

No-proxy telemetry

No-proxy LLM telemetry: track cost after each call

Opsmeter is production-ready without an SDK wrapper. Keep your provider call path unchanged, map provider usage fields, and send telemetry asynchronously with a short timeout.

Implementation model

  1. Call provider SDK/API and read usage metadata.
  2. Normalize payload with endpointTag, promptVersion, userId, and token fields.
  3. Send ingest asynchronously with timeout + swallow to protect request latency.

What is available now (No-SDK)

  • Endpoint, promptVersion, and user/tenant attribution.
  • Budget warning and exceeded workflows.
  • Export, retention policy, and compare workflows.

What is roadmap (SDK-enabled runtime)

  • Automatic wrappers for common frameworks.
  • Runtime clamp/fallback/queue enforcement patterns.
  • Policy contracts for machine-readable guardrail actions.