Opsmeter logo
Opsmeter
AI Cost & Inference Control
Integration docs

Developer Docs

Ship telemetry fast with LLM cost tracking and OpenAI cost monitoring. Request-level budget alerts are available on Pro+ plans.

Updated for 2026API v1GitHub

Integration examples

Provider changes the usage mapping. The Opsmeter payload stays the same.

Language
Provider

Provider call + usage extraction

Varies by provider
// Install: dotnet add package OpenAI
// OpenAI usage extraction (.NET)
var response = await openAiClient.Responses.CreateAsync(/* ... */);
var usage = response.Usage;
var inputTokens = usage?.InputTokens ?? usage?.PromptTokens ?? 0;
var outputTokens = usage?.OutputTokens ?? usage?.CompletionTokens ?? 0;
var totalTokens = usage?.TotalTokens ?? inputTokens + outputTokens;

Opsmeter telemetry client

Always the same
using System.Net;
using System.Net.Http.Json;

// TODO: Move apiKey to env/secret manager in production.
var apiKey = "YOUR_API_KEY";
var payload = new {
  provider = "openai",
  model = response.Model,
  promptVersion = "summarizer_v3",
  endpointTag = "summary",
  inputTokens,
  outputTokens,
  totalTokens,
  latencyMs,
  status = "success",
  dataMode = "real", // real | test | demo
  environment = "prod" // prod | staging | dev
};

await SendTelemetryAsync(payload, apiKey);

static async Task SendTelemetryAsync(object payload, string apiKey)
{
  if (string.IsNullOrWhiteSpace(apiKey)) return;
  // Requires: System.Net.Http.Json (built-in in .NET 6+)
  using var http = new HttpClient();
  http.DefaultRequestHeaders.Add("X-API-Key", apiKey);
  using var cts = new CancellationTokenSource(TimeSpan.FromMilliseconds(600));
  try
  {
    var res = await http.PostAsJsonAsync("https://api.opsmeter.io/v1/ingest/llm-request", payload, cts.Token);
    if (res.StatusCode == HttpStatusCode.TooManyRequests)
    {
      var retryAfter = (int?)res.Headers.RetryAfter?.Delta?.TotalSeconds ?? 1;
      Console.WriteLine($"Telemetry throttled, retry after {retryAfter}s");
      return;
    }
    if ((int)res.StatusCode == 402)
    {
      Console.WriteLine("Telemetry paused due to plan/budget policy.");
      return;
    }
  }
  catch { }
}

If usage is missing, send 0 tokens and backfill later when pricing is known.

userId is optional; if omitted, requests group as unknown. Do not send PII (hash or surrogate identifiers are recommended).