Opsmeter.io logo
Opsmeter.io
AI Cost & Inference Control

Operations

Ops guideMOFU profile

OpenAI token pricing changes: keep your cost table updated

Model prices change over time. If your cost model is not versioned, your historical reporting becomes unreliable.

Pricing syncCost snapshotOps

Full guide: LLM pricing tables: keep costs accurate and handle unknown models

What this guide answers

  • What changed in cost, cost per request, or budget posture.
  • Which endpoint, prompt, model, or tenant likely drove the delta.
  • Which validation step or control to apply next in Opsmeter.io.

What to send (payload example)

{
  "externalRequestId": "req_01HZXB6MQZ2WQ9D2KCF9M4V2QY",
  "provider": "openai",
  "model": "gpt-4o-mini",
  "endpointTag": "catalog.pricing_reconcile",
  "promptVersion": "pricing_v2",
  "userId": "tenant_acme_hash",
  "inputTokens": 320,
  "outputTokens": 120,
  "latencyMs": 892,
  "status": "success",
  "dataMode": "real",
  "environment": "prod"
}

Common mistakes

  • Overwriting historical price snapshots instead of versioning by effective date.
  • Ignoring unknown-model rows until dashboards become untrustworthy.
  • Missing cached/reasoning token nuances and mispricing requests.
  • Mixing test/demo traffic into production pricing reconciliation.

How to verify in the Opsmeter.io dashboard

  1. Open Catalog to confirm model mapping and pricing effective dates.
  2. Check unknown-model visibility and resolve pending pricing rows.
  3. Spot-check cost snapshots on recent requests to validate ingestion accuracy.
  4. Reconcile aggregates against provider usage exports for the same window.

Core rule: snapshot request cost at ingest

Store inputCostUsd, outputCostUsd, and totalCostUsd on each request row.

Historical rows should remain stable even if provider price tables change later.

Use this workflow

Turn diagnosis into action

Identify the cost driver, validate it with attribution, then apply one durable control before the next billing cycle.

Apply in your workspace

Re-run this workflow on your own spend data

Follow the same path from article insight to telemetry verification, then validate with your own cost signals.

Quickstart pathSend a first payload, confirm attribution, then return here for operations context.Open quickstart
Evaluation pathPair this guide with trust proof, status, and compare surfaces during review.Open trust proof pack

Pricing sync workflow

  1. Pull source pricing feed on schedule.
  2. Update model catalog entries with effective timestamps.
  3. Mark unknown or disabled models explicitly.
  4. Validate catalog updates before applying to ingestion path.

Failure modes to avoid

  • Backfilling historical requests with new prices
  • Silent unknown-model fallback without visibility
  • Mixing test/demo traffic into production pricing decisions

Version rates by effective date (not by deploy date)

Pricing changes are a time-series problem. The same model name can have different effective rates at different dates.

Store effectiveFrom timestamps and apply the correct rate at ingest time so charts remain consistent.

  • Keep one catalog row per provider/model/effectiveFrom.
  • Do not overwrite older catalog rows; append new versions.
  • Record the pricing source and verification timestamp.

Unknown models are an operations queue

Unknown models are not just a data issue; they are a reporting risk. If you cannot price traffic, you cannot trust cost totals.

Treat unknown-model rows as a daily triage workflow with an owner and SLA.

  1. Alert when unknown-model ratio exceeds threshold.
  2. Create a pricing request with evidence and effective date.
  3. Approve and publish the catalog update.
  4. Verify new ingests are priced without rewriting history.

Token classes: cached, reasoning, and other usage fields

Providers may expose multiple usage fields that do not map cleanly to simple input/output billing. Keep raw usage fields and normalize with documented rules.

When usage is missing, treat it as uncertainty rather than zero.

  • Store raw usage fields (source of truth) alongside normalized totals.
  • Document mapping rules and update them when provider fields change.
  • Reconcile against invoice exports to validate the mapping.

Reconciliation: keep finance trust

  1. Export provider usage totals for the billing period.
  2. Compare to internal aggregates for the same UTC window.
  3. Investigate deltas: unknown models, missing usage, window mismatches.
  4. Record the outcome with owner and timestamp.
  5. Attach reconciliation notes to your monthly CFO pack.

Related guides

Open catalog docsView operationsCompare alternatives

Evaluation resources

For security and procurement reviews, use our trust summary before final tool selection.

Open trust proof pack