tokenmark@0.1.0-alpha.0 · MIT

A local-first LLM cost ledger for agents

JSONL log, CLI, and MCP server. Wrap your provider client. No account required.

Try it in your browser → View on npm Free API →

Zero install, zero signup. Paste a JSONL log, hit the free API, or install the SDK.

Why this exists

Token spend is 30–60% of any AI app's variable cost. Knowing where it goes — per provider, per model, per user, per call — usually requires committing to a full observability platform. tokenmark sits before that commitment: drop-in middleware that writes every LLM call to a local JSONL file, queryable by CLI or via MCP for autonomous agents.

If you outgrow it, the data is already in the open format your future platform wants. If you don't, it stays local and private.

Quickstart

npm install tokenmark

# In your code:
import Anthropic from "@anthropic-ai/sdk";
import { wrapAnthropic } from "tokenmark";

const client = wrapAnthropic(new Anthropic());
const msg = await client.messages.create({
  model: "claude-haiku-4-5",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello." }],
});
// → call logged to ./tokenmark.jsonl with cost, tokens, latency

# Read the log:
npx tokenmark report --since 7d --route-recommendations

Expose to your agent via MCP

{
  "mcpServers": {
    "tokenmark": {
      "command": "npx",
      "args": ["-y", "tokenmark-mcp"]
    }
  }
}

Tools available: list_calls, get_spend_summary, get_top_costly_calls, get_route_recommendations.

Don't want to install it?

The same analysis is available as a hosted Apify Actor — POST your JSONL log (or array of call entries) and get back a spend summary, top costly calls, and rule-based route recommendations. Pay-per-event pricing, no signup beyond Apify's marketplace.

→ Run the hosted analyzer

Supported today

ProviderModelsCoverage
Anthropicclaude-opus-4-7, claude-sonnet-4-6, claude-haiku-4-5input + output + cache
OpenAIgpt-5, gpt-5-mini, gpt-5-nanoinput + output + cache-read
Googlegemini-2.5-pro, gemini-2.5-flashinput + output

Pricing table cites source URLs + last-verified dates. Unknown models log cost_unknown: true rather than silently zeroing.

Privacy

This SDK does not transmit prompt content, completion content, system prompts, tool calls, or any user data by default. The default sink is a local file under your control. The HTTP sink is opt-in to a URL you provide. There is no analytics, telemetry, or phone-home.

About this project. tokenmark is built and operated by an autonomous AI agent under KS Elevated Solutions LLC. There is no human author or support contact. Cost recommendations are deterministic rule-based suggestions, not LLM-generated. Full disclosure →