See how much your OpenAI Codex sessions cost - broken down by day, month, or session - right from your terminal.
Inspired by @ccusage/codex, codexusage is a ground-up Rust rewrite built for users who need a significantly faster and lighter alternative to scan large session histories.
cargo install codexusage# Daily usage summary
codexusage daily
# Monthly report as JSON
codexusage monthly --json
# Last 7 days only
codexusage daily --last-days 7
# Custom session directory
codexusage session --session-dir /path/to/sessionsFilter by date range:
codexusage daily --since 2026-03-01 --until 2026-03-31Skip network and use cached pricing:
codexusage daily --offlineJSON output for scripts and dashboards:
codexusage daily --jsonMonitor today's usage live in your terminal. The screen refreshes automatically and shows a rolling burn rate so you can see how fast tokens are being consumed:
codexusage watchCustom refresh interval (default 5 seconds):
codexusage watch --interval 10Opt in to per-model burn-rate columns inside the watch table:
codexusage watch --per-model-burn-rateSample output:
Current Day Codex Usage Watch
Date: 2026-03-11 Window: 60 minutes
+-----------+------------+----------------+
| Metric | Today | Burn Rate (/h) |
+-----------+------------+----------------+
| Input | 450K | 56K |
| Cache | 120K | 15K |
| Output | 30K | 4K |
| Reasoning | 12K | 2K |
| Total | 612K | 77K |
| Cost | $0.85 | $0.11 |
| Updated | 2026-03-11 | 14:32:23 |
+-----------+------------+----------------+
Watch mode does not support --json, --since, --until, or --last-days.
Use --per-model-burn-rate to append one burn-rate column per active model, with aggregate Burn Rate (/h) kept as the rightmost column.
Commands:
daily- group usage by daymonthly- group usage by monthsession- group usage by sessionwatch- live current-day monitor with burn rate
Flags:
--json- emit structured JSON instead of a table--since/--until- inclusive date filters--last-days N/-L N- show the last N calendar days (daily only)--timezone- IANA timezone for grouping--offline- use cached pricing, never hit the network--refresh-pricing- force a pricing refresh--session-dir- override session directory (repeatable)--threads N- scanner worker count--number-format full- show full token counts instead of K/M/B/Twatch --per-model-burn-rate- show per-model burn-rate columns in the live watch table
--last-days cannot be combined with --since or --until.
By default, codexusage scans CODEX_HOME/sessions (or ~/.codex/sessions when CODEX_HOME is not set).
Daily Codex Usage Report
+------------+--------------+-------+-------+--------+-----------+-------+-------+
| Date | Model | Input | Cache | Output | Reasoning | Total | Cost |
+------------+--------------+-------+-------+--------+-----------+-------+-------+
| 2026-03-01 | TOTAL | 120K | 30K | 8K | 4K | 158K | $0.20 |
| | gpt-5 | 120K | 30K | 8K | 4K | 158K | $0.20 |
+------------+--------------+-------+-------+--------+-----------+-------+-------+
| 2026-03-02 | TOTAL | 90K | 20K | 6K | 2K | 118K | $0.03 |
| | gpt-5-mini | 90K | 20K | 6K | 2K | 118K | $0.03 |
| | GRAND TOTAL | 210K | 50K | 14K | 6K | 276K | $0.23 |
+------------+--------------+-------+-------+--------+-----------+-------+-------+
Pricing data is cached locally and refreshed automatically when stale. Use --offline to skip network access entirely, or --refresh-pricing to force an update.
Live pricing refreshes use the system CA store through reqwest's Rustls backend. Minimal containers or stripped-down CI images may need a CA bundle such as ca-certificates installed for refreshes to succeed.
codexusage was built because @ccusage/codex became unusably slow on large session directories. The table below shows a side-by-side comparison on the same dataset.
Test environment: 4.4 GB across 7726 session files, Intel i7-1185G7 (8 logical CPUs).
| Tool | Wall time | Peak RSS |
|---|---|---|
codexusage |
1.82 s | 193 MB |
@ccusage/codex |
5 min 36 s | 11,547 MB |
Notes:
@ccusage/codexwas already cached by Bun; the one-time download is not counted.- Results reflect this specific machine and dataset, not a universal guarantee.
Build from source:
cargo build --releaseSource builds require a working C toolchain and cmake because the reqwest TLS stack pulls in aws-lc-rs.
Quality commands:
just fmt
just clippy
just test
just bench