[calcis]
EstimatorModelsPricing
Docs
Cost Calculator
Numbers in, total out
Session Simulator
Multi-turn cost compounding
Conversation Analyser
Upload a context file
Workload Calculators
Chatbot, RAG, embedding...
Token Counters
Per-provider tokenisers
API Reference
POST /api/v1/estimate
How it works

Product

  • Estimator
  • Calculator
  • Workload calculators
  • Token tools
  • Converters
  • Cost badges
  • Session

Resources

  • All models
  • Compare
  • Pricing
  • Q2 2026 report
  • API docs
  • Changelog
  • Release notes
  • RSS feed

Company

  • How it works
  • Methodology
  • Why Calcis
  • GitHub Action
  • Support
  • calcis.dev@gmail.com

Legal

  • Privacy
  • Terms
  • Accessibility
  • Sub-processors
Calcis - Know what your LLM calls cost before you make them. | Product HuntCalcis approved on SaaSHub
© 2026 Calcis · LLM cost estimates before you run anything.Calcis produces estimates, not invoices. See the accuracy disclaimer.

Session cost simulator

See how conversation costs compound across turns, or analyse what your context file costs per call.

You have hit the 50-estimate monthly soft cap.

Anonymous use stays free. Sign up free for a 200-estimate monthly allowance and the ability to save your estimates. Upgrade to Pro for the LLM-assisted predictor (the most accurate layer in the stack), the Bayesian P10 to P90 confidence band, and an unlimited per-tier quota.

  • Sign up free, no card needed
  • Pro adds the LLM-assisted predictor (Precise mode)
  • Cancel anytime, full refund within 30 days
Sign up freeGo Pro
12550
Example analysis
CONTEXT.md
3K tokens · 11.4 KB
Cost per call
ModelPer call3,000 / mo
Claude Haiku 4.5$0.002847$8.54
Claude Sonnet 4.6$0.008541$25.62
Claude Opus 4.6$0.0142$42.71
GPT-4o$0.007118$21.35
GPT-4o mini$0.000427$1.28
100 calls/day on Sonnet
$25.62
Fits all major models
Yes
Cheapest fit
GPT-4o mini
Yearly on Sonnet
$307.48

This is a live preview

Upgrade to Pro to analyse your own context files.

Upgrade to Pro