Skip to main content
Zero Markup Policy - We pass through exact provider pricing with no hidden fees. LLMLayer only charges $0.004 per search for our infrastructure. You get the same rates as going direct to providers, plus our powerful search capabilities.

Why LLMLayer Pricing?

  • βœ… No markup on models - Pay exactly what providers charge
  • βœ… One API, all models - No need for multiple provider accounts
  • βœ… $0.004 per search - Predictable infrastructure cost
  • βœ… Real-time web search - Built into every request
  • βœ… Instant switching - Change models with one single line of code

Model Pricing

All prices are per million tokens. We update these automatically when providers change their rates.

OpenAI Models

ModelInput ($/M tokens)Output ($/M tokens)Best For
openai/gpt-5.1$1.25$10.00Premium reasoning
openai/gpt-4o-mini$0.15$0.60Fast, affordable searches

LLMLayer Models

ModelPricingBest For
llmlayer-web$0.007/query flat (no token costs)Finetuned GPT-4o-mini for AI-powered web answers
llmlayer-fast$0.009/query flat (no token costs)GPT-OSS-120B served via Cerebras blazing fast inference for fast answers
llmlayer-web is a finetuned version of GPT-4o-mini specifically optimized for web search and answers. Pay a flat $0.007 per query with no token charges. Max 4 queries per request.
llmlayer-fast is GPT-OSS-120B served via Cerebras blazing fast inference engine, optimized for fast web answers. Pay a flat $0.009 per query with no token charges. Recommended when speed is a priority. Max 4 queries per request.
Coming Soon - We’re constantly adding new models and providers. Pricing updates are reflected immediately when providers change their rates.

The LLMLayer Advantage

πŸš€ Production-Ready Infrastructure

  • Automatic retries - Handle failures gracefully
  • Global CDN - Low latency worldwide

πŸ’° Unbeatable Value

  • No API key juggling - One key for all models
  • Pay-as-you-go - No minimums or commitments

Pricing Calculator

Want to estimate your costs? Here’s a simple formula: Standard models:
Total Cost = $0.004 (per search) + (Input Tokens Γ— Input Price) + (Output Tokens Γ— Output Price)
LLMLayer models (flat pricing):
llmlayer-web:  Total Cost = $0.007 Γ— number of queries
llmlayer-fast: Total Cost = $0.009 Γ— number of queries

FAQ

You pay the exact same model costs as going direct to OpenAI. The only additional cost is $0.004 per search for our infrastructure - a tiny price for integrated web search, relevant sources ranking, and unified API access.
The provider_key parameter is deprecated and currently ignored by the API. It remains available for backward compatibility.
We validate pricing every week. Any changes are reflected as soon as possible in your billing.
Yes, new users get 2$ in free credits to try out the service.

Start Building Today

Get API Key

Sign up and get free credits

View Answer API

Start with the core Answer API

API Reference

Explore all features

Contact Sales

Enterprise & volume pricing

Questions? Join our Discord or email support@llmlayer.ai