WorkflowAI
  • Welcome
  • Getting Started
    • For Product Managers
    • For Developers
    • Creating and Managing Organizations
  • AI Features Playbook
    • Introduction
    • What is an AI Feature?
    • Defining your AI Feature
    • Testing your AI Feature
    • Evaluating your AI Feature
    • Adding your AI Feature to your Product (and Beyond)
    • Improving your AI Feature
  • Use Cases
    • Image Generation (NEW!)
  • Concepts
    • Schemas
    • Versions
    • Runs
    • Tools
  • Features
    • Playground
    • Reviews
    • Side by Side
    • Benchmarks
    • Code
    • Deployments
    • User Feedback
    • Monitoring
    • Limitations
    • Change Log
  • WorkflowAI Cloud
    • Introduction
    • Pricing
    • Reliability
    • Compliance
  • Developers
    • For Developers
  • Python SDK
    • Get started
    • @workflowai.agent
    • Schemas
    • Versions
    • Deployments
    • Multimodality
    • Tools
    • Errors
    • Examples
    • Workflows
  • Integrations
    • Instructor
  • Support
    • Email
    • GitHub Discussions
    • Discord
Powered by GitBook

Support

  • Slack
  • Github

Company

  • workflowai.com
  • Join
On this page
  • Our Price match guarantee
  • What does WorkflowAI Cloud charge for?
  • Then how does WorkflowAI Cloud make money?

Was this helpful?

Edit on GitHub
  1. WorkflowAI Cloud

Pricing

PreviousIntroductionNextReliability

Last updated 1 month ago

Was this helpful?

WorkflowAI Cloud uses a pay-as-you-go infrastructure, similar to Amazon Web Services. There is no fixed cost, minimum spend, annual commitment, and no need to talk to sales to get started.

Our Price match guarantee

We offer a price-match guarantee for all LLM providers: WorkflowAI Cloud charges the same per token price as using providers directly. Currently, we support models from OpenAI, Anthropic, Google, Mistral, DeepSeek, Grok, and Llama (provided by ).

If you have credits with Amazon, Google, or Azure, you can also continue to use them via WorkflowAI Cloud, .

What does WorkflowAI Cloud charge for?

WorkflowAI Cloud only charges for:

  • the tokens generated by your AI features (per token generated)

  • the tools used by your AI features (per tool used)

We do not charge for:

  • data storage

  • quantity of AI features

  • users in your organization

  • bandwidth or CPU usage

Then how does WorkflowAI Cloud make money?

We make our margin by buying LLM tokens at bulk discount, and then reselling them to you at the standard public price.

To break this down further, let’s look at what actually drives the cost of inference: The cost of inference is mostly GPU and electricity, not tokens. When you buy tokens from an LLM provider you are effectively paying for the electricity and GPU cost. But paying for GPU usage is most efficient if you can utilize the GPU at maximum capacity, all the time. That’s hard to do on your own, but possible for WorkflowAI because we’re pooling demand from our users. By taking this approach of pooling demand, WorkflowAI can keep your LLM costs low while still giving you access to a wide range of providers.

FireworksAI
by providing your own API keys