Frontier InsightsBusiness

T2V Velocity: An AI-Native Business Metric

Sean5 min read

Token Consumption ≠ Business Value: A Blind Spot the AI Industry Ignores

I currently run two businesses.

One is an AI proxy service—essentially LLM API reselling. The other is a travel company that uses AI to write short video scripts.

When selling tokens, I noticed that programmers consume the most—easily tens of thousands, even hundreds of thousands of tokens for debugging, refactoring, and generating code.

When using tokens, I found that short video scripts only need about 100 words. Post it, check the data, adjust, post again. You can run several iterations a day.

Then I discovered a fascinating fact:

Programmers consume 1,000x more tokens than short video scripts, but their willingness to pay might only be 1/10th.

PS: Just discussing business facts here, no other implications—this is natural law.

Why Is This the Case?

Through my observation and reflection, I discovered:

Programmers writing code: Consume 100,000 tokens, then debugging needs more, integration testing consumes more tokens, then wait for user feedback after launch. Architecture wrong? Start over.

Short video scripts: Consume 200 tokens, post immediately after writing, check views in 2 hours. Didn't work? Post the next one. Went viral? Cash out immediately.

One closed loop takes at least 1 month; the other takes less than half a day.

The gap is 600x.

Factor in the difference in value density, and the T2V (Token-to-Value) gap could be 160,000x.

T2V Velocity: A New Metric

I needed a concept to describe this phenomenon. I searched through economics and SaaS literature and found several related concepts:

  • Time-to-Value: In SaaS, how quickly customers can feel value
  • Capital Velocity: In finance, the speed of capital turnover
  • Throughput: In TOC (Theory of Constraints), output per unit time

All related, but none precisely describes this phenomenon in the token era.

So we coined our own:

Token-to-Value (T2V) Velocity = Business Value / (Token Cost × Days to Close Loop)

It measures: how quickly each token invested can be converted into verifiable business value.

One Table Explains It All

Scenario Token Consumption Loop Cycle Business Value T2V
Viral Short Video 200 0.1 days $1,400 500,000
Daily Copywriting 500 0.5 days $70 2,000
Feature Development 50,000 30 days $7,000 33
Code Refactoring 100,000 60 days $2,800 3

The T2V of a viral short video is 160,000x that of code refactoring.

This isn't a decimal point difference. This is a business model difference.

The above data are estimates, for scale reference only.

Of course, this doesn't mean programming token consumption has no value. Rather, the programming industry inherently has longer conversion cycles and is scale-dependent. As a token seller, targeting such industries inevitably leads to long closed-loop cycles. However, once the loop closes, it becomes a massive-scale business with deep moats. This direction tends to suit large companies rather than startups.

This is precisely why Deepractice pivoted from early AI programming products to breaking through into other industries.

What Does This Mean for the AI Industry?

1. Your "Big Customers" May Only Have Scale, Not Profit

Traditional thinking: High token usage = High revenue = Good customer

T2V thinking: High token usage + Long loop cycle = High cost + Weak value perception = Could churn anytime, or get squeezed by costs

The real golden customers are high-T2V customers—they use fewer tokens, but every token quickly converts to cash, and their willingness to pay is actually the strongest.

2. Customers Don't Pay for Tokens; They Pay for Closed Loops

Why are short video customers more willing to pay?

Because they can calculate their ROI clearly.

100 tokens → One script → Publish → Check data in 2 hours → Make money or learn something

This loop is fast enough, clear enough—customers know what they're paying for.

In programmer scenarios, the chain from tokens to value is too long, too fuzzy. Customers themselves aren't sure if the money is well spent.

This relates to the concept of outcome delivery I discussed earlier this year.

3. High T2V Scenario Checklist

If building AI products, I'd prioritize capturing these scenarios:

  • Short video/livestream scripts
  • E-commerce product descriptions
  • Social media operations
  • Marketing ad creatives
  • Customer service scripts
  • Custom proposals

Common characteristics: Few tokens, fast loops, quantifiable value.

Fortunately, these industries are far less competitive than AI programming.

One-Sentence Summary

Customers don't pay for tokens; they pay for closed loops.

T2V Velocity measures the efficiency from investment to value verification—this is the core metric that determines customer value.

Token consumption is just cost; loop efficiency is value.

The entire AI industry is competing on token prices, but what they should really compete on is: how to help customers complete their value loops faster.

This is an AI-native economics concept. Open for discussion.