Back to Dictionary
Concepts

Tokens

The basic units AI models use to process text — roughly 1 token per 0.75 words in English.

The Full Picture

Tokens are how LLMs break down text. A token might be a word, part of a word, or a punctuation mark. "Hello world" is 2 tokens, but "authentication" might be 3.

Understanding tokens matters for vibe coding because: (1) you're billed per token for API calls, (2) your context window is measured in tokens, and (3) code tends to use more tokens per concept than prose because of syntax, variable names, and formatting.

Was this helpful?

Want to go deeper? I write about the real gaps vibe coding leaves behind.