A chunk of text the AI processes — roughly ¾ of a word. The reason rate limits and pricing exist.
A token is the smallest unit of text an AI model handles. It's not exactly a word — it's roughly ¾ of one. The word "thanks" is one token. The word "preposterously" is two or three. Models break your text into tokens before processing, then string tokens together to write back.
Three reasons:
For everyday chat use (not API), this doesn't matter — you pay a flat $20/month for Pro. Tokens only matter if you build with the API.
Estimate: divide your word count by 0.75. For exact counts, OpenAI offers a tokenizer at platform.openai.com/tokenizer. Anthropic's tokenizer is similar but differs slightly per model.