Tokenmaxxing: Pushing AI Agents to Token Limits

Source: nytimes.com

AI agents are getting smarter and more autonomous, but they guzzle tokens - the basic text chunks LLMs process - driving up costs and hitting context limits fast. The piece dives into "tokenmaxxing," the push to optimize every token in multi-agent setups where bots chat, decide, and act together. It matters because as agents scale to handle real work like trading or analysis, unchecked token bloat could kill adoption.