Anthropic Expands Claude's Memory to 1 Million Tokens, Boosting Coding Capabilities
Summary
Anthropic boosts Claude AI's coding abilities by expanding its memory to 1 million tokens, enabling it to handle requests up to 750,000 words or 75,000 lines of code while charging higher rates for prompts over 200,000 tokens.
Key Points
- Anthropic increases Claude AI model's context window to 1 million tokens, allowing it to handle requests up to 750,000 words or 75,000 lines of code
- The longer context helps Claude perform better on long coding tasks and remember previous steps
- Anthropic charges higher rates for prompts over 200,000 tokens, at $6 per million input tokens and $22.50 per million output tokens