Artificial Intelligence

Out of context: Reply #2084

  • Started
  • Last post
  • 2,575 Responses
  • ********
    -6

    "Claude Sonnet 4 now supports up to 1 million tokens of context on the Anthropic API—a 5x increase that lets you process entire codebases with over 75,000 lines of code or dozens of research papers in a single request.

    Long context support for Sonnet 4 is now in public beta on the Anthropic API and in Amazon Bedrock, with Google Cloud’s Vertex AI coming soon."

    https://www.anthropic.com/news/1…

    • This is more useful than GPT-5 launch :)
      ********
    • Got about 30 minutes of work done today on the Claude paid plan before I reached my daily limit with the message "Subscribe to Max". FFS.colab
    • But don't models break anyway after 300k-500k tokens?

      1M context windows is pure marketing and can't be actually used
      ********

View thread