Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wp-plugin-hostgator domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the ol-scrapes domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893
{"id":31787,"date":"2023-05-11T21:35:46","date_gmt":"2023-05-11T21:35:46","guid":{"rendered":"https:\/\/scienceandnerds.com\/2023\/05\/11\/anthropics-latest-model-can-take-the-great-gatsby-as-input\/"},"modified":"2023-05-11T21:35:47","modified_gmt":"2023-05-11T21:35:47","slug":"anthropics-latest-model-can-take-the-great-gatsby-as-input","status":"publish","type":"post","link":"https:\/\/scienceandnerds.com\/2023\/05\/11\/anthropics-latest-model-can-take-the-great-gatsby-as-input\/","title":{"rendered":"Anthropic\u2019s latest model can take \u2018The Great Gatsby\u2019 as input"},"content":{"rendered":"

Source:https:\/\/techcrunch.com\/2023\/05\/11\/anthropics-latest-model-can-take-the-great-gatsby-as-input\/<\/a><\/br>
\nAnthropic\u2019s latest model can take \u2018The Great Gatsby\u2019 as input<\/br>
\n2023-05-11 21:35:46<\/br><\/p>\n

\n

Historically and even today, poor memory has been an impediment to the usefulness of text-generating AI. As a recent piece in The Atlantic aptly puts<\/a> it, even sophisticated generative text AI like ChatGPT<\/a> has the memory of a goldfish. Each time the model generates a response, it takes into account only a very limited amount of text \u2014 preventing it from, say, summarizing a book or reviewing a major coding project.<\/p>\n

But Anthropic\u2019s trying to change that.<\/p>\n

Today, the AI research startup announced<\/a> that it\u2019s expanded the context window for Claude \u2014 its flagship text-generating AI model, still in preview \u2014 from 9,000 tokens to 100,000 tokens. Context window refers to the text the model considers before generating additional text, while tokens represent raw text (e.g., the word \u201cfantastic\u201d would be split into the tokens \u201cfan,\u201d \u201ctas\u201d and \u201ctic\u201d).<\/p>\n

So what\u2019s the significance, exactly? Well, as alluded to earlier, models with small context windows tend to \u201cforget\u201d the content of even very recent conversations \u2014 leading them to veer off topic. After a few thousand words or so, they also forget their initial instructions, instead extrapolating their behavior from the last information within their context window rather than from the original request.<\/p>\n

Given the benefits of large context windows, it\u2019s not surprising that figuring out ways to expand<\/a> them has become a major focus of AI labs like OpenAI, which devoted an entire team to the issue. OpenAI\u2019s GPT-4<\/a> held the previous crown in terms of context window sizes, weighing in at 32,000 tokens on the high end \u2014 but the improved Claude API blows past that.<\/p>\n

With a bigger \u201cmemory,\u201d Claude should be able to converse relatively coherently for hours \u2014 several days, even \u2014 as opposed to minutes. And perhaps more importantly, it should be less likely to go off the rails.<\/p>\n

In a blog post, Anthropic touts the other benefits of Claude\u2019s increased context window, including the ability for the model to digest and analyze hundreds of pages of materials. Beyond reading long texts, the upgraded Claude can help retrieve information from multiple documents or even a book, Anthropic says, answering questions that require \u201csynthesis of knowledge\u201d across many parts of the text.<\/p>\n

Anthropic lists a few possible use cases:<\/p>\n