Over the past year, since the launch of everyone’s favorite AI chatbot ChatGPT, there have been many changes and updates. This is the fastest-moving area of tech, after all!
One of these changes has been to the word count limit, which was set to a maximum of about 500 words per prompt and 3000 words for text generated at its launch. Many believed this may have been because of the AI’s propensity to ‘hallucinate’, which has gotten better with further iterations.
With the release of GPT-4 Turbo, has this word count limit changed? And what’s the difference between tokens and words, anyway? In this article, we’ll explain ChatGPT’s token and word limits.
ChatGPT Word Limit Explained
With the release of GPT-4 Turbo, rumors have been circulating that there is no longer a word limit when inputting prompts or receiving generated text. However, as with all new tech, a lot of these rumors are just wishful thinking.
Technically, there is still no definitive answer to whether GPT-4 Turbo has a word limit. However, it is clear that GPT-4 Turbo (the model available to those who subscribe to GPT+) has a much larger context window than its predecessor, GPT-4.
GPT-4 Turbo offers a 128,000-token context window, which is four times the size of that of GPT-4. GPT-3.5 has a much smaller 4096 token context window, in comparison.
GPT-4 Turbo is able to consider around 96,000 words in one go, which is longer than many novels. It is worth noting, however, that the character limit for GPT-4 has been reduced over time, with some Reddit users reporting character limits as low as 2048 characters.
GPT-4 Turbo offers a 128,000-token context window, which is four times the size of that of GPT-4.
The word limit set by OpenAI, the parent company of ChatGPT, is part of its ongoing development of the AI chatbot. So, if your request is deemed too complex for the AI to handle, then ChatGPT might simply stop producing content.
However, it’s still possible to use prompts that work around this limitation by setting it with the word limit in mind, or by feeding longer pieces of text in separate prompts. What ChatGPT Turbo does offer is the opportunity to browse the web to write and run code, analyze data, and more.
And it’s true that this new version has a larger context window of 128K tokens, which is the equivalent of over 300 pages of text that can be used for a single prompt. This is much larger than the default version of ChatGPT, and even better than GPT-3.5, which only allows for 4,096 tokens, and is roughly equivalent to 1,000 to 3,000 words.
This means that GPT-4 Turbo is able to comprehend prompts with questions that are more comprehensive, and does a better job of following instructions, considering its capacity to take on around 96,000 words in one go, which is longer than many novels.
And it also means that it is able to give more concise and thoughtful responses, which will apparently lead to fewer hallucinations. By comparison, GPT-4 features an 8,000-token context window, with a 32K model available through an API for some developers.
That said, many users (us included) have noted that ChatGPT will forget its instructions as the chat window gets longer and longer. Users often have to remind it what to do!
So, while there is no definitive answer as to whether GPT-4 Turbo has a word limit, it’s worth noting that some users have reported character limits as low as 2048 characters. It is also important to consider that GPT-4 Turbo comes in two versions — one that’s strictly for analyzing text, and another version that understands the context of both text and images.
Two Versions of GPT-4 Turbo?
Yes, you read that right, there are two versions of GPT-4 Turbo that are available for users. The first is the text-analyzing model, which is available to use now via an API. OpenAI plans to make it generally available in the coming weeks.
The second version of GPT-4 Turbo has the capability to understand the context of both text and images. This version, known as GPT-4 with Vision, or GPT-4V, enables the chatbot to analyse images, can accept images as inputs, as well as text-to-speech prompts, and integrate with DALL-E 3.
GPT-4 with Vision also has the enlarged 128K context window, which means that it can take prompts equivalent to around 100 pages of text.
It’s also worth keeping in mind that GPT-4 Turbo is more capable and has knowledge of world events up to April 2023, whereas GPT-4 is current only through September 2021. In addition, GPT-4 Turbo is optimized for performance, and is offered at lower price for input and output tokens when compared to GPT-4.
However, as with all new AI software, some users have already reported performance issues with GPT-4 Turbo, in terms of following instructions and generating output, which is to be expected while OpenAI sorts out all the teething problems.
Remember, the inputs and outputs of ChatGPT are measured in tokens, not characters or words. And a token is basically a single word, which could be “of” or could be “antidisestablishmentarianism”. Both would likely count as one token.
When we started doing our own research to find out the answer, we first decided to ask the AI chatbot itself to see if it would answer without hallucinating (which basically means that it doesn’t tell the complete truth). We asked GPT-4 turbo about word and token limits.
Do you currently have a word limit? And how does word limit relate to token limits?
Here’s what it said:
As of my current design, I don’t have a strict word limit for the text I generate, but there is a token limit. A token can be a word, a part of a word, or even punctuation. In general, the maximum token limit for my responses is around 4096 tokens. This limit includes the tokens in the user’s prompt and all of my responses in a session, meaning both the question and the answer, as well as any previous interaction in the same session.
To give you a rough idea, 4096 tokens are typically equivalent to about 750-1000 words, depending on the complexity of the language used. This limit is sufficient for most detailed responses, but in cases of very long conversations or extremely detailed inquiries, it might be reached, at which point I won’t be able to provide additional information in the same session.
Amusingly, this is incorrect.
I then reminded it that it is GPT-4 Turbo, not GPT-3.5:
Is that the case for gpt 3.5 or gpt-4 turbo? I thought gpt 4 turbo had a 128k token limit?
This is how it responded:
You’re correct in noting the differences between GPT-3.5 and GPT-4 Turbo in terms of token limits. GPT-3.5, like earlier versions, typically has a token limit of around 4096 tokens per session. However, GPT-4 Turbo, which I am based on, has a significantly increased token limit, allowing up to 128,000 tokens in a session.
This increase in token limit with GPT-4 Turbo offers a substantial enhancement in handling longer conversations and more complex queries. It means that in a single session, I can handle much more extensive interactions, provide longer and more detailed responses, and maintain context over a broader range of discussion topics without reaching the token limit as quickly as earlier models.
Bottom line: ChatGPT still hallucinates, even GPT-4! Always fact check it, or have it fact check itself. Regardless, if you’re using GPT-3.5 to any significant extent, you’ll still run into the token limit. This will be much less frequent with GPT-4, but can still happen.
Working Around The Limit
There are a few additional steps that you can take to work around ChatGPT’s word limits, whether that’s for your own prompts or for the text that you want it to generate.
Subscribe to ChatGPT Plus
For example, the most obvious, and perhaps the easiest option, is for you to upgrade your ChatGPT account, which will give you more access to functionality that isn’t available for free. It really is much more powerful, much better at coding, and is worth trying out for a bit.
Paying for ChatGPT’s $20 per month Plus subscription also means you’ll be less likely to have to deal with the recurring outages that happen when the server overloads during peak usage hours.
Keep Your Prompts Concise
Another way to work around the limit is to make your prompts logical and concise, so that ChatGPT understands exactly what you want. And if you’re not happy with the response that you get, then you can ask it to regenerate the output as much as you like until you are satisfied.
Ask ChatGPT to Summarize Your Conversation
A great way to consolidate your conversation is to ask ChatGPT to create a summary of your conversation up to this point in a way that will provide a new conversation with all the necessary details to continue. You can think of it like ChatGPT making a Zip file of your conversation, which it can then “unzip” in a new conversation altogether and continue.
Overall, GPT-4 Turbo offers several benefits over the default version of the AI chatbot, including more power and up-to-date information, a larger context window, cheaper costs and improved functionality, as well as customization options — although not necessarily an unlimited word count.
Therefore, if you’ve been a heavy user over the past year, then you may just want to consider subscribing to GPT-4 Turbo, it’s definitely worth it, especially if this ChatGPT word limit update has tickled your fancy and you want to try your hand at seeing if it works for you.
Just remember: it still hallucinates sometimes!