ChatGPT’s Evolution: From GPT-3.5 to GPT-4 Turbo

The publicly available version of ChatGPT initially relied on GPT-3.5. However, nearly a year later, OpenAI launched GPT-4, a significantly more advanced iteration of the model, which has since been further upgraded to GPT-4 Turbo.

When ChatGPT transitioned to GPT-4, it became accessible exclusively through the ChatGPT Plus subscription. Nonetheless, users still had the option to continue using the free version of ChatGPT, which remained based on GPT-3.5.


Following an agreement between OpenAI and Microsoft, ChatGPT was made available through Bing at no cost. Initially, it was the GPT-3.5 version that was integrated, and later, it was updated to include GPT-4. However, with the introduction of GPT-4 Turbo, users have been curious about when Bing Chat (now known as Copilot) will be adapted to support this latest iteration of OpenAI’s language model.

In contrast to previous updates, the integration of GPT-4 Turbo into Bing Chat is still pending. Mikhail Parakhin, the head of the web and Windows experiences team, revealed that due to the abrupt departure of Panos Panay a few weeks ago, the company has opted to consolidate various teams.

This is also partly because GPT-4 Turbo can only be utilized at the API level, making it accessible solely to developers. Until Microsoft completes its restructuring process, prompted by Panay’s move to Amazon, there is no fixed timeline for the implementation of GPT-4 Turbo within Bing Chat.

Therefore, if you wish to employ the new GPT-4 Turbo, you must register as a developer on the OpenAI website, fund your account with a minimum of $1, and employ a client that permits API utilization to submit queries using this advanced algorithm.

GPT-4 vs GPT-4 Turbo:

With the introduction of GPT-4, OpenAI marked a significant advancement in ChatGPT’s capabilities. In GPT-4, the model’s training data was substantially expanded, and its ability to generate responses to longer queries was enhanced.

The first update that GPT-4 received was GPT-4 32K, a slight improvement over the original GPT-4. However, with the launch of GPT-4 Turbo, often referred to as “128K,” the model’s contextual understanding was further expanded. This enhancement allows it to analyze ChatGPT and provide longer and more precise responses.

Moreover, unlike GPT-3.5, GPT-4 Turbo’s training data is up-to-date as of April 2023, whereas the free version of ChatGPT relies on data only up to September 2021.

Essentially, this new version empowers users to input and consider longer texts when crafting prompts or questions for which they seek answers. Consequently, ChatGPT can now take into account more intricate details and data, thereby generating more accurate and comprehensive responses.

ChatGPT operates on tokens, and while there is no one-to-one equivalence between tokens and words, they are similar. GPT-3.5 could handle up to 4,096 tokens, equivalent to roughly 3,000 words for generating a response.

When GPT-4 was introduced at 32K, it could process up to 32,000 tokens. However, with the release of GPT-4 Turbo, this new version can analyze and consider up to 128,000 tokens when generating a response. This means that users can formulate much more complex questions with a wealth of data and details, resulting in significantly more complete and precise answers.