Can you create a service like ChatGPT or Dall-E at home?

For some time now, graphics cards have been adding more and more capabilities, to the point of not only going beyond generating realistic graphics on our screen, but also supporting the rise of artificial intelligence. This has allowed the creation of services over the Internet for the generation of automatic images such as DALL-E or other services such as ChatGPT. But, what is the cost of being able to create an AI service based on hardware?

Despite the fact that we still have a long way to go in the field of artificial intelligence, the appearance of services in which, by placing a series of key words or phrases, it is capable of generating a story in text or, failing that, an image that can be more or less less correct. And despite the level of failures that this technology has, many people are fascinated by it, but they are unaware that a level of complexity is needed and that it is impossible to emulate even on the most powerful computer that you can assemble with the most expensive components that you can find right now.

Can you create a service like ChatGPT or Dall-E at home

How much does ChatGPT or Dall-E hardware cost?

Well, many thousands of euros, if not tens of euros, because the amount of data they handle to do their job and the amount of power needed to do it requires configurations of tens and even hundreds of graphics cards. Not only in order to generate the responses from the inference algorithm, which is what the user interacts with, but also to train the AI, that is, to learn the values and draw its own conclusions.

NVIDIA DGX Servidor IA

The hardware used in many of these cases is the NVIDIA DGX SuperPOD , a server built by NVIDIA made up of hundreds of graphics cards, but not for gaming, but those used for high-performance computing. Think, for example, that the price of an NVIDIA H100 can cost us a total of 5,000 euros and we even have models that go to five figures, this is much more than what a standing user will spend on his computer, even with a i9 latest generation and an RTX 4090 today.

And high, that the thing does not end there. The volume of data is such that it does not fit on one graphics card and it is necessary to use several of them. For example, ChatGPT requires teams of 8 graphics cards of this type, a cost of at least 40,000 euros per server as a minimum. And if we talk about Dall-E, which handles images and is more complex, then the cost can skyrocket by several dozen. So we still have a long time to have something like this at the domestic level and for this we will have to wait a whole decade to have something of this capacity in our home PC.

Memory is the biggest bottleneck to achieve it

All this is due to the amount of information that the artificial intelligence algorithm requires to draw its conclusions, just as it happens with the human brain that draws conclusions from the information and knowledge it has. So you will need to store internet search data as a basis to do your job. Which is huge and forces the use of extremely expensive infrastructure.

Astronauta Caballo Dall-E GPT

In addition, they are not entirely functional, you just have to see certain aberrations that ChatGPT gives in response to certain questions or the drawings worthy of the worst nightmares and without any sense that Dall-E sometimes shows us and that we do not know how he could come to such a conclusion. Although it must be recognized that some are even curious and worthy of framing, but there are still many years to go before they do not have such a high margin of error regarding what they are asked and what they show.