Inventory 1A

The best software to manage your inventories and online store in a simple and efficient way.

Free version for non-commercial use.

Imagen del software de inventarios

How to train my own ChatGPT

They paint it for us as if we were just a couple of clicks away from creating our own artificial intelligence. All it takes is opening YouTube or reading a technology blog to be bombarded with ads about how to train your own AI in minutes. ChatGPT here, Gemini there, or Claude from Anthropic can answer questions better than any competitor's assistant. It seems like AI is within everyone's reach, but how true is that perception?

How to train my own ChatGPT

Behind these artificial intelligences lies a reality that few know. Creating an AI model like ChatGPT or LLaMA from Meta is not something that just anyone can do on their desktop computer or even on an average server. The story begins with a question: where do they get so much information from?

Data Hunger: Stealing Knowledge from the Web

Training a language AI like ChatGPT is not just about getting an encyclopedia or Wikipedia content. While Wikipedia has become a reference point when talking about general knowledge, it actually represents less than 1% of the data needed to train these gigantic neural networks. So, where do they get the rest from?

The answer is quite raw: AI feeds on the entire web. The creators of these artificial intelligences build robots or "scrapers" that scour the internet like tireless spiders, tracking, copying, and storing all the text they find in their path. News articles, blog posts, social media comments, product reviews, even the lyrics to your favorite songs. And they don't just limit themselves to text. YouTube videos are converted to text through voice recognition algorithms, transforming hours of visual content into lines and more lines of processable information.

The result of this "data famine" is a huge text file called dataset. And when I say huge, I mean thousands of terabytes of information. A dataset is, in essence, the raw material used to train an artificial intelligence. It's like the food you feed the neural network, and the AI will only be as good as the quality and quantity of the dataset that nourishes it.

What Does It Take to Train a World-Class AI?

Okay, we have the data, but what else do we need to train an artificial intelligence of this caliber? This is where things get complicated. If you thought a good last-generation CPU and a few days of work were enough, I have bad news for you: forget about training a network of this level on your desktop PC.

To train a natural language processing (NLP) AI like ChatGPT or LLaMA, you need thousands of the latest technology graphics cards, such as Nvidia's A100 GPUs. And I'm not talking about one or two GPUs, I'm talking about server farms, full to the brim with GPUs connected in parallel. Each GPU is capable of performing thousands of mathematical operations per second, allowing the AI to learn from millions of data at the same time. A CPU is not suitable for these tasks; it simply does not have the capacity to process so much information in parallel.

Even with all this computing power, training a large model can take weeks or months, with a huge electricity cost. To give you an idea, OpenAI has spent millions of dollars training its GPT models. Is it then viable for an average developer to create their own AI from scratch? Probably not.

What Options Are Left for "Mortals"?

So, who can train an AI? The reality is that large technology companies like Google, Meta, OpenAI, and Microsoft dominate this field because they have the necessary resources to do so. The rest of us are playing a different game: that of using pre-trained models. These companies have opened their models for other developers to use, adjust, and optimize on a smaller scale.

In practical terms, this means that anyone with technical knowledge can use a model like GPT to create custom applications, without having to train it from scratch. It's not the same experience as developing an AI from the ground up, but it allows small businesses and entrepreneurs to benefit from this technology without the prohibitive cost of creating an AI from scratch.

The Illusion of Access: Why Do They Make Us Believe Anyone Can Create Their Own AI?

A large part of the narrative of "anyone can create their own AI" is based on marketing strategies. Google and OpenAI want more developers to use their tools, because that increases the adoption of their products and, in the long run, their profitability. If a small entrepreneur creates an app that does not use the GPT model from OpenAI, for example, this entrepreneur will continue to pay OpenAI for the use of the model. It's an ecosystem that seeks to attract developers, not to create their own AI, but to join the wave using what already exists.

Is It Really Possible to Create Your Own AI?

Yes and no. If we're talking about training from scratch an AI with the same level of capacity as those of the tech giants, then the answer is a resounding no for most of us. The cost, access to data, and necessary infrastructure are beyond the reach of most independent developers. However, if it's about using pre-trained models, adapting them, and creating applications that make use of that AI, then the answer changes to a yes.

Ultimately, although they want us to believe that we're just a coffee chat away from creating our own AI, the truth is that the real power still lies in the hands of those who control the resources: data and hardware. So, before getting excited about the idea of "building your own artificial intelligence," ask yourself if you really have what it takes to do it or if you'll just be using a reduced version of something that already exists.

For now, it remains a game dominated by the big players, although with small accesses for those who wish to explore their possibilities.

Asistente Virtual