Artificial intelligence
Artificial intelligence

Report: Global Tech Revolution Hindered by High Costs of Training AI Models
By Abbas Badmus

In a recent revelation by Stanford University, the soaring expenses associated with training Artificial Intelligence (AI) models are increasingly becoming a barrier for non-industry players to participate in the technological revolution.

The Artificial Intelligence Index Report 2024 sheds light on the significant financial investments required, with estimates suggesting costs reaching millions of dollars and continuing to escalate.

While many AI companies remain tight-lipped about the specific expenses involved in training their models, industry insiders widely speculate on the staggering figures.

Notably, Sam Altman, CEO of OpenAI, disclosed in 2023 that the training cost for their GPT-4 model surpassed $100 million.

This revelation underscores the immense financial burden associated with developing cutting-edge AI technologies.

The report underscores the repercussions of these escalating costs, particularly on academic institutions, traditionally regarded as hubs for AI research. The astronomical expenses effectively preclude universities from independently developing leading-edge foundation models, thereby impeding their involvement in advancing AI technologies.

Illustrating the exorbitant costs, the report contrasts the relatively modest expenses associated with earlier models, such as the Transformer, which cost around $900 to train. In stark contrast, contemporary models like OpenAI’s GPT-4 and Google’s Gemini Ultra demand staggering investments, estimated at $78 million and $191 million, respectively.

Furthermore, the report highlights the looming challenge of data scarcity in AI development. As Language Models (LLMs) increasingly rely on vast datasets, concerns arise regarding the depletion of available data for future generations of computer scientists.

Projections suggest that high-quality language data could be exhausted by 2024, followed by low-quality language data within two decades, and image data by the late 2030s to mid-2040s.

Amidst these challenges, the Nigerian government’s recent launch of its Large Language Model (LLM) aims to position the country as a leader in AI innovation within Africa.

Dr. Bosun Tijani, Minister of Communications, Innovation, and Digital Economy, emphasizes the importance of inclusivity in AI development, with the LLM designed to incorporate five low-resource languages and accented English to enhance language representation in AI datasets.

As the AI landscape evolves, stakeholders must address the dual challenges of escalating training costs and data scarcity to foster a more inclusive and sustainable AI ecosystem.