The development and training of Artificial Intelligence (AI) is often associated with high costs, especially regarding the required hardware. Powerful graphics processing units (GPUs) are essential for training complex AI models. Many companies invest in expensive high-end GPUs and operate large data centers. A startup is now pursuing a different approach and relying on cost-effective gaming graphics cards.
While large companies like OpenAI rely on hundreds of thousands of specialized and expensive GPUs to train their extensive AI models like ChatGPT or Sora, the startup Fastino is experimenting with a significantly cheaper alternative. The company uses commercially available gaming graphics cards, with total costs under $100,000.
Fastino focuses on the development of so-called Task-Specific Language Models (TLMs). In contrast to large language models (LLMs), which are trained on a broad spectrum of tasks, TLMs are tailored to specific applications. This focus allows Fastino to train smaller and more efficient models that achieve good results even on less powerful hardware.
The startup offers its customers various AI-powered services, including text summarization, support for creative writing, and the automated censoring of sensitive data. The smaller models enable faster response times and lower resource consumption.
Fastino emphasizes the cost efficiency of its approach. Not only are the hardware costs significantly lower, but the pricing model for customers also differs from the common billing methods based on input and output tokens. Instead, Fastino offers fixed subscription prices, ranging from a free plan with 10,000 requests per month to an enterprise subscription with three million requests for $1,275.
The startup has already acquired its first customers and has raised a total of around $25 million from investors in several financing rounds. This indicates a growing interest in cost-efficient solutions for AI training and applications.
Fastino's approach raises the question of whether expensive high-end GPUs are always necessary to train powerful AI models. For specialized applications, smaller, more efficient models running on cheaper hardware could represent an attractive alternative. It remains to be seen whether this trend will intensify in the future and how it will affect the development and deployment of AI.
Developments in the field of AI hardware and software are dynamic. New approaches and technologies could further reduce the costs of AI training and applications and increase accessibility for companies of all sizes.
Sources: - t3n.de/news/statt-teurer-gpus-dieses-startup-trainiert-seine-ki-mit-billigen-gaming-grafikkarten-1686320/ - x.com/t3n/status/1920415693821948302 - www.facebook.com/100064654845221/posts/1118886030276548/ - t3n.de/tag/kuenstliche-intelligenz/ - www.threads.com/@t3n_magazin/post/DJY1kbiqR7V/muss-für-das-training-von-ki-immer-auf-teure-grafikkarten-zurückgegriffen-werden - newstral.com/de/article/de/1266436235/statt-teurer-gpus-dieses-startup-trainiert-seine-ki-mit-billigen-gaming-grafikkarten - t3n.de/news/ - t3n.de/ - x.com/t3n?lang=de