Tech industry watchers were aghast years ago as Uber, Stripe, and other startups raised billions of dollars in new investment capital. Granted that happened when these companies were established, but the amount was staggering. And we are poised to see such activity again, this time with AI startups. But the money won’t all be for hiring more workers or developing the next product on the company’s roadmap. It’ll largely be used to pay down its hardware costs from licensing the LLMs its app uses to any GPUs and infrastructure it might own that help train its models.
Perhaps the biggest investment AI companies will make is in hardware. The cost of buying servers, semiconductor chips, and even building data centers will be enormous, though necessary if Large Language Models (LLMs) are to have the power these systems require to be effectively trained. Only a year ago, we witnessed a shopping frenzy in which companies scrambled to gobble up any available Nvidia chip—and it wasn’t to help power their crypto pursuits. And there’s eagerness to challenge Nvidia, which has seen its business prosper thanks to the AI era.
Today, well-funded companies are contemplating developing their own AI chips or massive data centers that will cost hundreds of billions of dollars. However, less capitalized startups will find themselves in a money crunch trying to keep up, even as technology innovations may lower the cost of processing data. And in an episode of “Keeping Up with the Joneses,” founders could ultimately turn to investors seeking an exorbitant amount of money to stay afloat.
Deep-pocket capitalists are getting ready, too. Venture capitalists in Silicon Valley are known to invest hundreds of millions. Wealthier options outside the United States also seek to plant their flag in the AI space. Saudi Arabia’s government is reportedly in talks with investment firm Andreessen Horowitz and others to establish a $40 billion AI fund.
Could this all be a sign that there’s an AI bubble? According to Google’s DeepMind co-founder and chief Demis Hassabis, all this frantic investment in AI “brings with it a whole attendant bunch of hype and maybe some grifting.”
My point is this: As AI tools and services see adoption growth, the costs of running those operations must be borne somewhere, likely with the provider. In order to train the models on which these systems run, there are costs, including paying to access data and the power needed to produce updated LLMs. Don’t be surprised to see in the short future, some AI startups begin to tighten up their purse strings as they try to find ways to remain competitive while combating rising software and hardware costs.
Leave a Reply
You must be logged in to post a comment.