From Unsplash.

When does the AI craze end?

Bret Waters


“The AI craze will be over by June”. That’s the final slide in my 2024 Trends presentation, and it provokes all sorts of lively conversation every time I give the talk.

But here we are in mid-March, and Generative-AI continues to dominate tech news. Yesterday Bloomberg reported that Apple and Google were in talks on an AI partnership, plus Apple released a research paper on their AI work (unusual for them), and meanwhile the EU was busy enacting new AI regulations that everyone is scrambling to understand and comply with.

By ending my 2024 Trends talk with “The AI craze will be over by June”, I don’t mean to suggest that Generative AI is a meaningless flash-in-the-pan (it isn’t) but I’ve seen many of these Silicon Valley cycles in my career and we always seem to go through the standard Gartner Hype Cycle. In this case the “Technology trigger” was the release of ChatGPT in November of 2022, the “Peak of inflated expectations” was toward the end of 2023, and now 2024 will likely bring “The trough of disillusionment”. It has always been thus.

Here’s the thing: Last year we all played around with ChatGPT, and it appeared to be easy, cheap, powerful, and amazing. Suddenly every startup in the world edited their pitch deck to call themselves “a revolutionary new AI startup!”, and investors took the bait. A new gold rush was on.

But as always, expectations were probably a bit over-inflated. It turns out that building a useful LLM model costs tens of millions of dollars, the compute power required to run the servers is enormous, and the answers generated by such engines are still uneven.

Estimates are that it costs OpenAI more than $700,000 per day to have ChatGPT running and available for you to play around with (and they’ve added a disclaimer at the bottom that says “ChatGPT can make mistakes”).

So it costs $700K/day to run and may or may not give you correct answers. Wonderful.

Reports are coming in that sales teams at AWS and Microsoft are being told to tamp-down customer expectations on buying Generative AI systems (when salespeople are told to tamp down expectations on something they are selling, that’s worth noting!).

And then there’s the carbon footprint. Training a single AI model can emit 285 tons of CO2, and something like 1% of all carbon emissions worldwide are now generated by AI servers. So in the last year, while we’ve been desperately trying to reduce the pace of global warming, AI servers have been busy making it worse.

Software has been getting smarter and smarter for a very long time. LLM’s definitely appear to have created a step-function improvement, but they use a brute force approach that is expensive, consumes tremendous computing power, and doesn’t really render intelligence as much as just rote memorization of a huge corpus. All of this will improve over time, of course, as Moore’s Law prevails.

This new generation of AI technology is definitely here to stay, but I believe AI as a topic will fade into the background — it will just be inside every software application we use. Think about how semiconductors, once the most important product in Silicon Valley, have faded into the background. They are just simply inside every device we touch.



Bret Waters

Silicon Valley guy. Teaches at Stanford. Eats fish tacos.