Will the AI Bubble Affect Publishing?

I’ve been dormant on this blog for a few months, busy busy busy with lots of exciting stuff! That’s the risk of putting a blog on your site—initial enthusiasm gradually fades leaving you with the onerous task of having to update a blog all while not necessarily having anything to say. I have some fantastic guest posts that I need to get up here too… all that for 2026! Stay tuned!

One issue I feel compelled to address relates to the so-called “AI Bubble.” So-called because it’s a financial term, meaning that the value of “AI” is built upon hype, not upon actual results. So much money is being invested (or, more accurately, promised to be invested) that the value of companies is greater than their likely eventual return. It’s a fad, albeit one capable of reducing everyone’s investments dramatically when (not if) the bubble bursts.

As a publisher, we don’t have much stake in the market. If there’s economic downturn, we feel the pinch like everyone. Fewer authors interested in publishing. Fewer people buying books. But our cycles aren’t really driven by economics, and we don’t invest any company funds (though in economic downturn, interest rates may be lowered, and I will miss the interest we earn on our savings!). So we don’t have much to worry about from a bubble popping.

But what about the technology itself? Part of the assessment of the “bubble” is the fact that no one has figured out how to earn revenue that even approaches costs from “AI.” And when I say AI, I’m referring mostly to generative models that can (tongue firmly in cheek) replace all software developers, write novels, create hit songs, and generate your next favorite movie. These generative models are like meeting someone who can do sleight of hand magic at a party. It’s awesome to see them do it and let yourself be wowed. But when the question comes to how that person makes a living, no one is surprised when they answer something like “lawyer” instead of “professional magician.”

There’s no money in party tricks, and so far, no company has found a profitable use of generative text beyond party tricks. The word profitable is key here because there are three key elements that must be factored in when assessing these AI tools.

  1. Do they work? And the sad answer to that question right now is “not very well.” Like the magician at the party, you start to notice what’s really going on after seeing the trick a few times in a row. So when it comes time to uses like “replacing software developers” or “writing a novel,” we see pretty quickly that the tools themselves are bad at the job, and that their ability to get iteratively better over time is limited based on the nature of how they perform their tricks. There’s no additional information to “feed’ the models that will suddenly make them more efficient or less prone to making things up. These elements are baked into the design.
  2. Do they work efficiently? Even if the models did what we have been told they can do… someday… just wait… it’ll happen…, there’s still the question of the resources required to make them work. This is why the only company making money from the AI bubble is Nvidia. They make the chips that are perfect for running the models for generative text. Making the chips is profitable. But running and maintaining those chips for AI tasks is incredibly expensive. And just like the models themselves, there doesn’t seem to be a path to reduce these costs. And with new chips every year, new software to take advantage of their new capacities, etc., we run into a case where making models better can also just add to costs. More features are exciting, but they may also be a way for companies to lose even more money every time someone uses an AI tool.
  3. Do they do anything that people will pay for? This, to me, is the crux of the issue, but also one of the hardest to talk about. If we trust media coverage on technology, then our answer here is an emphatic yes. But looking at the numbers more closely, even companies with established customer bases paying for software (like Microsoft) are getting only a small fraction of users paying for additional AI features. Plenty of people may be willing to use the services for free, but when it comes time to convert that user to a paying customer, there is no evidence that users find the offering valuable.

When we combine these things together we see a product that is not as useful as promised that costs a lot to run and that doesn’t generate enough revenue to even come close to offsetting expenses.

Of course, this doesn’t mean the tools will go away, even when the bubble pops, but I expect the features will end up looking a lot more like Apple’s implementation of the tools—integrated into software, running on-device as much as possible, and with added features that run only on the latest devices (thus providing an incentive for more hardware sales). And, in that way, we won’t see the tools become able to produce writing that displaces the role of humans in authorship. And in the end, even if they could produce work that rivals humans, we also have to note the centrality of the author in selling work. I don’t think the same enthusiasm will ever result for a book generated by an algorithm. Publishing, in my view, remains secure, no matter when the bubble pops and no matter where the technology goes next.

 

Leave a Comment