Unless you're buying proprietary technology, a knowledge base, or a network effect, commodity SaaS no longer needs to exist. It has become cheaper for many companies, even small ones, to build than to subscribe. This would have seemed absurd ten, even five years ago. What changed?
Machine learning, at its core, is pattern recognition: the extraction of structure from data to anticipate what comes next. In this sense, it mirrors human cognition. T.S. Eliot argued in "Tradition and the Individual Talent" that creative work is never wholly original; it emerges from deep absorption of the past, synthesized into something new. The poet's mind, he wrote, is a "receptacle" for images, phrases, and feelings accumulated over a lifetime of study. But Eliot insisted this synthesis required "great labour."
AI removes the labour. It offers the synthesis on demand.
The economic framework appears in Prediction Machines, where three professors at the University of Toronto argue that technology disrupts by making something once rare cheap. Light. Arithmetic. Search. With AI, prediction has become cheap. Ivan Zhao, the founder of Notion, strikes a similar chord in a recent essay. He compares AI to steam and steel—technologies that revolutionized the modern economy. Knowledge work, he observes, once the proprietary domain of humans, is now being transformed by AI agents.
I'd push the argument further. What AI makes cheap isn't just prediction. It's knowledge itself. But this raises a question the economists don't answer. Does knowledge have value because it required labor, or because it produced useful outcomes? If the latter, AI is simply a more efficient means to the same end. If the former, something else is lost.
Consider how discoveries actually happen. When Alexander Fleming returned from holiday in 1928 to find mold contaminating his petri dishes, he noticed that the mold had killed the bacteria around it. Fleming's prepared mind recognized discovery where others saw only contamination. This was not the output of an algorithm, but a judgment shaped by experience applied to the unexpected.
Zhao's vision is telling. His cofounder no longer writes code; he orchestrates AI agents. But if no one labors through the years of learning to code, who remains qualified to manage? The ladder is pulled up behind the last generation to climb it.
Eliot understood something the efficiency argument misses. The labor of acquiring tradition wasn't a cost—it was constitutive. The poet becomes a poet through the effort of absorption. Remove the process, and you don't just get cheaper synthesis. You get a different kind of mind.
Sex and the City you watch in your twenties is a distinct experience from watching it in your forties. Not because the show has changed, but because you have. Experience is not fungible. AI can synthesize the past, but it cannot live through time, nor discover what it has not seen. The question is whether that still matters—and to whom.