ChatGPT reaching 800 million users in 17 months: unprecedented. The number of companies and the rate at which so many others are hitting high annual recurring revenue rates: also unprecedented. The speed at which costs of usage are dropping: unprecedented. While the costs of training a model (also unprecedented) is up to $1 billion, inference costs — for example, those paying to use the tech — has already dropped 99% over two years, when calculating cost per 1 million tokens, she writes, citing research from Stanford. The pace at which competitors are matching each other’s features, at a fraction of the cost, including open source options, particularly Chinese models: unprecedented…

Meanwhile, chips from Google, like its TPU (tensor processing unit), and Amazon’s Trainium, are being developed at scale for their clouds — that’s moving quickly, too. “These aren’t side projects — they’re foundational bets,” she writes.
“The one area where AI hasn’t outpaced every other tech revolution is in financial returns…” the article points out.

“[T]he jury is still out over which of the current crop of companies will become long-term, profitable, next-generation tech giants.”