Monday, 9.44pm
Sheffield, U.K.
Truth, like gold, is to be obtained not by its growth, but by washing away from it all that is not gold. – Leo Tolstoy
I’ve been enjoying reading newspapers again.
If you had told me 20 years ago that I would read a lot, then get to a point where I simply looked at lots of small articles lined up like cards, and rarely read a long-form article I would have thought you were nuts.
But that’s what’s happened. We’ve gone from making time to read to “consuming” content.
It’s time to slow down. Like the slow food movement, perhaps we should get back into slow reading.
And thinking about what we’ve read.
I have spent a lot of time over the last year or so thinking about AI.
It’s professionally important to me. Could AI do what I do? Could it help me do what I do better? It seems like my areas of competence – which centre around decision making and the sustainable transition, with a bit of technology thrown in, are affected by, and are relevant to the development of AI.
For example, I have spent much of the last couple of decades immersed in power markets. Power prices are set by supply and demand in deregulated markets. AI is power hungry and there is an explosion in data centers that will use a lot more power than current ones – which means there is a need for new connections, new contracts, and potentially price spikes in wholesale power prices.
Of course, you can build renewable generation near your data center – and there are strategies to mitigate the risks – but the point I’m making is that power is important.
The growth area in the last year has been in building and kitting out data centres to support the use of these new LLM models. That means lots of chips to run the models and big server farms to handle the global demand. ChatGPT runs very fast – much faster than the local LLM on my computer – because it runs on more computers.
Cooling those computers requires power and water – and we’re back to the resource requirements.
But really, all we have so far, is the equivalent of picks and shovels – where’s the gold?
Well, that’s the next thing to figure out. Which companies are going to make more money – either through increased revenues or decreased costs through the use of AI?
This is actually much harder to do than you think.
I think I’ve written about this before, but the savings from an improved process often go to the supplier, rather than the customer.
For example, will you really save money using ChatGPT?
OpenAI has already talked about how it wants to take a share of the savings made by firms using its technology.
A few months ago, I could use the tool and get a useful answer. I could show people how useful it was.
In my most recent tests, it’s searched the web and given me less useful information – almost like it’s teasing me and saying I do know something but you’re going to have to pay more to find out.
That creates friction – and increases costs.
This next stage is the really hard bit and I don’t think many companies are going to get this right.
On the one hand, you have a flaky technology – with its makers trying to figure out how to make money from it.
It’s like having a shovel that randomly turns into ice cream while you’re trying to use it.
On the other hand, you have to figure out what the value proposition really is – all too often we don’t care about what we get. AI writing, for example, isn’t the kind of stuff you’re dying to read. Instead it’s the stuff you submit for a term paper because you can’t be bothered to do the reading.
Are you really going to pay for that sort of output?
Have you seen anything produced by AI that you would pay a premium for?
I don’t know how things are going to turn out, but that’s what makes things interesting.
Cheers,
Karthik Suresh
