Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
Note: This originally appeared in the weekend commentary section of my InsideAI newsletter.
Andrew Ng has repeatedly compared AI to electricity, positioning AI as a technology that will be everywhere, and in everything. But if you study the history of electricity adoption in the United States, it was not always on track for ubiquity. With all the attention paid to Tesla, Edison, Westinghouse, and others, the real brains behind mass adoption of electricity was Samuel Insull.
Insull realized that electricity could be everywhere only if it was cheap, and to do so required building larger generation plants with better economies of scale. He was a Jeff Bezos style entrepreneurâââalways trying to lower unit prices. This didnât always fly well with others, but, it worked in the long run. The more the per unit price of electricity dropped, the more households used. From that perspective, Insull was very successful. (Though he ended up in some trouble because of financial holding company issues).
I bring up Insull because, at the moment, I donât really see AI positioned in a way that it can mirror the adoption pattern of electricity, even given the constant comparisons. At the moment, AI is very application specific. So I want to ask the questionâââwhat changes need to be made to make AI adopted in a similar pattern to electricity, and what person (or company) will be the Samuel Insull?
To really be âthe new electricityâ AI needs to be more generic. Someone has to come up with a way to provide Intelligence-As-A-Service. This isnât picture classification as a service or sentiment analysis as a service or chatbot dialogue as a service. This is general intelligence as a service. Until we figure that out, I donât see how we can parallel electricity adoption.
Once that happens (if it does, and I could argue that it wonât, at least for a very long time, in another post), we have to think about where the economies of scale are for intelligence, then architect our systems in ways that take advantage of those. Letâs think through what that might mean.
Economies of scale benefits usually come from areas with high fixed costs that then get spread over lots of units of something. In the case of AI, we could start by looking at labeled data. But data doesnât really provide economies of scale. It acts more as a barrier to entry. Itâs hard to get in some cases, but once you have it, itâs very valuable until you hit a point of diminishing marginal returns in your models where more data doesnât really improve the models much. To draw a parallel to electricity, this might be the amount of coal needed to power a generator. Getting coal would be helpful, but, that isnât your advantage. In this world of AI as electricity, Iâm not sure data is still the advantage either.
The next place to look will be at inferenceâââthe point at which an AI makes a decision. There is a computational cost of inference, and lowering that cost allows you to do more and more inference. I believe inference is constrained by hardware right now, which drove my investments in Mythic and Rain. AI hardware innovation is going to help this significantly. So the next wave that starts to move us into the AI-as-electricity world is dropping the per unit cost of inference.
The next place to look after that might be the act of training, and by training I donât mean training a neural net. The current model of doing so is way too targeted to be a generic benefit like we need for the AI-as-electricity framework to make sense. At some point, I think training will be a more generic process that includes humans training machines, and machines learning by reacting to a broad based environment like humans doââânot just narrowly targeted applications. I think broad based training is the place to really get economies of scaleâââtrain a thing once and see it execute that training as many times as the world needs it to for all kinds of applications.
Once we hit that point, you can see how scaling a large training business would drop the per unit cost of intelligence. If you think of intelligence performing âone smart processâ that cost could continually drop through a combination of hardware driven inference price drops and human+software driven training fixed costs being spread out over many units of intelligence.
In his book âAge of Emâ, economist Robin Hanson pointed out that eventually you could drop the cost of intelligence close to the cost of electricity, so, the speed and length at which you want to run will determine the price you pay, more than what task you are performing.
We are a long way from this view of intelligence-as-a-service, and many technical hurdles have to be solved before we get there. But the AI world is moving fast and Iâm keeping my eye out, as an investor, for the companies that could drive this view of the world. Iâm hoping to invest in the next Samuel Insull for sure, and help drive intelligence prices down to the point where it is embedded everywhere.
If AI Is The New Electricity, Who Is The Samuel Insull? was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.