In the media, there's a lot of speculation about, are we in an AI bubble? Is AI going to be as big as the internet? How will AI transform work? How will it transform businesses?
And there's a lot of media pundits that have a lot of opinions.
And what I'm here to do is to provide a series of frameworks to actually help think through those questions.
I think I've got a couple of useful frameworks that might not predict the future, but have have been pretty fundamental in showing how technology revolutions form over a series of decades and have formed multiple times over the last 250 years.
So this being an AI group, I think I'm preaching to the choir here when I say AI is bigger than ChatGPT writing your emails. This is a fundamental technology paradigm shift.
As I said, providing a collective kind of useful frameworks, I'd like to address the question of, are we in an AI bubble?
and I have a definitive answer to that and I have a couple of ideas on where I think the technology will go in the future as well.
So I've started three companies.
I started an innovation consulting firm, sells innovation strategy to large multinational corporations.
I've launched AI projects to publicly traded corporations.
I also founded a venture capital firm and we actually invested in a breast cancer detection AI company.
I wouldn't say I'm I'm an AI expert, but what I am an expert in is technology and investing and the intersection of the two.
And so this is what I do on evenings and weekends. I sit and think about these types of things.
My original background is in robotics and electrical engineering, and I just love technology and I love investing.
So first framework I want to give you is a little bit of an odd one, and it comes from Mark Twain. History doesn't repeat itself, but it definitely rhymes.
ups.
The cycle that we're just starting to go through over the last two to five years, three to five years, we've been through this cycle before.
And I'm going to walk through as an example of how we walked through the last cycle with the microprocessor and how we're repeating the exact same cycle today.
The second framework is called Amaraz Law. And it says that as humans, we tend to overestimate the effect of technology in the short run.
Will the AI bubble will pop tomorrow.
I don't know if anybody watched the NVIDIA earnings a couple weeks ago.
Will AI take all of our jobs?
And we tend to underestimate the effect of technology in the long run.
And what I mean by that is if we were in 1978, which was the very early days of the PC era and the very middle days of the mainframe era, it would be almost impossible to anticipate anticipate or predict the iPhone.
We're really the proliferation of the internet. And my thesis is that we are in the 1978 days of the AI era.
And what that's going to look like over the next couple of decades is going to be almost impossible to predict what that's going to look like.
So there have been five of these technology revolutions since 1770.
I will map all of them out.
They generally last from 45 to 60 years. The shortest one was, I think, 33. three.
There are seven definable stages. I'll walk through all of them.
And there are three fundamental components of how these technology revolutions form.
Technology revolutions are not based on a single technology. They're based on a cluster of technologies.
And I'll walk through that as well.
And again, there is a repeatable, definable pattern on how these technology revolutions form, because history doesn't repeat itself, but it definitely rhymes. times.
So this is the seven definable patterns, starts with gestation, then there is a point in time big bang, eruption, frenzy, crash, we all know what that looks like if anybody's old enough to remember the dot -com bubble, synergy, which is comparable to kind of the era that we just kind of came out of with mobile SaaS cloud, and then maturity, which is actually what we're living in right now with the CPU era.
and what's important is is is cycles will overlap themselves so a previous cycle will end and a new cycle will start as another cycle is ending and i think that's the situation we're going through
right now here's all the cycles lab mapped out here's the first five of them and i think we've started a six starting in the beginning of the 2000s so again this goes all the way back to 1770 and they generally last between 45 and 60 years.
The last cycle started in 1971 with the launch of the microprocessor, the Intel 4040.
And I think we're just in the very early days of the AI cycle, which should generally last.
And that's why the title of the presentation is called The Next 40 -Year Cycle.
So, definable pattern.
So each of these cycles starts with a technology paradigm shift. And what I mean by that is the rules of the previous cycle are either being rewritten or no longer apply Meaning the playbook that you might have used to build technology or invest in the previous software era
The rules of that are probably being rewritten and the thing is of those rules We don't know which ones are no longer applied and so therefore experimentation is actually going to be rewarded Because we don't know what parts of the playbook are still relevant and which parts are not
just to prove that out we've gone from serial computation to parallel computation we've gone from deterministic software to emergent learning in neural networks and we've gone from retrieval
of static data this so if you went on your phone and you pull down and you pull a website you're pulling static data from a server somewhere as we're just progressing into now this happened to me on linkedin the other day we're getting on the fly real -time generative content created
that's not static content being pulled from a server. It's being generated automatically, and so that's the inference part.
So this is a fundamental paradigm shift. The Von Neumann architecture of a standard CPU is actually even somewhat becoming irrelevant to a degree.
Okay, so each technology paradigm shift is based in a general purpose technology.
This is obviously different than the chat GPT, that's generative pre -trained transformer.
But a general purpose technology is defined as a technology so pervasive and so profound that it actually rewrites the fabric of society.
So steam engine, the internal combustion engine, electricity, microprocessor, these are examples of a general purpose technology.
My argument is that a generative pre -trained transformer or neural network collectively are a general purpose technology. technology.
Don't take my word for it. You can ask Mark Andreessen, Mark Zuckerberg, or Jensen Wong, who all agree with me with that thesis.
Okay, so I mentioned that these technology cycles are not a single technology, they're a cluster of technologies.
My thesis is that there's three major components to a technology cycle.
So one is a raw material input, one is a work system, and another part is a network connection so for the automobile the obviously the material input is gasoline the work system is the internal combustion engine or the automobiles in the internal combustion engine is the technology that goes inside the product
which is the automobile and you didn't see the peripheration of the automobile until post World War two the rebuilding of Europe and you actually started seeing suburbs be being created and that's because of the Highway Act of 1956 under the Truman administration and so the point is as you start seeing the
exponential growth rate of a technology when you start to network and connect them together.
Again, we saw this in the 90s with the Internet. The PC era, there was some proliferation of PC usage, but it didn't take off until 1994 with the real proliferation of the Internet. And that's where we start to see an exponential growth curve of the technology, and my argument is that's also where a bubble starts to form. And so I don't think that we've actually
entered into AI bubble territory because we've got the raw material input is clearly data, that's how the models are trained. The work system is the neural networks and GPUs.
But we haven't figured out a way to network and connect the AI systems together. And I think
there's two possible ways that this could happen. One is potentially blockchain.
I'm not a huge crypto or blockchain fan. I've dabbled in it a little bit, but I made some money in it.
But I think this actually could be a real renaissance for blockchain.
I don't think that there's an internet native payment rails yet this is what PayPal was supposed to be and
this really hasn't come to fruition and when I start seeing agents doing machine -to -machine transactions there needs to be some kind of machine to machine transaction to be able to do commercial transactions the other
potential is MCP from anthropic and that's too for for agents to be able to actually communicate together but it's not to do commercial transactions it
It could be both, like I don't know, but I'm just saying that until we actually start seeing the work systems actually get networked together, that's where you'll see an exponential growth curve and that's actually where I think the bubble happens.
So I'll walk through the previous cycle of the microprocessor in the 70s and kind of show you how this all kind of fits together.
So gestation actually starts, this is the period of time when researchers, academics, scientists create an invention.
For this cycle, it would have been 1947 with Bill Shockley creating the, inventing the transistor, and also Claude Shannon inventing information theory, both of them at Bell Labs.
Then you went through three different companies, so there was Shockley Semiconductor, Fairchild Semiconductor, and Intel, before you actually got a commercializable product. So you went from 1957 to about 1971. 1971.
Big Bang would have been 1971 with Intel releasing the 4004, which is the first commercializable microprocessor.
And then that moves into the eruption phase. So this is where you start seeing a lot of investment into infrastructure.
You start seeing new laws and protocols being created. And this is where an invention and a technology will start getting put into a usable commercializable product.
This is where we saw the microprocessor, the technology, go into the PC, the product.
The next phase is frenzy. This is where value accrues up from the infrastructure layer to the application layer, where we start seeing the network connection happen, where the consumer product actually starts getting easier to use for
general consumers, navigating the internet prior to 92 -ish was very, very difficult to do until you started seeing Netscape.
There's a lot of parts of AI that are still very, very difficult for the average consumer to navigate.
And an interesting thing happens at this point in time is where we start seeing a reorganization of labor around the technology.
This actually happened in the beginning of the Industrial Revolution with electricity, and that's where we'll see a breakout of productivity.
Now, this is really important for the AI bubble. The thing that 1840, 1893, 1929, and 1999 all have in common is widespread financial speculation of the general population.
When people start quitting their jobs to day trade tech stocks or when your Uber driver starts trying to sell you GPUs is when we see a real AI bubble happen. happen.
NVIDIA doing a special purpose vehicles is not general speculation. And so that's the common part.
When we start seeing the general speculation amongst the common public, when the common public are speculating on open AI stock, that's when we have an AI bubble.
Another element of where we start to see bubbles happen is when we have low or we have high amounts of debt financing happens.
So to give you an example, in the telecom bubble, for every dollar of revenue that came into telecom companies, $3 were spent on CapEx. That's $2 that was used, that was financed by debt, and $1 is financed by revenue.
It's about 30 % today. So for every dollar that comes into hyperscalers, only about 30 % of it's spent on CapEx, and a small portion of that's That's actually done with debt.
The other major component is low utilization rates. The utilization rates of fiber in the late 90s was around 2%. Today it's 100%.
So when we start seeing the utilization rates of GPUs drop, that's when we should be a little bit more concerned.
So the interesting thing about the network connection is this is where we see the exponential growth rate, but this is where we start to see also the financial speculation.
So the AI era, this is where we started. started.
Back in 2001, the invention of the GPU. 2007, we see the CUDA framework put together.
2012, we saw the AlexNet team at the University of Toronto figured that you can use a GPU and a neural network to do image recognition.
Then we've got AlphaGo in 2017 and the attention is all you need paper in 2017.
This is the research and invention era that we just came out of. We had the Big Bang.
Honestly, I don't know if the Big Bang happened in May of 2020 with GPT -3, or if it was November 2022 with ChatGPT 3 .5, it was one of the two.
Honestly, it doesn't really matter. And this is where I think we are today.
I think we're in the eruption phase.
Clearly, we're still spending a ton of money as a society building out infrastructure with data centers, and there's still very much more to go there.
We're still converting existing processes into the existing technology.
As an example, most people just use ChatGPT as a Google alternative. They're not actually using it for any real native AI purposes.
And so we haven't really reorganized the labor around the AI.
And we're starting to see new laws and protocols being created.
The the Genius Act passed by the Senate a couple of months ago is a good example of this.
To give you another example of what happened back in the in 1980, the Copyright Act actually had to be rewritten because software had never been sold outside of the hardware.
And so software was decoupled from the hardware and ownership around software had to be rewritten in the copyright act we'll start seeing things like that where we'll run into legal
frameworks that we've never had to explore before so technology runs kind of like a major three -part evolution um adaptation evolution and revolution and so i think we're still in the adaptation phase here this is where we're still converting the existing processes into the technology we're not actually reorganizing the labor or actually applying like we're not reorganizing our
workflow native to the technology just yet at this point in time it's best to buy horizontal technology and to customize it to vertical applications once we get into the evolution kind of native stage we'll be like you use a blank sheet of paper you say regardless of what
happened in the past how do i use the how do i reorient the workflow to this technology specifically that's where we'll see a breakthrough in the productivity and then eventually after that happens second order use cases will start to present themselves that we never would have predicted before.
To give you an example, in 2007 when we all got iPhones, nobody would really have predicted Uber or Airbnb.
But it wasn't until the GPS API was available, I think it was 2011, that Airbnb and Uber were actually a reasonable concept.
So I think
we're still in the adaptation phase. I think we still have a long distance to go before we get to evolution and native workflow.
And that's it. Thank you.