The Largest Infrastructure Project in Human History

I Never Expected This...

“Stargate” is coming… But first, this…

“Woah…”

As hard as it may be to believe, (and it certainly was for me), as you’re reading this, we are more likely than not already inside ‘The Matrix’.

I always laughed at even the mention of the “Simulation Hypothesis”. After all, I considered myself to be a man who believed in “real” science. While science fiction was great entertainment for me (The Matrix is one of my favorite films) there was never any landscape in my mind where the Matrix and our reality were likely to overlap.

And then I was presented with the logic, and astoundingly, found myself no longer able to argue against it. It’s not that I’m sure with any certainty that we ARE in a computer simulation… it’s just that I now understand the math as to why we are more likely to be in one, than not. Yeah, so add this to our already uncertain future.

As we get closer and closer to being able to simulate the real world inside computers, we are more and more likely to already be “living” inside one. Follow the logic with me, as difficult as this may be to hear. Once we are able to create AGI and ASI (and Microsoft is now betting EVERYTHING on this future reality, detailed below), the continued exponential improvement in compute design by those models should quite easily be able to create a world inside a computer where virtual reality will be as good as our currently perceived “real” reality.

Once the human mind is recreated inside a computer, there is no reason why there would not be layers upon layers of simulated worlds, until the math becomes overwhelming… just by pure mathematical odds, we would be much more likely to be living inside one of those simulations than in the one single “base” reality. And since we would not be able to tell the difference, the logic against the simulation simply disappears. Poof! Personally I have a problem with this, but it is an emotional one, not a logical one.

Does AI not seem “real” to you yet? I understand how you feel today, but let’s follow the money for a glimpse of what the next 2-6 years may be like…

The path we are on to AGI/ASI continues to astound me. I laid out a detailed timeline last week on when we may reach these milestones and the potential consequences, both good and bad. But by the end of last week, how we will get there was revealed in further detail.

The mega project for the largest U.S. supercomputer and data center is code named “Stargate”; and it will be the largest infrastructure build that Humanity has ever witnessed.

It turns out that no project this big can be kept secret. Microsoft, which currently has $8-10 billion invested in OpenAI, is apparently betting the entire farm that it can construct the computing facilities for OpenAI to achieve Artificial General Intelligence. Microsoft is already spending up to an additional $10 billion on a new server facility in Mt. Pleasant, WI which broke ground 9 months ago. They paid $50 million for the land alone. This is just “phase 3”, scheduled to be completed by 2026, with a commitment of up to $100 billion to complete “phase 5” by 2028 for OpenAI. Keep in mind that as of December 2023, Microsoft had total cash reserves of $81 billion.

And it’s not just the millions of Nvidia (or other yet to be sourced GPU’s) that they estimate it will take to bring AGI/ASI online by 2028… it’s the estimated 4.5 gigawatts of power they now estimate will be needed to power it. That’s enough power to run more homes than are in the Los Angeles and Phoenix metro areas combined, and equals 1.5x the power production of the largest U.S. nuclear power plant.

And this is already a problem as the training of ChatGPT6 recently hit a power snag. (Keep in mind that ChatGPT5 has already been trained, but is not expected to be released to the public until later this year) Microsoft data center engineers recently expressed some frustrations, which is how this story became apparent. They were unable to pull the power they needed for training OpenAI’s GPT6 without risking bringing down the state’s power grids, with a limit of 100,000 Nvidia H100 GPUs in use at one time. They are now having to devise a distributed power sourcing strategy before proceeding.

ChatGPT3 cost $12 million to train. GPT4 cost $100 million to train, and costs $700,000/day to operate now. We do not know the costs of GPT5, but clearly GPT6 is getting very expensive, with GPT7 (probably the model with AGI) not yet even feasible today due to a current lack of infrastructure.

What are the takeaways here?

1) AGI/ASI may very well be able to find a cheaper power source or may bring down its training and inference (operating) costs by designing new algorithms for itself, but in the meantime, we have a serious power production and GPU shortage challenge to overcome. 

2) Microsoft must have witnessed Q* firsthand. Documented in a research paper in late 2023 to have advanced reasoning capabilities, solving encryption math problems it had never been trained on before, Q* is the entire cornerstone of the belief that Sam Altman and OpenAI can achieve AGI before any other company. Otherwise, why would a company with the reputation of Microsoft bet the farm?