Michael Intrator: GPU technology’s adaptability beyond crypto, the monetization of AI through inference, and why GPU lifespan misconceptions are misleading | All-In Podcast
CoreWeave's AI cloud platform expansion challenges misconceptions about GPU depreciation and market value.
Key takeaways
- The transition from crypto to other GPU applications demonstrates the technology’s adaptability.
- Initial GPU investments were crucial for scaling and learning in tech operations.
- Scaling laws in computing are essential for developing transformative AI models.
- Inference is a key economic driver in AI, monetizing the investment in artificial intelligence.
- CoreWeave plays a pivotal role in deploying Nvidia’s new architecture at scale.
- The GPU depreciation debate is influenced by traders with short positions, not market realities.
- Clients’ long-term contracts indicate GPUs retain value beyond short-term depreciation claims.
- GPUs have a longer commercial viability than typically assumed, challenging common misconceptions.
- The demand for AI infrastructure is fostering market competition and profitability.
- Innovative financing structures are crucial for managing cash flow in compute resource contracts.
- The adaptability of GPU technology allows for diverse applications beyond its original crypto focus.
- Long-term investments in GPUs provide invaluable operational insights and business growth opportunities.
- Effective scaling is critical for decommoditizing computing and advancing AI model deployment.
Guest intro
Michael Intrator is co-founder, Chairman, President, and Chief Executive Officer of CoreWeave, Inc., a specialized cloud infrastructure company powering demanding AI workloads. Previously, he co-founded and served as CEO of Hudson Ridge Asset Management, a natural gas hedge fund, and as Principal Portfolio Manager at Natsource Asset Management, where he invested in global environmental markets and energy products. Under his leadership, CoreWeave has scaled into one of the world’s fastest-growing AI cloud platforms, partnering with Nvidia, OpenAI, and Microsoft.
The versatility of GPU technology
-
We immediately moved from crypto to cgi rendering and we built projects that would allow folks that were trying to animate and render images… and then we moved to batch computing and started to look at medical research and different ways of using the compute to be able to drive science.
— Michael Intrator
- The transition from crypto to other applications illustrates the versatility of GPU technology.
- GPU technology’s adaptability is a response to market volatility and the exploration of diverse use cases.
- The evolution of GPU applications reflects changing market demands.
-
The transition from crypto to other applications of GPU computing illustrates the versatility of the technology.
— Michael Intrator
- Understanding the evolution of GPU applications is crucial for grasping market dynamics.
- GPU technology’s adaptability showcases its potential beyond crypto.
- The shift in GPU applications highlights the technology’s role in various industries.
Strategic investments and scaling
-
I kinda feel like buying those initial gpus was the tuition we paid to learn how to run this business.
— Michael Intrator
- Initial investments in GPUs were a learning experience for scaling the business.
- Strategic investments in technology are vital for gaining operational knowledge.
- Scaling laws in computing are crucial for delivering transformative models.
-
What became very clear to us very very early on was that the scaling laws were going to drive… computing decommoditizes at scale.
— Michael Intrator
- Understanding scaling laws is essential for AI model development.
- Effective scaling impacts how AI models are developed and deployed.
- Scaling laws play a fundamental role in computing and AI infrastructure.
Monetization and AI infrastructure
-
I always think of inference as the monetization yeah of the investment in artificial intelligence so when when when we see our compute being used to stand up the massive scale of inference that’s hitting our compute every day…
— Michael Intrator
- Inference is the monetization of AI investments.
- Understanding inference is crucial for grasping AI’s economic implications.
- CoreWeave is at the forefront of deploying Nvidia’s new architecture at scale.
-
So really you know we are we are the tip of the spear in bringing the new architecture out of nvidia into into commercial production at scale.
— Michael Intrator
- Nvidia’s architecture is significant in AI infrastructure.
- CoreWeave’s role highlights the importance of Nvidia’s technology in AI deployment.
- The deployment of new architectures is crucial for advancing AI infrastructure.
The GPU depreciation debate
-
So my take on the GPU depreciation debate yeah is that it’s nonsense right it’s a debate that is being brought to the forefront by some traders that have a short position in the stock and they’re trying to talk down…
— Michael Intrator
- The GPU depreciation debate is driven by traders with short positions.
- Market commentary often does not reflect the reality of GPU usage.
- Clients typically purchase compute resources for five to six years.
-
Our clients come into us and they buy compute for five years for six years our average contract is five years…
— Michael Intrator
- GPUs retain value beyond short-term depreciation claims.
- Understanding contract lengths is crucial for grasping GPU lifespan.
- The debate around GPU depreciation often overlooks long-term usage.
The lifespan of GPUs
-
The concept that a gpu is no longer relevant or commercially viable after sixteen more eighteen months or two years yeah that’s farcical it just it just doesn’t make sense.
— Michael Intrator
- GPUs remain commercially viable for longer than typically assumed.
- Common misconceptions about GPU obsolescence are challenged.
- The ongoing utility of older technology is highlighted.
- Understanding GPU lifespan is crucial for tech industry assumptions.
- The relevance of GPUs extends beyond initial expectations.
- The commercial viability of GPUs is often underestimated.
- Older technology continues to have utility in various applications.
Demand and competition in AI infrastructure
-
The fact that we are attracting competitors the means that the business is healthy and there’s a lot of people trying to deliver this service because the need for this infrastructure…
— Michael Intrator
- Demand for AI infrastructure drives market competition.
- The competitive nature of the AI infrastructure market is highlighted.
- Profitability is impacted by the demand for AI infrastructure.
- Market dynamics are influenced by the need for AI infrastructure.
- The AI infrastructure market is characterized by competition and growth.
- Understanding market dynamics is crucial for grasping AI infrastructure trends.
- The demand for infrastructure fosters innovation and competition.
Innovative financing structures
-
What I do is I create something it’s not a particularly creative name it’s called the box… the box governs cash flow and it has a waterfall of cash flow that comes into it and goes out of it.
— Michael Intrator
- The financing structure for compute resources involves a ‘box’ that governs cash flow.
- Innovative financing models are crucial for managing compute resource contracts.
- Cash flow management is essential for large-scale compute resources.
- Understanding financing structures is key to grasping compute resource management.
- The ‘box’ model provides a unique financial mechanism for compute resources.
- Effective cash flow management is crucial for tech infrastructure financing.
- Innovative financial mechanisms are vital for compute resource management.
Earn with Nexo