Upcoming Events
SF & North Bay
Tue, June 11th: 🧠 GenAI Collective 🧠 Marin AI Jun Co-Working
Thu, July 11th: 🧠 GenAI Collective 🧠 The Diversity AI Forum
Wed, June 12th: 🧠 GenAI Collective x Light Dao ☀️ NeuroAI - Accelerating Radical Shifts for Leaders & Investors
Tue, June 25th: 🧠 GenAI Collective x OctoAI 🐙 All About Fine-Tuning LLMs
Thu, June 27th: 🧠 GenAI Collective 🧠 Demo Spectacular 🚀
🗓️ Hungry for even more AI events? Check out SF IRL, MLOps SF, or Cerebral Valley’s spreadsheet!
NYC
Tue, June 25th: GenAI Collective NYC x Fractal Tech: AI Lightning Talks ⚡
Tue, July 9th: GenAI Collective NYC Presents: SWE-Agent & the Future of Software Engineering
Wed, July 17th: GenAI Collective Panel Discussion @ The Inside Summit
The GenAI Collective is thrilled to announce that we will be hosting an exclusive panel discussion at the Inside Summit on July 17th, in partnership with The Room Podcast. Join us for an insightful conversation featuring distinguished guests: Dynamo AI CEO Vaikkunth Mugunthan, Tavus CEO Hassaan Raza, and Rilla Chief AI Officer Michael Castellanos. Together, we’ll explore the current state of Generative AI and delve into what it takes to build a sustainable and differentiated business in this rapidly evolving field. Don’t miss this opportunity to gain valuable insights from industry leaders. Learn more here.
If you didn’t get a chance to listen to our most recent episode of the Collective Intelligence Community Podcast, go listen to our very own Thomas Joshi and Chappy Asel’s interview with Ofer Ronen, CEO of Tomato.ai. In this episode, they discuss the challenges of building 0 to 1 companies to a successful exit from Ofer’s experience as a 3x founder and how to identify attractive market opportunities in the AI ecosystem.
An Update on the AI Chip Race!
As of this post, Nvidia’s market cap is hovering around $3.0T, a >8x increase from the start of 2023 and pushing them ahead of the Big Tech stalwarts Google and Amazon to become the third most valuable company in the world (they actually briefly surpassed #2 Apple on Wednesday afternoon). Why? Their GPUs that were once known primarily for their use in gaming have now become the backbone of all AI systems globally. That said, if you’re reading this article you already know this intimately. Nvidia’s growth has become a proxy for the growth of AI. The chip maker just reported 3x year-over-year growth in sales for the third straight quarter and is the undisputed category leader with greater than 80% market share. Whether you’re training, running inference, or using AI applications, you’re probably contributing to Nvidia’s top line. Although Nvidia’s historic rise proves the size of the opportunity (Bloomberg predicts Generative AI alone to grow from $40B in 2022 to $1.3T in the next 10 years), the success has also highlighted the industry’s dependence on one provider. Nvidia would love to control 100% of the market, but the ecosystem is fighting back. Big Tech incumbents, rival chip makers, and emerging startups are beginning to challenge Nvidia’s dominance and capture a share of this AI revolution.
Big Tech fights back!
Big Tech is in an arms race to become the premier cloud computing platform for AI workloads. In Q1, the largest cloud providers, Google, Amazon, and Microsoft, spent $40B on capital expenditures, which represented >2x increase quarter-over-quarter. Even with this outsized spend, the increases in the performance of their data centers by using Nvidia’s newest processors like the H200 and Blackwell Superchips may be well worth the investment. Nvidia predicts in their last earnings call that “for every $1 spent on NVIDIA AI infrastructure, cloud providers have an opportunity to earn $5 in GPU instant hosting revenue over 4 years.” The ROI for ramping up design and production in-house is clear as AI workloads become ubiquitous across the entire software landscape, and these incumbents (and others) have already invested substantial resources to begin rolling out their own custom silicon. They’ve also formed deep partnerships with leading foundation model providers to ensure this layer isn’t single threaded to one provider.
Amazon
Amazon has made significant strides in AI chip development releasing its Trainium chip in 2018 and Inferentia in 2019 and continuing to build more performant versions. The company’s recent $4B investment in Anthropic as well as their commitment to supplying their own custom silicon underscores their push to create viable alternatives for customers. By leveraging its own chips, Amazon aims to optimize performance for its cloud services and customer applications while reducing the costs and dependency on an external provider.
Google has been at the forefront of AI chip innovation using its own Tensor Processing Units (TPUs) since 2015. In May, Google announced the sixth version of its chip, Trillium, which they used to train Gemini and Imagen. These chips remain integral to Google's AI infrastructure, powering products like Google Bard and numerous cloud services. Additionally, Google's investment in AI startups, such as Cohere and Anthropic, is also strategic, ensuring these companies could utilize Google’s TPUs, thereby expanding their ecosystem and reducing dependency on Nvidia.
Microsoft
Microsoft's foray into AI chips includes the development of its Maia and Cobalt chips, which are designed to support its AI products. Microsoft’s substantial investment in OpenAI ($13B) aligns with this strategy, as it integrates OpenAI’s models with its proprietary chips to enhance performance and efficiency. They’ve announced a fabrication partnership with Intel to accelerate the production timeline (currently slated for 2025).
Meta
Meta has announced plans to develop AI chips tailored to its foundation model development and AI applications. Although still in development, Meta announced they would begin deploying their custom chips in their own data centers later this year. Meta’s initiative is another example of a strategic move to gain better control over its AI infrastructure costs and reduce reliance on Nvidia.
What about the other chip leaders?
AMD
Advanced Micro Devices (AMD) has positioned itself as a formidable competitor to Nvidia. AMD’s MI300 GPUs are touted as strong alternatives for AI workloads, particularly for inference tasks, and they recently released their newest MI325X accelerators to deepen the rivalry. AMD's success in capturing market share from Intel in the PC and server markets bolsters its credibility in the AI chip arena and recently released its Ryzen AI 300 series to continue its leadership. AMD CEO Lisa Su has projected significant revenue growth from their AI chips, reflecting the market's confidence in their ability to compete.
Intel
Intel, historically trailing in the AI chip race, has made a notable entry with its Gaudi 3 AI accelerator chips and recently unveiling its Xeon 6 processor. These chips are designed to offer superior performance and energy efficiency compared to Nvidia’s H100 and potentially compete with Blackwell. Intel’s strategy also includes substantial investments in building out its own chip manufacturing facilities across the US and Europe, which could provide a long-term advantage in supply chain stability.
Emerging startups are becoming formidable challengers
Cerebras Systems
Cerebras Systems stands out among AI chip startups with its Wafer Scale Engine (WSE). The WSE-3, the largest commercial chip ever made, claims to significantly outperform Nvidia in specific tasks, offering a unique approach to AI hardware. Cerebras bundles its chips into complete systems and sells access to these systems as cloud services, providing supercomputer performance to enterprises without the overhead.
Groq
Groq has received significant buzz as of late by developing a specialized tensor streaming processor and inference engine, LPU, designed for high-performance AI workloads. Their architecture focuses on minimizing latency and maximizing throughput, offering a compelling alternative to traditional GPUs for certain inference and training tasks. In the last couple of months, developers have moved AI workloads to their platform because of their highly competitive speed and performance.
Other
There are countless mature startups making noise in the ecosystem but notable players like SambaNova Systems and Graphcore have demonstrated impressive capabilities. These companies focus on specialized architectures designed to optimize their own AI workloads, aiming to capture niches that Nvidia’s more generalized GPUs might not address as efficiently and giving them a competitive advantage when training their own foundation models.
Conclusion
Nvidia's ability to maintain its lead will depend on its continued innovation and ecosystem advantages. They’ve committed to releasing a new AI chip architecture every year and will continue to roll out software tools that could more deeply entrench its chips in the applications that their customers are building on top. That being said, the strategic moves by Amazon, Google, Microsoft, and Meta, combined with the relentless drive of companies like AMD, Intel, and innovative startups, indicate that the AI chip market will be fiercely competitive. As these companies balance collaboration and competition with Nvidia, the ultimate beneficiaries will be the end users who will continue to see lower costs and higher performance for their AI applications. Go capitalism and keep building!
Events Recap
🚀 We made SF TECH HISTORY with the GenAI Collective community's inaugural Demo Extravaganza! 🚀
With 50+ early-stage AI companies applying for the chance to demo in front of and receive feedback from 250+ attendees, this was undoubtedly our most anticipated community gathering yet! 🤩
🏅 AWARD WINNERS
• 🎨 Best Design: Dovetail
• 🤖 Best Technology: Indexical
• 🏆 Best Overall: Narada
This event would not have been the incredible success it was without the active participation of ~literally everyone~ in attendance who submitted nearly 1,000 pieces of invaluable feedback to the demoists! 🤩
This is only the beginning of what we hope becomes one of SF's most legendary monthly event series! And remember the GenAI mantra, this is the worst it's ever going to be...
Come join us for the next event, where we're running it back with an EVEN MORE outrageously ambitious Demo Spectacular!! 🚀
Value Board
Join the GenAI Collective team!
Are you passionate about AI, meticulous about details, and a wizard at organizing events? The GenAI Collective is on the lookout for an Events Leader to bring our exciting events to life! This is your chance to dive deep into the AI community, enrich your network, and showcase your skills. If you’re ambitious, organized, and ready to make a splash in the AI scene, we want to hear from you. Let’s create unforgettable events together! Reach out at stephen@genaicollective.ai 😄
About Eric Fett
Eric leads the development of the newsletter and online presence. He is currently an investor at NGP Capital where he focuses on Series A/B investments across enterprise AI, cybersecurity, and industrial technology. He’s passionate about working with early-stage visionaries on their quest to create a better future. When not working, you can find him on a soccer field or at a sushi bar! 🍣