Elon Musk Podcast

Nvidia is coming for Tesla

13 min
Jan 7, 20263 months ago
Listen to Episode
Summary

Nvidia unveiled Alpa Meo, an open-source AI model for self-driving cars at CES 2026, with the first production vehicle being the 2025 Mercedes-Benz CLA launching in Q1 2026. This represents Nvidia's bid to become the 'Android of Autonomy' by offering automakers access to advanced autonomous driving technology that rivals Tesla's proprietary FSD system.

Insights
  • Nvidia's open-source approach could commoditize autonomous driving technology, removing Tesla's competitive moat in self-driving capabilities
  • The shift from proprietary to open-source autonomous driving platforms mirrors the smartphone industry's Android vs iOS competition
  • Explainable AI reasoning in autonomous vehicles addresses regulatory concerns about black-box decision making in safety-critical applications
  • Hardware redundancy using multiple sensor types (cameras, radar, LiDAR) may prove superior to Tesla's camera-only approach for safety and reliability
  • The democratization of autonomous driving technology could accelerate industry-wide adoption and reduce barriers to entry for automakers
Trends
Open-source autonomous driving platforms gaining traction over proprietary systemsShift toward explainable AI in safety-critical applications for regulatory complianceMulti-sensor fusion approaches replacing single-sensor autonomous driving systemsAutomotive industry moving toward collaborative technology partnerships rather than in-house developmentAccelerated timeline for Level 4 autonomous vehicle deployment by 2027-2028Integration of cloud services with vehicle hardware for recurring revenue modelsRegulatory emphasis on transparency and explainability in autonomous vehicle decision-makingCommoditization of advanced driver assistance systems across the automotive industry
Quotes
"Jensen Huang called this the ChatGPT moment for physical AI, the point where machines begin to understand reason and act in the real world."
Host
"Nvidia is offering five to six times more processing power than anyone else on the market."
Host
"Jensen said the goal is that someday every car and every truck will be autonomous."
Host
"This is a clear bid to become the kind of like Android of Autonomy. While Tesla continues to keep its FSD stack completely closed similar to Apple."
Host
"The technology that only a few companies had access to is beginning to be available for everybody."
Host
Full Transcript
3 Speakers
Speaker A

Oh, such a clutch off season pickup Dave.

0:00

Speaker B

I was worried we'd bring back the same team.

0:02

Speaker A

I meant Those blackout motorized shades lines.com.

0:04

Speaker B

Made it crazy affordable to replace our old blinds.

0:07

Speaker A

Hard to install?

0:09

Speaker B

No, it's easy. I installed these and then got some from my mom. She talked to a design consultant for free and scheduled a professional measure and.

0:10

Speaker A

Install hall of fame son.

0:17

Speaker B

They're the number one online retailer of custom window coverings in the world.

0:19

Speaker A

Blinds.com is the goat.

0:22

Speaker C

Visit blinds.com now for up to 45% off with minimum purchase plus a free professional measure.

0:23

Speaker A

Rules and restrictions apply.

0:28

Speaker C

Nvidia just released an open source AI model for self driving cars at CES 2026, Jensen Huang unveiled Alpa Meo, a 10 billion parameter reasoning model that can explain why it makes every driving decision. And the first production vehicle to use it will be the 2025 Mercedes Benz CLA, launching in the United States in the first quarter of 2026. This is a real car. You can buy it in a few months with Nvidia's full self driving stack build built in. Jensen Huang called this the ChatGPT moment for physical AI, the point where machines begin to understand reason and act in the real world. And the question is whether Nvidia just leapfrog Tesla in the race to autonomous driving by giving every other automaker access to technology that Tesla has spent a decade developing in secret. Now, the technical specs are pretty good. Nvidia's platform runs on two Drive 4 chips delivering over 2000 teraflops of compute power, roughly 1000 trillion operations per second. Tesla's hardware 4 is believed to be in the few hundred tops range and Mobileye's upcoming AQ Ultra is rated at 176 tops. Nvidia is offering five to six times more processing power than anyone else on the market. The Alpa Mayo model weights are available on huggings right now. You can check them out yourself. And the simulation framework is on GitHub. Nvidia released 1700 hours of real driving data for anyone to use, and in this episode we're going to be talking about what the demo looked like, who's partnering with Nvidia, and why. This open source approach could change everything and we'll get right into that right after this very short break. Nvidia demonstrated the system live in San Francisco. The transportation editor from the Verge got a ride in a Mercedes Benz cla equipped with Nvidia's Level 2 Plus system. 40 minutes through the city traffic and the route included delivery trucks, cyclists pedestrians, four way stops, traffic lights, double park cars and unprotected left turns. The car handled all of it without disengagements and according to this report there were no crashes and no hiccups during the entire ride. They said that Nvidia showed would go toe to toe with Tesla's full self driving under the most complex circumstances. That's a significant claim right now. Now the demo showed specific behaviors that suggest the AI is reasoning rather than just pattern matching. At one point the vehicle approached an intersection blocked by a truck. The Nvidia system first slowed to let pedestrians cross, then executed a wide maneuver around the obstacle. The a kind of nuanced decision making combining caution and assertiveness the way a human driver would is what the Alpa MEO model is designed to produce. The AI uses chain of thought reasoning, which means it generates a step by step explanation of its logic before executing each action. That internal monologue is not just for debugging them, it's designed to help the system handle rare edge cases by explicitly reasoning about cause and effect. Tesla's FSD does not do this. Tesla's neural networks are black boxes that produce outputs without explanations. The search strategy is also different from Tesla's. Nvidia's platform supports cameras, radar, LiDAR and ultrasonics in a full 360 degree array. Tesla famously removed radar from its vehicles in 2021. It has never used LiDAR relying entirely on cameras for full self driving. Nvidia is arguing that redundancy is necessary for safety and the demo vehicle used a combination of cameras and radar to perceive its surroundings. Now this approach is safer and more robust than camera only systems, especially in poor visibility or when objects are occluded. Nvidia has partnered with sensor supplier like Ava, hi C and Sony and Arby to create a validated ecosystem of hardware that works with its software stack and automakers can mix and match components without worrying about compatibility. There's no walled garden for this thing Now I've been digging through the analytics of the show. I've noticed 37% of you are following this right now. And for you I'm forever grateful. Thank you so much for joining this community. The other 63% of you haven't hit the follower subscribe button and I'm going to tell you one thing that I think is really important. Supporting independent journalists is important. Important in this day and age because we used to have people that would block everything we had the gatekeepers independent journalists like myself. I've been doing this for I don't know, six, seven years uncovering tech. Elon Musk, SpaceX, space flight, anything like that. Like any sort of tech. I've been doing that for the last six, seven years and I'm going to continue doing it for the next 10 years and all I need from you is literally one second of your time which is going to help the show tremendously. Hit the subscribe or Follow button on whatever podcast platform you're listening or watching on right now. That's going to help me out tremendously and help out the show and continue the growth of the show. We've grown so much because of you and can't wait to go to this next part of the journey because of you now. The partnership list is very impressive. Mercedes Benz is first launching the cla with what they are calling MB Drive Assist Pro, capable of point to point city navigation under driver supervision and Wong said the CLA is safest car in the world, sending its Euro NCAP five star rating. Jaguar Land Rover has a multi year deal to use Nvidia's platform for level four autonomy and future vehicles. Lucid is in granting the full stack into upcoming models. Uber is back in the autonomous vehicle game, working with Nvidia to potentially deploy robotaxis or automated delivery vehicles. The Tier one suppliers are on board too. Bosch, zf, Magna and Quanta are all building electronic controls and units around Nvidia's architecture. That means car companies can buy pre integrated hardware from their existing suppliers and just drop in Nvidia's system without building everything from scratch. I'm an open source advocate. I love that Nvidia is doing this and the open source approach is what makes this different from everything else in the market. Tesla's full self driving software is entirely proprietary. You can't buy it, you can't license it. They're not giving it away, you can't study it. We don't know how it works. Mobilized technology is also proprietary. Waymo and Cruise developed full self driving stacks, but they use them exclusively for their own robotaxi fleets. Nvidia is the first company to release a production grade autonomous driving AI as open source. This is a clear bid to become the kind of like Android of Autonomy. While Tesla continues to keep its FSD stack completely closed similar to Apple. And the implication is that any automaker without Tesla's decade of in house AI development can now get equivalent technology right off the shelf. It's free, you get it. All you have to do is buy the parts and build the software around it and you're Good to go now if Mercedes actually ships a car in Q1 2026 that has similar capabilities to Tesla's FSD and is based on an open sour system any automaker automaker can buy, it could commoditize level 2 plus autonomous systems. Advanced driver assistance would no longer be the secret sauce of a few companies, would be a feature any carmaker can implement with Nvidia's help. That puts pressure on Tesla which has never faced an equal capable rival system in production vehicles. It also challenges mobileye, whose business model depends on being the dominant ADAS supplier. Qualcomm announced new AI features for cars at ces, but Snapdragon ride platform is still significantly behind Nvidia in raw performance and Nvidia laid out a that compresses the timeline to widespread autonomy. By mid-2026 they expect level 2 plus urban and highway driving with automated lane changes, traffic light recognition and point to point navigation. By the end of 2026 they plan to cover the entirety of the United States in terms of operational design domain and add autonomous parking. Late 2026 will bring a small scale level 4 trial similar to Waymo's early robo taxi pilots. And by 2027 Nvidia expects partner robotaxi deployments to begin. By 2028 they're targeting personally owned level four vehicles and level three hands off driveway and highway driving into production cars. That is an aggressive schedule. It depends on regulatory approvals and the confidence of automaker partners though. But the message is very clear. Nvidia is committing to rapid iteration and deployment of this open source software. Now the business model is significant. Nvidia's automotive division currently generates about 600 million in revenue compared to billion in their core AI and cloud business. Now if this works, every vehicle that uses Nvidia's platform brings recurring revenue for chips, software licenses and cloud services. Nvidia also introduced the Rubin platform, a new data center system for training autonomous driving models. That means they capture value both in the vehicle in the cloud. As fleets scale up, Jensen Huang said the goal is that someday every car and every truck will be autonomous. Nvidia is working toward that future with the foundation laid out at CES 2026. They have made a strong case that will make the key architectures of that future. I think it's probably the best announcement at CES 2026 as far as a regular person goes, as far as automakers go, because everybody uses a car right? And if you have ever used Uber, you know how important it is to get your car there. On time and also get you to where you're going. But also it's not the perfect system. Once people are taken out of that equation and we can have robotaxis and we have to work with those robotaxis or work as those robot, it's not just going to be Tesla anymore. People will be able to compete and Nvidia is helping them. So this is a strategic frame too. Listen to this. Jensen said that ChatGPT is similar to this. And he believes this is the moment that reasoning AI will do for robots and vehicles what large language models did for text. For us, the ability to explain decisions is not just about debugging anymore. Governments want this. They want to understand why an autonomous vehicle made a particular choice before they approve it for public roads. And by exposing that reasoning trace, Nvidia is addressing concerns about black box AI in safety critical applications. Regulators and researchers can inspect how the system arrives at each decision. And that transparency can make Nvidia powered vehicles easier to validate and certify than systems that cannot explain themselves. Now, the challenges are absolutely real here. There's a lot of work to do. Automakers traditionally want to control their own stack. Handing over the core driving AI to Nvidia means giving up a lot of that control. But the presence of Mercedes, JLR and Lucid on board suggest a shift towards collaboration. If the big players can do it, if they jump on board, I mean Mercedes, come on now. Public acceptance will require flawless safety records. Though. Nvidia's focus on safety, including their Halos framework for validation and functional safety, is aiming at building trust with government, the competitive response will also shape outcomes here. And also they're building trust with the public. They want you and me to feel comfortable in a car with Nvidia stacks. And if Tesla opens parts of its stack or immobilized undercuts Nvidia on cost, the market could go wild here. Nvidia also released ELPA sim, which is an open source simulation framework for testing autonomous vehicles in virtual environments. And this is cool because engineers can create realistic scenarios with configurable traffic, whether in sensor modes that run the AI through EDGE cases without putting anyone at risk. They don't need an actual car to do this. It's going to be training the AI on infinite amounts of situations that we could hit on the road. Then they're going to run the AI through EDGE cases without putting any people in a vehicle. The 1700 hours of driving data released alongside it covers diverse geographical locations and conditions, including rare and complex situations that are Hard to capture in normal testing. Like just bad drivers. You see them every day. You see a bad driver or you'll see somebody who this is a green light and they're on their cell phone for a while, you have to honk at them. Those kind of things happen all the time. But you can do this virtually so nobody gets hurt, nobody gets mad. So by providing both a simulator and real world data sets, Nvidia's enabling what they call a self reinforcing development loop. Developers train models on a wide range of situations, test them in simulation and continuously improve the system. And all of this, which is crazy, it's public, you and I can check it out. Alpasim A L P A S I M is on GitHub. The data is on a hugging face. We can check it all out. The barrier to entry for autonomous vehicle development just drops significantly. Tesla has some some absolute killer competition. Nvidia has planted their flag. The technology that only a few companies had access to is beginning to be available for everybody. Now this is going to be a crowded race here. Every vehicle manufacturer has access to this. And why would they spend years developing their own from here on out when they can just plug in Nvidia stuff and then plug in some sensors, plug in their own cameras, plug in their own stacks of other things. It seems like a no brainer to me. They're going to be not the Apple, but they're going to be the Android of vehicles. They're going to make so much money, they're going to be selling their cloud services, their sensors, their chips, et cetera. This is going to be so much money for Nvidia. I can't wait to see what happens in the future because one, I want to take a ride in this thing, I want to see how great it actually performs compared to a Tesla. And two, just think about how easy it's going to be when you need to go someplace and you don't actually have to own a car any. That's going to be great. That's going to be one asset that doesn't depreciate when you drive it off the lot. It's going to be democratization of driving.

0:30