Welcome to the Tech Brew Ride Home for Tuesday, January 6, 2026. I'm Brian McCullough. Today, NVIDIA launches its next chip platform. AMD says, hey, us too. Also, did you know NVIDIA is building self-driving car platforms? Did you know Dell is bringing the XPS brand back? And did you know that the most interesting product launch I've seen so far has come from LEGO? Here's what you missed today in the world of tech. You may have noticed that your customers love webinar and video content, but if you've ever put together a webinar or a video, then you know that it can eat up a lot of your time and budget. But now, thankfully, there's a singular tool that can streamline your team's video and webinar workflows, Wistia. Wistia can scale your content output with AI-powered tools that help you create, edit, and repurpose videos and webinars quickly. And speaking of webinars, you can host engaging, easy-to-setup webinars in Wistia, too, complete with built-in analytics. With Wistia, you don't have to pay for multiple video tools, hop between platforms, or constantly re-upload files. Create, edit, collaborate, and publish all in one place. Head to wistia.com slash brew to learn more. That's W-I-S-T-I-A dot com slash brew. With Wistia, you can expect less work and more plays. Well, NVIDIA comms did not rate me as being cool enough to attend the keynote in person, but yesterday here in Vegas, Jensen Wong and NVIDIA launched the Verirubin platform, saying it will offer dramatic reductions in inference and training costs compared to Blackwell across six new chips. Good to see somebody is still treating CES as the platform for launching big things. Quoting The Verge, NVIDIA is kicking off 2026 with the early launch of its new Vera Rubin computing platform following a record-breaking year for the Rubin GPU's predecessor, Blackwell, fueled by the AI boom or bubble. During a press briefing ahead of today's keynote, Dion Harris, NVIDIA's Senior Director of HPC and AI Infrastructure Solutions, described Vera Rubin as six chips that make one AI supercomputer. Those six chips include the Vera CPU, Rubin GPU, NVLink 6th Gen Switch, Connect X9 NIC, Bluefield 4 DPU, and Spectrum X 102.4 TCPO. The platform will support third-generation confidential computing and, according to NVIDIA, will be the first rack-scale trusted computing platform. NVIDIA claims the Rubin GPU is capable of delivering five times as much AI training compute as Blackwell. The Vera Rubin architecture as a whole can train a large mixture of experts' AI model in the same amount of time as Blackwell, while using a quarter of the GPUs and at one-seventh the token cost. The Rubin launch was originally expected for late this year. Its arrival today comes just a couple of months after NVIDIA reported record-high data center revenue up 66% over the prior year. That growth was driven by demand for Blackwell and Blackwell Ultra GPUs, which have set a high bar for Rubin's success and served as a bellwether for the AI bubble. Products and services running on Rubin will be available from NVIDIA's partners starting in the second half of 2026, end quote. Jensen also said yesterday that the Vera Rubin chips are in full production already, and that Rubin can train some LLMs with roughly one-fourth the chips Blackwell needs. Oh yeah, says AMD. They yesterday teased their next generation's CDNA-6-based MI500 AI chips built on a 2-nanometer node, claiming 1,000x performance games over predecessors. So yeah, take that, NVIDIA, quoting Reuters. Advanced Micro Devices CEO Lisa Su showed off a number of the company's AI chips on Monday at the CES trade show in Las Vegas, including its advanced MI455 AI processors, which are components in the data center server racks that the company is selling to firms like ChatGPT maker OpenAI. Sue also unveiled the MI440X, a version of the MI400 series chip designed for on-premise use at businesses. The so-called enterprise version is designed to fit into infrastructure that is not specifically designed for AI clusters. The MI440X is a version of an earlier chip that the US plans to use in a supercomputer. AMD is one of NVIDIA strongest rivals but has struggled to have as much success In October AMD signed a deal with OpenAI that in addition to the financial upside was a major vote of confidence in AMD AI chips and software But it is unlikely to dent NVIDIA's dominance as the market leader continues to sell every AI chip it can make, analysts said. At the Monday event, OpenAI president Greg Brockman joined Sue on stage and said chip advancements were critical to OpenAI's vast computing needs. Looking to the future needs of companies like OpenAI, Sue previewed the MI500 and said it offered 1,000 times the performance of an older version of the processor. The company said the chips would launch in 2027, end quote. But back to NVIDIA for another second. They have their hand in basically everything now, right? Including robotics and such. But did you know that they've built a bunch of autonomous driving platforms too? The Verge says Tesla should be worried. Quote, the vehicle is using NVIDIA's new point-to-point Level 2 or L2 driver assist system that is getting ready to roll out to more automakers in 2026. This is the chipmaker's big bet on driving automation, one it thinks can help grow its tiny automotive business into something more substantial and more profitable. Think of it as NVIDIA's answer to Tesla's full self-driving. For roughly 40 minutes, we navigate a typically chaotic day in San Francisco, passing delivery trucks, cyclists, pedestrians, and even the occasional Waymo robo-taxi. The Mercedes, under guidance from NVIDIA's AI-powered system as well as its own built-in cameras and radar, handles itself confidently. Traffic signals, four-way stops, double-parked cars, and even the occasional unprotected left. At one point, it makes a wide right turn to avoid a truck that's blocking an intersection, but not before allowing a few slowly moving pedestrians to cross in front. Tesla fans would likely scoff at NVIDIA's demonstration, arguing that full self-driving is orders of magnitude more capable. NVIDIA hasn't been working on this problem as long as Elon Musk's company, but what they showed me absolutely would go toe-to-toe with FSD under the most complex circumstances. And thanks to the redundancy provided by Mercedes Radar, some could argue it's safer and more robust than the camera-only FSD. The invitation to test out NVIDIA's new system came as a bit of a surprise. After all, the company isn't exactly known as a self-driving leader. And while NVIDIA has long supplied major automakers with chips and software for driver-assist systems, its automotive business is still relatively tiny compared to the billions it rakes in on AI. Its third-quarter revenues were $51.2 billion, but its automotive division only made $592 million, or 1.2% of the total haul. That could change soon as NVIDIA seeks to challenge Tesla and Waymo in the race to level 4 autonomy, cars that can fully drive themselves under specific conditions. NVIDIA has invested billions of dollars over more than a decade to build a full-stack solution, says Xin Zhao Wu, the head of the company's automotive division. This includes system-on-a-chip hardware along with operating systems, software, and silicon. And Wu says that NVIDIA is keeping safety at the forefront, claiming to be one of the few companies that meets high automotive safety requirements at both the silicon and the software levels. That includes the company's Drive AGX system on a chip, similar to Tesla's full self-driving chip or Intel's Mobile Eye IQ. The SoC runs on the safety-certified Drive OS operating system built on the Blackwell GPU architecture that's capable of delivering 1,000 trillions of operations per second of high performance compute, the company says. Jensen always says the mission for me and my team is to really make everything that moves autonomous, Wu says. Wu outlines a roadmap in which Tesla will release level two highway and urban driving capabilities, including automated lane changes and stop sign and traffic signal recognition in the first half of 2026. This includes an L2++ Plus system in which the vehicle will be able to navigate point-to-point autonomously under driver supervision. In the second half of the year, urban capabilities will expand to include autonomous parking. And by the end of the year, NVIDIA's L2 Plus Plus system will encompass the entirety of the United States, Wu said. For L2 and L3 vehicles, NVIDIA plans on using its Drive AGX Oren-based SoC. For fully autonomous L4 vehicles, the company will transition to the new Thor generation. Software redundancy becomes critical at this level, so the architecture will use two electronic control units, a main ECU and a separate redundant ECU. A small-scale Level 4 trial similar to Waymo's Robotaxis is planned for 2026, followed by partner-based Robotaxi deployments in 2027, Wu says. And by 2028, NVIDIA predicts its self-driving tech will be in personally-owned autonomous vehicles. Also in 2028, NVIDIA plans on supplying systems that can enable level three highway driving in which drivers can take their hands off the wheel and eyes off the road under certain conditions Safety experts are highly skeptical about L3 systems by the way What makes this particularly notable is how quickly progress has been made Tesla took roughly eight years to enable urban driving with FSD, whereas NVIDIA is expected to do the same within about a year. No other passenger car system besides Tesla's has achieved this, Connie boasts. We're coming fast, he says, as the Mercedes slows itself down at another intersection. I'd say we're very close to FSD, end quote. Maybe Jensen isn't just blowing smoke when he says he envisions AI going into everything and thus NVIDIA chips going into everything as well, maybe into every last car, at least. It's the holidays, which means you're probably trying to figure out what to get the people in your life who live in back-to-back meetings. This isn't some sci-fi concept. It's PLAUD, P-L-A-U-D. It snaps onto the back of your phone and records phone calls, meetings, and conversations. This isn't just note-taking, though. It can summarize meetings, generate to-do lists, draft emails, extract insights, analyze perspectives, and help you make better decisions, all with full contextual awareness across your past conversations and meetings. Black Friday is coming and Plod is giving TechBrew Right Home listeners 20% off. Search P-L-A-U-D on Google or Amazon and get 20% off. I always try to report on Dell and Lenovo news for you when pertinent, because I figure a lot of you use those brands as your daily driver computers. So, news that Dell has revived the XPS brand with new XPS 14 and 16-inch laptops, offering new designs, Intel Core Ultra Series 3 chips, and tandem OLED screens, but no dedicated GPUs. Quoting Gizmodo, at last year's CES, Dell made the eyebrow-raising decision to axe all its legacy laptop brand names and instead opt for Apple-like conventions. Instead of XPS, we were forced to comprehend the differences between a Dell, a Dell Pro, a Dell Premium, and a Dell Pro Max. Now Dell is admitting it made a mistake. Whether or not you're happy with your Dell Pro Premium Plus Max Crunchwrap Supreme, the XPS line is back and now includes 14 and 16-inch models sporting up to an Intel Core Ultra X7 or X9 chip for more GPU capabilities. The revised XPS also includes the brand name stenciled on the laptop lid. Otherwise, the notebook is getting back to basics. The 14-inch model starts at 3 pounds in weight and is barely more than half an inch thick, plus it contains a 2.8K OLED screen as an option. The new versions of the XPS laptops no longer have the glowing touch function row keys of recent editions. That proved terrible for the sake of accessibility. Dell has maintained the seamless trackpad and squared keyboard keys that help make the XPS laptops visually unique. If you're not a big fan of keyboards with no space between keys, you probably won't enjoy the XPS's big reunion tour. Whether or not this is Dell playing 5D chess and creating demand for XPS by taking it away, the next Dell XPS 14 starts at $2,050, while the 16-inch model will demand $2,200 starting January 6th. If you think that's high, that's because Dell revised its price point from $1,650 and $1,850 respectively. Just as Dell hinted to Gizmodo back in December, it's taking targeted price adjustments in response to the ongoing chip shortage. A new XPS 13 will come later this year, end quote. Let's close today out by hitting some actual CES stuff. Chinese robot vacuum maker Roborock unveiled the Saros Rover, a concept robot vacuum with two wheel legs that allow it to climb steps, as robotics is dominating CES. More on that in a second. Quoting Bloomberg, It's the first robot vacuum cleaner with two-wheel legs, according to the company, which is formerly known as Beijing Roborock technology. Those legs can be raised and lowered independently of each other, the firm said in a statement Tuesday, allowing it to climb steps and navigate other uneven surfaces while making sudden stops and small turns along the way. Roborock was a surprise hit at last year's CES when it unveiled a different robotic vacuum, the Saros Z70, which had a mechanical arm that could pick up stray socks. While the company dazzled onlookers at the show with a tightly choreographed demo, the device was met with a lukewarm reaction from tech reviewers when it went on sale a few months later for in the US Aside from the high price a common complaint was that the Z70 could only recognize a small handful of items such as tissues, paper, and slippers, but not, say, kids or pet toys. Following that ill-fated launch, Roborock is taking a different approach with the two-legged rover, which does not have a confirmed launch date according to the company. The rover navigates using a combination of artificial intelligence, several motion sensors, and 3D spatial information. In a demonstration for media ahead of Tuesday's announcement, it successfully climbed several steps, rolled down a ramp, and pulled off a small jump. A maneuver it might use to go downstairs or bypass certain obstacles, the company spokesperson told Bloomberg. It was not immediately clear from the carefully staged presentation what happens if the rover falls and whether it can reorient itself without the help of a human. In the event of an accident, the robot will try to get up by itself, the spokesperson said, end quote. Now, one keynote I did attend yesterday was Lego. Yes, Lego, because listen to this. Lego has unveiled what it calls the most significant change to its iconic building system in nearly 50 years. It's called the Smart Brick, a fully functional computer that fits inside a traditional 2x4 Lego brick. This new component is at the heart of Lego's upcoming SmartPlay platform, which aims to blend the classic tactile experience of LEGO with dynamic, interactive responses, all without screens. The SmartBrick uses built-in sensors, a speaker, and wireless technology to bring physical creations to life, essentially. It responds to special NFC-enabled smart tags embedded in tiles and mini figurines, detects motion and gestures, and can communicate with other SmartBricks via a Bluetooth mesh network. This lets sets create effects like lightsaber hums, roaring engines, blaster sounds, or even music when characters are positioned in particular ways. The bricks are wirelessly rechargeable and designed to hold their charge even after long periods of inactivity, a big shift from past Lego tech that my kids have had that relied on replaceable batteries that you basically have to replace all the time. Unlike previous electronic Lego products such as Lego Mario, See my previous complaints about reliability there. The Smartbrick doesn't use a camera or AI, and its onboard microphone isn't for recording. It's just another sensor that reacts to simple sounds like claps or blowing. The first Smartbrick sets will launch in March, starting with three LEGO Star Wars models that incorporate these interactive elements. They're slightly smaller than traditional minifigure-scale Star Wars ships, in part due to the added cost of the technology. Lego says smart play is just the beginning of a broader vision to make brick play even more imaginative, social and responsive, potentially extending into future themes beyond Star Wars. And more on this from The Verge. quote. In a 30-minute interview, Julia Golden, the LEGO Group's top executive in charge of product and marketing, hints again and again that smart play in which bricks have light, sound, and the ability to detect movement isn't an experiment, but rather a tremendous opportunity for the future and that it isn't just for kids. Golden presided over the LEGO Group's massively successful expansion into sets for adults, including $50 flower bouquets and $850 Millennium Falcons, and she says she has no doubt in my mind these bricks will eventually make their way into adult sets as well. And when they come to adult sets, she hints that the SmartPlay initiative might bring a different meaning to them. She says the company sees this as an optional part of the future, something that LEGO fans can engage with or not. The smart bricks and tiles will be a multi-purpose tool and a LEGO builder's tool chest. Today, a bunch of LEGO hot dog pieces can be turned into flowers when you attach some petals. Tomorrow, a medical scanner smart tile can turn into an alarm when you attach it to your LEGO city's police jail and a minifigure tries to escape. Golden says there may be all sorts of intriguing combinations in the future as LEGO produces more interactive tiles and minifigures too. She asked me to imagine what happens when smart LEGO Star Wars minifigures meet smart LEGO Marvel minifigures, a crossover event perhaps. Lego partner Disney controls both universes, so it seems quite plausible, end quote. All right, was really running late today. Too many things to do. I am finally heading out to the convention floor, so check my X account at Brian MCC because I'll just post a whole bunch of pictures of all the cool things I see today, especially the robots. I'm going to find the robots first. I would cross post to Blue Sky as well, but that's probably too much work. So at Brian MCC on Twitter. Talk to you tomorrow.