The Vergecast

The Vergecast RAM Holiday Spec-Tacular

87 min
Dec 23, 20254 months ago
Listen to Episode
Summary

The Vergecast explores RAM as a critical bottleneck in AI infrastructure, examining how three companies control 93% of global DRAM supply and why consumer RAM prices have quadrupled as data centers consume massive volumes for AI training. Dylan Patel from Semanalysis explains the manufacturing constraints, geopolitical implications, and when consumers can expect relief.

Insights
  • RAM has shifted from a commodity component to a strategic constraint in AI infrastructure, with data centers now consuming supply that previously served consumer markets
  • The memory industry's cyclical boom-bust pattern, combined with years of underinvestment after the 2022 PC crash, created a perfect storm where AI demand coincided with zero capacity expansion
  • New fab construction takes 2-3 years, meaning memory prices will likely remain elevated through 2027 unless the AI bubble collapses, making computer purchasing decisions dependent on macroeconomic forecasts
  • The barrier to entry for new memory manufacturers is so high (tens of billions in capital, decades of R&D, proprietary processes) that the three-company oligopoly is unlikely to be disrupted
  • 3D DRAM represents the next breakthrough needed to resume cost scaling, similar to how 3D NAND revived SSD price declines, but this technology is still years away from mass production
Trends
AI infrastructure spending consolidating around six hyperscalers (Google, Microsoft, Amazon, Meta, Oracle, CoreWeave) creating unprecedented demand concentrationMemory vendors deprioritizing consumer markets in favor of enterprise/data center customers due to superior margins and inelastic demandGeopolitical fragmentation of semiconductor supply chains with export controls limiting China's access to cutting-edge chips and accelerating domestic investmentVertical integration of memory into system-on-chip designs, reducing modularity and making RAM less visible but more ubiquitous in consumer devicesHigh-bandwidth memory (HBM) becoming a bottleneck as AI accelerators require specialized memory with even fewer suppliers than standard DRAMWafer-level manufacturing becoming the limiting factor rather than chip design, with ASML's EUV lithography machines as the critical chokepointData center megaprojects (5+ gigawatt facilities costing $250B+) creating unprecedented capital intensity and long-term hardware depreciation cyclesMemory pricing becoming a commodity market with spot pricing and contract pricing fluctuations directly impacting device manufacturer marginsSupply chain transparency gaps where even major OEMs like Dell lack guaranteed long-term memory allocation despite historical supplier relationships
Topics
DRAM Manufacturing Capacity and Fab ConstructionAI Data Center Infrastructure BuildoutMemory Pricing Dynamics and Market OligopolyHigh-Bandwidth Memory (HBM) Supply ConstraintsSemiconductor Manufacturing Equipment (ASML EUV)3D DRAM Technology DevelopmentConsumer vs Enterprise Memory AllocationGeopolitical Semiconductor Competition with ChinaDDR4 to DDR5 Transition EconomicsGPU and Accelerator Memory RequirementsFab Construction Timeline and Capital RequirementsMemory Vendor Profitability and Market ConsolidationSupply Chain Risk Management for OEMsMoore's Law and DRAM Scaling LimitsNational Security and Semiconductor Export Controls
Companies
SK Hynix
Largest of three companies controlling 93% of global DRAM supply; prioritizing AI data center contracts over consumer...
Samsung
Second-largest DRAM vendor; contributing ~40% of supply to OpenAI infrastructure; strong in DDR5 but weaker in HBM
Micron
Third DRAM vendor; recently exited consumer business to focus on enterprise; building new fabs to expand capacity by ...
ASML
Dutch company with monopoly on EUV lithography machines critical for cutting-edge chip manufacturing; only dozens exi...
TSMC
Taiwanese foundry producing 93% of cutting-edge logic chips; competing with memory vendors for wafer production capacity
Nvidia
GPU manufacturer consuming massive amounts of HBM and DRAM for AI accelerators; locked in long-term supply contracts ...
OpenAI
AI company receiving ~40% of global memory supply from Samsung and SK Hynix for massive infrastructure buildout
Google
Hyperscaler building multi-gigawatt AI data centers requiring unprecedented volumes of memory and accelerators
Meta
Building five-gigawatt Louisiana facility costing ~$250B for AI infrastructure, driving memory demand
Microsoft
Hyperscaler investing heavily in AI infrastructure and competing for memory allocation with other cloud providers
Amazon
Hyperscaler building AI data centers and competing for constrained memory and GPU supply
Dell
PC manufacturer struggling to secure DRAM supply; CEO stated supply acquisition is primary focus for 2025
Apple
Consumer device maker affected by rising memory costs; historically absorbs component cost increases without price ad...
Intel
Historical DRAM manufacturer that exited memory market due to competitive pressures; now focuses on processors
AMD
GPU manufacturer deprioritizing consumer graphics cards to focus on AI GPU production due to margin differences
CXMT
Chinese state-backed memory company attempting to break oligopoly; missed government targets despite massive investment
Applied Materials
Semiconductor equipment supplier focused on helping memory vendors develop 3D DRAM technology
Lam Research
Semiconductor equipment supplier supporting memory manufacturing innovation and fab construction
CoreWeave
AI infrastructure provider competing with hyperscalers for GPU and memory resources
People
Dylan Patel
Founder of Semanalysis; AI infrastructure analyst providing detailed market forecasts and supply chain analysis
Sam Altman
OpenAI CEO receiving ~40% of global memory supply for AI infrastructure projects; referenced as major buyer
Sean Hollister
Vergecast host and tech journalist covering RAM shortage impacts on consumer devices and pricing
David Pierce
Vergecast host discussing macroeconomic implications of memory shortage on device purchasing decisions
Nilay Patel
Vergecast host exploring technical and business aspects of DRAM manufacturing and market dynamics
Quotes
"RAM is now, it's just utterly ubiquitous that way, right? Like is it is it's just the same March as sort of as smartphones became this gigantic, massive industry, smartphone chips went everywhere and RAM has done the same thing"
Sean Hollister~25:00
"If I'm one of these three companies, basically, I've looked at the market and I'm saying, okay, I can sell to like Dell who wants to make some laptops and I can sell to like nothing who wants to make some smartphones or I can sell essentially to like the five richest companies on earth who are all trying to build AI data centers"
David Pierce~35:00
"The most bullish person tends to go bankrupt. Right? And so that's the scary thing about this industry is if you overbuild the most, you end up going bankrupt. And that's how we've gone from 30 to three"
Dylan Patel~60:00
"If the memory cost goes up 20%, well, they're still selling this chip for $30,000 plus, right? So, you know, if it goes from $6,000,000 to $8,000, you know, there, there elasticity there is not crazy"
Dylan Patel~75:00
"In 2027, we will have new wafer capacity coming online. And you're literally just talking about like the time it takes to build a building, right? Like it's not there's not something like magic they have to do"
Dylan Patel~95:00
Full Transcript
Support for the show comes from L'Oreal Group, the global beauty leader, defining the future of beauty through science and technology. L'Oreal Group create the beauty that moves the world. Support for today's show comes from Dark Trace. Dark Trace is the cybersecurity defenders deserve and the one they need to defend beyond. Dark Trace is AI cybersecurity that can stop novel threats before they become breaches, across email, clouds, networks, and more. With the power to see across your entire attack surface, cyber defenders such as IT decision makers, CISOs, and cybersecurity professionals now have the ability to stop zero days before day zero. The world needs defenders. Defenders need Dark Trace. Visit darktrace.com slash defenders for more information. The world moves fast. You work day, even faster, pitching products, drafting reports, analyzing data. Microsoft 365 co-pilot is your AI assistant for work. Built into Word, Excel, PowerPoint, and other Microsoft 365 apps you use. Helping you quickly write, analyze, create, and summarize. So you can cut through clutter and clear path to your best work. Learn more at Microsoft.com slash M365 co-pilot. Welcome to the Verge Cast, the flagship podcast of the Deaf Punk album, Random Access Memories. And the Reddit thread I just found asking, what would be the name of the tracks on Deaf Punk's Christmas album? Which I encourage everyone to go spend a lot of time reading. I'm in front of David Pierce. Neil Eppsell is here. Neili, hello. Hello. Sean Hollister, also here. Hello, Sean. I am also a friend. This is the Verge Cast Holiday Spectacular. I would say Neili, is it fair to say this is just a joke that we made one time that has now gotten way away from us over time? Yes. To the point where I'm wearing a holiday muff. Okay. I'm glad you brought this up. What is that? So I was going to wear a Santa hat, but I couldn't find it. The last time I wore the Santa hat was last year on this show. So I don't know where it is. It should just be in the stream. Last year I acquired from my wife, what can only be described as a holiday muff, which I am currently wearing. It's a scarf, but it's a loop. It's very warm. How many stars does it have? It's a lot. I look great. What I've come to learn is that I should be a scarf guy. You should, I actually agree with that. I look terrible in scarves, but you're kind of pulling this one off. I feel great about it. My neck is very warm because of the warm. I would like everyone if you're listening to this. I'm like, if you've ever seen a dog wearing the cone of shame, but it's not a cone, it's just a sort of inflatable donut they put around their necks, that's knee-live, but make it fashion. You know what I mean? Yeah, Becky's very fashionable. This looks great on her. I don't agree that it looks great on me, but I'm pulling it off right now. I want to say. Sean, you meanwhile managed to make this space that you're in, go from not holiday at all to extremely holiday in like three minutes, which I'm very impressed by. It turns out that my wife, who's quite the Christmas decorator in every other part of this house, had lots of stuff to steal. So I stole all of it. That's great. So your room is the respite from joy and happiness and seasonal feelings, except for right now. Unless those feelings are starfives. Leftist. I do want to say I was going to put some holiday art from the Samsung frame store on the frame TV behind me, but I was worried about copyright infringement. So this is just a winter scene painted by Vengo, who is dead and can't sue us. That's right, Vengo. It's a challenge. What a Vince. What do you want to tell Vinnie? So what we do every year on this spectacular is we pick a spec or a technology or some piece of this world of tech that we live in and get like weirdly deep into it. And we've done, Neely, help me remember, we've done Bluetooth in the past. We've done USB-C in the past. We've done, we did matter last year. That was really fun. What else? We started with HDMI because the joke was that after the first Trump administration, we had so much politics on the show, we were going to clear the air. We're going to reset with a full hour on HDMI to do the nerdiest possible thing that was in politics. And now we have to do a spectacular every year. A beloved, beloved thing that we all heard. It's time to do HDMI again. I think we're headed towards needing to do another HDMI episode. We're not going straight to 6G. So okay, Sean, I'm glad you brought this up. Here is a partial list of things that we could have done instead of the one that we've picked. We could have done 6G. It would have been a very short episode and it would have made all of us very angry. But that's an episode we could have done. We thought about doing, instead of doing matter last year, we almost did arm just the idea of arm chips, which I think was very important. Last year it continues to be very important. All things LLMs, you could spend a lot of time just digging into inference as a spec. We just get rid of it. That's not a spec. We could do TPUs. You want to do TPUs for an hour and a half year? He's right. He's got a spec, it's a spec, a spec, not an acronym, a spec. I am very down for less letting the LLMs play for us. That's true. We could do Wi-Fi 7. That's a spec. We could do satellite internet with not a spec. I'm sure there is a spec. I didn't do enough research to care, but there's spec involved. Wi-Fi. There you go. Wi-Fi. I thought about doing the wireless power thing where you can like point a charger at yourself and apparently won't. That won't be horrible. That's what Sean's talking about. I proposed E-Inc Collido. Travis shot that down fairly quickly. Travis suggested LaBoubou, which may or may not be a spec. We can debate that later. But the one that we decided is RAM. We're going to spend this whole episode talking about RAM because RAM contains within it specs. RAM, what is everything if not a bundle of specs? You don't have any on. Think about that. We are all specs. But RAM, I think, Sean, my question to you is, would you have gone into 2025 expecting to spend as much time as you have covering specific sticks of RAM? Absolutely not. This stuff is commodity. It's so boring until it gets fascinating. You find out how it's made. But like, just take it for granted, right? It's everywhere. You don't need to think about it. I thought. Yeah, I feel like it's like a thing we write about as a clause in a sentence and a phone review about how much RAM it has. And that is most of the way that I've thought about RAM for my entire adult life. And now all of a sudden, it is like a precious rare earth mineral in the world. And we're going to talk about why. So we're going to do this show in three parts. First, we're going to just build up some knowledge of RAM. Sean, you are going to just school us on what RAM is and how it works and how it became to be a thing that we all care deeply about. Then Travis, our producer, has some games that he has created slash vibe coded that are I assume going to make us all look very dumb and be deeply terrifying. And then we're going to have a conversation at the end about kind of the macroeconomic chip war of all of this. There is a big picture question about how chips get made and who's in charge and what it means for the future of technology. And we're going to get into all of that. But that is for later. And for now, Sean, we asked you to really give us the kindergarten starting education and how RAM works. Are you ready for this? Oh, boy. Can't wait. All right. So let's literally start from nothing. What is RAM and where did it come from? Okay. Add it's very, very basic. There are now nowadays chips that are going to be on a stick or possibly, patterns etched into the top of your processor that also do this memory thing. And their job is to store charge. They will store a little bit of electricity that indicates if it is a one or they will not have that amount of electricity and it not have quite that much charge in it. It will be a zero and these ones and zeros represent computer memory because it is code. This is the digital zero one that you hear about. This is where it is stored temporarily inside your computer, inside your phone so that the rest of the computer can read it and say, what did I just compute? Now I can act on that previous thing I computed and do more with it. And so computers didn't always have this chip kind of memory. They had all kinds of other kinds of memory before this. There were switches you would have to flip. For rods or gears that would be turned to a certain position in a mechanical computer that would be, this is my storage, my memory of what I just did a moment ago so that I can do more computing on top of that. But all of these forms were known as memory and the random access of that memory is the thing that the computer is doing all the time, randomly looking at various parts of it to find out what it stored temporarily in its memory that it can look back at. The kind of RAM we talk about most of the time is DRAM, Dynamic, Random Access Memory. This is called Dynamic because the charges that are stored in it need to constantly be refreshed. The memory doesn't have, it doesn't have any, it doesn't remember what's inside of it when it doesn't have power, it needs to constantly have power so that's refreshing those things. The kind of memory we talk about is storage, we have SSDs which keeps stuff in there longer without power, not forever, but longer. Most of the time right now though and the crunch that we're talking right now is mostly about DRAM. Memory that's going to give you that little short term for the computer to act on it again. I feel like it's important to say here that historically DRAM has been much faster than the other kinds of storage on the computer, right? Back in the day the hard drive was really slow, quite linear actually in how it stored information and you would want to have a lot of RAM because it was so much faster to get information out, but that's sort of equalized, right? Like modern SSDs and RAM, they're closer than they were but they're still not very close, right? Yeah, memory is much, much faster than storage still but storage is, we're at the point now where in most computers you'll have your random access memory that's doing the very quick calculations back and forth between the processor and computer companies want to keep that memory as close to the CPU, the actual instruction parts of your computer as possible. So they're baking it under the board, they're creating special rings of logic where something can pop out of the CPU and into memory and back into a CPU thread so quickly that they're getting even more performance out of it than they would if they had it on a separate stick further away from the CPU. So they prefer that higher bandwidth that faster access if they can't, but yes, the solid state drives now. They're quick enough that you could have a cache on there. Some people will talk about doing back in the day you could speed up your whole computer by using your DRAM. It's like a temporary storage for your programs. You'd like just throw your game in there or something like that so you could load it even faster than the storage. Storage is fast enough now we don't think about that much anymore. Okay, so RAM, actually I did, I would say like four minutes of research and it turns out RAM has been around much longer than I realized. Like the way I've always thought of RAM is basically like you have a hard drive that is sort of spinning and trying to find things in a linear fashion in a row and actually storage doesn't quite work that way. So everything is all over the place. So your drive has to constantly spin around and look for stuff and that takes a while. And RAM just can look for it all at the same time and that is much faster. This is basically like a thing that I learned listening to music on knockoff iPods in the early 2000s. That's the extent of my RAM education. So this is a technology that's been around since like the middle of the 20th century, right? Where did this actually come from in the first place? I don't remember who quite did it first but when you're looking at early computing people will talk about the Babbage engines and they would have like rods that would be in a certain position and that would tell you, you know, they would store the previous calculation. And then some of the early computers would have punch cards or tape that would have, you know, but punches through it and you would read that. There would be various switches that you would flip or gears turn into a certain position. At one point they had vacuum tubes where you know, you've got this idea that a vacuum tube when it's on, you know, can store a little bit of energy in there. Not to interrupt you but every single one of those things sounds so cool and futuristic. Yeah. That's on sick. Let's do that. That's probably fair. And then it's like, no, these were all the bad ideas we had before we did music. These are great. Everybody loves Express Los of D. Compress. The Trabad disabouts of energy required to do it, right? At one point they had CRTs where you would store the bits on the screen of the CRTs as a charge. You know, you've got your, your, your electrode gun firing your charges at the screen. That creates your foster as an image that you see on a CRT TV. Well, why not store the memory there too? And then there was ring memory, the core ring memory at some point where you'd have wires kind of all wrapped in patterns and inside of each of those places you could store the charge for longer than you could in some of the previous technologies. You can store many of them in a lattice. And I've gone to the computer history museum and mountaineering. You can see hand woven memory as this, as this thing. It looks like a bunch of cords knotted together. It's fascinating stuff. I still would bring all of that back. I'm looking at my desk now being like, my computer needs like several more vacuum tubes and something very cool will be happening. As far as I can tell, RAM is one of those things that it's like, there are a bunch of things in sort of the history of computing where a thing was created and everybody kind of looked at it and went like, oh, yeah, this is better. And it just immediately took off and won forever. And I feel like we're in your like 60 of RAM, which suggests that everybody just decided this was the right idea. The thing to know now, I think, is that they are computer chips. The RAM, the RAM bits on your memory stick that goes in your computer. It is created basically the same way. It is the same material. And so when we're talking about the shortages that we're seeing now, these shortages may intersect with other places. Because what you're doing fundamentally is you've got a silicon wafer. You're taking a single big crystal of silicon. You are cutting it using like a diamond wire or a diamond edged rotary saw into incredibly thin layers, like a quarter of an inch thick. And these are being patterned with chemicals and ultraviolet light and then etched with acid or plasma into these little chips. And it's the same process you use for processors, for computer processors, also used to remember the same kinds of wafers we're talking about, about 13 inches wide are used for computer chips. They're used for memory chips. And so we've digitized all this and made this much more efficient and small, but now it's kind of the same thing. So the gap between all of these different things has actually like shrunk a lot over time. So when we talk about like storage and processors and RAM, like the distinction, it seems like matters a lot less than a issue. I hadn't really thought about that until just now, but it does. It is all becoming sort of a single system in the way that it was not like I still think of ram is like a giant stick of thing that I shove into my desktop computer. And that is in almost every case, not what RAM is anymore. What's also not where the volume is, right? Like phones, there are more phones than laptops, right? There are more phones and cloud servers and RAM is in all the phones. It's soldered into the phones. Those aren't sticks. You got a lot of feelings about that. The only thing we should all put dim sticks into our phones, that'd be kind of cool. But you know, if you listen to the Apple earnings haul or Dell's earnings haul, all the analysts are always talking about fluctuations in DRAM pricing. Because there's only so many vendors of RAM, which I'm sure we'll get into, they are kind of a commodity. It moves up and down like oil. And if the price of RAM is high, Apple's margins are lower because they, you know, the price of the iPhone is not fluctuated with the price of RAM. So this is in the background of all these businesses is, oh, RAM is a commodity and the price moves up and down in a way that, Sean, I don't think like the price of M4 chips moves up and down, right? In the same way that RAM is a commodity and the price moves up and down. Yeah. Now, if you want Intel or Apple or like a Samsung, if you want TSMC to create you a chip, you talk about like, do they have the capacity at their fabs to make as many chips as you want? And if they do, you pay, you know, this much. And if they don't, you, you try to pay more so you bump somebody else off of the fab or or get the capacity that's coming later in the year, whatever it is. With DRAM, there are so many wafers that are going to these, these, these three, basically three companies that are creating this stuff and they're, you know, they don't want to create too much of it, but they know generally how much many PCs and how many phones are going to be sold in a given year and they just produce a lot of it and everybody buys it off them. Yeah. RAM is one of the things it is like so deeply unsexy, but it is increasingly everywhere. And right, I feel like it is the, we've been on this March and we've talked about this a lot on the show that like everything has become a computer, right? That my dishwasher has a, a computer chip that probably resembles a smartphone chip in it. And so does ever, basically every other piece of electronics in our world has RAM scaled the same way like is it can you sort of spin around in your chair and probably find a hundred things with RAM inside at the moment? I can tell you this frame TV doesn't have enough RAM in it. I can tell you that right now. Yeah. I can also tell you that. I didn't just watch you go through the interface to select this photo. Something I didn't know until we were doing, we were doing this report the other day is that the solid state drives have RAM in them too. I figured you didn't need very much of that, but you know, the other is some cash in there. There's some cash in there for the solid state drive so it can run faster. RAM is now, it's just utterly ubiquitous that way, right? Like is it is it's just the same March as sort of as smartphones became this gigantic, massive industry, smartphone chips went everywhere and RAM has done the same thing to the point where it is baked into many of those chips. Yeah. Your phone doesn't necessarily have an application processor soldered here and a RAM soldered there. It has the RAM packaged into the processor in many cases. That's what I call them, SOCs, right? That's the chip people, the chip influencers get very mad when you call them CPUs, they're SOCs or systems on a chip and that part of that system is the memory is the RAM. So about your dishwasher actually has an SOC. By the way, I've been watching you through your dishwasher for weeks now. You're loading it all wrong. My wife would tell you that. But your dishwasher probably has an embedded SOC that is a complete system containing RAM that is, you know, whoever just bought off the shelf and shoved in there. Yeah. So, okay, so Sean, tell us about 2025. What the hell happened this year that I have to think about RAM in my day to day life now? I don't know exactly how it began, but I don't remember anybody forecasting that all of a sudden AI data centers will be buying up so much of the world's RAM that the rest of us would be scrounging for leftovers. I do remember them saying that DDR4, an outgoing standard, would be a little bit pricier because they were trying to end of life. It's like, okay, yeah, there's less of that to go around. So if you got an older computer, you might pay a little more to get your older RAM. Double data rate for synchronous dynamic random access memory, DDR4, SD RAM, good Lord. Okay, so they're trying to, they're trying to phase out DDR4 presumably to phase in DDR5. They're trying, yeah, and we got, and we got DDR5. It's everywhere if you're buying a new computer, you're going to buy that instead of DDR4. You stick it in your graphics card, so you stick a graphics version of it in there. There's even six for graphics cards and seven, but moving on from that, yes, the general standard DDR5. And it goes in all the computers and it goes in the kind of computer you'd buy for yourself and it goes in the laptop where you don't even think about it and it also goes in the AI data center. And the AI data centers want as much of it as they can possibly get, just like they want as many GPUs as they can possibly get. Can I actually ask about that? So I heard there's a data center RAM crunch and I thought, oh, it's because Nvidia is selling a lot of GPUs today centers. Don't the GPUs use a different RAM? Don't they use V RAM? They have their video RAM. It's on a slightly different standard, but fundamentally, all the memory is coming from just a few companies who have a limited supply. Wafers that they are cutting out of ingots of silicon and doing their chemical treatment processes on. And so everybody who's making these computer chips is going to be fighting for someone that's supply and the RAM makers in particular control these three companies control 93% of the world's supply of RAM. Who are those three companies? This is micron, SK high necks and Samsung. Oh, sure. The first three companies I think of when I think of tech. SK high necks runs the world. You shut your mouth. We're in trouble with SK high necks. SK high necks is the biggest to the three. And Samsung also, these three companies, 93% of the other companies that there are. There is only one other company that has even 5%. The rest have one percent or below. Wow. Okay. And so if you want RAM, you're going to one of these three. One of them just said we're done with consumer business. We're just going to focus on enterprise now. We're going to focus on data centers because that's where the money is. And of the other two Samsung and SK high necks, they may have contributed as much as 40% of the world's entire supply of memory to a single project at open AI going on right now to create a massive set of AI infrastructure there. Wow. Clear. It's not the AI data centers need more regular RAM for the CPUs or more VRAM for the GPUs. It's just like RAM in general is needed for these formats. It's all of it. I mean, you'll want, if you're building your computer inside the data center, it's going to need a GPU and that GPU is going to have memory. It's going to need a CPU with normal memory to talk to. And so it'll have some normal memory in it. It's going to need SSDs, which are going to have a bit of memory on those. It's going to need all kinds of networking, fancy networking switches for their, for their thin, thin link or whatever. There's memory inside those. All of these things are consuming memory and so much of it is going there now that companies are saying, if this is the way things are going, we don't need to worry about consumers so much. We're going to have plenty of money from where the profits lie. And this, we saw this already happen to some degree in GPUs. I mean, we saw AMD say, we're not going to make a high-end graphics card for gamers this year because we want to put our resources toward AIGPUs instead. We saw Nvidia not say that, but then kind of do that anyway. Although they do have plenty of consumer GPUs, it's been harder to get them. They've been pricier. They know that they can charge gamers more because if gamers don't buy as much, that's not a problem for them. They're making $30,000 off of one of the AIGPUs. They don't need to make $1,000 off of the gamer. So if I'm one of these three companies, basically, I've looked at the market and I'm saying, okay, I can sell to like Dell who wants to make some laptops and I can sell to like nothing who wants to make some smartphones or I can sell essentially to like the five richest companies on earth who are all trying to build AI data centers and want to buy from me in unbelievable volume. And like any reasonable shareholder value maximizing company is going to pick the thing where OpenAI just writes you the largest check you've ever seen in your life. Antonio ran into Dell's COO the other day and got a whole like long mostly empty spiel about how Dell has all these relationships with these suppliers and they go back years and they had a kind of boil down to Dell. They need to kind of tap the companies in the shoulder and be like, remember me, please give me some memory. We have these we have these deals and we would like them to be fulfilled. He said that supply that getting supply is their focus this year. Like they don't they haven't already negotiated the supply. Apparently not to the degree that they need. So how how wild has this gotten you you saw and wrote about some like I would say sincerely nuts outcomes of this RAM shortage. What have you seen out there in the world? I think the one that's gotten the most reach is that some stores are now selling it like lobster in that they're selling it like the catch of the day market price. You have to walk into the store and ask, hey, how much is RAM today? Or you might say, you know, I'd like to order the RAM. Can you can you can you tell me how much that'll be on my bill? It's a earlier wine and I say, I'll have the RAM. Thank you. It's it's the point where several of our vergetators myself Richard Lawler's we've gone back and looked at how much we paid for RAM three months ago, six months ago, a year ago. And we've seen that the number is quadruple what it was. Good for that. Sticks stick. You know, I see Reddit stories about somebody talking about how they're willing to trade their their pricey GPU for a couple sticks of RAM. And a year ago that would have been unsinkable. The GPU was the item that everybody wanted. You pay a thousand, fifteen hundred, two thousand dollars for this GPU. It was that hard to get a good GPU. And now they're saying, oh, yeah, maybe I should get fifteen hundred dollar sticks of RAM instead for my computer. So my computer has enough memory to open my hundred Chrome tabs. It feels like we'd keep doing this. It wasn't that long ago that all the Bitcoin miners made it impossible to get GPUs and then all the AI companies made it impossible to get GPUs. And then right before that, we had a chip shortage because everybody was in the pandemic buying like Peloton's and stuff. And so like, how is this just a thing that keeps happening? Like at some point, these South Korean companies should just make more RAM. Why is this so complicated? Some of it is that it takes a long time to span up new facilities like these. There are only so many ultraviolet extreme ultraviolet with augurib fee machines in the world. We're talking about dozens, not hundreds of those. Is that the thing that's made by that one like secretly powerful Dutch company that no one knows about? But it just kind of runs the world. Yep. If you want to make the cutting edge chips, this is the only company that will sell you the machine. You can't buy them if you're in China because they think that's a security risk. They have to fly parts all over the world for this. It takes six months just to take the parts for this machine and construct them into the machine. There are only dozens of them in the world. It's wild. That's for the cutting edge stuff. So there are not everything you're making is going to require the cutting edge machine. But still, I just have to say, I just googled how much does DRAM manufacturing need EV, extreme ultraviolet with augurib fee? And there's answers, but Google's AI overview is like, yes, it absolutely needs EV with augurib. Give it to me now. Like this robot is like, I need it. I need it right this second. It's very funny. It's on a curve up. It is not the dominant like micron only just adopted to you. What? And micron is building out more fats. But to Sean's point, the fabs take a long time to come online. There's so few companies that control this. There's the ASML for your EVUV machines, the only company that's doing that. There's just the three memory manufacturers. When you come to the other side of the chip equation, TSMC, the big Taiwanese fab company, I think the New York Times said that they, too, single company was producing something like 93% of the world's cutting edge chips at the time that New York Times report came out like a couple of years ago. So there's just power is in the hands of very few. And those companies, they don't want to risk creating oversupply. In some cases, it can be fundamentally detrimental to their bottom line. They could lose money if they do too much oversupply. I think micron in particular is making massive profits now, but it wasn't always. There was a point one, two years ago that was losing a bit of money. For the other companies, they're like, well, we'd like to maintain long-term profits. Thank you very much. We're going to, we're going to feast on this as long as we can and not create more than we think we can sell and profit from. So they're happy to see their average selling prices go up. They're looking at Nvidia becoming a $4 trillion company and going, well, that went well. It'll be we also. Yeah. Yeah. So that makes me feel like this thing is going to get worse and not better. It's not like I assume there's not some like next technology that is waiting to be implemented here. Like the world runs on RAM in a real way. Are we stuck in this sort of chaotic shortage for the foreseeable future? Or like, I guess if the AI bubble pops and all these companies go out of business, things change. Short of that, how does this get better? One thought is that the AI bubble pops. Another thought is that maybe the world economy crashes because the robot wants to bang you. Like your laptop, it'll be easier to get a laptop though. That's true, but at least Kevin and Risa will find true love. Shout out to K. Shout out to K. My boy. Happy holidays, Kevin. Another thought, another thought is that maybe, maybe they will come up with a version of manufacturing RAM that works better for the AI companies and they'll switch to that. One theory going around right now is that today we make little chips on wafers and we cut out those little chips and we turn them into CPUs and memory modules and so on. But maybe in the future we will make giant whole wafers chips and because those whole wafers chips will be fundamentally much more efficient at moving data within them, maybe higher speeds from memory and so on. Maybe the AI companies will switch to those. I don't know where the bottleneck is, but if the bottleneck is not in slicing up wafers, but actually in printing out chips, maybe that will make the AI companies more happy. Just, I don't know. Wait, I just, the idea is that, let's just say Sam Altman. Sam Altman is going to buy 18 inch round memory chips. That's basically, yeah, basically. That's a perfect opening idea. We will get so good at manufacturing memory that the yield will be perfect on 18 inch round silicon wafers. Sure. Sure. This is how we get back to vacuum tubes. Do you know how cool that would look sitting under desk right now? He's just like the RAM guys are going up the stairs with their huge wafers. Like instead of dims, you're like feeding them in like giant CDs. Hell yeah. Sure. Try buy. I love it. Packaging stuff. There's like 3D packaging. Like all the same chip manufacturing stuff is in RAM now, right? There's lots of thoughts about packaging things differently, but again, I don't know where the bottleneck is. If it's, there's only so many wafers to go around, then we're kind of in trouble if the AI continues to eat all the wafers, you know? If it's somewhere else, if it's in, you know, if we just package this differently, we just manufacture it differently. And if that manufacturing can be spun out into a different supply chain that doesn't impact all the consumer stuff, that might be nice. Then again, I don't know. Right now we're talking about the AI company who's talking about a lot about HBM memory, high bandwidth memory. It's all stands for HBM. And so if you're going to use this separate kind of HBM to get more bandwidth in your AI data center, we're thinking that's going to steal RAM production away from the consumer stuff, from the DDR. It takes up three times as much space, according to one of the analysts that Emma talked to for her RAM report. And so if that steals away, that's not good. But maybe we all start using HBM. There was an AMD set of graphics cards in 2015. I used one in a PC that used HBM memory. And it was great. It made for a very compact, efficient graphics card. Perhaps all of that will trickle down. And after AI isn't eating up so much of things, maybe everything gets better in consumer as a result. I don't know. Let's hope. All right. So the bubble pops, the AI build that works out, or we all transition to a new standard that's unproved. These are good outcomes. Perfect. We're doing great. And then you can finally buy a PS5. Like that's the how it goes. Sure. Yeah, that sounds great. All right. We're going to come back to this thread of the story. But first we're going to take a break and then we're going to come back and we're going to play some RAM games, which I am told are a thing that exists and it's going to happen. We'll be right back. Support for this show comes from LinkedIn ads. When you're running your own business, every decision could feel like make or break and you can't afford to waste a penny. So if your B2B marketing is still falling short, it may be because you're preaching to the wrong choir, whether you know it or not. If you want to reach the right professionals, use LinkedIn ads. LinkedIn has grown to a network of over 1 billion professionals and 130 million decision makers according to their data. That's where they stand apart from other advice. You can target your buyers by job title, industry, company, role, seniority, skills, company revenue. Also you can stop wasting budget on the wrong audience. That's why LinkedIn ads boast one of the highest B2B return on ads bed of all online ad networks. Seriously, all of them. So get your ads in front of the right people and make your B2B strategy work. You can spend $250 on your first campaign on LinkedIn ads and get a free $250 credit for the next one. Just go to LinkedIn.com slash verge cast. That's LinkedIn.com slash verge cast terms and conditions apply. Support for the show comes from L'Oreal Group using the latest advancements in science and tech to create personalized beauty solutions for all. The global beauty leader recently introduced two breakthrough technologies that bring the power of light to hair care and skin care. Light straight and multi-stiler and the new LED face mask, both of which were recognized as CES 2026 Innovation Award honorees. Learn more about both technologies on l'Oreal.com. L'Oreal Group create the beauty that moves the world. Support for the show comes from L'Oreal Group using the latest advancements in science and tech to create personalized beauty solutions for all. The global beauty leader recently introduced two breakthrough technologies that bring the power of light to hair care and skin care. Light straight and multi-stiler and the new LED face mask, both of which were recognized as CES 2026 Innovation Award honorees. Learn more about both technologies on l'Oreal.com. L'Oreal Group create the beauty that moves the world. Support for the show comes from Anthropic. Clouds problems are just that. Complex. And when you're trying to run a business, complex problems often mean there's an even bigger issue just under the surface. Thankfully, there's cloud. Cloud is the AI for minds that don't stop I good enough. It's the collaborator that actually understands your entire workflow and thinks with you, whether you're debugging code at midnight or strategizing your next business move. Cloud extends your thinking to tackle the problems that matter. Plus, Clouds research capabilities go deeper than basic web search. It can have comprehensive, reliable analysis with proper citations, turning hours of research into minutes. Ready to tackle bigger problems? Get started with Cloud today at cloud.ai slash vergecast. That's cloud.ai slash vergecast. And check out Cloud Pro, which includes access to all of the features mentioned in today's episode. Cloud.ai slash vergecast. All right, we're back. It's time for some RAM games here on the holiday spectacular. What is this? What is happening to me? RAM games, Neil Hyde. Get ready. This is what we do here. All right, Travis, Larchuck, our producers here. Hi, Travis. Hello, everybody. Travis, you are in charge and you have been let loose with several AI tools that make me nervous. What do you, what, you're in charge? What do you have for us? I have two games for you today. The first game, Neil Hyde. Earlier you said this was a spectacular, not an acronym, tagular. No, no. We're wrong. Because this first game, as we all know, nothing tech people love more than jargony, acronyms, and initialisms. And as you know, RAM stands for Random Access Memory. In this game, I will give you an initialism for one point. Tell me, does this initialism relate to RAM in any way or is it some random other thing that I stuck in there? I don't like this Sean is laughing the way this suggests. He's already won. I have to be really honest about this. I don't like this confident cackle that's coming out of my voice on Hollister. Look at these, these notting. I gave you all of all, all everything you need to win. Yeah, this is a question of whether Neil and I were listening. Are you listening, the initials came. All right. Do we buzz in? How does this work? We will go in order. That's one point. Now, if the initialism turns out to indeed be related to RAM, there are bonus points available. If you can tell me what it stands for, I will give you one point. If you get any of the words correct, I will give you two points. If you get it all correct. All right. All right. Ready to rock? Yes. Okay. Let us randomly select the order of our game. Oh my God. And Sean is up first. Yes. I vibe coded this in case anyone's watching this interface. I cannot wait for this thing to start hallucinating. Your first initialism where acronym is, C-A-S or maybe, CAS. This is RAM. It is indeed RAM for one point. Come on. Do you know what it stands for? I, shit. What's the initialism? It has to do with latency, but why can't I remember any of the words in it? No. I don't know what CAS stands for. All right. CAS stands for. Column Access Strobe. Oh my God. Column Access Strobe, latency. This is when RAM was made of like weird rods, right? So this column Access Strobe is a vacuum tube. Yeah. You just pull the rods fast as you can. You have strobe the rods, boy. That's what he's just strobe those rods. Sean, that is good for one point. Neal I. Oh, sorry. What are you doing? Go ahead. I have good news for you. Your initialism is DDR. Oh, that's great. It stands for Revolution. No, this is RAM. Double data rate. Nailed it. Two points. It is RAM. It does all the data rates. Three points. All three points. David, just playing to Neelai here, I say. That's right. Your initialism or acronym is UNR. UNR. I definitely paid Travis under the table. I'm going to say not, Ram. You are correct. It stands for Revolution of the Matter. Yes. To me, Farron, I think you should have led me guests to get two more points. All right, Sean, we are back to you. Ready. You have UMAA. That's a RAM. Correct. For one point. Universal. It's also UMass Amherst, just to say. It's all universities. All the not-rans or universities. I don't know the M. Crap. Universal something access, I think. How do you not know the M? Just wait, wait, wait, wait. I'm going to hand. Think about what Ramm says. Do you want to take a guess with the M stands for? Just like, you can work it. OK, let's try memory, I don't know. OK, you are correct about the M. The M is memory. It is unified access. Unified. We're close. Wait, does he get two points? Because he got two words. No, it's one point. It's not one point per word. You get two points total. OK, yeah. One point, if you get at least one word, two extra points if you get all the words. This is when CPU and GPU share the same pool of RAM. OK, new I. DOTA, perhaps Dota. I'm going to say not RAM. That's a game that uses a lot of RAM. Yeah, I'm just saying you have Lads out there. Correct, DOTA 2. That is defense of the agents at eSports game. David. All right, hit me. D-I-M-M, dim. It's a RAM thing. Correct, this has come up already. Do you know what it stands for? This has come up already? That seems bad. Data. David. David. Memory master. Data intramachine memory. You got memory. I get it. It is dual inline memory module. Wait, to me fair, technically I didn't get memory. Yeah, do you have to get one of the words right in the right? Yeah, I had it on the M. I think I don't get that point. The wrong M? All right. I'm willing to say. Put an upstanding guy. No, because otherwise you just get to say memory every time. You probably get a point of that. No, I understand how to score points, David. I'm kind of willing to say. Sean, you were making a face like you would have gotten this one. I was. I was. I got this one. Yeah, 100%. There was a big moment when I worked in the computer store in high school where Apple switched from Sims to DIMS and it was like all the talk of Macworld magazine in 1998. That's why I remember this is just PTSD. You were pretty cool in high school. I was the shit. All right. At the end of the second round, our scores are Sean three, Nela IV, David two. Nela with the easiest three points anyone's ever gotten. I would have gotten DIMS. You would have that's fair. All right. All right. This is the last round. Sean. C-T-U. Not a ramp thing. That is correct. That is the counterterrorist unit from TV is 24. All right. New eye. E-C-C. That is ram. That is. Error correction circuit. So close. Error correcting code. I knew it. I knew it the first two were. That was good. All right. This allows ram to catch errors and is generally used for mission critical systems and not in consumer E-C's. All right. David. Mathematically, no way for you to win. But let's find out what happens. That's David's. B-M. He hurts. High bandwidth memory. Hey. Correct. Thank you, Sean Hollister. Thank you, Sam Altman for your giant wafers. We are playing the games first next time. Thanks for playing the games first. Explanation comes after. Somehow we end the game with Sean with four points. David with five and Nela with six. Congratulations. That feels right. Sean, clearly the least educated about ram. I'll take it on the chin. Okay. Your next game is on screen now. As we all know, ram is the most valuable resource in the world right now. It is Mad Max Times but for ram. So we are going to play a white elephant gift swap for the object is to end the game with the most ram. Here is how it will work. I have nine secret gifts that are wrapped in verge wallpaper. Each is a gadget containing an amount of ram. I will not reveal how much ram until the game is over. Before anyone writes in, I know that not all ram is created equal. But for the purpose of this game, we only care about quantity of ram and not quality of ram. So, DDR4 versus five. Exactly. It doesn't matter. It's all worth the same. Because I heard we're phasing out DDR4. I don't know if you're taking that into account here. I heard that and I have not taken it into account. So, on your turn, you will choose. This is like a white elephant, Yankee swap. You will choose to either open a new gift, a random gift, or you can steal a gift from another player. If a gift gets stolen, the victim of the steal can then choose to either open a new gift, or steal a gift from another player. Each gift can only be stolen one time per round. And if a new gift is opened, the round is over, and we go to the next person to start the next round. Once everyone has three gifts, we will play one more round because the person who opens the first gift has no choice during this game. In that round, your choice will either be to swap one of your gifts with somebody else's gift or to end the game. Okay? Oh, oooh. Alright, so we will randomize our order to find out who will open the game. Who will open the first gift? What if I don't want RAM, what if I want the other stuff? Well, that is up to you. Are you looking to win? In the one with my wife's family, everybody wants the lot of tickets. That's the way it goes. Alright, Nielai, you get the first pick, so it's just one through nine and that's all you get. World of No 7, I'm going 8. 8. Gift 8 is... 8. It works, pauma, perfect. Congratulations, Nielai. Nielai, you win. It's our daily that contains RAM. It does contain RAM, David, you could either steal that from Nielai or you can open a new gift. This is, maybe, the one advantage I have over both of you, as I know, off the top of my head, how much RAM, the books, pauma, that's... I'm going to open gift number four. Gift number four, you have a shiny new, cyber truck. I am also aware of the amount of RAM in that device. We are talking about the amount of RAM in the Cybertruck's entertainment systems, specifically. Oh my God. So, Sean, you can either steal, Nealize Boats Palma, David Cybertruck, or you can open a new gift, Sean. I don't want to end with a Cybertruck. Also, Virge Wallpaper number seven is delightful, so I'm taking that one. I love this purpley reddish blue wallpaper, and you got meta-rayband display. Oh. All right. Nealize, you can steal any of the gifts that are out there, or you can open a new gift. This is our second go round here, round four. I like debating whether or not the Cybertruck has more RAM than the Boats Palma. It's a real toss-up. I'm going to open number three. All right, you're going to open gift number three, no stealing so far. That is a iPhone 13 mini. The favorite phone of many a Virge, high-pastor, and was up to, so good. Okay. David, you can steal any of the gifts that have been opened or open a new gift. So, the dilemma here is to Travis pick a bunch of things that don't have much RAM, or have we just opened the things that don't have much RAM? I was doing the same calculation. I am going to make a potentially reckless decision, and I'm going to steal the Boats Palma. Yes. All right. Which I think has a non-zero chance of having the most RAM of anything on the board. David stole the Boats Palma from Nealize, which means Nealize, you can now make a steal yourself, or you can open a new gift. I'm taking the Cybertruck. Straight swathing. Wow. Nealize steals the Cybertruck from David. David, it is your turn to either steal or open a new one. Give me number two. All right. Gift number two is... Nealize, do you agree that the Cybertruck either has one or a thousand gigs of RAM and nothing in between? Yeah, but in my bet, I was doing the same calculation as you, and I was like, I know that the Boats Palma doesn't have a lot of them. I'm going to roll the dice on the Cybertruck. Also, it is only fitting that I end up with the giant wiper. Agreed. The switch to... It is. This is a good get. Yeah, I'm going to switch to. I feel good. I'm psyched. Are you going to keep the switch to that? I'm not sure if you are. It is Sean's turn. I am in fact taking the Nintendo Switch to. The Nintendo Switch to has been stolen. It's a maximum number of steals. Each item could be stolen once per round. So, like, I can't take it back. Right. But I could, in theory, take the Cybertruck. I'm going to open number five. Number five is... The main AI pin. This is so cruel and also the fact that I'm going to end up with the Boats Palma in the humane AI pin feels like I deserve this. You totally don't. My 2025 suggests that I deserve this. Wait, so if there's not a maximum number of steals, the end game here gets very complicated. Yes. So, yeah, each round... Each item can only be stolen one time in a round. But after that round, things are open to be stolen again. But then there's a swap at the end. So, we can... There is a swap round at the end. Just to recap for the listeners, Nila has an iPhone 13 mini and a Cybertruck. David has a Boats Palma and a Humane AI pin. Regrets. Sean has Meta-Rayband display and Nintendo Switch 2. And it is Nila's turn. Can I just pause by the way to say that so far, we have accidentally kind of all three picked our personalities. I was going to say, is the gold... Is the gold right up at the first day, guys? Is this stuff we like? All right. The only correct move right now is to pick a new. So, I'm taking number six. I think that's right. Number six, the game theory is a foot. A pixel 9A. You did pick stuff with not enough radios. One of these is gonna have... This game is gonna be made or broken on like 16 megabytes of RAM. I can see it coming. There's gonna be like a Mac studio at the end of this that just wins. And it's none of the rest of it's gonna matter. Holding out for Framework desktop, that's why. I do have two tiny little phones. It's adorable. And it's so... I didn't say that was adorable. Do I lose two? His two cell phones in the cyber truck. Yo. It's our boy. Red flags all over the place. Uh... Are you gonna have Bitcoin? Give me the switch, too. All right, David has stolen the switch, too, from Sean. Sean, you can steal or open a new gift. But I can't steal the switch, too. So... Think I'm going with... Have you heard of the soundtrack? No. Everyone loves it. He's opening a new gift and the gift is... A Galaxy Z Fold 7. Now we're talking. That feels like it has a lot of RAM. I know exactly how much. Sean, it is also your turn to start. The ninth round. There's one gift left. I'm taking the switch, too, back. All right, Sean takes back the switch, too, David. I can't take it because I'm so on the same round, right? Right, but you're going to get to go next. God, I see. All right, give me number one. Let's do this. All right, David is opening number one and number one is an Apple Vision Pro. How much? It's so poor. So... Wait, okay, serious question. For the two of you looking at this, do you have confidence about which one of these nine things has the most RAM in it? I'm going to... No. I think it's a cyber truck. So I think there's a chance it's the cyber truck. I also think it's possible that I'm just dead raw. But okay, I have the vision. So great. I have a book's Palma and AI pin in a vision. This sucks. Yeah, I'm sorry. This is how it's going to go to your house. I want to try out that Apple Vision Pro. Travis, you can come over. Everybody else is on his own. Dave Malin, I'm going to be with this dead glowing eyes. It is a RAM starb headset. All right, this is the 10th and final round. Nila gets to start this round because he went first and didn't have a choice in the beginning. On your turn, you can either swap one of your gifts with one of anybody else's gift or just end the game. So Nila, it is your choice. All right, I'm going to swap the 13 many for the Switch 2. I feel good about that decision. Okay, Nila is stealing the Switch 2 from Sean and giving Sean the iPhone 13 mini. Sean, it is now your turn. You can swap or end the game. You have Meta Ray Band display, Galaxy Z Fold 7 and the iPhone 13 mini. I am going to take the Vision Pro for the 13 many. Okay, giving David the 13 many. Finally, David has one useful device. David, same choice. Game's a wapper and the game. And my pain now. There's at least one move on that board that I would make. There's two, but I'm ending the game. I'm dancing with the ones who brought me. Let's do this. David, you made me. Guess what's going to happen when the humane pitch turns out to have 32 gigs of rain. Let's go. David hates this game. All right, we are ending the game with Nila having the Nintendo Switch 2. A cyber truck and a pixel 9a. Party. David, books, Palma, he made a ipin, iFold 13 mini. Spoiler alert is going to lose the game and Sean better. Ray Band display, Galaxy Z Fold 7 and an Apple Vision Pro. All right. We all have a phone. We all have a phone. We're all in a out. You can all call each other. This is great. Let's find out how much ram. Ring ring, David. How's that humane? It's okay. I'm looking at you on my laser projector. David, how much ram does the books Palma have? I believe it's six gigs. It is six. Wow. She made an ipin. Has four. I was going to guess four. I'm going to guess the 13. I also have six. Four. Oh, right. Total of 14. Tough day for day. All right. Nila. We're going to go to you next. Nintendo Switch 2. Does anyone know how much? My guess was 16. That was what's on my head. It is 12. Cyber truck. It's either zero or 1000 per David. 64. It has 16. It's so damn. It has more inches of wiper than ram. And the pixel 9a has eight gigabytes of RAM. Actually, I got about this. Nila's pink phone and his truck have the same. All of those things are like influencer gasses. That's what I'm doing. All right. Nila has vaulted into the lead with 28 gigs of RAM. Sean, I got a good feeling about that. A rayband display. So anyone know? Four. Not even a big. Two. Rough start. Big win for the AI pin. Galaxy Z Fold 7 has 12. And this is the base Z Fold 7. 12 gigabytes of RAM. All right. Sean is 14. Nila has 28. The Apple Vision Pro has 16 gigabytes of gas. Giving Sean the win with 30 gigs of RAM. Wait, David, you would have won. You had held on to two things. I think you would have won, right? Sean took the Vision Pro from me. I wasn't given that option. Yeah. I got you. Damn. Well played, Sean. I also really respect that you had both the most and the least. And you're a full spectrum RAM guy. I just, my bet was that Apple always stars everything for RAM unless you pay an additional $800. So I was out in the Vision Pro. Yeah, but it's $3,500. I think we pay the additional $800. They're not fair enough. My assumption was that Sam's overstuffed the phone with RAM. Wait, can you buy a Z Fold 7 with additional RAM? You can. The highest N model, I believe, has 16. That is ridiculous. You should not be able to change your amount of RAM. So I have a Palm Pilot here. Anyone want to guess how much RAM is in here? 256 megabytes. It's a sim. That's a stick. It has two megabytes. Two megabytes of RAM. Incredible. That's wonderful. Back when a megabyte was the size of your thumb. Those are the days. All right, Travis, this was delightful. Thank you. Congratulations to Sean. Congratulations, Sunilai. I lost, you know, magnetimously as the host, just so everyone's very clear on that. We need to take one more break and then we're going to come back and we're going to get back into talking about the ship horse. We'll be right back. Hey, Cara Swisher here. I want to let you know that Vox Media is returning to South by Southwest in Austin for live tapings of your favorite podcasts. Join us from March 13 through the 15th for live tapings of today. Explained. Tuffy Talks, Prof. G Markets, and of course, your two favorite podcasts pivot and on with Cara Swisher. The stage will also feature sessions from Brunei Brown and Adam Grant, Marquez Brownlee, Keith Lee, Vivian 2, and Robin Arzhan. It's all part of the Vox Media podcast stage at South by Southwest presented by Odo. Visit voxmedia.com slash skssw to pre-register and get your special discount on your innovation badge. That's voxmedia.com slash skssw to register. Really, you should register. We sell out and we hope to see you there. This week on version history, our chat show about the most interesting and important products in the history of technology. We're talking about the hottest toy from 1998. That's right, of course, I mean the Furby. The little thing that sat on your desk and didn't have an off button and didn't speak English and annoyed everyone you knew, but you loved it to pieces anyway. It turns out there is a fascinating technology and even AI story behind what happened with Furby and why it took off. That's the story this week on version history wherever you get podcasts. All right, we're back. Neely has gone. Sean is still here. I rearranged during the break if you're watching. Please clap. We have replaced Neely with somebody much better. Dylan Patel is here from semi-analysis. Dylan, welcome. Thank you for having me. Real quick, I assume most people, the brand penetration of semi-analysis is right up there with Coca-Cola and Walmart these days. What is semi-analysis? What do you do? Yeah, so we're a firm that does AI infrastructure, research, consulting, etc. I started five years ago. It was just a blog and me consulting on the side. After I quit my job and it's just blown up and up and up. I have a very famous blog as well as companies, 50 people all up and down the stack having worked on equipment that makes chips all the way up to like work that data centers worked on models and things like that. We do a lot of consulting research, etc. I want to start with a question that is sort of about the timing of launching your company, but is also just about this market in general. What I'm wondering is this just what the chip market is like? Did you pick just sort of the beginning of the most bonkers time in chips history to start doing this? Or is this just how chips were? Absolutely. Today is the most exciting time ever. The ability for people to make the biggest change ever. I think it's the craziest time ever, but chips have absolutely just been bonkers right now. Parts of it are annoyingly slow and parts of it are not, but you have to think about the market for, let's just take DRAM, right? Memory, prices are soaring, people freaking hate DRAM right now, especially general public. But what's interesting is this market has existed this way for years. If I go back a decade when I was more of a four-morrier on Reddit and stuff, every three to four years, people would be oscillating between all memories so cheap. I'm going to put way more memory than I need in my PC or whatever, and then oscillating to like, oh my god, memory prices are so high. They're colluding against us. They hate the American public. They hate the public in general. I think seven characters are a very tough business, right? Memory specifically went from like over 30 companies on the leading edge to just three, right? In a couple decades, every boom and bust more companies go bankrupt. The better entries. So high, despite the fact that there is an oligopoly of sorts in many industries, in fact, most of the seven-connector industry is sort of leveled out to like, hey, there's a player with like 70% share, there's a player with like 25% share, and there's like this crappy company with 5% barely hanging on, right? And obviously things are dynamic because it's technology. So shift, stuff does shift, but at the end of the day, the barrier to entry is so high. The dynamics of the market are so intense that people are just going bankrupt left and right, that it's just, it is a bonkers industry and I've always thought it was bonkers. You've had these booms and busts. You've had these cyclical nature of the industry and you get smaller and fewer players, but it's never been like this before, right? I've seen low memory prices and I've seen high memory prices, but I haven't seen, there are only three players and one of them is getting out of the consumer market. And the consumers, it's not just that they can't buy it because it's high prices. It's that the supply isn't even there because they're seeing higher new industry eating up all this DRAM. It's never been like this before, right? I definitely agree. If we're just talking about memory, we have seen prices spike about this fast before. Over the last six months, like 50%, 75% increase in contract pricing, spot pricing has risen more. But even then, I think we've seen this level of price increase. The thing that's really going to break everyone's brain is the next six months. It's also going to do that again, right? It's going to double again, right in price or something like that. And that's what's going to really make people go crazy. And I think the funny thing is, or not even funny, like I think just like the sad thing is the reason we're in this position is because this is actually when we talk about the boom and bust of the memory industry, 2022, PC boom, all this huge amounts of memory was being deployed because everyone's just buying a new PC and the memory company's built a lot of capacity. And then all of a sudden, oh, wait a second, everyone has a new PC. I keep my PC for seven years, right? Like, what do I need a new PC? As an enterprise or as a person, you know, and the PC market falls off like a cliff. And these memory companies are now losing money every quarter, right? And so from 23, 24, 25, they were not adding new wafer production capacity, right? The only thing they were doing is as AI came and devogue, they were converting the regular memory can production to HBM, right? And so this has been the longest uninterrupted period of no capacity expansion as well, right? So this boom and bust and then also it was the worst bust, right? One of the worst buses, like sustain multiple years of losses rather than just like, hey, a year or six months, because the boom of PC and COVID was so large, right? So you kind of have this like, like, coalescing of like, oh my god, everyone is, we just haven't added any capacity. So like, and then on the flip side, we have the craziest boom in human history, right? Like, you know, I think AI infrastructure is undoubtedly like, you know, the biggest boom in the chip industry on a dollar basis. There's one way to look about this where we're looking at like these companies, there's so few of them and they can make these massive, massive profits all of a sudden. But another way to look about this at this is a couple years ago, micron was actually losing money. I don't mean like the revenue was going down. I mean, like, it was in the red, right? With actually losing money. Yeah, for over a year or two, right? It's not even like a little bit of time, right? They had to make a light. They had to do layoffs. They like shut down factory lines, right? Like all of this sort of stuff, right? Samsung, high nicks, and micron, all three of them had to do this, right? So the sort of like, you know, I would say the more nuanced view is like, yes, they, you know, this is crazy and the prices are soaring. At the same time, they kind of had no choice to, you know, like, you know, it was like, who would have increased production after you just got out of the red, right? A year and a half ago, right? Like, that's just silly. But even that no one could have predicted this. So that, that is my question, though, because even if it's possible that there's nothing we could have done about it, even if it was inevitable, right? Which I think you just made a solid case that it is. Shouldn't we all have seen this coming? Shouldn't it, like, all of those things that you just described are like three-year-old trends now. And if they're not that hard, at least in retrospect, to put next to each other and be like, oh, of course, we were headed towards something totally disastrous here. The thing about this industry is that it's very easy to be bullish, but the most bullish person tends to go bankrupt. Right? And so that's the scary thing about this industry is if you overbuild the most, you end up going bankrupt. And that's how we've gone from 30 to three, right? And so as far as like, has anyone seen this, like, of course, I could show myself and say, look, we told all of our clients, we modeled all this production capacity. We modeled all the AI production demands from a data center standpoint, from accelerator standpoint, flowed through how many wafers are required, blah, blah, blah, blah. Yeah, obviously we've been writing about like, memory is going to go crazy for the last year and half roughly, right? But at the same time, it's like, you know, I'm not someone who managed a company, you know, through the last 30 years, we're 37 competitors when bankrupt. And the scars are there, right? The industry has these scars. So there's also that front of like, is this a bubble? Well, I've seen 15 other bubbles and that's bankrupted all my competitors. Why would I go crazy this time? Was there anybody though who was like predicting this? Who's like the, you know, the guy who saw the subprime mortgage crisis coming? Was there anything like that here? Because I feel like what I'd heard was DDR4 prices are going to rise because, you know, it's the old standard. They're not going to be producing as much of that anymore. You know, let's, let's get ready for DDR5. And then it's felt like it was a surprise that DDR5 was in this huge crunch that everybody, that these data centers wanted so much of it. And even even the companies that make their business on this Dell, the CEO of Dell is telling us, you know, hey, we need to my focus now is to make sure we have supply as if his company hadn't inked deals years ago to make sure that they have enough supply for their consumer products for their commercial products. Yeah, yeah. So at the end of the day, like the memory market is like kind of a funny one in that like you can say, oh, DDR4 pricing is going to go up because it's going out of production, but the wafer production, on the wafer production level, right? There is sort of like the differences between DDR4, 5, HBM, there are some process differences, but those process differences are not like, oh, it's a separate factory. It is the same factory. The time to change from one to the other is not that long, right? You know, in the case of DDR4 to 5, it is just a mass change, which can take hours. Now, obviously the production timeline to make memory waifers takes months, right? But, you know, the timelines to change stuff is like quite quick. What it is is that they just didn't add wafer capacity, right? Now, whether that wafer capacity goes to DDR4, HBM, or DDR5, there is some differences in sort of like what's the capabilities, right? Samsung's not that great at HBM, but they're great at DDR5, right? Like sort of like, you know, there's these sort of mixes, and discontinuities in the market, but at the end of the day, you know, the market is pretty efficient, and the margins for each don't differ that much, right? For DDR4 versus 5 versus HBM. So what's happened is just that memory AI is a sort of inelastic buyer, right? If you will. Nvidia buys, you know, let's take Blackwell, right? Blackwell GPU, 192 gigabytes of memory, you know, that GPU in video sells for over $30,000, right? But at the same time, that GPU costs them like $6,000,000 to make, and of that memory is half the cost, right? But if the memory cost goes up 20%, well, they're still selling this chip for $30,000 plus, right? So, you know, if it goes from $6,000,000 to $8,000, you know, there, there elasticity there is not crazy, you know, who the elasticity is crazy for is the PC buyer, the mobile phone buyer, there the phone costs, you know, or the laptop cost, whatever it is. And if the price of memory doubles and it adds 100 bucks to the cost of the device, then all of a sudden, you know, guess what? I might not, you know, I might not, or it cost even more that, right? Like I might not, you know, that $1,000 laptop, you know, for a little bit, there were even $1,000 laptops with 32 gigs of RAM, and it was like, glory, glorious. Now, now, like, you know, it's kind of harder to find, and I bet next year, it's going to be even tougher, right? And so there's, there's sort of that interaction in the market, in that like, who is the elastic buyer? And unfortunately, the consumer is the elastic buyer, you know, the production, you know, AI infrastructure spends going from like, you know, 10 billion to 100 billion to a trillion, right? Like we are scaling like crazy. And if it gets to, you know, next year, we look at, you know, the big top five hyper scalers, they're going to spend our top six hyper scalers, right? Google Microsoft Amazon, Oracle, CoreWeave, I'm missing one meta, right? You're looking at $500 billion to spend across them, right? On AI infrastructure, of that, their spend on NVIDIAs is like 300 billion, and of NVIDIAs 300 billion that spend on memory, you know, after their margins, you know, NVIDIAs largest supplier is SK Heinigs, a memory vendor. It's not TSMC, right? So you, you flow through and it's like, okay, actually next year, the gross, the total volume that AI is consuming is ridiculous. And as far as like, hey, did anyone see this coming? There are a lot of people who are speculatively buying and things like that because the dynamics of the market are clear, right? So if you look at like, there's, I don't remember which PC maker it was, whether it was Dell or Lenovo or HPE, but one of them actually super smartly about two quarters ago secured a bunch of DRAM ahead of time. And they're stock at like, there was a call on the earnings call where their investors were like, why did you, why did you enter all these long term supply agreements? Like, you know, you shouldn't do that. You should be just in time inventory, right? Like, they were kind of like criticizing them like the Wall Street person. And then now they're looking pretty smart because they secured memory at a lower price longer term, right? But at the end of the day, right? Like, not all of the memories. The other aspect of this is like this, this area for AI, the HBM, right? That manufacturing process is a little bit more complicated. The suppliers are even fewer, right? Samsung's not so good at it. So Nvidia locked in their supply for HBM totally next year. But this funniest thing is because SK Heinigs is the best and because they have the best HBM, and Nvidia decided I'm going to sign a big contract with you at Good Margin. All of a sudden, SK Heinigs is going to be the least profitable of the big three memory vendors because they don't get to sell all their memory at the super inflated price that because the market is going crazy. So we've laid a lot of the blame for a lot of this kind of at the feet of AI data centers. But I think for people who don't follow this stuff all the time, these just seem like large buildings full of computers. And I think it's hard to like put into real terms how big a project this actually is. Can you give us a sense of the scale of the build out of the AI data center right now? If we go back just a couple of years ago, there were 4, 5,000 data centers in the world. But none of them were like super, super big, right? Like the biggest data center in the world three years ago is laughably small compared to what's being built now, right? The largest data center building in the world was maybe 50 megawatts, 75 megawatts, etc. In that range. Google was building one that was like 100 megawatts. I was like, whoa. No, when you look at like the size of like some of these campuses, right? It's like, oh, Stargate in Abilene, Texas is two gigawatts. And actually that one is kind of small relative to some of the stuff that's going up, which is five plus gigawatts, right? You know, Metta's talked about like their Louisiana stuff. The concentration of like value within a data center is absolutely absurd, right? You think about it again, right? This area of Manhattan, it's five gigawatts. Five gigawatts, you know, once Metta builds this entire like Louisiana facility out five gigawatts, size of Manhattan, it ends up being the entire cost that they put into it was on the order of roughly $250 billion, right? $250 billion for one site in Louisiana that they're building over, you know, a couple of years, which by the way, those GPUs are not going to be, you know, bleeding edge in five years, right? They're going to be, you know, they'll still be, there'll still be maybe useful in five years, but definitely in seven or eight years, they won't be useful. It's just an insane factoid because when you think about it, it's like, you know, like, 250, you know, it's hard for humans to understand what 250 billion even means, right? That's more money than the richest person in the world has, but again, like we can't fathom what the richest person in the world even has. So it's just like, there's no way to contextualize this. If that's where we're headed, a different market would suggest that the next thing that's going to happen is we're going to get a million chip startups who are going to come and try to make all of this stuff. And they're in the like, your margin is my opportunity, right? That's, that's, that's this moment that we're in. And every, every academic, everybody who's interested in chips, you should quit your job and go start a chip company. Like, this is, this is the gold rush and we should be getting an entire new generation of folks coming out and saying there's only three companies doing this. Let's get it back to 30. Let's get just a 300. And there's going to be demand for this going forward. Is that happening? Are we headed down that road? And if not, why not? So there's two sort of layers in this stack, right? There is the design companies and they're the manufacturing companies. As far as the design companies, there are some startups. Many of them are going to fail. Maybe some of them will be successful. But that's true of all startups, right? Like this is not necessarily a wildly different kind of thing to try and do. Yeah. But in terms of design, like the bearer entry is there, but it's not insurmountable, but it is a high bearer entry, right? You have to design your chip and then tape it out at TSMC and all this cost, you know, $50 million plus dollars. And if there's any issue, you got to do it again, right? So there is a bearer entry there. That bearer entry is really low compared to manufacturing. Right? We are not going to get many new memory companies. There is one new memory company, but that's because it's like a national level like priority from China called CXMT. But they're like, they have national government backing. We're not, you know, we're not going to get a new memory company in the West. Just we just aren't right? Or in like Taiwan or Japan or Singapore or anywhere else, right? In the West, like, you know, there's just not going to be a new memory company. Because the physical like construction of a fab is tens of billions of dollars, right? Plus all of this know how that is super trade secreted and super like confidential. And it's built upon a year after year after year of tens of thousands of people's R&D who are all super, super smart, like PhD level, you know, engineers and researchers just building and building and building and it's incrementally building on top of each other, right? For context, right? Building a chip, right? You know, like a leading edge chip, whether it's memory or, you know, say a phone chip, a phone CPU or phone memory, right? Or laptop memory or laptop chip. That takes over 5,000 process steps, right? We take a perfectly, you know, you take a silicon wafer, it's a perfect crystal structure. Then you bombard it with ions, you put your deposit materials, you, you, ions is for doping, like you do lithography to define where things are and you build this chip layer by layer by layer by layer by layer and any defects ruin the chip. So then the other thing that might happen, it seems to me is like the other thing you would hope for at the end of this is this is where we get the next breakthrough, right? And you've written about this. I found a blog post you were a couple of years ago, basically saying we're sort of at the end of DRAM anyway, that it can't scale the way that it the way that it used to that other parts of the process are kind of leaving its abilities behind and that maybe what we need is something beyond DRAM. And again, this we have perfect market conditions for somebody to show up and say, actually, I've found something that I can make cheaper or I can make with a different set of materials that is less, you know, reliant on one multimillion dollar machine from a Dutch company that has to fly in on a Boeing 737. Is that happening? Like, is there a, is there a science breakthrough here somewhere? It takes multiple planes, not just a few. Right, it's multiple. So if we can just get it down to one 737, we will have done something. Is there, is there any inclination of that on the horizon here? Yeah, so so the interesting thing is like when you look at, you know, just I'll zoom out to just like generally semiconductors, they've sort of had more of a lot, right? And more is a lot was actually originally about DRAM, but people just sort of prescribed it to logic as well. That's true. I didn't actually know that. Yeah, Gordon Moore was talking about DRAM. That's what Intel made at the time. They actually lost it. They were one of the companies. They didn't go bankrupt. They pivoted to processors. But they had to leave the DRAM market because they were getting, you know, the market was too tough. Anyways, you know, the number of transistors doubling, you know, every two years again, like, you know, in an area like butchering the definition, but whatever happened for decades, right? And Moore's law was made before, you know, Intel's 8086, right? Like the initial coin of it, right? It was actually when they were making only DRAM. It happened for decades, decades, and it cost kept going down. As far as the general DRAM vendors, they do have a roadmap to continue to go 10, 15% cost reductions a year for another decade. They have some, you know, they have some innovations that they want to do over the next five years, next seven years about changing the structure of DRAM and such. But the next big one is sort of 3D DRAM, right? So interestingly enough, NAND Flash already went through this. The cost reductions ascent voted out. The physics just didn't let you shrink it smaller and smaller. So NAND ended up having this like big revolutionary update, we're instead of doing planar NAND, they did 3D NAND. And so they can make many layers at once, right? And that ended up resuming the cost scaling curves for NAND. And that's why, you know, SSD prices have fallen so much over the last decade. And even the last five years, yes, market dislocation right now, but generally prices have fallen a lot, whereas DRAM has not really fallen much. And so if they can figure this out, the debt's what all, that's what the DRAM vendors are all in on, right? But that means that they're sort of, they're all in on, but so is their supply chain, right? Applied materials, you know, lamb research, you know, ASML, all these companies that are in the equipment supply chain, all of their R&D effort is also focused for memory on helping these companies figure out 3D DRAM because that's what they need next to keep scaling and keep the market growing and keep the industry moving forward. You mentioned China, you mentioned China earlier that there is the state backing there for one company in particular to start making a debt. Oh, the big companies, you know, SKHINEX and Samsung and Micron, they're making a lot of this stuff in China too, right? What keeps China from taking over? So as far as what they're making in China, the vast majority of SKHINEX and Samsung's capacity is in Korea, right? The vast majority of Micron's capacity is in Singapore, and Japan, Taiwan, and the US, right? Micron has no capacity in China. Haini and Samsung have some, but that's old capacity and they actually haven't upgraded it in a long time. With that said, you know, what prevents China from learning is like, I mean, it's just a ton of hard engineering, right? You know, there's a bit of like, oh, people who worked at these companies coming back and being hired, there's all the sort of like claims of like IPTF blah blah blah, but at the end of the day, it's like, look, there's a ton of engineers who are working really hard and they have effectively unlimited money, right? And even then China still has not been able, you know, in 2015, they released a five-year plan, which was what their targets for semiconductor production in 2020 and 2025 were. And they did it again in 2020, right? And they've missed every time despite the fact that they've poured hundreds of billions of dollars. Now, that's not to say they didn't make progress and that they're not going to get there. They will. I just really believe it, because Chinese engineers are really good, right? It just takes a really long time. Now, I don't think this is something a venture back business can just do, right? You know, that amount of money is really venture nation state level stuff, right? And that's the challenge. And as far as China, like, you know, the next breakthrough away from DRAM entirely could come from China as well, right? There's no reason why it wouldn't. It's always easier to leapfrog than it is to sort of like incrementally catch up. Is this the moment though? Because with the export controls and Trump playing the games around, you know, who can and who can't get in video GPUs and the national security concerns in the United States about do we want to let China have all the GPUs? Is now the moment that the nation state changes its tune and says, we're going to we're going to 10X that investment in these companies to make it to make it happen. Do you see that China becoming a global power in logic and in memory the way that we already know it's a global power? In all the other ways. Yes, I think I think that's absolutely going to going to happen one day, right? China is going to catch up. I want to just go back to the thing you alluded to at the very beginning, which is my last question for you was going to be when does this get better? And when when does my computer get cheaper again? And I think the answer is a maybe it doesn't and be it's certainly not happening anytime soon. So that is that right? Basically about last year, this last year, high nicks, micron, Samsung, they've decided to start investing in building new fabs again, right? Instead of just upgrading old fabs, keeping the old fabs going, changing the process, blah, blah, blah, blah, they've decided to build new fabs takes two-ish years or three years depending on like the time scale of the construction progress. So in 2027, we will have new wafer capacity coming online. And you're literally just talking about like the time it takes to build a building, right? Like it's not there's not something like magic they have to do to set up these new fabs. It's just literally like they have to go do large scale construction. It's large scale construction in the cleanest environment humans have, which is a clean room where there's less than particle, you know, a million part, you know, like the million particles per square meter or whatever is like, absurdly low, right? You know, to the point where like human like skin flakes are the most dirty thing in the like, you know, like fab, right? And that's how you wear the bunny suit. There's all the complex chemistry, right? They have like all these surfioric acid and hydrochloric acids and they have all these different chemicals like photo resist. There's thousands of chemicals used, right? There's these tools that range all the way from these like 200, 300, 500 million dollar tools that get flown in on multiple seven or seven. The human logistics required to build these fabs is absurd. And so yeah, it takes a couple of years and then setting up the fab, making sure it's clean, running the process, et cetera. Best case prices get cheap again in 2027 or continue to fall again in 2027. Okay. But depends on what happens with AI. Because as I mentioned, everyone who's building these fabs is super conservative, right? Right. They are, they're these 55 year old men who have seen many busts. And if you take what are predictions for AI supply chain demand or, you know, anyone else who's sort of like super in tune with this, what we think, you know, the number of GPUs, the types, the memory, all the volumes required actually prices will get worse in 2027 too. But that's if AI doesn't bubble pop as a bubble, right? Now, I'm a maxi. I don't believe it'll pop. I think there's trillions of dollars of economic value that will be delivered through AI over the next few years. You know, that's that's me. But if you do believe AI bubble will pop, then 27 will have cheaper prices. If you think AI, you know, just continues to go, woo, then we're screwed and actually memory will never be cheap again. And we might as well like stop using our computers. It is truly wild that you actually you have to make your computer buying decision in 2026 based on macroeconomic and giant geopolitical. Edwin's this is where we are in the world. Should I buy a Dell XPS 13 is actually a question about like the state of the economy in the world. This is why we do an hour on RAM friends. This is this is what we're doing here. All right, Dylan, Sean, thank you so much. This this has been delightful. I appreciate you both doing this. Thank you so much. All right, that is it for the show. Thank you to Neely and Sean and Dylan for being here. And thank you as always for watching and listening. This is actually our last Verge cast of the year. We're all going to disappear into the holidays for a couple of weeks. And we will be back in the beginning of January for CES. We're going to do an episode kind of right as all the news is happening. And then we're doing a live episode on Wednesday, January 7 in Vegas at the Brooklyn Bowl. So mark it on your calendars. It's going to be in the afternoon. Come hang out, come Verge cast. We're all going to go bowling. I've never been bowling with Neely and I have this sneaking suspicion that he's either very good or like hilariously terrible and I'm very excited to find out which one it is. So come bowl with us. Watch the Verge cast. It's going to be very fun. Also, in the meantime, there's a bunch of decoder for you to catch up on. We have a couple more version history episodes coming out this season. We've got Tivo left to do. We've got the Nintendo power glove left to do. We've got Flappy Bird left to do. They're all very fun episodes. They're all coming out the next few weeks. Make sure you go subscribe to all those shows and subscribe with the verge if you want to get them all that free. Until then, we're getting out of here. The Verge cast is a Verge production and part of the box media podcast network. This shows produced by Eric Gomez, Brandon Kiefer, and Travis Larchuck. I hope you have a wonderful holiday. We will see you very soon. Rock and roll.