TBPN

Intel Rips, Thrive Launches Eternal, GPT 5.5 | Diet TBPN

28 min
Apr 25, 20263 days ago
Listen to Episode
Summary

Intel surges 20% after Q1 earnings beat, driven by five converging demand narratives around AI agents, CPU-to-GPU ratio shifts, US government backing, and Elon Musk's TeraFab project. The episode also covers Thrive Capital's new permanent fund backing the San Francisco Giants, and GPT-5.5's improved reasoning capabilities.

Insights
  • Intel's stock recovery is narrative-driven rather than fundamentals-driven; five separate demand stories (AI agents, advanced packaging, US domestic fab mandate, Musk's TeraFab, hyperscaler supply) are collectively creating investor confidence despite ongoing losses
  • The CPU-to-GPU ratio shift from 1:8 to potentially 1:4 or even 8:1 represents a fundamental change in AI infrastructure economics that benefits Intel, but the actual ratio depends on whether inference workloads consolidate on large models or distribute across many smaller open-source models
  • Compute constraints are real and persistent; even tier-two and tier-three AI labs will be compute-constrained for years, creating pricing power for model labs until hardware supply chain margins expand
  • Permanent capital vehicles like Thrive Eternal signal a shift toward long-term ownership of iconic assets that cannot be replicated by technology, suggesting venture capital is diversifying beyond pure tech plays
  • AI model labs are losing money on inference despite high demand, but this is driven by strategic pricing and compute constraints rather than VC subsidies—Fortune 500 and institutional demand is substantial and real
Trends
CPU demand acceleration in AI infrastructure as agentic workflows shift focus from training to inference and orchestrationUS government industrial policy in semiconductors creating demand guarantees and nationalization narratives around Intel and domestic fab capacityCompute scarcity extending useful life of existing GPUs and creating pricing power for model labs across all tiersPermanent capital structures emerging as alternative to traditional VC for long-duration, non-tech-replicable assetsOpen-source AI model proliferation driving distributed inference workloads that require more CPU resources than centralized large-model inferenceMargin expansion potential for inference providers as compute supply constraints persist and demand from enterprise/institutional customers growsVertical integration plays (Musk's TeraFab) challenging traditional foundry economics and creating new demand signals for manufacturing capacityAI agent adoption creating new CPU-intensive workload categories (task orchestration, memory management, job routing) beyond traditional data center computeInference becoming the dominant revenue driver for AI labs, shifting economics from training-focused to inference-focused business modelsCircular economy dynamics in AI where big tech investments in AI labs create demand that benefits their own infrastructure and hardware investments
Companies
Intel
Stock surged 20% on Q1 earnings beat; five demand narratives (AI agents, CPU-to-GPU ratio, US backing, TeraFab, hyper...
NVIDIA
Dominant beneficiary of AI training boom; CPU-to-GPU ratio shifts could reduce relative demand for GPUs in inference ...
TSMC
Fab capacity constraints discussed; CEO noted 2-3 years to build fabs plus 1-2 years to ramp; Arizona fab progress me...
OpenAI
Described as 'cooked' relative to Anthropic; raising capital for incremental compute from Oracle, CoreWeave, SoftBank...
Anthropic
In the lead but compute-constrained; raised $40B from Google; will be sold out of tokens even at higher pricing; pote...
Tesla
Part of Elon Musk's TeraFab vertical integration project; needs massive chip volumes for self-driving cars and humano...
SpaceX
Part of Elon Musk's TeraFab project; needs chips for space-based AI data centers and infrastructure
Cursor
Acquired for $60B; had negative 23% gross margins earlier this year; stickiness as AI entry point discussed; Composer...
Thrive Capital
Launched Thrive Eternal, a permanent capital holding company; first partnership with San Francisco Giants for long-du...
San Francisco Giants
First partnership for Thrive Eternal permanent capital fund; iconic sports franchise with century-old history and com...
ARM
Entering CPU design space; expected to work with TSMC in short term; relevant to CPU supply dynamics as AI demand grows
AMD
Part of big three CPU manufacturers (CPU, AMD, Intel, ARM); agents need far more CPUs than these three can produce
Cohere
Merging with German AI lab Aleph Alpha; CEO Aiden Gomez is transformer paper alum; congratulated for strategic combin...
Aleph Alpha
German AI lab merging with Cohere to combine capabilities and resources
Jane Street
Generated $1B+ in revenue last year with only 3 employees; joked as having achieved AGI internally
Microsoft
Funding OpenAI compute; strategic investor in AI infrastructure and model development
Google
Raised $40B for Anthropic; strategic investor in AI labs and compute infrastructure
Amazon
Providing Trainium chips to OpenAI for incremental compute capacity
Oracle
Providing compute infrastructure to OpenAI for AI model training and inference
CoreWeave
Providing compute infrastructure to OpenAI for AI model training and inference
People
Lip Bhutan
Communicated CPU-to-GPU ratio shift from 1:8 to 1:4 on earnings call; articulated agentic AI narrative driving CPU de...
Josh Kushner
Announced Thrive Eternal permanent capital fund; first partnership with San Francisco Giants; backing emerging tech a...
Elon Musk
TeraFab project aiming for 100k-1M wafers/month; vertical integration of chip manufacturing for self-driving, robots,...
Dario Amodei
Previously gloated about sensible compute scaling vs OpenAI; now wishes Anthropic had more compute for inference demand
CeCe Wei
Noted that building and ramping fabs takes 3-5 years total; cautious on aggressive fab expansion timelines
Patrick O'Shaughnessy
Discussed Anthropic compute constraints vs OpenAI's ability to pay for incremental compute; inference demand economics
Dylan Patel
Appeared on Patrick O'Shaughnessy podcast; discussed expensive inference tokens and pricing pressure on model labs
Ben Thompson
Wrote bull case for Intel-US government deal; wrote on TSMC risk and need for increased CapEx
Mark Lipacis
Upgraded Intel from neutral to outperform; cited CPU-to-GPU ratio shift as driver of CPU demand growth
Bill Gurley
Argued that price is a leveler of supply and demand; questioned data center and energy constraint narratives
Elad Gil
Bullish on AI boom as once-in-a-lifetime transformation; advised many AI companies to exit in next 12-18 months despi...
Martin Scully
Debunked circular deal narrative; argued economy is broad with real Main Street and institutional demand for AI
Jim Cramer
Praised Intel CEO Lip Bhutan for turning Intel from bailout candidate to wealthy chip industry company in 13 months
Aiden Gomez
Transformer paper alum; leading Cohere merger with Aleph Alpha; described as 'death grips enjoyer'
Donald Trump
Referenced as enjoying bailouts; joked about applying wisdom to hyperscalers; compared Intel to baseball betting analogy
Quotes
"Intel is working again is the idea."
HostEarly in Intel segment
"The CPU to GPU ratio flipping from one to eight to eight to one is absolutely wild. That's just a completely new world to what we've had so far."
VK MacroIntel CPU-to-GPU ratio discussion
"Anthropic won't have enough compute to do that. And so, and presumably OpenAI and Google will hit that tier soon enough. Whoever hits that tier next, sure, Anthropic may get to charge 70 plus percent gross margins, but if OpenAI hits it next, they charge 50% gross margins."
Patrick O'ShaughnessyInference economics segment
"Historically, in markets, price is a leveler of supply and demand. If you have a constraint, you price higher. You don't have surplus demand."
Bill GurleyCompute constraint discussion
"My view is the AI boom will only accelerate and is a once-in-a-lifetime transformation."
Elad GilAI boom outlook
Full Transcript
The big news, of course, is Intel. Intel is absolutely ripping. Intel jumped 20% after hours on the back of $13.6 billion in Q1 revenue. Only 11% above analyst estimates, but there's five key factors that are coming together to create a new narrative around Intel that are driving the stock much higher. And it is almost at all-time highs. The main factor is Bubble Boy on X. A lot of people have called this and it is long overdue. The revenue is only up 7% year over year, but next quarter is already guiding to be better, somewhere north of $14 billion probably. But there's still big losses under the hood. This quarter they lost $3.7 billion. Not good, but that was mostly driven by one-off charges related to Mobileye and derivative payments tied to the U.S. government's 10% stake. So if you strip those out, Intel actually earned $1.5 billion, which is much better than what people were expecting, which was basically breakeven. So the cruise ship of Intel is starting to turn somewhat, but the narrative has already completely shifted. So Intel is working again is the idea. The AI trade has mostly been NVIDIA, NVIDIA's memory suppliers, TSMC, power equipment, cloud CapEx, and a few software names that can prove real adoption. If you're accelerating your top line, you are an AI winner. That's sort of the rule of thumb. Of course, things might play out differently, but that's what's been happening in the market. Intel was the most embarrassing missing piece. Why isn't Intel booming when we're in a computing boom? It made no sense, but there were good reasons, and we'll go through them. The company that invented the modern CPU era, they missed mobile. They fell behind TSMC. They failed to produce a competitive AI GPU. They did fab a GPU at one point. They tried to get into gaming at one point, but they never really found traction, especially in the data center and the servers for AI training. And so they have fallen behind. And then they spent the last few years trying to convince the world that Intel could still matter. But now, oddly enough, the rise of AI agents, it's giving Intel a second shot. Do you have thoughts on Intel yet? I'm going to keep going. It's funny because at the moment that the U.S. took a position in Intel, it was like it felt like a bit it was a bailout right it felt like that yeah and it felt like you would expect okay over time this is very this is going to be very very good for intel to have that vote of confidence but i don't think anyone was really predicting that it would go up quite this much in this van yeah of just half a year i think ben thompson wrote a pretty strong bull case for the for the intel u.s deal basically like you had to you had to grapple with this idea of like this doesn't feel like free market capitalism. Yeah, I guess the only thing is like there wasn't this narrative around CPUs at the time that the deal was happening, or at least it wasn't a very public narrative. Yes, yes. The bull case at that time that I heard most loudly was not the CPU boom. It was just overall chip. It was the 14, like the leading edge fab that basically by having the US government as a shareholder, you see those dinners with all the AI lab leads and all the hyperscaler CEOs. And there's a world where there's another one of those meetings and the government administration says, hey, we are backing Intel. I need all of you to commit to buy a ton of supply if Intel actually goes and builds the leading edge fab, which is going to cost them billions of dollars. But if they have the demand guarantees, then they will actually be able to go do it. Tyler, what was your take? Yeah, I think Intel has long been kind of the Leopold take, which is that this is the company you want to own in the nationalization world of Taiwan risk. What are we going to do if we can't make chips in Taiwan? Intel is the very clear strategy. Maybe it's not working right now, but at some point, this is not just economics. This is national security. It's extremely important that we have this capacity in the U.S. And Intel is the only one that's even remotely in the conversation. Yeah, that makes sense. There were some predictions from AI 2027 and other folks in the AI forecasting world around the rise of agents that agents would arrive at this time. I didn't see too many folks who were predicting strong, valuable, effective AI agents really predicting the CPU crunch, but that's exactly what's happened. So AI agents need CPUs to do things. Training frontier models, that's still a GPU story, but running agentic workflows across data centers, orchestrating tasks, routing jobs, managing memory, handling inference workloads, coordinating servers, increases demand for the boring old central processor. Intel's data center segment produced $5.1 billion in quarterly revenue, beating the $4.5 billion analysts expected. And Intel CEO Lip Bhutan said the next wave of AI is moving from foundational models to inference to agentic AI. And that shift increases the need for Intel CPUs, wafers, and advanced packaging. On the earnings call, he said that the CPU to GPU ratio is closer to one CPU for every four GPUs versus one for every eight in prior years. And there was an interesting forecast from Liputon as well, but VK Macro says CPU to GPU ratio flipping from one to eight to eight to one is absolutely wild. That's just a completely new world to what we've had so far. This is from Evercore, ISI's Mark Lapisis, has upgraded Intel directly from neutral to outperform and mentions that as AI workloads shift further toward inference and agents, the weight of CPU demand will rise sharply, and the CPU to GPU ratio could flip from 1.1 to 8 to 8 to 1, which is a massive, massive switch. Yeah, I mean, I don't know. Like, that's kind of a crazy ratio. It is. Because, like, the lesson from tier models is that, like, the models are going to keep getting bigger. You're going to need way more, like, chips to infer some. Like, Dylan Patel was on, Patrick O'Shaughnessy, I think it came out yesterday or the day before. And he's just saying, like, you know, the models are going to get really expensive. Tokens are going to get super, super expensive. People are going to price down. So I don't know if I fully believe the narrative that, like, you need all the CPUs. Yeah. You're going to need way more CPUs than the CPUs. Well, I mean, the... That's all, folks. It really lines up perfectly. His camera is the main one that fits perfectly. The glasses land around. I love a Friday show. Yes, that's a good point. But there is this, I think at least in the midterm, like the. Yeah, there's like capabilities overhang. And just like one really smart AI system that being inferenced on GPU can spin off a ton of CPU workloads and do a lot of things that require CPUs Yeah Even in the SaaSpocalypse like you vibe Slack or whatever it like well that vibe system is running on a CPU And if it took you, I don't know, 1,000 GPU hours, but then that system runs on CPUs that are running constantly for every user, you still wind up with creating more CPU demand out of the GPU. I think this take makes a lot of sense if you are like kind of, you know, long open source. We're going to have a lot of these open source models. Yeah. They're quite small. You can run them on like a small amount of chips, but you just have a lot of them. You have a bunch of agents doing a bunch of stuff at the same time. Yeah. Rather than the single model, you know, inferenced on a massive amount of chips. Why is the horse here? Anyway, whether it's going to eight CPUs for one GPU or staying at one CPU to four GPUs, it's still twice as good as it used to be, because it used to be one to eight. And so that is bullish for Intel, and LibBouton is making that clear to the market and to the investors on his earnings call. Then there's the wild card, the TerraFab Elon Musk project. It's very exciting, but it's clearly further off. Elon Musk wants to build a massively vertical, integrated chip manufacturing operation with Tesla, SpaceX, and possibly other Musk companies needing huge volumes of chips for self-driving cars, humanoid robots, and even space-based AI data centers. Intel is supposed to help design, manufacture, and package chips for the project. Now, the Wall Street Journal has a more cautious view today. Elon is aiming for TerraFab to reach 100,000 wafers a month, and then eventually 1 million wafers a month, which is just insane scale. So let's put it in context. 1 million wafers a month is about 70% of TSMC's total monthly output across all fabs. TSMC's largest fabs put out roughly 100,000 wafers a month into production. So you're talking about 10, you know, leading class TSMC fabs at Intel just on the TerraFab project, plus whatever else Intel is doing. So a pretty massive scale up. And TSMC's CEO, CeCe Wei, has said that it's just not that easy to build a fab overnight and get it up and running. So fabs take two to three years to build and then another one to two years to actually ramp. And we've seen this with TSMC Arizona, which we've been very excited about, but it just takes time. And there is a bottleneck. And the bottleneck has been discussed at length. Ben Thompson wrote about TSMC risk and surtecary, arguing that TSMC needs to spend more on CapEx. And I think that should be clearer now. We'll see what happens at the next TSMC earnings call because maybe they will be waking up. But overall, Intel now has a collection of plausible demand stories, even though the demand itself is not going vertical, the stock is going vertical on the back of these five key demand stories that are all pointing in the same direction, directly up. One, AI agents need more CPUs. Two, AI systems need more advanced packaging, a higher ratio of CPUs to GPUs. Intel can help with that. Three, the US government wants to direct a domestic leading edge foundry. This is just a national mandate. Four, Musk wants an impossible amount of silicon. And five, also the hyperscalers want more supply. And so all of that is good news for Intel. There are more companies that are getting into the CPU design space. We talked about ARM recently. And ARM, of course, it feels like they will be going with TSMC in the short term. But who knows what happens in the long term. So as CPUs continue to be important in the AI story, All good news for Intel. So suddenly investors are more willing to entertain a messy, expensive, strategic chip story than they were five years ago. Intel famously missed mobile, which meant TSMC, ran away with enormous manufacturing volume and left Intel with a demand problem. Volume was destiny, as they say. You can't grow if you can't build the fab that's on the leading edge, and you can't build the fab unless you actually have the customers. And so if you missed mobile, you just have this gap, and you have to jump over it with the help of the government and a bunch of other people and the AI narrative and all these different five key pieces. So the pieces of the puzzle are coming together now, and that's good for Intel. It's also good for America's chip manufacturing prospects. So good news overall, and congrats to all the Intel shareholders that were believers early on and rode the wave. Everyone, every U.S. citizen. Every U.S. citizen. Congrats to all Americans. Every U.S. citizen. And every U.S. debt holder. So even the international folks that own treasuries are in a better financial position today. You've got to think about what this does to Trump's confidence level. You know, he's like, I enjoy a bailout. He's like, I'll indulge Spirit Airlines, American Spirit. I think he's excited about the potential there. Sure, sure. But why stop at just bailouts, right? Why not start taking a position in hyperscalers? He's like, look, I think I can move your market cap by 3x. I've done it before. Yeah. Intel was falling apart. Yeah. Imagine if he applied all his wisdom to a company already winning. Certainly possible. Jim Cramer's excited about it. He said in 13 months, LipBouton took Intel from a possible and unthinkable bailout candidate to one of the wealthiest companies in the chip industry. There is a big three of CPU, AMD, Intel, and ARM. And the agents need far more CPUs than these three can produce. So that means prices are going up. And they got the God candle. Yeah, look at this candle from... We love a green candle. Should be white suits today. Well, before we move on, there's some news. Justin Bieber brought Biebercella back from the dead. Later in the show, we'll tell you what this says about late-stage venture reacceleration. We're going to tie those two together. In other news, for more than a year, Silicon Valley has buzzed about Cursor's growth and whispered about its margins. Now on the cusp of a $60 billion bailout, Laura over at The Information. Wait, they called it a $60 billion bailout? Buyout, buyout. I'm sorry, buyout. We revealed a hot vibe coatings, financials. $60 billion acquisition calling it a bailout. It's not a bailout. Their margins were rough. We were just talking about that. It got warmed into my head. Cursor had negative 23% gross margins earlier this year. Amir says that is low for a company generating as much revenue as it is. This is the question of how sticky will Cursor be as an entry point to AI, as an entry point to inference demand? Can they reroute? It feels like with Composer, they have been able to retain a lot of customers. The company is still growing and we know cursor users who have stayed with the product while changing inference based on what their plan gives them So they a subscriber and they will use whatever the plan whatever gets the job done for them within the budget of the plan And so obviously, like you see negative 23% gross margins, and you're like, whoa. And then you look at the SpaceX deal and the XAI deal and all the compute that's sitting at Colossus 2, and you think, oh, well, like, what will the gross margins be Once the inference is happening on a cursor trained model or XAI trained model, are there going to be higher gross margins? It's pretty easy to imagine that the gross margins would increase. Did you want to play this piece from Patrick O'Shaughnessy? Let's do it. So clearly Anthropic is in the lead, right? And OpenAI is cooked. What's interesting is because Anthropic has such bounds on compute and they can only grow it so fast and sort of to the point of Dario used to gloat about how OpenAI was being too aggressive on compute and Anthropic was more sensible in their scaling. And now Anthropic is like, fuck, I wish we had a lot more compute. OpenAI is able to pay the bills perfectly fine. In fact, they've raised a ton of money to get incremental compute in addition to the irresponsible levels of compute that they were buying from Oracle and Core, and SoftBank and all these people and Microsoft, such as Tranium. Now they're getting Tranium as well from Amazon. on. But what's interesting is if you were to say Opus 4.6, you know, let's ignore models getting better over time. Let's just take diffusion of this technology. You and I may jump on the model immediately day one, but other businesses take time and it takes time for people to learn. And so by the end of the year, let's say a 4.6 Opus tier model, the economy would spend $100 billion on. I don't think that's unreasonable. It's spending $40 billion right now. Anthropic won't have enough compute to do that. And so, and presumably OpenAI and Google will hit that tier soon enough. Whoever hits that tier next, sure, Anthropic may get to charge 70 plus percent gross margins, but if OpenAI hits it next, they charge 50% gross margins. They still get all of this incremental demand and probably they also won't have enough compute to serve all the users. Sure, maybe Mythos is a model where if the world had enough compute, it'd be $500 billion of revenue or something crazy. There is such demand for these tokens and such limitations on compute. You know, and we see this with H100 prices skyrocketing and the useful life of these GPUs continue to extend. It's pretty clear even the tier two lab is going to be sold out of tokens, let alone the tier one lab. The tier one lab will have better margins, but the tier two lab will be sold out and probably the tier three lab will also be close to sold out. Economic value that the best model can deliver is growing faster than our ability to actually serve those tokens to people via the infrastructure. And so this gap will continue to grow, and the model labs will continue to have expanding margins until people in the hardware supply chain, infrastructure supply chain are like, wait, no, why don't I just jack up my margins? Oh, yeah. I love that sound. I love that sound. Bill Gurley, let's see what Bill Gurley, let's check in with BG. He says, I find this conversation foreign, along with the argument that we are data center constrained or energy constrained. Historically, in markets, price is a leveler of supply and demand. If you have a constraint, you price higher. You don't have surplus demand. Yeah, it's a good point that the big labs are losing. money, but it doesn't feel like that the revenue of OpenAI and Anthropic are VC subsidized heavily. They're not VC. Yeah, that's what I'm saying. And their customers are not startups necessarily. It's like hedge funds and banks and stuff. So hedge funds and banks that are printing by betting on semis right now. Yeah, which is a different, which like, like, I, I, I'm totally, I'm totally on board with like, well, you know, the, the, the, the mag seven are drawing down on cashflow and issuing debt and doing layoffs and they're funding the, like, they're subsidizing demand, right? But the idea that BC dollars are the biggest drop in the bucket of subsidized demand feels like, I don't know, it just doesn't quite math out to me. The actual revenues and demand from Fortune 500 companies is so high and the really big dollars in all of these rounds, I mean, Anthropic just raised something like 40 from Google today. And that's not a VC subsidy. I mean, it is a subsidy. Yeah, I'm surprised he's not talking about like big tech subsidies. Yeah, yeah. Maybe he is using the term VC dollars like broadly, which is like fair. So if he's using the term VC subsidies, VC dollars broadly, I think that does sort of, it is a reasonable point. But at the end of the day, you talk to most companies and most, you know, paid chat subscribers. And they're just like, I like the thinking model. So I pay $20 a month. I like the pro model. And I use it to create value. I'm getting that much value in like software for my business that probably doesn't even exist. Like all the vibe coded software that we use here, like it's just not available off the shelf. We're building like completely net new products. And I think that that's like generally what's happening. in the AI economy. There's been this discussion for a long time. Martin Scully was sort of debunking it around like, oh, are these all like circular deals? And he was like, no, the economy is so broad that you have, yes, there are, yes, like Google might take a position in Anthropic. Microsoft might have a position in OpenAI. But you also have completely Main Street customers and just average Joes who are paying, seeing ads, buying things. There's companies that are profiting off of running ads and other inference providers. There's like everyone. It's one big circle. Yes, it's one giant circle of life that is actually self-perpetuating because it is a true economy that hits 25 different categories. Well, you might think it's over, but Elad Gil does not. He says, my view is the AI boom will only accelerate and is a once-in-a-lifetime transformation. This is orthogonal to whether many AI companies should exit in the next 12 to 18 months, as some may lack durability versus labs, new entrants, or weird market ships. But he's extremely bullish. He also posted a funny thing about his new life plan. He said his new life plan is to move to Brooklyn, get a neck tat, ride a bike everywhere, cold brew his own coffee, also start drinking coffee? that's an odd thing to jump straight to the course. Was this, was, was, was he's like trying to, is this like a cipher of some sort? He wants to adopt a 12 year old straight hat. Is there a secret message embedded in this post? Maybe. It sounds like he's advising his portfolio companies. Yes. I don't care how hard you're ripping. Yeah. A lot of you should probably exit. Yeah. Just given, given how, how much uncertainty there is. Well, later in the show, we have a fun story. A bear wandered into a backyard and took a dip in the family pool. We'll take you through it coming up after later in the show. What this tells us about edge computing Get ready for it Has Jane Street achieved AGI internally I think so They definitely have They did billion in revenue last year more than all the big Wall Street banks and with only 3 employees Yes, let's give it up for Jane Street. Fantastic. We have some exciting news out of Thrive Capital. Josh Kushner announced a new fund, Thrive Eternal, a permanent capital holding company that will be concentrated in a small number of assets that we can own and steward over many decades across that of capital and thrive holdings we are building and investing through a moment of expo exponential change backing emerging technologies the infrastructure that powers them and the businesses they can transform increasingly he sees a fourth category these are assets with qualities that cannot be replicated by technology iconic franchises and cultural institutions rooted in tradition. And he says his first partnership is expected to be with the San Francisco Giants, an institution built on more than a century of shared identity and community and among the most iconic sports franchises in America. I looked it up before the show started, the San Francisco Giants. If you're not familiar, it's an American professional baseball team based in San Francisco. Joe, baseball is a bat and ball sport played between two teams of nine players, each taking turns batting and fielding. The game occurs over the course of several plays with each play beginning when a player on the fielding team called the pitcher throws a ball at a player on the batting team called the batter. And then the batter tries to hit it with a bat. And so they play this game. They sell tickets to the game. And that's how the San Francisco Giants make money. and Josh Kushner with Brad Eternal is getting in on the action. Someone should make a TBPN-style show for whatever you just described. I feel like it would be very cool. That would be cool. If you had maybe a couple of hosts, you know, on-screen graphics and they could provide kind of live coverage maybe while that was happening. Extremely educational. This is a crazy story. Brandon Guerrero and our team was obsessed with it. A U.S. Special Forces soldier involved in the capture of Venezuelan President Nicolas Maduro was arrested for allegedly betting on that operation, netting him $400,000 in profits. This is when the betting on yourself meme goes wrong, right? You should always be betting on yourself, but not literally if it's against the law. If you are involved in a government capacity, you should probably not be gambling on the outcome of your government work. Shame on you, but still an interesting story. Even Trump was like, he was betting on himself, And then he referenced some baseball player that isn't in the Hall of Fame because he was betting on himself. But Trump was seemingly implying that that wasn't so bad. People are really, really having fun with the new image gen model. I don't even know if we should pull this up because it's so rude. But this guy is effectively playing Minecraft through ChatGPT because he has it generate him an image. And then he says, OK, walk closer. And then it just generates a new scene. I'm sure there's some kids out there that have figured out how to play Minecraft in chat GBT, but this is very funny. I found this post when it had 40 likes this morning. Sniped. Sniped. Invested early. Cohere is merging with a German AI lab called Aleph Alpha. I am the biggest fan of Cohere. I love Aiden Gomez. transformer paper alum death grips enjoyer what's not to like about the legend aiden gomez we got to get him on the show but this is a congratulations moment for cohere and a left alpha gpd 5.5 yes uh kits is saying it might be agi because if you ask it bro the car wash is five minutes away should i walk to it or go by car and says car bro it's a car wash let the plot let the plot make it. Wait, what? Like it just immediately clocks it. Oh, I like that it's, I've seen this exact test done in normal language, like with a proper prompt, not the casual bro slang. And it actually is more, it feels more like AGI if it is able to pick up on the lingo and mirror that back. Car bro, it's a car wash. That's very funny. That feels very remarkable. It also got the R's and strawberry correct, right? And how many strawberries in the letter R? Zero. The letter R is not known for its berry storage. These are good answers. The being a helpful assistant but simultaneously rejecting nonsense has particularly been difficult for LLM, so good progress here. We'll figure out more about what went into this and what the next generation of AI impacts are from this. Taj says, found a post from five and a half years ago. Elon says, the rate of improvement from the original GPT to GPT-3 is impressive. If this rate of improvement continues, GPT-5 or 6 could be indistinguishable from the smartest humans. Just my opinion, not an endorsement. I left OpenAI two to three years ago. I am a neutral outsider at this point. Greg says, thank you. A lot has changed, but he was accurate in his prediction. Funniest outcome is the most likely. And then the last thing I wanted to do a buddy cop film together. The last thing, there is a new app from X. It's called XChat. XChat. And I'm excited to try this out because the current DMs on X have been pretty brutal. Yeah. Right? I haven't had that much of a problem. I do have this weird bug where when I click the chat button on desktop, it sort of loads and then it resorts after it loads. And sometimes if I haven't been in it in a long time, it needs to reshuffle several, several times. And so that can be a little jarring. But overall, it seems like they made the migration to encryption. I haven't, I don't know. I seem to get messages. It seems to work fine. It's cool that there's a new app. I probably need to be better about answering DMs. I have a lot of them that are unread. So having a dedicated app for that, that sounds cool. People were taking shots because the Everything app is supposed to be everything in one place. Instead, it's three apps to get everything in the Everything app. It's sort of silly. Who cares? More apps, the better. I like apps. So go download it. Go check it out. And go like Makita's post because he's been on a generational run. Also, GBT 5.5 is available in the API now. That's breaking news, I guess, from Craig. So anyway, we will see you on Monday. Have a great weekend. Enjoy the weekend. Leave us five stars on Apple Podcasts and Spotify. Sign up for a newsletter at tbpn.com. And we'll see you tomorrow. Flashbang. Have an incredible weekend.