One, disfluencies and fillers. One, disfluencies and fillers use words like, you know, well, and so naturally throughout the dialogue, these shouldn't overwhelm the text, but should appear frequently enough to break up formal sentence structures. So the main issue they're hitting is this physical barrier, right? Yeah, exactly. I mean, they're completely maxing out local power grids. Two, active listening and back channeling. Two, active listening and back channeling. When one person is explaining a concept, the other shouldn't be silent for too long. Use brief interjections to show active listening. They actually 3D printed the electronics inside the wall. Oh, wow. Yeah, right into the internal structure. That's incredible. Three, fragmented sentences and restarts. Three, fragmented sentences and restarts. People often start a thought, pause and rephrase it mid sentence. But how do we actually, you know, how do we get all that heavy hardware up there affordably? Four, strict turn taking without monologues. Four, strict turn taking without monologues. Avoid letting one speaker talk for more than three or four sentences. Distribute the information so the speakers are co-narrating the topic. It acts as a cognitive brain. Like the strategic navigator. Exactly. Analyzing the broader business context. Five, colloquial phrasing. No caricatures. Five, colloquial phrasing. No caricatures. Use relaxed everyday vocabulary rather than academic or textbook language. Phrases like pretty wild just completely absorbs or basically help ground the conversation. The stats on this particular engine are, well, they're staggering. It produces what, 280 tons of thrust? Yeah, 280 tons and it's incredibly light. Natural notes. Elon Musk's SpaceX has confidentially filed for an initial public offering. Targeted an evaluation of up to $1.75 trillion to $2 trillion. While simultaneously integrating the company with his artificial intelligence venture, XAI. Musk went on the platform X to dismiss the $2 trillion figure as BS. But the confidential filings and massive target raises remain the focus of intense investor attention. If you want to save a few quid British gas have a way, you get half price leaky and it's called peak save. On every Sunday, it's the smart thing to do if you're regular folk or furry and blue. 11 till 4, let the good times begin. You could charge up the car or take the dryer for a spin. Half price electricity, what joy that brings with British gas peak save, we're taking care of things. Teas and seas apply eligible tariffs and smart meter required. Security program on spreadsheets, new regulations piling up, and audit dread. It's time for Vantor. Vantor automate security and compliance brings evidence into one place and cuts audit prep by 82%. Less manual work, clearer visibility, faster deals, zero chaos. Call it compliance or call it calm clients. Get it? Join the 15,000 companies using Vantor to prove trust. Get started at vantor.com slash calm. Yeah, we are really looking at the deliberate fusing of SpaceX, XAI and potentially Tesla into a single multi-trillion dollar mega conglomerate. Just merging them all together. Exactly. I mean, the entire strategy revolves around the concept of convergence. We are seeing the individual corporate boundaries completely dissolved. They are assembling a highly integrated unified structure that's designed to control absolutely every aspect of hardware, software, energy production, and aerospace delivery. So can moving artificial intelligence compute into orbit actually rescue a struggling terrestrial car company? Well, to really answer that, we have to look at the recent all-stock merger between SpaceX and XAI, which valued the combined entity at $1.25 trillion. Okay, wow. Yeah, so XAI brings its rock artificial intelligence model and the massive X social platform directly under the umbrella of the aerospace company. So SpaceX just completely absorbs the entire software infrastructure. Exactly. Along with the real-time human data streams that are required to train those massive neural networks. So the primary motivation for combining an aerospace manufacturer with an artificial intelligence startup, I mean, the reason they are doing this is that they are actively building orbital intelligence. Right, orbital intelligence. Because the technology sector is currently hitting a massive physical barrier. It's known as the AI energy wall. Terrestrial data centers are just maxing out local power grids all over the globe. Yeah, they really are. Training a large language model requires stringing together tens of thousands of high performance graphics processing units. And those GPUs, they have to run constantly, right? Like at maximum capacity. Yeah, all the time. And the electricity required to power those specific chips, combined with the staggering amounts of municipal water and energy needed just to prevent them from melting down. Right. It's completely overwhelming local infrastructure. We are genuinely running out of physical land and available electrical capacity to support the required server farms here on Earth. Which brings us to the space aspect. Right, so instead of building a bigger power plant on Earth, the plan is to just outsource the servers to space. But how do we actually get all that heavy hardware up there affordably? Well, doing that removes the fundamental limits currently throttling artificial intelligence processing. By moving data centers, which they are referring to as grok-sats into space, they can harness direct solar energy. Because there's no atmosphere in the way. Exactly. Once you place hardware in orbit, there is no atmospheric interference blocking the sun. And if the satellites are positioned correctly, there is absolutely no day-night cycle interruptions. Just constant power. Constant uninterrupted radiation. The hardware completely bypasses earthly electricity constraints. That makes sense. The entire burden of computing shifts away from our fragile terrestrial power grids. And it moves into an environment that offers unlimited solar radiation alongside the natural extreme cooling properties of deep space. Right, the cooling is just built into the environment. Which brings us to the Raptor 3 engine. Oh yeah! That's the specific piece of technology required to lift these massive orbital data centers via the Starship rocket. And the statistics on this particular engine are staggering. They're pretty wild. It produces 280 tons of thrust. 280 tons. Yeah, and it is incredibly light, and it does not require an external heat shield at all. It's just a huge deal. Because lifting heavy satellite constellations demands extreme power, combined with extreme efficiency. Any extra weight attached to the rocket itself directly reduces the amount of server hardware that can be successfully carried into orbit. And the engineering behind that weight reduction is, well, it's fascinating. SpaceX engineers removed all the external plumbing, the complex wiring harnesses, and the heavy protective shields that usually surround a rocket engine. They just took them all off. Yeah, they 3D printed the delicate electronics and the fluid cooling channels directly into the internal structure of the solid engine wall. So it's all hidden inside the metal? Exactly. This relies entirely on a concept called regenerative cooling. Think of it like a circulatory system built directly into the metal. Oh wow. The cryogenic liquid methane fuel flows through these hidden internal pathways before it ever reaches the main combustion chamber. So the freezing cold fuel absorbs the extreme heat of the firing engine? You got it. It effectively keeps the metal structure fully intact and cool from the inside out. By eliminating the bolt-on components and the external tubes, they stripped away all the parasitic mass that was dragging down the payload capacity. Though Tori Bruno, the CEO of Rival United Launch Alliance, looked at publicly released photos of the sleek Raptor 3 and claimed it was a trick. Oh right, the ULA CEO? Yeah, he stated publicly that it was only a partially assembled engine missing critical components. Really? He argued that vital mechanical elements like thrust vector control actuators, fluid management systems, and electronic controllers were visibly absent from the photographs, leading to his direct conclusion that SpaceX was simply exaggerating their manufacturing progress. Wait, back up. SpaceX immediately responded to that claim. Right, they didn't let that slide. No, President Gwynne Shotwell posted an uncut video of that exact, supposedly incomplete engine successfully test firing on a test stand. Just proving it works right there. Yeah, she captioned the video, works pretty good for a partially assembled engine. The critical components that Bruno assumed were missing were actually just integrated entirely inside the solid metal structure of the engine block. Which is incredible. And that structural change radically alters the economics of space delivery. It really does. The extreme reduction in mass and the severe drop in manufacturing complexity means Starship can affordably lift the heavy infrastructure required for the convergence plan. It's all about payload. A much lighter engine directly increases the payload capacity for every single launch. Lowering the overall cost per kilogram to low Earth orbit is the only mechanical and financial pathway that makes launching thousands of heavy GPU laden groxats financially viable. And Starlink is the financial engine making this all possible. Oh, absolutely. The satellite internet service has officially surpassed 10 million active customers. 10 million. That subscriber base provides the massive recurring cash flow necessary to fund highly capital intensive projects like the development of Starship and the multi trillion dollar XAI merger. The growth metrics for Starlink are pretty astounding. The service is adding over 20,000 new users every single day. Every day. And it's operating across 160 different countries. They're generating over $10 billion in revenue with massive profit margins. Yeah, and they are expanding far beyond standard residential users. Right. They are charging massive premiums for commercial maritime and aviation contracts. Airlines and massive cruise ship operators are paying incredibly high commercial rates to maintain high speed low latency broadband connections over the open ocean, where traditional cellular towers simply do not exist. Hold on. If SpaceX is doing so incredibly well entirely on its own with Starlink, why merge with XAI and why are investors talking about bringing Tesla into this mix? Well, there is an ultimate combination scenario actively floating around Wall Street involving a $3.5 trillion merger between SpaceX and Tesla. A $3.5 trillion merger. Yeah. The financial community is closely analyzing the shared technological dependencies forming between all of these entities. Because the current financial trajectories of the two companies present a stark contract. You really do. While SpaceX valuations continue to soar into the trillions, Tesla's stock has dropped significantly due to intense pricing competition from rival electric vehicle makers. It's a really tough market right now. Some financial analysts project Tesla's stock could fall another 60%. Meanwhile, Tesla shareholders officially approved a $1 trillion compensation package requiring Musk to push Tesla's total market valuation to $8.5 trillion. Which is an enormous target. Hitting that target requires immense unprecedented growth that the traditional automotive manufacturing sector alone may not be able to support. So why would SpaceX shareholders want to attach themselves to a struggling car company? It's a fair question to ask. SpaceX holds a near total monopoly on commercial launch services and operates a highly profitable, rapidly expanding global internet provider. Tesla is actively fighting a brutal automotive price war, repeatedly missing vehicle delivery forecasts, and dealing with massive capital expenditures for new factory expansions. Right, but the physical integration of the companies is already happening regardless of the corporate structure on paper. Okay, how so? Consider the development of TerraFab. This is a joint $20 billion chip manufacturing factory located in Texas, funded simultaneously by Tesla, SpaceX, and XAI. All three of them. Yes, and it is explicitly designed to produce up to 200 billion artificial intelligence chips annually. 200 billion chips. Annually. These are incredibly advanced, custom 2 nanometer silicon processors. SpaceX requires these specific chips to operate the orbital data centers. And Tesla requires those exact same chips to process their autonomous driving software and to serve as the brains for their humanoid robot. Exactly, and XAI requires them to continuously train the GROC language models. They are building a massive shared supply chain strictly because no single company can source that volume of advanced hardware on the open commercial market. That physical dependency severely limits the idea that Tesla functions purely as an automotive manufacturer. Yeah, it completely changes things. It changes Tesla into a dedicated hardware manufacturing arm for a unified artificial intelligence ecosystem. The passenger vehicles and the optimist humanoid robots, they basically serve as physical roaming endpoints. On the world. Right, they gather vast amounts of visual and spatial data from the real world, feed that data directly back into the XAI neural networks for training, and then receive updated, highly complex software instructions processed by SpaceX's orbital computing infrastructure. Which brings up digital optimists. It's been cheekily nicknamed, MacroHard. I love that name. It's a shared artificial intelligence agent project operating between Tesla and XAI. The software is designed to automate complex corporate office workflows and eventually run entire companies. It observes exactly how human office workers interact with their computer interfaces, tracking every movement. Just watching what they do. Yeah, and then it learns to replicate those exact digital processes completely autonomously. The software architecture driving digital optimist runs on two distinct parallel processes. System one runs natively on Tesla's low cost AI for chips to process real time screen video and coordinate precise mouse actions. Right, so think of system one as your physical reflexes. Like the fast executor. Exactly, it handles the immediate localized physical inputs on the screen. And then system two utilizes XAI's massive grok model for high level reasoning and broad world understanding. So grok acts as the cognitive brain. Or the strategic navigator. It analyzes the broader business context of the task at hand and constantly directs the system one agent on exactly what goal needs to be accomplished next. Let's keep this simple. The computer inside your parked car is going to be doing corporate accounting. Basically, yes. Tesla is equipping millions of their consumer vehicles with advanced AI for hardware. And when the car is parking a garage, that immense computing power sits completely idle. It's just doing nothing. By utilizing the digital optimist software, the car's internal computer can securely log into enterprise business networks and process highly repetitive administrative tasks while simply sitting in the driveway. And Tesla plans to deploy millions of these dedicated computing units directly at their commercial supercharger stations. Oh, to tap into the power there. Yeah, to tap into 7 gigawatts of readily available electrical power. This effectively turns Tesla's physical charging network into a massive, highly distributed edge computing layer that directly supports SpaceX's orbital AI operations. Because standard electrical grids experience significant transmission loss when moving electricity over long distances from a power plant to a centralized server farm. Exactly. So by placing the physical compute nodes directly at the power source, they eliminate that transmission waste entirely. That's incredibly efficient. The ground-based superchargers handle the localized, immediate processing tasks. And the orbital grok sats handle the heavy global data synthesis and long-term memory storage. The lines separating cars, rockets, and chatbots are completely disappearing. We're watching the real-time assembly of a single, vertically integrated artificial intelligence infrastructure that operates seamlessly on the ground and up in orbit. If the processing power that ultimately runs global businesses in critical terrestrial infrastructures floating in orbit, completely detached from any single nation's power grid, who actually controls the off switch. If you're not subscribed yet, take a second and hit follow on whatever app you're using. It helps us keep making this. We appreciate you being here.