How AI-Powered Holograms Are Reimagining Fan Experiences at the Big Game - Ep. 288
Jha Lee, co-founder of LiveX AI, discusses how their human-like AI holographic agents are transforming fan experiences at Super Bowl 2026 and retail environments. The technology creates full-size, interactive holograms that can provide wayfinding, customer service, and personalized assistance in real-world settings, bridging the gap between digital AI capabilities and physical world interactions.
- 80% of retail shopping still happens in-person, creating a significant opportunity for AI agents in physical spaces
- Real-time performance and accuracy are critical challenges for mission-critical AI agent deployments in crowded venues
- On-premise deployment is essential for reliable performance in high-traffic environments where internet connectivity is poor
- Human-like appearance and interaction are crucial for user engagement compared to text-based interfaces
- The technology requires complex integration of multiple AI models including language, visual, and audio processing
"In retail in the U.S. 80% of the shopping is actually still happening in real world in the in person."
"As humans, we're not designed to type. It is because of the earlier technology constraints that we have to use keyboards to communicate with each other."
"We can increase the average token speed by six times comparing to a traditional inference framework."
"I hope in the foreseeable future, every place, as long as there is screen or somewhere we can touch and interact with, there is a human like AI agent behind it everywhere."
Foreign.
0:00
Welcome to the Nvidia AI Podcast. I'm Noah Kravitz. Before we get started, a quick note to let you know that registration is open for GTC 2026 in San Jose, California. Nvidia GTC is the premier global AI conference where developers, researchers and business leaders come together to explore the next wave of AI innovation. Visit Nvidia.com GTC for more information and to register. Jah Lee is with us today. Jha is co founder, president and chief AI officer of LiveX AI which builds human like AI agents, including one that fans attending super bowl week this year are interacting with as we'll talk about in just a moment. Jha, welcome to the AI Podcast and thanks for taking the time to be with us.
0:10
Thank you Noah. This is a great pleasure to discuss AI and agent and physical AI.
0:56
Absolutely. I'm really excited to hear more about what Lab AX is working on. So let's get right into it. Maybe to start, can you tell us a little bit about your own background and your role at LabX AI?
1:03
Thank you. Hi everyone, I'm an AI technologist in the AI field. Since the not so popular days of AI and really exciting to to see how everything has been evolving and myself having in small companies like Snap when they were tiny 100200 people and saw the company grow to over 3000 people. I was leading AI and AR efforts there and also big company like Google. I built and co founded the Google Cloud AI organization and oversaw the products and full stack engineering there and obviously had the opportunity to work with different industries to see the real world impact. And since then I got super excited about the real world impact of AI and taught at Stanford. The latest is teaching the future of AI agents which is very core to the future of AI interaction and of course building my own company and LivX AI as Noah introduced is building the human like AI agents and specifically everybody has been talking about AI agents and has been talking about physical AI and of course digital world of AI is already fairly prominent at the moment and we at Levesque AI really want to bridge the digital world and the physical world and bringing everyone the face of AI. For all the B2C companies who want to provide a smooth and human like interaction with their users and customers.
1:14
So I think I mispronounced. I apologize, it's LiveX AI, not LiveX.
3:22
I think that's a very good question because in a way we would love to follow how the users and how people will pronounce our company's name. So originally the company started with a vision of hey, we want to make a positive impact to everyday life and everyday living. So basically that's living. We want to improve the living experience, leveraging AI. So that's live multiplied by AI. So that's how LivX AI come together. But now with all the evolvement, 24, 7 available human like AI agent. So we are live.
3:28
You're live. Yeah.
4:16
Right. So all the time, so. Right. People have been calling us livexai.
4:18
Got it. Okay, well for people who maybe haven't heard of LiveX AI LiveXAI before today you started to talk about it a little bit. But can you dive in a little more to what your technology does?
4:23
So basically the technology really builds the human like AI agents. There is a human like voice, human like appearance and human like intelligence. But everybody knows that the power of AI agent is it's able to take actions. So we can foster human like actions by interacting with dance users and be it. If you are in the retail store, how human like AI agent can recommend a product for you, check the inventory for you and you don't have to go to, you know, wait for a long time. But once you find the right kind of product, you can wait. You can just go directly to that checkout line or by clicking it can just be shipped to your address. Right. So that's one way for the human like AI agent to take action.
4:37
To be clear for the audience, when you're saying human like AI agent, this isn't a chat box, this isn't a text window. This is an animated, as you said, human like lifelike. It's like interacting with a person who's on a screen or in a holographic form.
5:53
Exactly. It's a human size holographic full size. Holographic full size. One that you can interact with that really connects with people way better than, you know, typing. As humans, we're not designed to type. It is because of the earlier technology constraints that we have to use keyboards to communicate with each other. But we appreciate that human connection from how it looks like, how it interact, behave like and how it sound like and really help you like a human. I mentioned about the example of, of human full size human serving the consumers when needed. In fact, in a few weeks on the New York's Fifth Avenue, one of the world's leading brand would have it in their flagship store.
6:11
Oh, amazing. Okay. So the same way that I would walk into a store and there might be a concierge greeting, people asking, you know, what are you here? What are you looking for today? What can I help you find instead? I could walk in, be greeted by a life size human like hologram who could ask me these questions. Same thing, direct me to a place in the store or even as you said, you know, check inventory, take my order and send me off to the checkout line to pay completely. Yeah, yeah, amazing.
7:16
So guiding the users, which aisle you can find some item and just have that human connection, the feeling of human connection.
7:47
Why is it necessary? What's the gap in retail experiences or live entertainment? What's the problem in fan engagement? Customer engagement that livex AI is solving?
7:59
Yeah. So right now, as we can see, lots of AI agents or AI tools are available online. And in a few years, no matter if we are on the website or using a mobile phone or in the real world, we might be interacting with an AI agent. It doesn't matter the interface in the future, but right now most of those still are existing online. And in fact in retail in the U.S. 80% of the shopping is actually still happening in real world in the in person.
8:12
In person. 80%. Wow, that's higher than I would have thought.
8:55
Yeah, exactly. And having that right now there is a gap that we don't know what the user needs and how the users are served and how do we bring that digital experience to the physical world so that everybody can be accommodated. Right, right.
8:59
We did an episode recently with Jason Goldberg who's a retail expert and he was talking about this past holiday season, Amazon and Walmart rolled out online shopping assistants, AI agents to help with shopping and they weren't widely used. Right. Most customers didn't use them, but the ones who did, they were really helpful and they, you know, drove. I think it was 3x engagement versus you know, the ones who didn't. So, so clearly there's, there's something here and it's working. And as you said, we're, we're just moving towards agents proliferating across more and more of our experiences, particularly in retail.
9:20
Completely from the, the recent nrf, the National Retail Federation, we actually deployed a full size holographic human like AI agent together with Nvidia and Supermicro on the latest RTX Pro 6000 server. Everything on prem. And we have seen so many retail owners and shopping mall owners as well as brands have been calling for such technology in the past, either due to the limited resource available to assist everyone with their urgent needs or missing that channel to provide the latest and greatest about their products in person at the moment at the place. So that's where the technology would like a human like AI agent can really, really bridge the gap. Right?
9:56
Absolutely.
11:08
We've talked about, you know, in the retail how it looks like, but this could be also for big events like Super Bowl.
11:09
So let's. Yeah, let's talk about the Super Bowl. So you're doing a project with the NFL with the super bowl, which is happening. Well, super bowl week. So as the week of February 2, 2026, February 4. I forget the date.
11:20
So the big game is on the February 8th. But really that entire week there are different innovative events, fan zone activities, concerts leading up to the big game. We have over 20 activations across different cities, different locations from the airport to fan zone, from parks to like city hall, from concerts that are leading to the big games, to innovative executive facing type of events that can show not only the tech world, but also the rest of the people from the rest of the country how AI can interact with fans, with executives, with everybody. And that is going to bring a little bit of the future looking, you know, experience to such a, such a, you know, impactful sports event.
11:35
So can you detail what one of these activations might be like, say for a fan who's coming across one of your activations at, at the game or at one of the fan zones leading up to the game? What's the experience like?
12:46
The experience could be, hey, like I just landed and there is a holographic AI agent. We call it Lyra.
12:58
So Lyra's at the airport?
13:08
Yeah, Lyra and a few of her colleagues at the airport and they were great. So. So Lyra is Libex AI, your real world agent.
13:10
Got it.
13:27
That's what stands for.
13:27
Yep, you're a real world agent. Great.
13:28
Yeah. So basically when you land, hey, where I want to go and where is the luggage? I should pick up my luggage. And also where should I check out while I have a little bit of time before the game really starts? And what are the interesting companies or interesting areas should I check out, which restaurants I should go and where should I get my rideshare, et cetera. Right. So that's the journey, where the journey starts. And then fast forward to, hey, I'm at one of the fan zones and I want to find what is going on in the rest of the day, what are the events available around it and where should I go in order to find my seat, et cetera. Right. So that way finding and I can navigate from the neighborhood, around the neighborhood and also scenarios when there is emergency situation. You know, we always want to protect the fans, the users to their best. Right? So this area, there might be some risky factor. Could be, there could be fire or, you know, earthquake, et cetera.
13:30
Right?
14:55
So what's the way out that will be the safest, Safest and even in the, in the typical situation, right? So when you leave after the concert, when you arrive at the concerts, how to keep everything smoothly and there won't be like traffic jam, et cetera. Right? So that's another way to protect and to help the entire scenario. Right. And in situations you might be able to interact with your favorite characters and take a celebration selfie that you can share with your family, with your friends, show that, hey, I'm here. I had the best AI experience. Not only I'm here for the big events happening here, but this is possible with AI and this is possible with human, like AI agents.
14:55
I recently went on a two week trip with my family that involved airports and train stations and unfamiliar cities. And I went to a big sporting event at a venue I've never been to before. And I just kept thinking about all of the situations where I was thinking about being in an airport and looking at the large board of, you know, all the arrivals and departures and the map of the airport and where the different gates are and all that, and just thinking, oh, if it was an interactive agent and I could just ask for the specific piece of information I wanted, how much faster it would all go.
15:57
Exactly, Completely. So basically for the wayfinding that, hey, you want to go to different, you know, gates or if there is gate change. Right, Right up to date the information, Right?
16:31
So yeah, absolutely. Yeah, yeah. Going back to the super bowl experience and thinking about sports leagues in particular, do you think that when teams from across the NFL and I would imagine other sports leagues, you know, representatives will be at the super bowl during the week when they're exposed to this experience, do you think it will change the way they think about fan engagement and what fans should expect from an experience of going to a big game? And how do you see this kind of changing the way that sports teams and leagues in particular think about the fan experience completely.
16:45
So so far, for all the teams we interacted, everybody is calling for, how can we provide the best fan experience? And that's the most core needs for everyone, right? So basically you can have that lifetime moment that either celebration of a great success, right? Game, you know, winning a game, or, you know, you have been there trying to. Even the kids and families want to learn a little more about the teams, about players, about the League overall. Right. So that's a core part that apart from like day to day support or, you know, experience that we can provide to people. Right. So it's a long history that people can learn from and feel affiliated or attached to.
17:21
Right. That makes me think about the relationship between. I'm a big sports fan, I've been a big sports fan, you know, my whole life. And the times when I've been able to interact with a professional athlete or a college athlete, get, get an autograph or you know, take a selfie or whatever that is, those are, as you said, they're lifetime memories. I'll never forget as a little kid getting John McEnroe the tennis player, to. He didn't even look at me, he just kept walking. But he signed my program, so it was great. Right. Are you working with or thinking about working with individual athletes? Is that potentially part of the experience? An AI agent that's representing a specific athlete and sort of building that relationship with the fans. How might all of this technology impact that relationship between fans and athletes?
18:22
Yeah, I think the celebrities, the athletes always want to have, find a way to connect with their fans. This is the future of AI agents that can help them represent them in an authentic way and provide that connection that everybody, the fans would appreciate.
19:07
It's such a big part of sports and as you said, celebrity culture today with, with social media, that, that access, you know, and feeling like you have a connection with the celebrity, with the athlete has become such a core part of the experience. Yeah, totally. I'm speaking with Ja Lee. Ja is co founder, President and Chief AI Officer of LiveX AI, which as we've been talking about, builds human like AI agents for all kinds of experiences, ranging from fan experiences at sporting events and other events as we've been talking about, to all manner of customer service, wayfinding services for consumers, travelers as well as executive experiences, all types of experiences. Jha, if we could, let's pivot a little bit and talk about the technology. What goes into, what's the hardest part? Technically speaking, what are some of the biggest technical challenges in making holograms that feel human like that look and act and respond as we'd expect them to?
19:32
Yeah, I think right now AI agents are pretty well received in the tech world already. And as many of us know, there are two big challenges for AI agents, especially for mission critical tasks, efficiency and accuracy. So if you are working on a marketing report or even writing some code for a project, you can always wait for mission critical tasks. For example, in real Time interaction. Like suppose you are in the stadium and there is a human like AI agent that looks perfectly, but you have to wait for a minute to hear what it responds to.
20:29
And the clock, it's halftime's ending and you need to go to the bathroom and there's people waiting behind you and yeah, no, you can't, you can't have a lag, you can't wait.
21:23
Exactly. Another aspect is the accuracy. Right. So you can, you know, handcraft some of the answers so that you can, you can be very efficient, but you can be intelligent or even foster actions on top of it. Right. So basically through this journey we have been really grateful to have Nvidia as a great partner to go through this. Combining the hardware and software and algorithm advantages, we made it possible to have that real time interaction that is intelligent and smooth in a smooth fashion that really we've seen this as the first time a 4K high quality holographic full size human like AI agent that can interact at the place, at the time to serve your purpose.
21:31
You mentioned before the, I think it was the activation at NRF was running on Prem on Nvidia hardware. If we could talk about the software stack for a moment. Are you using Nvidia models as well?
22:42
Yeah. So we have been benefited from Nematron, from Triton and tensorrt, Name and Nemo, all these different components. Because building a human like AI agent is a complex, is very complex, not only on the model side but also on the system level. So we have a combination of different types of models, some trained by ourselves, some we, for example, some language that we don't even understand. It's hard for us to evaluate and of course we'll benefit a lot from companies who have a lot more experience than us. So it's a combination of software and hardware. And if you think of in scenarios like NRF, which hosts over 40,000 people, many of them are executives looking for the best experience. In such a crowded indoor scenario, Internet is always awful. And then by deploying everything on prem on the Nvidia GPUs together with our partner Supermicro there, we are able to serve crowds with that seamless experience. Similarly, it could, I mean in sports events like super bowl hosting millions of fans. Right. So how do we have that scalable experience and on time right at the moment at the place is very, very important.
22:55
Can you talk about how, and you mentioned before, Nvidia Triton and Nim Microservices fit into your production setup and how they help run these models at the Scale you've been talking about.
24:58
Totally, yeah. So the NIM Nvidia collaboration started actually a GTC a couple of years ago.
25:10
Okay. Getting another plug for GTC. So many great relationships started at GTC's past.
25:19
Totally. Yeah. Together with, at that time, together with Google Cloud and Nvidia. The three teams are thinking, hey, name is around the corner and we need to build that human like AI agents at scale. And how do we make that happen? And the three companies work almost like one team and said, okay, let's make it available at Google Cloud Next, which is just a few weeks after gc. And we made it. And it is the combination of leveraging, deploying the MOE structure model onto Kubernetes at Google Cloud and leveraging the auto scaling efforts to scale to serve the peak volume and also scale down when there are less people interacting with it. So that has been the Nvidia team and the Google team highlighted at the Google Cloud Next that the collaboration helped leveraging by leveraging NIM and we can scale, we can increase the average token speed by six times comparing to a traditional inference framework.
25:26
Amazing. Yeah, that's fantastic.
26:56
So that means in the AI agent it could have multiple steps that each of these steps means LLM or vlm. Right. So if you can speed up six times, that means when you are taking six steps, other people could be only taking one step.
26:58
Only one.
27:22
Right, right. So that really solves a huge bottleneck for us at that moment. And we made it possible to have that scalable and efficient human like AI agent.
27:23
And are your teams using CUDA and TensorRT in your workflows?
27:40
Yes. Sometimes you need to get into details of how to leverage the best kernel. Right. Of course, that's collaborating together with Nvidia team that really squeeze every bit of the computation efficiency out of the best scenario. So that has been really helpful for us through the entire development process.
27:44
You mentioned again, going back to the NRF example that you're running on Nvidia RTX 6000 GPUs. Is that both on Prem and in the cloud?
28:14
Yes, at nrf we deployed on Prem. So basically in the workstation that has the RTX Pro 6000 GPUs, two of those that were just right behind the stairs. So that's the on Prem scenario. And at the super bowl scenario, because of the scalability, we also added the cloud deployment that is again leveraging the RTX machines.
28:25
What do these GPUs let your holograms do that a standard server just couldn't do?
29:03
Yeah. So it has the unique advantage of both from the visual graphic generation rendering perspective as well as handling large models, language models, visual language models, audio models altogether. So the performance, even if it's at the consumer price, so the performance is way better than even some of the business facing type of architectures. Yeah, it's a black, it's a Blackwell architecture which made it very powerful and we have been mainly leveraging it for inference purpose and sometimes looking into the lightweight training purpose as well.
29:10
Right, great. So if we could look ahead a little bit as we kind of wind the conversation down. Ja beyond sports and retail, as we've been talking about, which industries do you think will adopt AI holograms next and.
30:04
Why those first, I hope in the foreseeable future, every place, as long as there is screen or somewhere we can touch and interact with, there is a human like AI agent behind it everywhere. Yeah, we are starting with the most crowded and most needed areas like events, concerts, sports, games and also retail scenarios, shopping malls, et cetera. But we see there are more scenarios or verticals like this could adopt this as well. For example travel, hospitality and also even in some of the scenarios like hey, can we help every patient in the future to provide that connection when they need it? Right. So yeah, I hope it can make more positive impact to our everyday life.
30:19
Absolutely. Going back to retail for a moment, I mean, I would imagine the answer is yes, but are big consumer brands excited about the tech? And if so, are there certain things that they're particularly excited about? Why is the buzz building around the idea of AI holograms in consumer applications?
31:28
Completely. So so far the Fortune 500 big brands are really looking forward to this kind of technology. As I mentioned in the past, most of the interactions or most of the learned interactions are happening in the digital world, on the website, in the mobile, et cetera. And all the brands want to provide the best experience for their users in the real world as well. Right. And in the past that information is completely missing and also the tools are completely missing there. So nobody knows what it's a big gap to help serving the users. And nowadays because of the chatbots and smart search available on the mobile, when users go into a big shopping mall, brand store, when they couldn't find the needs that they are looking for immediately they could pull out their mobile phone and start checking in some of the chatbots, but those chatbots are not representing the brands and they won't know what is available at the moment and what is more suitable for the user, let alone there might be much outdated information on the Internet that could completely confuse the users. So the brands are looking for the intelligent human like AI agent that can represent their brands and speak and look like their voice and provide the most accurate and up to date help that their users would needed when they walk into the store or the shopping mall.
31:48
I mean, we've been talking about the future, the near future sort of a lot throughout this episode, throughout this conversation. But to put a point on it, how do you see AI holographic technology changing industries, kind of broadly changing the world if you will, over the next, I don't know what the right time period is the next five years, five to 10 years as the technology really matures and you know, becomes more accessible in terms of cost and resources you need to run it. How is this going to change day to day life across, you know, people and industries going forward?
34:01
Completely. I think as I shared earlier, our technology is advancing fast and that's why nowadays we don't need to necessarily type to receive some of our information. We can talk to it, sometimes we can make a phone call to do it. But all these are the necessary progress that is needed towards in the future or down the road, whenever, wherever you go, there could be that human like AI agent that can connect with you that you feel hey, like this I'm taking good care of. Right. So and many of the people, they don't prefer to read long paragraph of answers or how to say like trying to guess, hey, if that person is listening to me on the phone, right. So I can't see them and I can't, you know, understand or guess whether they are paying attention to me. Right. So that's where that fiscal AI appearance is very, very important for people to have the amount of connection that they will also pay attention to it. Right?
34:37
Right.
36:08
Yeah. Like in my vision that will be part of our everyday life and hope our entire industry would move as fast as possible to bring that to reality.
36:08
Jia, for listeners who want to learn more about LiveX, about any of your activations work that's ongoing. Where are the best places online since. Since they don't. We don't all have personalized holograms guiding us just yet. Where can listeners go online to learn more? Company website, social media chann. Where would you direct them?
36:25
Yeah, come to visit LivX AI. We have a lot of information about different industry and different hardware, different software that is available, see it online and experience it and also follow us on LinkedIn. We have a lot of up to date information sharing about it. And some of the activities. We will have it at GTC as well.
36:47
Perfect.
37:19
And if you are in the Bay Area, come to check our activations at different fan zones and different concerts Airport. Experience that in the real world how it can help you.
37:19
Perfect. Jelly this was really entertaining and insightful conversations. Thank you again for taking the time and all the best on everything you and LIVX are doing with this just incredible technology that, as you said, you really need to try to experience it in person to really get the full effect of it. But all the best to you and thank you again for coming on the pod.
37:36
Thank you and look forward to meeting many of you at GTC as well as at the super bowl week. Sa. Sam.
37:57