AGI is coming in 2025 or at least that’s what OpenAI’s Sam Altman thinks. Plus, huge announcements from NVIDIA at CES & hands on with Google’s VEO 2. AND OpenAI’s Operator (aka their AI Agents) might come soon, DeekSeek V3 is pretty darn...
AGI is coming in 2025 or at least that’s what OpenAI’s Sam Altman thinks. Plus, huge announcements from NVIDIA at CES & hands on with Google’s VEO 2.
AND OpenAI’s Operator (aka their AI Agents) might come soon, DeekSeek V3 is pretty darn good, Meta makes a big mistake with their AI personalities, Minimax’s new subject reference tool, METAGENE-1 might help us stave off disease, and all the robot vaccum news you could ever want.
Oh, and Kevin is sick. BUT HE’S GOING TO BE FINE.
Join the discord: https://discord.gg/muD2TYgC8f
Join our Patreon: https://www.patreon.com/AIForHumansShow
AI For Humans Newsletter: https://aiforhumans.beehiiv.com/
Follow us for more on X @AIForHumansShow
Join our TikTok @aiforhumansshow
To book us for speaking, please visit our website: https://www.aiforhumans.show/
// SHOW LINKS //
Sam Altman Blog Post:
https://blog.samaltman.com/reflections
Head of OAI “Mission Alignment” warns to take AGI seriously: https://x.com/jachiam0/status/1875790261722477025
OpenAI Agents Launching This Month?
https://www.theinformation.com/articles/why-openai-is-taking-so-long-to-launch-agents?rc=c3oojq
Satya Nadella says AI scaling laws are “Moore’s Law at work again”
https://x.com/tsarnick/status/1876738332798951592
Derek Thompson Plain English
Digits: $3k Supercomputer
https://www.theverge.com/2025/1/6/24337530/nvidia-ces-digits-super-computer-ai
New GFX Cards including 2k 5090
https://www.theverge.com/2025/1/6/24337396/nvidia-rtx-5080-5090-5070-ti-5070-price-release-date
Cosmos World nVidia World Model
https://x.com/rowancheung/status/1876565946124341635
DeepSeek 3 Crushes Open Source Benchmarks
Oopsie File: Meta Deletes AI Characters After Backlash
https://www.cnn.com/2025/01/03/business/meta-ai-accounts-instagram-facebook/index.html
Minimax Subject Reference
https://x.com/madpencil_/status/1876289286783615473
Famous Science Rappers
https://youtu.be/B56rwm2sn7w?si=hD1ankVpWvALHAN5
Science Corner: METAGENE-1
https://x.com/PrimeIntellect/status/1876314809798729829
HeyGen Works With Sora -- VERY Good LipSync
https://x.com/joshua_xu_/status/1876707348686995605
GameStop of Thrones
https://www.reddit.com/r/aivideo/comments/1htvzzc/gamestop_of_thrones/
Simulation Clicker (not sure this is AI as much)
https://x.com/nealagarwal/status/1876292865929683020
Torso 2 In the Kitchen
https://x.com/clonerobotics/status/1876732633771548673
Roborock's Saros Z70
https://x.com/rowancheung/status/1876565471887085772
Halliday Smart Glasses
https://x.com/Halliday_Global/status/1871571904194371863
[00:00:00] It's 2025 everyone, and we might actually get AGI. Well, that's what hype master Sam Altman wants you to believe. Plus, are OpenAI's agents on the way, Kevin? Well, that's what hype master Sam Altman wants you to believe, sure. And NVIDIA's new 3, 000 mini supercomputer. Plus, Jensen Wong has a new leather jacket.
Okay, yeah, sure, if we want to believe that. Legendary hype master, open AI founder, Sam Altman. Sure, that's fine. Kevin, Kevin, are you okay? Are you okay? I am so sick and I'm at CES. It's AI for flumens. It's AI for flumens.
Gavin Purcell: Is 2025 everybody we are back in Sam Altman has dropped another banner of a blog post in which he is alluding to the idea that not only are we going to get a GI in 2025 but Kevin. Open AI has turned its eye towards super intelligence. Now we should [00:01:00] talk a little bit about what this means. It is just a blog post, but there has been a lot of people at open AI kind of behind the curtain who have been talking about this idea. And obviously, before the end of last year, we saw the announcement of 03, the newest version of the reasoning model. And I think this is actually a pretty big deal because the signaling is getting a little louder. I will also very quickly mention that the, , head of mission alIgnment for open AI, which I guess is their version of their safety team.
Gavin Purcell: Now, Joshua Akim, Has a pretty interesting tweet storm, which starts with the world isn't grappling enough with the seriousness of AI and how it will upend and or negate a lot of the assumptions many seemingly robust equilibria are based on. So Kevin, explain that tweet to me in less
Gavin Purcell: than 10 seconds.
kevin: no, I won't. I'll read a quote from Sam Altman's reflection, a blog post where he says, we are now confident we know how to build AGI as we have traditionally understood it. Then to your point earlier, we believe that in 2025, we may see the first [00:02:00] AI agents joined the workforce and materially changed the output of companies.
Gavin Purcell: And to that point, , there's a new story in the information that talks about opening eyes, launching agents this month.
Gavin Purcell: Now, something interesting around this story was there's a little bit of fear around the idea of Prompt injection, which we know means that injecting kind of malicious prompts into AI. And if you can imagine a world where agents go out there and do things for you, malicious prompting could be a big thing when it comes to those sorts of scenarios. I think it's going to be very interesting to see if these agents are actually useful. We saw Anthropic drop computer use, which is a version of agent to computing a couple months ago. And while we saw some cool demos, I haven't seen a lot of people doing amazing stuff with it yet. I'm not sure Kev, if this is like the future or if this is a kind of a stopping point till we get to what the future is going to be.
kevin: Yeah. I think it'll get us, it'll get us closer. I'll ask you how many times have you been building something or [00:03:00] working with AI and then you find yourself almost like the. The computer's assistant, where you're just connecting one output to the next input or chaining one prompt to the next prompt, trying to orchestrate something.
kevin: It, the machines are perfectly capable of building the product specification that you want, outlining the novel that you want to create. Writing interesting looking cinematic prompts for an image generation software. It's totally capable of doing all these things. It's the orchestration layer on top of all that, where they tend to suffer.
kevin: And if you look at the Oh one release or the promise of Oh three, it could be smart enough To reason and orchestrate if it's given power over the other agents and its workflow to say, you go out and do this, you go out and do that. Okay, great. I'm going to take your two reports or results and jam them into this other thing.
kevin: Like I think it could get us quite a bit of the way there.
Gavin Purcell: Are you talking about that? Agents are the Reese's peanut butter cups of AI and that they
Gavin Purcell: take chocolate and peanut butter and make something delicious out of it. Is that a good [00:04:00] metaphor?
kevin: I like that. Especially if it's a Reese's peanut butter cup, like McFlurry. Because it would mean that half the time they don't work. They are out of order. I'm sorry. There's nothing we can
Gavin Purcell: the machine is not
Gavin Purcell: working. The machine is down.
kevin: Yes.
Gavin Purcell: Anyway, we're looking for a pretty exciting January, especially if these agents come out. , and the other thing that's been going on obviously is CES. Kevin's at CES right now. And Satya Nadella Microsoft's head came out and said a little bit more about these agents and also, test time compute and how it's going to grow over the next year.
Gavin Purcell: Take a listen to this.
kevin: The scaling laws that are powering AI and pre training in particular. It's really Moore's law at work again. And it started in 2010. With DNNs and then obviously the GPUs they inflected again, perhaps with transformers just because of the efficiency of data parallelism with transformers and what was happening perhaps of doubling of capacity every 18 months, started to double every six [00:05:00] months.
kevin: That's really what the scaling laws were. And by the way, there's all this debate, what's happening with scaling laws for pre training. Will they continue? Have we hit the wall? We fundamentally believe the scaling laws of, absolutely still great and will work and continue to work, but they do become harder,
kevin: Gavin, you want to explain that all to me in less than three and a half seconds, please?
Gavin Purcell: I can do this one. So basically, Satya is this is another hype master general voicey post, but he's basically saying that the scaling laws of AI, meaning how much how I can get better over time are continuing to scale, meaning what we've seen the pastor scaling. Now what people said at the end of last year was that pre training scaling, which means that the idea that these very large models that they train on first. That is not scaling, but test time compute 0103, those sorts of things, which look at current things and then reason around them is scaling. So in general, this is just another CEO out there saying that the AI growth will continue. Of course, as always, you have to take this with a grain of salt. [00:06:00] Microsoft has a massive investment in opening.
Gavin Purcell: I, this is what they see as the future. I just heard a great episode of Plain English, Derek Thompson's podcast the other day where they talked about this growth and spend in A. I. And obviously this is going to be one of the big stories of 2025 too, is the literal hundreds of billions of dollars that are being poured into A.
Gavin Purcell: I. technology without some sort of like real revenue stream coming out of it. Microsoft, Salesforce, all these companies are betting that AI agents are more than likely going to be that revenue stream. But Kevin, speaking of revenue stream, there's a very funny tweet that Sam Altman himself put out where he was like, surprised that people were using Oh, one pro as much and supposedly said they thought they were going to earn money on it, but actually are not.
Gavin Purcell: They're losing money because people are using it so much. A lot of people have said, no, you know what you were charging for this, whatever, but like this stuff is not cheap, right? It's not cheap to use. Oh,
kevin: I'm wondering how much of that is just people justifying their monthly subscription. Like I need to use this. I have to run as [00:07:00] many queries through this. And I wonder how many of those queries actually need to be run by the highest end models. I use the Oh one release the basic Oh, one release has been good enough to unstick any coding issues that I'm having, but this harkens back to the agent stuff from earlier, but cursor is an app that I talk about a lot on the show and a lot of people use it.
kevin: Their latest update has agents built into it. And this is a tiny little. Inching into this world, but when I give it a task, I can see the way it's breaking the task down, reviewing its own code, trying to iterate through the problems and fix it.
kevin: And when it works, it does feel magical. So I just not to get back to the agent discussion, but I just, that plus a super high, powerful Oh three. That's the director at the whole thing. I get, I don't know. I get excited for that.
Gavin Purcell: 0405, which supposedly the rumor is that are coming
Gavin Purcell: out this year. That's how fast they think the O's are going to happen. . They're somehow like rolling the O's up like we got to keep somebody in our audiences to come up with a version or a name for how the O's will climb or. The big O. Maybe not the big O. [00:08:00] Maybe not the big O. Let's stay away from the big O. Okay, we're gonna keep going forward. Speaking of exciting things, Kevin, the most important thing you could turn your agent on, I feel like, is liking this video right now. If we had an agent that liked and subscribed for
Gavin Purcell: us, can you imagine how many more subscribers on YouTube we'd have?
kevin: if I had an agent that would just return my phone calls or emails, I would be excited, but in the world of artificial intelligence, sure. Someone make a bot, please. Just let it, no, actually that's against terms of service. you, Dear human that's listening, you, go ahead and manually old school analog steampunk, if you will, manipulate your mouse and keyboard, click a click a subscribe, it costs you nothing but it helps us out, it gets us seen in the old algorithm.
Gavin Purcell: Oh, and over the break, we had an amazing little tweet from a guy named Andrew Wilkinson, who's a fan of our show, who runs a pretty big company up in Victoria, BC. So first of all, thank you, Andrew, for that. We know we saw it and we really appreciate it, but that also helps us.
Gavin Purcell: If you ever want to share our show somewhere, that's great. That really brings more people into the show. In fact, last week's episode, I think got listened to. By far, [00:09:00] not last week, whatever the two weeks ago, we did take a week off, got listened to by far more than almost any episode to date. So thank you so much.
Gavin Purcell: Let's get back to CES stuff, Kevin. Let's talk about Papa Jensen Wong. And I don't mean that I don't want to just get into his, I don't know what happened fashion wise with Jensen this time. He maybe was in Vegas, so he went a little bit crazier with his leather jacket this time. Do you have any comment on that?
Gavin Purcell: I know you're a leather jacket aficionado. Anyway,
kevin: it's a double knit it's a wool crotch,
Gavin Purcell: honestly, what's so interesting about CES now is that the NVIDIA keynote feels like the biggest deal. Like it's not really gadgets as much, but like this is the big deal keynote. And they dropped quite a bit of news in this keynote. First and foremost, Jensen does his long speech about different things, and I thought the interesting thing he talked about in the beginning of it was the idea that we're going to be moving into this space where robotics and self driving cars are the next generation of AI in the real world.
Gavin Purcell: And you and I both know we've been in Waymo's. We know that Waymo's are starting to [00:10:00] take over the world. Robotics is the thing that they keep hammering on. We're going to talk about a very creepy robot later, but humanoid robots feel like they are having a moment a little bit. Obviously, Jensen is pumping his own bags again.
Gavin Purcell: He knows that every single thing needs to have chips in it to function. The coolest thing, though, he announced, I thought, was this 3, 000 mini supercomputer that they're calling Digit. Did you see this?
kevin: Yeah, it looks like you're going to be able to run, GPT 4. 0 quality or level models locally in your home with this little bot and all your devices can connect to it. And it, it's a supercomputer for three grand.
This here's the amazing thing. This is an AI supercomputer. It runs the entire NVIDIA AI stack. All of NVIDIA software runs on this
Gavin Purcell: Yeah, which, and I saw somebody say that if you string two of these together, and they're not very big by the way, they are like about the size of a, smaller than a shoe box,
kevin: They look like a Mac Mini, almost.
Gavin Purcell: Yeah, exactly. And if you string to those [00:11:00] together, you can like legitimately run the 70 billion local models, which I think ultimately will be a big deal as we want to get away and do stuff that we want to do with these things.
Gavin Purcell: The other thing obviously that came out of this was the new graphics cards. So if you're out there, you're a video gamer, you've heard this already, but the 5090 is the new 2000 card that is going to be using AI for a lot of its upscaling.
Gavin Purcell: AI is basically being integrated into the same way that shaders were.
Gavin Purcell: And this is me just talking out of my ass. But now AI is the buzzword within the graphics card industry as well. Yeah,
kevin: when you look at the benchmarks, it shows this 5090 and even the 5080 card just blowing previous gen cards out of the water, but there's, it's almost like the benchmarks on some of the graphics are split in half, and if you look at The sides that they're really exploding.
kevin: It's because AI is generating two to sometimes five additional frames of information for every one frame that's naturally being rendered by the system. And [00:12:00] so does it look like your game is running at, 300 frames a second? It may look like that to you, but they're not real frames. They're these synthetic frames being generated and output.
kevin: By the AI engine. And so is it fair to benchmark these, AI versions over the regular versions? Are we really seeing that much of a performance enhancement? I don't actually know. AI
Gavin Purcell: did say something about AI and video games. Play that clip there.
kevin: is going to re, re invigorate the video game industry. On the one hand for developers, it's going to reduce the cost of creating the content. On the other hand, all of the characters that are in the games are going to be smart characters in the future. So every time you interact with them, they're going to be interacting with you in a much more intelligent way.
kevin: And so the games are going to be more interesting. The characters are going to be more interesting. The content development cost is going to decline and that's going to be really great for the industry.
Gavin Purcell: I don't know if I completely agree with that statement. I saw. Actually, I saw an interesting video. I think the [00:13:00] New York Times put it out, which was talking about why some triple A games are flopping. and the big thing was like, they're not as much fun. And I think that the idea is maybe this lower cost will bring more fun video games to the forefront and we'll get like smaller teams being able to do stuff.
Gavin Purcell: That's a little bit more triple A and quality, but edgier and choice
kevin: That's all very exciting, but you hear Jensen say this. And then I don't know if you saw the Nvidia project R2X video, Gavin.
Gavin Purcell: I did.
Hey R2 X, how are you? I'm doing great, thanks for asking. What can you do for me on my PC?
kevin: Hey, R2X, how are you? You did a pretty convincing R2X head wobble. Oh, that
Gavin Purcell: I know it's
Gavin Purcell: pretty. This is So,
Gavin Purcell: I saw somebody referred to it as clippy AI.
kevin: it's Clippy meets Bonsai Buddy, but it is yeah, here's another. Introducing Project R2X, an RTX powered digital human interface for developers and enthusiasts building AI agents. Okay, so now here it is guiding you, assisting you with an app. How do I use generative AI to replace the jacket in this photo?
kevin: [00:14:00] In Photoshop, use the generative fill feature. First, select the jacket with the selection tool. Then, click on generative fill in the context And you have the power of 4 billion supercomputing clusters. You have all of the shiny jackets a growing boy could need. And this is what your Project R2X sounds like.
kevin: It looks uncannily odd. , it looks like, a second life avatar from like the early aughts. I don't know. It just blows my mind that's their project.
Gavin Purcell: Here's the thing. NVIDIA has a lot of things going on, Kevin.
Gavin Purcell: And that is probably, can you imagine being the product manager for that thing? And you're like, Oh boy,
Gavin Purcell: we got to get this out at CES
kevin: And it's probably like a person too. I bet this was like a jam within Nvidia. They were like, I look what I did for four hours. And now they're like, Oh, announce it. Why not?
kevin: I think the more impactful thing, as you talked about robotics earlier, was their Cosmos world models.
The next frontier of A. I. Is physical A. I. Model performance is directly related to data availability. But physical world data is costly to [00:15:00] capture, curate, and label. NVIDIA Cosmos is a world foundation model development platform to advance physical AI.
kevin: So Cosmos in Jensen's own words, once they want developers to use Cosmos to basically generate worlds and those worlds will allow developers to test and validate their AI models, , and their machines and their robotics and all this stuff, Even across multiple sensor views, they want Cosmos to power the footage in the world.
kevin: That's going to be used to test. So what does that mean? That means if you are aiming for self driving cars, you can generate, , all the cameras, the LIDAR sensor, the acceleration and wheel state of the vehicle. If the user were to interrupt, you can generate all this data. In real time using their Cosmos engine and you can account for different, weather types or time of day or edge cases.
kevin: And one of their examples is like a bear on the side of the road that looks like a Skyrim character growling at one of the lenses. But this applies to cloth simulation. This applies to material simulation. This applies to [00:16:00] warehouse environments and open world environments and stuff. So basically they want Cosmos to be the, , Sort of world engine.
kevin: So that way, when you are working on your bipedal robot or your Roomba that can pick socks up and move them out of the way, which we'll get to later, , Cosmos is the thing that's powering all that.
Gavin Purcell: Yeah, it's pretty cool. NVIDIA, like again, they came to play and they really are driving the whole AI space forward. , okay, we should move on and get a couple quick hitters. A deep seek version three came out last week and people are shocked at how good it is. This is the Chinese open source model that is now competing with Lama 3.
Gavin Purcell: 1 and a bunch of other things in terms of quality level. Kevin, it's fast. It's very interesting. It sometimes thinks that it is chat GPT,
Gavin Purcell: which is
kevin: I wonder why
Gavin Purcell: shows that because it shows that my maybe they've got some interesting training data. But in China, as we've said in the show many times, the laws are not the same in terms of IP protection.
Gavin Purcell: I think this is a pretty big deal,
kevin: it like at a fire sale. [00:17:00] Basically they've cut the price of the token. So it's really cheap to develop for. We'll see if it can retain the quality and the performance for price once it goes back up or if things will be slowed down at a cheaper tier. But, this is.
kevin: Something we also signaled a long time ago is that different models developed by different countries and nations with different politics will some have different censorship built in or not have censorship built in. I think we can assume based off of its predilection to confuse itself with chat GPT.
kevin: That, , maybe some license agreements were violated in the road to creating this model. I'm not going to go there, but I will say if you ask it for information about political issues or whatever, it just straight up will not answer. And so we have to keep that in mind as these new models come out, we have to now start really taking into consideration, where is that model coming from?
kevin: What biases might a model have, that you might not even be aware of as the user.
Gavin Purcell: It's funny you say that because we'll transition now to our next story, which is about Meta's strange AI avatars that they have now booted off their system. There's a pretty big story this [00:18:00] week where somebody came out and there was a a black woman who was LGBTQ what, how do I want to say this?
Gavin Purcell: Like a
kevin: It doesn't matter. You can't offend her. She's not real. That's the thing that is just I know. No, I get it. I understand it.
Gavin Purcell: I want to make sure that?
Gavin Purcell: like I get the wording right, but you're absolutely right. This is the problem. Yes, this is the problem is that basically they created an AI and again, it's not like there's a black woman who's an LGBTQ affiliated person out there who says I created it. This was something that was an AI that was created and they got In huge trouble for it, which they should have, right? Because this is you're creating an AI that's specifically designed to connect with a group of people.
Gavin Purcell: This is just a very weird place that we are entering right now where you have, AIs that are created around personalities or types of people, which is kind of stereotypes, right? This is where you run into some big trouble, I think, if it's not
Gavin Purcell: based on a real person originally.
kevin: The bio here was proud black queer mama of two and truth teller. [00:19:00] You're realist source for life's ups and downs. Let's chat available in the U S and. One of the posts which really drew the ire of the Instagram community says, kicking off the new year in service of our community, leading this season's coat drive was an honor, especially because it provided my little ones a tangible example of helping others.
kevin: And the comments are, yeah, but none of this happened. , two things spring to mind. Yeah, of course. , people talk about like virtue signaling or whatever.
kevin: It's is this an example of that? , it's a bizarre take for an AI to hallucinate that it's giving back and doing good when nobody is benefiting from these, Fake coats getting non donated. So that's even more upsetting for those that are out there trying to get awareness for real things that they're actually doing in their community.
kevin: I also want to point out that don't want to put a percentage on it cause it'll feel hyperbolic, but I'm willing to bet a large portion of the internet that you and I are all interacting with right now. It's exactly this. It is fake accounts doing fake engagement and fake outreach and rage baiting or virtue signaling, whatever labels you [00:20:00] want to put on it.
kevin: This is the majority of the net now. It's just that they put a label on their cookie jar that they got caught with their hand and they put made by AI on it. And that one was obvious, but had they not put that there, I don't know if it would have been noticed or flagged.
Gavin Purcell: Metta has taken these down now, and this is, part of their kind of AI personality thing that launched, I think, originally with the Tom Brady and all the different kind of celebrity personalities. These were the separate ones. Metta has, is doubling down on the idea of AI content. They really believe that AI content that some might refer to as AI slop, because it's probably driving incentives. Insane page views on Meta right now. We know Shrimp Jesus did that we know lots of those Facebook.
kevin: Yeah, it's a bizarre one. I don't know what success is here. If you listen to the Reddit comments and threads, it'll be that this was just a test run and they had to get their feedback. And now that they know the next batch won't say made by AI probably, and there'll be tuned a little different, but yeah, I really don't, I don't know what the end game is here, but this is, it is concerning, but don't worry.
kevin: Cause I'm sure all the Facebook fast fact checkers will definitely hit it. [00:21:00] Don't worry, all those fact checkers that they're holding on to, They'll definitely flag all the content.
Gavin Purcell: That's exactly right. All right, we're moving on here. Minimax, one of our favorite video models has an update this week called their subject reference. And this allows you to track faces across multiple video generations. This is a big thing that you're going to see across a video this year, which is the idea of being able to keep character consistency in your videos. One of the guys in our discord, Kai who is an amazing AI experimenter created a very funny rap video with some of the biggest scientists of all time. So you can see, we'll put it up here, but like Einstein, Neil deGrasse Tyson, rapping in kind of street clothes and doing all sorts of stuff.
Gavin Purcell: It made me laugh quite a bit to watch. So it's a very fun kind of way to look at what you can do with minimax subject reference tool.
kevin: You can see where everybody is going to as one feature comes out one place and the next three models have it, but Minimax is just a fantastic video model, so it's nice to see it there.
Gavin Purcell: And we're going to talk about VO2 later. I got access to VO2, but one of the things that [00:22:00] Minimax lets you do, like I did here, is use real people's faces, which because, again, is a Chinese model, they don't have as many restrictions.
Gavin Purcell: VO2 does not, nor does Runway or other things like that. But there are many cool things they can do as well, too. \
kevin: Yeah, but again, they deep dive through your poop.
Gavin Purcell: No, Kevin, they cannot. Tell me, what candy
kevin: Were you ready to transition to this, Gavin? Meta, MetaGene1?
Gavin Purcell: that was going to be the transition, but here we are.
kevin: I'll elegantly and expertly insert myself into any conversation and transition it to fecal matters. In collaboration with researchers from USC, MetaGene1 has been open sourced. It is a state of the art 7 billiard parameter metagenomic foundation model.
kevin: They took samples of wastewater and Globally and analyzed it, looking for pathogens, looking for DNA sequences, RNA sequences that might lead to pathogens that would basically create a global pandemic or even the next strain of the flu, for example. And so they have a model that has pathogen [00:23:00] detection.
kevin: It has a learned representation of what these things look like in DNA anomaly detection and species classification. There's a bunch of stuff there that. Seems really rich in detail, but they open sourced it. So if you've got your own wastewater process facility, or if you're deep into like a urine fan, tick tock rabbit holes, I don't know if you've seen those guys,
Gavin Purcell: I, no, I don't want to hear anymore. You can keep that right to yourself. You keep that in
kevin: there's a big community that believes that by. Drinking your own urine or connecting it to fans so that it misses it out into
kevin: your room or where
Gavin Purcell: that's a thing.
Gavin Purcell: Oh my god, I'm
Gavin Purcell: so sorry people, you had to
Gavin Purcell: hear about that, because
kevin: And if I could travel with my pee fan, I probably wouldn't be so sick in Vegas right
Gavin Purcell: that what they call it a P Fan?
kevin: No, they call it something worse, but
Gavin Purcell: Urine Flow. It's the Urine Flow 2000.
kevin: That's an Eddie Vedder
Gavin Purcell: Flow. We're a little crazy here. Okay, so actually, this [00:24:00] is really cool.
Gavin Purcell: There's a bigger story here, which is why I think it's important to talk about this, which is the idea of DNA LLMs in some form or another, right? When you talk about genetic Uh, AI. And this is both scary and super crazy interesting in that DNA is essentially a series of codes, which is like programming in a lot of ways, just like you might program something by asking an LM to do it. There's many people out there who are trying to crack through this kind of AI plus science world, right? And I think what this is a step in the direction that we are going to get these kind of LLM like things. That are specifically designed around DNA and other things like that, which is sounds really frightening because you think that's the I love Dr.
Gavin Purcell: Moreau and we're going to get like shout out, Dr. Octagon, half shark, half alligator, half man. But like the idea of this is like ultimately you could do things like cure cancer because you could find the right DNA sort of sequence to be able to pull that off or
Gavin Purcell: you could help people eliminate genetic diseases, all sorts of [00:25:00] stuff like that.
Gavin Purcell: So this is just like
kevin: Yeah. What if you could find out at the age of 25 or something like that, that you have all of the markers for, or like an early onset dementia or something like
kevin: that, and you could plan your life accordingly with that info and, try to negate any of the effects.
Gavin Purcell: Or, ultimately, maybe have some sort of cocktail that's created that you drink that does some sort of genetic manipulation of you. Now, I just saw Brian Johnson, the guy, the, who's trying to live forever, the
kevin: Yeah. Don't die
kevin: guy. Yeah.
Gavin Purcell: yeah.
Gavin Purcell: The documentary is very fascinating. He's a kind of a weirdo, but like definitely somebody pushing the kind of limits of this stuff he'd in fact, just did like in that video, a genetic treatment to increase his muscle tone and supposedly gave him 7 percent more muscle mass, Kevin, that's what he said.
Gavin Purcell: So there you go. The university, those videos of the rats that got jacked,
kevin: Oh yeah. Yeah. I love those.
Gavin Purcell: Jacked rats. Yeah. They're pretty cool. we're gonna go around, just look at some of our favorite things on the internet.
Gavin Purcell: It's time for AI, see what you did there. [00:26:00] Alright, so Kevin, did you see this? Heygen, one of our favorite lip sync tools, showed what their video looks like up res'd with Sora. Do you wanna play this real quick, and we'll kinda listen to it, and then we can talk about it?
kevin: Living rooms are more than just spaces. They're reflections of who we are. Which is why they should feel both bright and cozy.
Gavin Purcell: Yes.
Gavin Purcell: Okay. That's enough. We don't need to hear any more of that. We know enough about living rooms. They should be bright and cozy.
kevin: Isn't that what this tech is all about? Is how to make your living room brighter and more cozy?
Gavin Purcell: if you're not watching this on video, what we just saw was a very good looking lip sync, very high quality of a digital avatar. And what they've basically done here is use Sora to up res what they do, which is what a lot of people have said.
Gavin Purcell: Sora is great for taking an output from something else than up resing it.
kevin: People were doing that with [00:27:00] Sora with other video models as well. They would get the result out, like from a runway generation and then put it in through Sora and it would refine it or add details and crisp up the textures and maybe change the lighting a little bit.
kevin: I use Haygen each and every day for one of the projects that I work on. So I'm very familiar with the lip syncing prowess and it's shortcomings that happened with certain models and Haygen Really, they're iterating each and every day. It seems there's every month I get like a new email about some crazy new tech that they have.
kevin: But this one is, this one's really good. This one seems to be really good. I'd like to see it in a dynamic scene. Right now, all of the examples, at least the ones that I've seen have the avatars in front of a basic sort of gradient backdrop So I want to see how that gets affected.
kevin: But one of the ones that's interesting to me was watching the this sort of neck and collar muscles, musculature change around one of the presenters and I have to watch it closer to make sure that it's. It looks natural, right? Yeah. You want to make sure that it's matching up and it's not randomly flexing at weird times, but yeah, exactly.
kevin: Like you're trying to breathe on Mars. [00:28:00] Yeah. But that is really good. Cause a lot of the apps that we use and this guy have like Hedra even DID in the past. There's a bunch of apps that really they focus on usually the mouth and the lips. Sometimes you get some of the eyebrow in there, but usually that's it.
kevin: And so here to have. What looks like natural response from the presenters as they're moving. I don't know if that's. Sora is just enhancing what is there from the normal thing, or if it's adding those details, but it looks really good
Gavin Purcell: another cool thing I saw was this very funny video. Kevin is probably using many Max's character totally talked about for us called GameStop of Thrones. This is just a dumb, fun thing that somebody did out there where they basically put all the Game of Thrones characters into a GameStop as employees. GameStop has a very special place in my heart because I grew up going to either that or egghead.
kevin: Yeah, this is great. Just an example of what you can do when you can take take a source character and say, I want this face in this scene, go make it happen. And look, I know we struck it out because it's not AI, but I do, we do have to shout it out.
kevin: It's so good. Did you beat it?
Gavin Purcell: I didn't beat it, but I think it I [00:29:00] might have
Gavin Purcell: some AI involved.
Gavin Purcell: So go
kevin: it. Get on it. stimulation clicker by Neil Agarwal. If you go to Neil dot fun, A bunch of their games and things that they've made, but, and you should try them all cause they're all great.
kevin: But stimulation clicker is like an old cookie clicker game where the premises you start by manually clicking a single thing to earn points or credits or cookies or whatever. And then you use those points, credits, and cookies to buy new things that help you generate more clicks. Faster, better, et cetera.
kevin: This is like the internet dank Lord meme version of a cookie clicker. It has DVD screens flying around. It's got the Duolingo owl. It's got hydraulic press videos.
kevin: There's lo fi beats. Yeah. There's a subway runners. It's like everything, even, um, Ludwig, who's a popular streamer, a friend of mine.
kevin: He pops up as like doing commentary on your. Stimulation. Overload. And it's really great. It can be beat. I it's worth playing through. You
kevin: can unlock some
kevin: really
Gavin Purcell: this while you were sick even that's amazing. It's the perfect kind of sick thing to do.
kevin: It was a great I'm sick [00:30:00] in bed thing to do,
Gavin Purcell: okay. Our next thing, Kevin is a photo, a series of photos that just like I saw, and it scared the absolute
kevin: I thought these were polls from the Brian Johnson documentary. I thought this was Brian from Don't Die prepping his favorite meals in the kitchen.
Gavin Purcell: it looks like that. I also thought it was fake. I thought it was like some sort of AI footage, but no. These are pictures from a company named Clone Robotics. And we've talked about this company on the show before. I think it's like this very like humanoid looking almost like a musculature put on and human and robots, but these particular pictures, which they just titled torso two in the kitchen, which feels to me like a sequel to torso one was just this scary movie and torso two in the kitchen is Oh my God, he's got. He's got kitchen knives. He's gonna take you on, but this if you look at these photos, it almost looks like a Italian uncle who's been stripped of his skin and put a metal face on and he's like cooking stuff in the kitchen, but it, even like the creepiness of like the finger on the [00:31:00] knife
Gavin Purcell: I'm like, Y'all, this is the scariest version of a humanoid robot that could ever exist. So to all humanoid robot makers, let's make them a little more like WALL E and a little less like the bodies exhibit, please. If we can.
kevin: I think it's cool. It's got like actuated muscles and that's the way that this one works. It's not like it, the movements tend to be a little more lifelike and natural, but this is, I'm sure they simulated this scene in Cosmos and and put it into the old robot. It does look like Uncle Vito covered himself in Crisco and he's angry at Christmas and he doesn't want anybody to talk to him.
kevin: So he put on his
Gavin Purcell: chop, chop, chop! Yes, I tried to make my special sauce and now nobody wants it. What's wrong with me? That's Uncle Vito. Okay we gotta get to what we do with AI. And Kevin you're at CES,
Gavin Purcell: so do you want to chat through some of the stuff that you've seen there?
Gavin Purcell: Because it really you're there for work, but also
Gavin Purcell: you've been around and seen a few things.
kevin: So I'm here doing work with a Telly. They're a dual screen, shout out Telly, dual screen, smart television. They would probably kill me if I distilled it that basically right now, but [00:32:00] that's what I got in me. And we, we're using AI to do everything from content delivery to some really cool features that I don't want to break here just yet, cause I don't know if I'm allowed to, but putting AI in the things
kevin: there are robotic vacuums, putting AI into their stuff. There are smart glasses putting it like everybody is infusing things with AI. I'm sure with some products, it is a gimmick. It is like a, Oh, let's just slap a on it. Let's rename exactly what we're going to do, but say it's AI.
kevin: So it feels like it's bigger and better, but the promise is becoming very real. It's very much becoming reality. Being able to click and drag intelligence into your software suite or into your hardware device is very real and it is very powerful and it's not foolproof and it's certainly not as simple as a click and a drag, but it is very real.
kevin: And two things that I just wanted to shout out one that got a lot of attention, the robotic vacuum, the Robo rocks, Soros Z 70.
Gavin Purcell: With that little arm that pops out? Yeah.
kevin: and it's not the, it's not the only one here that has a robotic grabber [00:33:00] arm, but you
kevin: watch this
Gavin Purcell: I ask a question about that? Can I,
Gavin Purcell: ask a question about
Gavin Purcell: this? Because my problem with my Roomba, which we've now decommissioned, is
kevin: Oh
Gavin Purcell: these videos, but like, it ran over one time dog poop in our
kevin: Yep. Yep.
Gavin Purcell: my big question is, can that robot arm recognize dog poop? Because that, it took
kevin: It actually, and it's worse that they don't know how to solve it. It juggles it. If it detects it, it picks it up and it's
Gavin Purcell: Ooh
kevin: Yeah, I don't know what the training data was on it, but it just, it will make even more of a mess. So they're still working on that. But for socks, not a problem. , a little arm comes out and it goes and it picks up the sock and it goes over and drops in a little bin. I'm like it's cute, but it is an example of like advanced robotics. In the home now, right? This thing is
Gavin Purcell: that does solve a
kevin: manipulation, and it's solving a problem. there was another robot vacuum that could even go up small shallow stairs. The wheels on it would press down and lift it up and it could drive up over them. So it's ah, just robotics is getting very crazy and wild and disrupted.
kevin: And then I wanted to shout out the holiday smart glasses. [00:34:00] I haven't used
Gavin Purcell: Oh, yeah.
kevin: but they're, they look sneaky, good. And we'll see when reviewers get their hands on them or their faces on them, but it is a set of glasses that has a completely transparent screen built into it.
kevin: Real time voice recognition, translation, note taking and summaries. One of the examples that they use is a, someone in a meeting who's checking out, they hit a button, it's automatically transcribing what everybody is saying. And then when a question is asked, it automatically suggests a possible answer that they could give. But are we ready to be driven around by the machines on this level?
kevin: I maybe some are gavin, you and I were at something fairly recently where someone walked up to us and they had an AI wearable on
kevin: and I instinctively recoiled and backed up. Yeah. because I'm like wait you literally walked into a conversation that other people were having.
kevin: Who knows if it was sensitive or secret or whatever, or someone's just sharing personal information. If you walk up and it's being recorded, I don't know that I am ready for the world where everybody has.
Gavin Purcell: You could be sharing your urine drinking habits, [00:35:00] and then
kevin: I think the world. needs to know about that. We actually need to broadcast that more.
Gavin Purcell: Okay. Fair enough. Yeah, I see. Yes. Very cool. I
kevin: The Piss Mist community needs it, Gavin.
Gavin Purcell: Fine. Fine. I got access into VO two, which not a lot of people have access to yet. Shout out to Logan Kilpatrick at Google who helped us get in there. And Kevin, I have to say I am freaking impressed by VO2. So if you're not familiar, VO2, we talked about on the show, is Google's new AI video model. And we never really saw VO1, right? VO1 was just behind the scenes. But VO2 is now out to a limited amount of people. And it's a little bit weird in that it's text to video and it's image to text to video. You can't upload images, which I think is one of their ways around IP generating things, so that you're only able to do image to video with images that you generate on their site, and Kevin, so my experience with this was pretty cool and I have to say, I It's very good at realistic video. The first couple, I wanted to see if [00:36:00] we could do really weird things, right? Cause one of the things that kind of sucks sometimes with Sora or other video generation things, they can't do strange generations. So I asked it to give me a realistic video of a Man or person in a bird suit eating a meal outside in a grassy place.
Gavin Purcell: And these are pretty good.
kevin: It's interesting because I think the background could pass for realistic, probably. But because of the character looks like a cartoon Pixar ish character, but the meal on the plate looks real, so it's this mashup of styles, but Overall, it looks like I'm watching a clip from a Pixar movie.
Gavin Purcell: These are all first generations. Like I, these are like one shot to this thing. I didn't try to do these multiple times. I try to do a hot dog city promo, I didn't, nothing really gets hot dog city that well, but this is a single shot hot dog city definitely looks like a Pixar movie again, like this
Gavin Purcell: feels like an animated movie.
kevin: The amount of choices
Gavin Purcell: of hot
kevin: to make, right. Cause I'm sure you, yeah. Cause I'm sure you said like a bunch of hot dogs walking down the street. It's okay what are their arms and legs? Are they human arms and legs [00:37:00] or are they meat? And how do they look like the, all of the choices that this thing had to make to generate this, and it still looks fairly coherent and believable, including the plucky little sidekick dog that looks like
Gavin Purcell: Yeah,
kevin: slash. It's got a second set of arms coming out of its body. That it's like waving and gesticulating with, but who cares? It's adorable. Yeah, I, yeah, that's, this is very interesting.
Gavin Purcell: And then I wanted to do a prompt that I did with Sora when Sora first came available to everybody, which was skateboarders on the moon, 90s style, basically. It's a pretty straightforward prompt. And one of the things that VO does much better than Sora is motion, right?
Gavin Purcell: So you can tell, at least in these skateboarding tricks. They're able to do kickflips and you can see stuff. It's a lot better than what Sora does, which is very funky motion.
Gavin Purcell: One thing I will say I went back again because all of these look like they're on the moon per se, but actually there's not moon gravity. I was hoping that it was going to give me like, like low gravity kickflips. And I even went back in and said like low gravity and it
kevin: I was gonna ask if you specified, yeah.
Gavin Purcell: because physics are tricky, [00:38:00] right? But these are much better physics than Sora had. And we know that VO is based on Google's kind of larger world simulation idea. In fact, they're hiring people to bring in to work on their world sim stuff.
kevin: just for my mild take of the day, we had the technology to fake the moon landing back in, what, the 60s? Why can't we do better now, Google? Like, why can't these skateboarders look like they're actually in low
Gavin Purcell: There's so much fun you can do with this, but Kevin has a
kevin: I don't even see our flat Earth in the background.
Gavin Purcell: But really, honestly, Kevin, the thing that it does best, I think, is realistic footage with things that would feel like you could see in the real world. They've clearly trained this on specific videos because you can see streamers late, like that looks like streamers.
Gavin Purcell: You can see people talking. I saw one video that looked like it almost was the Rick trained on Rick roll recently. . So I wanted to create a fake commercial for what I came up with. This idea was I was going to create a commercial for big Mayo, the mayonnaise lobby, people are not using enough Mayo out
kevin: No, this is we did Mayo's [00:39:00] ready for it's got milk moment.
Gavin Purcell: that's right. That's exactly what this is.
kevin: Oh,
Gavin Purcell: This
Gavin Purcell: is a 32nd spot. Called I haven't, you can help me name it at some point. Once you see it, you'll come out of this. I'll play this right now.
kevin: Mayonnaise. Perfect compliment to any sandwich. Stop spreading it. Shaken. Stirred so, thanks, partner for life's every need, spread and joy, one jar at a time. Oh, no
kevin: notes. That was beautiful.
Gavin Purcell: again, what's interesting about AI video is, and in that there were about 20 different generations, all of which I use the photorealistic tab, but some of the videos that I use that are in there, the physics aren't perfect, but in other
kevin: Oh no, at a cursory glance, this all looks fine. Just, and for those that are getting the audio only this thing begins with A totally natural Mayo [00:40:00] spread on some ham on a little sandwich. Then it gets to Mayo on eggs, then a dollop of Mayo on a stack of pancakes, and then it's bacon. It drops as like a Mayo float on a like a nice alcoholic beverage.
kevin: Then it goes for like window cleaning, skincare teeth brushing. And it ends with the matri Mayo, matri matrimonial Mayo. It's like a saying I do, it's a beautiful bride to be accepting a spoonful of mayonnaise, and then the beautiful shot of the big Mayo jar.
kevin: Uh, on the countertop. So it's, and it nails, I think it nails it.
kevin: You, you
Gavin Purcell: Yeah, and it's, and by the way, so one interesting, that shot of the bacon was so fascinating. I was in the middle of it. I was like, whoa, this is so cool that it looks this realistic. I showed it to both my wife and my daughter's can you believe this video? This person put
kevin: They put mayo
Gavin Purcell: on
Gavin Purcell: their bacon. They both were like, oh, that's so gross. I said, I just made that and they're like, what? They were shocked about the fact and this is where VO two is getting us right? Because. Again, there's all these people out there who are going to make the GameStop of Thrones, and they're going to make stuff that feels like ridiculous. This is an [00:41:00] ad, basically, and it's not hard to make this. I think I spent probably all in, four hours making this thing, between just generating the images, writing a script, and then putting the words into Eleven Labs and getting a track from Suno. But this is the world that we're entering.
Gavin Purcell: This is not hard to do, and it does look very real. So it's not out to everybody yet, but I think, when this does come out, it's going to be a really interesting thing. The one last thing I'll say about it is, Google has talked about the idea that they're trying to create this kind of larger suite of stuff that can go within one place.
Gavin Purcell: And we know that open eyes charging 200 for one pro, which gives you relatively unlimited access to Sora. I think there's a world there since rumors out there where Google is going to try to keep this price around 20 bucks and if for 20 bucks you get full access to VO two in some way that is going to be a very hard thing to turn down
kevin: that's an awesome video. I love that you got access. I can't wait to get my hands on it as well. That's really exciting. Also shout out, by the way, [00:42:00] completely unrelated, but Grok 3 is supposedly coming
kevin: in like the next week or two. And I don't. I don't like it, Gavin, but I have been using Grok a lot more now because they finally integrated it into the feed.
kevin: And it was something that you and I had discussed eons ago, which is if your whole competitive advantage is access to Twitter or the X fire hose and data and blah, blah, blah. Yeah. Contextualize things for me. And there was a post the other day, someone was talking about Oh, it was an image of a financial chart of some sort.
kevin: I don't exactly remember what it was, but the tweet just said this certainly isn't good and it had a ton of likes and whatever else. And I don't know what this is about. And I don't want to go through it. Cause all the replies were like, Oh my God, we're doomed. I was like, what is this? I clicked on the grok or the X AI analysis button.
kevin: And it very quickly. Analyze the image and said, the poster is saying that this is not good in referring to an image and then broke down what the image was and then gave me several with sources cited outside of Twitter inline thoughts on what it could be or why that analysis might [00:43:00] be
Gavin Purcell: Is this where you learned about pee spraying? Is that what this is all about?
kevin: Piss missed.
Gavin Purcell: All right, everybody. We had a great week. We'll see y'all next week. Thanks so much for joining us. Bye everyone