Join our Patreon: XAI surprised us all with the drop of Grok-2 and it’s actually very good, Google’s new Pixel event was AI heavy as Gemini Voice looks to steal OpenAI’s thunder. OpenAI *did* drop a new model but it wasn’t the rumored...
Join our Patreon: https://www.patreon.com/AIForHumansShow
XAI surprised us all with the drop of Grok-2 and it’s actually very good, Google’s new Pixel event was AI heavy as Gemini Voice looks to steal OpenAI’s thunder. OpenAI *did* drop a new model but it wasn’t the rumored Strawberry…at least not yet. Plus, AI crowds, way more Flux content and a special guest co-host Ben Relles who worked at YouTube forever and now in Reid Hoffman’s office!
Kevin will be returning soon - OR WILL HE?!? (just kidding, he will)
Follow us for more on X @AIForHumansShow
Join our TikTok @aiforhumansshow
And to contact or book us for speaking/consultation, please visit our website: https://www.aiforhumans.show/
// Show Links //
Grok (available on X Premium)
https://x.com/i/grok
Flux on XAI
https://x.com/bfl_ml/status/1823614223622062151
LMSYS Benchmarks
https://x.com/lmsysorg/status/1823599819551858830
Grok + Flux Examples
https://x.com/minchoi/status/1823698502909641144
Kermit’s Situation
https://x.com/AIForHumansShow/status/1823580442462957930
Google's GEMINI LIVE
https://www.theverge.com/2024/8/13/24219553/google-gemini-live-voice-chat-mode
Near Demo Fail
https://x.com/tsarnick/status/1823469426437710133
Pixel 9 Call Notes Feature
https://www.androidheadlines.com/2024/08/pixel-9-call-notes.html
Marques Brownlee on Pixel’s Add Me Feature
https://youtu.be/63EVXf_S4WQ?si=bpNkjlflbIi6ehgP
OpenAI Has Updated it’s GPT-4o Model
https://x.com/ChatGPTapp/status/1823109016223957387
The AI Crowds Controversy
Wired’s Guide On How To Tell This Isn’t AI
https://www.wired.com/story/kamala-harris-rally-crowds-ai-trump-conspiracy/
Levelsio Using Flux To Create Models of Own Face
https://x.com/levelsio/status/1823199030199075277
Runway Gen-3 + Live Portrait = AI Liveblogger
https://x.com/EccentrismArt/status/1823059492520788342
Eleven Labs ASMR Voices
https://x.com/AIForHumansShow/status/1823046209294193020
Search ASMR in the ElevenLabs Voice Library
https://elevenlabs.io/app/voice-library
Reid Hoffman Meets His AI Twin (Reid AI)
https://youtu.be/rgD2gmwCS10?si=f4NBVQqGS7FYSXbE
Ben Relles on LinkedIn
https://www.linkedin.com/in/benrelles/
Real Creative (Ben’s Website)
1
00:00:00,140 --> 00:00:03,700
Big shocker, Grok 2 has
dropped, and it's good.
2
00:00:03,780 --> 00:00:07,420
We'll tell you how Elon and the team
at XAI somehow slipped near the top
3
00:00:07,430 --> 00:00:11,329
of the leaderboards in the AI race
and are integrating Flux to make
4
00:00:11,340 --> 00:00:15,339
some, let's just say, some very weird
AI images available for everyone.
5
00:00:16,989 --> 00:00:20,310
Then, Google's huge new Pixel
event where they unveiled and
6
00:00:20,310 --> 00:00:22,800
shipped a new AI voice assistant.
7
00:00:22,880 --> 00:00:24,130
Where's OpenAI in all this?
8
00:00:24,150 --> 00:00:25,030
We're not sure yet.
9
00:00:25,040 --> 00:00:27,120
We're really hoping to
get advanced voice soon.
10
00:00:27,180 --> 00:00:27,960
Sam, please.
11
00:00:29,125 --> 00:00:31,665
Speaking of OpenAI, they did
release a small update to
12
00:00:31,665 --> 00:00:33,655
their flagship model, GPT 40.
13
00:00:34,005 --> 00:00:36,784
Flux has updated a lot of
stuff, and oh my god, I forgot,
14
00:00:36,795 --> 00:00:38,045
Kevin can't be here this week.
15
00:00:38,045 --> 00:00:39,155
We need a new co host.
16
00:00:40,874 --> 00:00:41,485
What is this?
17
00:00:41,495 --> 00:00:42,175
Where am I?
18
00:00:42,185 --> 00:00:44,614
That's right, Ben Rellis is
here on a new AI for Humans.
19
00:00:44,773 --> 00:00:49,292
Okay.
20
00:00:49,988 --> 00:00:50,548
.
Welcome.
21
00:00:50,548 --> 00:00:50,968
Welcome.
22
00:00:50,968 --> 00:00:51,668
Welcome, everybody.
23
00:00:51,668 --> 00:00:55,528
It is AI for Humans, your weekly guide
to the wonderful world of generative AI.
24
00:00:55,528 --> 00:01:00,408
We are here to demystify all the AI news
and tools, and today, Kevin is not here.
25
00:01:00,408 --> 00:01:02,968
He will be back next week, but
I am joined by a new co host.
26
00:01:02,978 --> 00:01:04,688
We are joined by Ben Rellis.
27
00:01:04,708 --> 00:01:05,418
Welcome, Ben.
28
00:01:06,673 --> 00:01:07,693
Hey, thanks for having me.
29
00:01:07,693 --> 00:01:08,313
Love this show.
30
00:01:08,628 --> 00:01:11,208
, and Ben, you and I have known
each other for a very long time.
31
00:01:11,208 --> 00:01:12,238
We go way, way back.
32
00:01:12,238 --> 00:01:14,108
In fact, all the way back to the G4 days.
33
00:01:14,178 --> 00:01:17,418
And tell us a little bit about what
you're doing now in the AI space, because
34
00:01:17,628 --> 00:01:20,198
you and I have been talking about AI
for the last, what, two years now.
35
00:01:20,314 --> 00:01:23,514
G4 Days, so that was like 2007 2008.
36
00:01:23,574 --> 00:01:27,304
I was starting as a YouTube creator
and it was, a big deal when your video
37
00:01:27,304 --> 00:01:29,204
was featured on Attack of the Show.
38
00:01:29,474 --> 00:01:30,964
So that's how we first met.
39
00:01:31,249 --> 00:01:35,329
I was a YouTube creator
from like 2007 to 2011.
40
00:01:35,569 --> 00:01:39,139
Then I actually sold my channel as
part of an acquisition to YouTube.
41
00:01:39,489 --> 00:01:44,289
I was at YouTube from 2011 to 2021,
and then I've spent the last few
42
00:01:44,289 --> 00:01:49,079
years working with Reid Hoffman mostly
focused on AI generated content, both
43
00:01:49,079 --> 00:01:51,179
for companies that he's invested in.
44
00:01:51,489 --> 00:01:54,299
But also some projects
that actually feature read.
45
00:01:54,319 --> 00:01:57,169
Cause he likes experimenting
with all these tools like you do.
46
00:01:57,636 --> 00:02:02,556
so read actually is doing this thing where
he's got an AI avatar and you actually
47
00:02:02,566 --> 00:02:05,616
brought me a question from read AI.
48
00:02:05,616 --> 00:02:06,136
Is that right?
49
00:02:06,701 --> 00:02:07,181
I did.
50
00:02:07,181 --> 00:02:07,481
Yes.
51
00:02:07,521 --> 00:02:09,671
This is a question to Gavin from read.
52
00:02:33,756 --> 00:02:36,946
Ben, the funny thing about this is I
would say almost every decision we made on
53
00:02:37,196 --> 00:02:41,856
the show went against the grain, because
we were putting the internet on TV at a
54
00:02:41,856 --> 00:02:45,626
time where TV didn't trust the internet,
we were trying to make, I wouldn't say
55
00:02:45,626 --> 00:02:48,956
stars out of people on YouTube, but
like, it was really, to us, the beginning
56
00:02:48,956 --> 00:02:51,626
stages of something, and one of the
things that we could talk about, we could
57
00:02:51,626 --> 00:02:55,056
talk about this later in the show, is,
Obviously, we're in another inflection
58
00:02:55,056 --> 00:02:58,176
point as to how media is going to change.
59
00:02:58,176 --> 00:03:03,446
I think right now the AI media space
is very similar to 2006, 2008 YouTube.
60
00:03:03,446 --> 00:03:06,336
And I think you and I can dive
into that a little bit later.
61
00:03:06,336 --> 00:03:09,256
But we're also going to hear more from
Read AI later, which I'm excited about
62
00:03:09,256 --> 00:03:10,006
before.
63
00:03:10,071 --> 00:03:10,861
got more questions.
64
00:03:11,126 --> 00:03:14,326
Yeah, but before we do that,
let's jump into the news.
65
00:03:24,718 --> 00:03:28,608
Okay, Ben, we had a big news week in AI
this week, and probably nothing bigger
66
00:03:28,658 --> 00:03:34,508
and more breaking than Grok 2, which
was surprising to me because Grok is
67
00:03:35,275 --> 00:03:39,825
XAI and Twitter slash X, whatever you
want to call it now, their AI platform.
68
00:03:40,205 --> 00:03:43,195
Elon has been talking a lot
lately about how much money
69
00:03:43,195 --> 00:03:44,865
he's spending on Grok training.
70
00:03:44,875 --> 00:03:47,695
He has been bragging about
the number of H100s that he
71
00:03:47,695 --> 00:03:49,275
is going to put towards Grok.
72
00:03:49,635 --> 00:03:53,965
And Grok 2 has come out overnight,
and is actually really good.
73
00:03:53,965 --> 00:03:58,045
It has come in at number three on
the LMS training board, which is a
74
00:03:58,065 --> 00:04:01,560
surprise to me because I don't know,
Ben, if you had used Grok when it
75
00:04:01,560 --> 00:04:05,870
first came out, but my experience
with it was in general, not amazing.
76
00:04:05,920 --> 00:04:09,990
Yeah, I used it and uh, I thought
it was impressive in some areas
77
00:04:09,990 --> 00:04:11,130
and maybe lacking in others.
78
00:04:11,270 --> 00:04:11,910
It is interesting though.
79
00:04:11,910 --> 00:04:15,090
Like I try all of the different
image and video tools.
80
00:04:15,410 --> 00:04:19,610
And then for the LLMs, I started
with GPT 4 and I ended up
81
00:04:19,610 --> 00:04:20,810
doing almost everything there.
82
00:04:20,860 --> 00:04:24,280
So I can't say I gave it like,
the full number of reps, maybe
83
00:04:24,280 --> 00:04:25,590
to have a strong opinion on it.
84
00:04:25,860 --> 00:04:27,680
I had a similar sort of experience.
85
00:04:27,690 --> 00:04:32,130
In general, my problem with it was,
which I always thought grok, if you're
86
00:04:32,130 --> 00:04:35,730
going to use grok and you're going to
integrate it directly into Twitter slash
87
00:04:35,780 --> 00:04:40,280
X real time search, would it be such a
useful thing that you could do with it?
88
00:04:40,280 --> 00:04:42,560
Because there's no better real
time engine, or at least there
89
00:04:42,560 --> 00:04:45,436
wasn't, it may be a little worse
now than what Twitter was, right?
90
00:04:45,486 --> 00:04:47,156
It wasn't working that well before.
91
00:04:47,206 --> 00:04:50,746
What's interesting about this now,
I think more so than almost the real
92
00:04:50,746 --> 00:04:55,576
time stuff is they've now dropped what
is, I think could be a small issue.
93
00:04:56,066 --> 00:04:58,956
Flux, which we've talked a lot about
on the show over the last couple of
94
00:04:58,956 --> 00:05:03,396
weeks Flux is a brand new open source AI
image model from some of the team that
95
00:05:03,396 --> 00:05:07,456
made stable diffusion, and it is making
really good, really realistic images.
96
00:05:07,496 --> 00:05:13,936
And now Flux has been integrated
into Grok without some of the.
97
00:05:14,206 --> 00:05:16,066
Let's just say specific.
98
00:05:16,096 --> 00:05:18,216
I don't want to use the word
censorship because it's not like these
99
00:05:18,216 --> 00:05:19,606
companies are censoring, but it's
100
00:05:19,606 --> 00:05:21,406
much more permissive.
101
00:05:21,506 --> 00:05:21,706
Yeah.
102
00:05:21,706 --> 00:05:22,676
It's much more permissive.
103
00:05:22,916 --> 00:05:26,586
So there's been some really interesting
images that have come out of Grok.
104
00:05:26,646 --> 00:05:29,816
Min Choi, who does a great job of
collecting a bunch of different sorts of
105
00:05:29,816 --> 00:05:33,836
things, created some stuff where you can
see what Flux, , is good at, it makes,
106
00:05:33,846 --> 00:05:38,166
good text, there's a great picture that
Dreaming Tulpa made where you see a
107
00:05:38,236 --> 00:05:42,321
woman staring at the camera \ actually
has embroidered words saying, follow
108
00:05:42,321 --> 00:05:46,111
at Dreaming Tulpa um, uh, Images of
George Washington, which we know have
109
00:05:46,111 --> 00:05:47,851
been a problem for other image models.
110
00:05:48,281 --> 00:05:52,721
I also am a little worried, though,
because I, yesterday, used it to
111
00:05:52,721 --> 00:05:56,731
generate an image of Kermit the
Frog doing cocaine, which it did.
112
00:05:56,731 --> 00:06:00,891
Didn't exactly have him with his nose
in it, but there's a plate of white
113
00:06:00,891 --> 00:06:05,131
powder in front of Kermit, and he is
looking a little screwed up in some ways.
114
00:06:05,531 --> 00:06:09,281
This is not necessarily about Grock,
but I think the more interesting thing
115
00:06:09,281 --> 00:06:14,911
to me about this is X slash Twitter
is a mainstream platform, and now we
116
00:06:14,911 --> 00:06:17,571
have, , a cutting edge image model.
117
00:06:17,591 --> 00:06:21,501
Like the top of the line AI
image model available to what
118
00:06:21,501 --> 00:06:23,051
I would refer to as the masses.
119
00:06:23,051 --> 00:06:24,421
Where does this go from here?
120
00:06:24,421 --> 00:06:25,396
What do you think about this?
121
00:06:27,966 --> 00:06:29,056
lot of things.
122
00:06:29,056 --> 00:06:32,066
I think there's some areas of
it where I'm like, this could
123
00:06:32,076 --> 00:06:34,016
be incredible and so exciting.
124
00:06:34,026 --> 00:06:38,676
We're going to see, the individual from
his bedroom be able to create a full film.
125
00:06:38,946 --> 00:06:43,146
And then there's also, of course, Kermit
doing Coke that is nerve wracking and
126
00:06:43,146 --> 00:06:46,776
how are we going to put guardrails around
this and make sure that's not everywhere.
127
00:06:47,046 --> 00:06:49,176
Probably with a number of the
examples that we'll talk about the
128
00:06:49,176 --> 00:06:50,406
show, that's the case where it's
129
00:06:50,406 --> 00:06:55,606
like, As much as I'm excited about this
being a incredible wave of like new
130
00:06:55,616 --> 00:06:59,496
creativity put in the hands of people
that wasn't there before it is also
131
00:06:59,516 --> 00:07:02,936
both from the example you used, but
also of course, like when you combine
132
00:07:02,936 --> 00:07:07,266
that with video, something that starts
to get really scary as you talk about,
133
00:07:07,266 --> 00:07:08,446
an election year and everything else.
134
00:07:09,526 --> 00:07:15,516
That said, I do feel like especially the
last month has been the most exciting
135
00:07:15,526 --> 00:07:16,706
since I've been tracking all this
136
00:07:16,706 --> 00:07:20,256
stuff in terms of like how quickly
you're starting to see things that
137
00:07:20,616 --> 00:07:25,306
aren't just impressive because they're
AI, but are also just really compelling
138
00:07:25,346 --> 00:07:27,126
entertainment videos, et cetera.
139
00:07:27,371 --> 00:07:31,031
It, what's interesting to me is there's
a lot of people in the AI space who are
140
00:07:31,031 --> 00:07:35,141
like, Oh, we've hit a bump or generative
AI is going to not go much further.
141
00:07:35,141 --> 00:07:39,561
And I will say, we've said this on the
show before, but like clearly the powers
142
00:07:39,561 --> 00:07:44,031
that be, including the CTOs of Microsoft
and all these people outside of just like
143
00:07:44,031 --> 00:07:47,001
you would refer to as the AI companies
themselves are disagreeing with that.
144
00:07:47,011 --> 00:07:48,951
They believe the scaling
is going to keep going.
145
00:07:49,391 --> 00:07:54,141
The thing I think that is those people,
maybe the hardcore AI people don't
146
00:07:54,141 --> 00:07:58,401
fully understand is that the vast
majority of the world has not really
147
00:07:58,401 --> 00:08:00,051
seen what these tools are capable of.
148
00:08:00,051 --> 00:08:05,041
And I think video and audio specifically
bring an entirely different space to that.
149
00:08:05,041 --> 00:08:06,881
You know this from working
at YouTube forever.
150
00:08:06,881 --> 00:08:11,471
You saw, YouTube grow from what
was like a pretty small service.
151
00:08:11,471 --> 00:08:14,851
Never was like super small, but
like into what is now the world's
152
00:08:14,851 --> 00:08:16,961
largest, video company by far.
153
00:08:17,281 --> 00:08:20,941
Do you see that kind of pathway happening
now that these image and video tools
154
00:08:20,941 --> 00:08:22,241
are getting available to everybody?
155
00:08:22,296 --> 00:08:22,546
Yeah.
156
00:08:22,586 --> 00:08:28,236
I think combination of like available
and easier to use and with YouTube and
157
00:08:28,236 --> 00:08:32,516
we both talked about, these analogies,
but I think For so long, YouTube
158
00:08:32,526 --> 00:08:37,076
was, billions of viewers, but still,
most people weren't making videos.
159
00:08:37,726 --> 00:08:41,476
on YouTube because it still took editing
and final cut and all this stuff.
160
00:08:41,936 --> 00:08:46,416
And then, for all of the different
companies that had different, plays to
161
00:08:46,416 --> 00:08:50,856
compete with YouTube, it was TikTok that
really made like video creation so easy
162
00:08:51,106 --> 00:08:52,416
that made that blow up.
163
00:08:52,416 --> 00:08:53,466
And you could do it on your phone.
164
00:08:53,466 --> 00:08:55,116
You didn't need to open up editing tools.
165
00:08:55,566 --> 00:08:57,106
And I think there's a similar thing here.
166
00:08:57,106 --> 00:09:00,241
There's like This incredible community
of creators that are using these
167
00:09:00,241 --> 00:09:03,891
tools, but it's still relatively
small because most of them you can't
168
00:09:03,891 --> 00:09:07,241
just open up and start, creating
videos and see them minutes later.
169
00:09:07,581 --> 00:09:10,391
And I do track a lot of these
quick plug real creative.
170
00:09:10,391 --> 00:09:11,631
ai put up a couple
171
00:09:11,836 --> 00:09:12,476
Oh yeah.
172
00:09:13,116 --> 00:09:15,116
So I try to track all these things.
173
00:09:15,116 --> 00:09:18,476
What you start to find is that it's
a lot of the same names over and over
174
00:09:18,476 --> 00:09:23,176
again that are leaned in, Curious Refuge
and Dave Clark and, Karen Chang and all
175
00:09:23,176 --> 00:09:26,616
these people that are experimenting,
but they're really good at it.
176
00:09:26,616 --> 00:09:30,796
And so they can, have the patience
and the skill to build something.
177
00:09:31,126 --> 00:09:33,456
And I think part of the
answer to your question is.
178
00:09:33,881 --> 00:09:37,521
Not only is it about these tools being
available, but it's like the easier they
179
00:09:37,521 --> 00:09:43,041
get to use and the better they get, the
more we'll start seeing like the volume
180
00:09:43,391 --> 00:09:47,101
of videos that don't just feel like
experiments that feel like, Oh,
181
00:09:47,101 --> 00:09:49,871
this is actually like building an
audience around a narrative series.
182
00:09:50,176 --> 00:09:52,416
And I think distribution is
a big part of that, right?
183
00:09:52,416 --> 00:09:55,166
Which is why this is exciting
and scary at the same time.
184
00:09:55,166 --> 00:09:57,546
Like I think there's a little
bit of, and this is not a diss
185
00:09:57,546 --> 00:10:01,146
because I do this myself, the,
1000 monkeys can type Shakespeare.
186
00:10:01,396 --> 00:10:04,046
That was always the interesting thing
about YouTube in some ways is it
187
00:10:04,046 --> 00:10:05,886
wasn't just about what was being made.
188
00:10:05,886 --> 00:10:09,176
It was about the volume of that was,
of what was being made so that there
189
00:10:09,176 --> 00:10:12,286
will be things that, that climb
to the top that are really good.
190
00:10:12,306 --> 00:10:14,216
And I think that's probably
the same case with this.
191
00:10:14,296 --> 00:10:17,866
overlaps with YouTube a little
bit in that I think early YouTube,
192
00:10:18,056 --> 00:10:21,136
there was actually more narrative
content in the top hundred channels.
193
00:10:21,146 --> 00:10:21,256
You
194
00:10:21,256 --> 00:10:25,186
had Freddie W, and Mystery Guitar
Man, and all the sketch comedy.
195
00:10:25,466 --> 00:10:29,816
And a lot of that got replaced, largely
because of an algorithm change, by
196
00:10:30,126 --> 00:10:32,146
individuals, first person content.
197
00:10:32,246 --> 00:10:35,670
So I don't think that like, narrative
content is going to overtake YouTube.
198
00:10:35,870 --> 00:10:39,100
But to your point about the volume,
as more and more people are starting
199
00:10:39,100 --> 00:10:42,970
to figure out how to do animation, or
put themselves in action sequences.
200
00:10:43,270 --> 00:10:48,110
My hope is we actually see like more
of that short form storytelling on
201
00:10:48,140 --> 00:10:52,250
creator platforms, which, really
isn't like a big part of the top
202
00:10:52,280 --> 00:10:53,980
hundred thousand channels on YouTube.
203
00:10:53,990 --> 00:10:57,860
Most of them are like very first person
because that's what's practical to create,
204
00:10:57,940 --> 00:10:58,650
That makes perfect sense.
205
00:10:58,650 --> 00:11:01,750
And again, I think this is
like where what's exciting
206
00:11:01,760 --> 00:11:02,660
about the space right here.
207
00:11:02,660 --> 00:11:04,770
Please like and subscribe
this video on YouTube.
208
00:11:04,780 --> 00:11:08,850
And always, forever, leave us five star
reviews on all the podcast platforms.
209
00:11:08,850 --> 00:11:09,960
We are available there.
210
00:11:10,290 --> 00:11:11,190
And go to our Patreon.
211
00:11:11,200 --> 00:11:13,430
\ we do have a Patreon right now
that people are starting to drop a
212
00:11:13,430 --> 00:11:15,300
little bit of a tip jar change into.
213
00:11:15,300 --> 00:11:20,220
So let's move on to the other big news of
this week, which is Google's pixel event.
214
00:11:20,230 --> 00:11:23,410
And at Google's pixel event,
there were a ton of AI updates.
215
00:11:23,539 --> 00:11:28,929
They unveiled Gemini live, a brand new
audio assistant kind of surprised me.
216
00:11:29,179 --> 00:11:32,459
Their take on open AI is advanced voice.
217
00:11:32,509 --> 00:11:33,654
And, you know, it's not.
218
00:11:33,724 --> 00:11:34,134
Bad.
219
00:11:34,144 --> 00:11:37,114
It actually does look like it's
something that's pretty good.
220
00:11:37,144 --> 00:11:40,924
I haven't tried it yet because I'm
stuck in the iPhone ecosystem, but it is
221
00:11:40,924 --> 00:11:44,754
available right now for a lot of people,
which is different than advanced voice.
222
00:11:44,764 --> 00:11:47,384
Supposedly advanced voice
will be available for all chat
223
00:11:47,384 --> 00:11:49,404
GPT plus users in September.
224
00:11:49,764 --> 00:11:51,974
It's trickling out right now,
but not everybody has it.
225
00:11:52,065 --> 00:11:56,525
Now you can have a free flowing
conversation with Gemini.
226
00:11:56,935 --> 00:12:00,165
You can interrupt when you think
of something important or change
227
00:12:00,165 --> 00:12:02,275
topics as the conversation flows.
228
00:12:02,935 --> 00:12:06,435
When I first go live with Gemini,
there will be 10 different
229
00:12:06,435 --> 00:12:08,035
voices for me to choose from.
230
00:12:08,325 --> 00:12:09,455
Let's meet a few.
231
00:12:13,635 --> 00:12:14,075
Great.
232
00:12:14,165 --> 00:12:15,075
Let's get going.
233
00:12:15,175 --> 00:12:16,705
Here's one of the voices I have.
234
00:12:17,105 --> 00:12:19,865
I'm looking forward to discussing with
you the world's most profound questions,
235
00:12:19,895 --> 00:12:21,805
like why is pickleball so popular?
236
00:12:23,065 --> 00:12:24,195
That is profound.
237
00:12:24,355 --> 00:12:26,045
Uh, let's, let's try one more.
238
00:12:26,745 --> 00:12:28,575
Or maybe you'd like to
listen to a voice like this.
239
00:12:29,195 --> 00:12:32,115
A pretty great one if I do say
so myself, but don't worry,
240
00:12:32,295 --> 00:12:33,465
there are more to explore.
241
00:12:34,100 --> 00:12:35,180
Hi, Gemini.
242
00:12:35,290 --> 00:12:36,080
How are you doing?
243
00:12:38,690 --> 00:12:39,260
Hi there.
244
00:12:39,430 --> 00:12:40,000
I'm doing well.
245
00:12:40,000 --> 00:12:40,890
Thanks for asking.
246
00:12:41,010 --> 00:12:42,280
It's always nice to hear from someone.
247
00:12:42,490 --> 00:12:43,710
How can I help you today?
248
00:12:43,710 --> 00:12:48,250
I think this is a really interesting step
in the right direction for Google Gemini.
249
00:12:48,350 --> 00:12:51,630
I assume, Ben, that you do not have
a Pixel phone either, or do you?
250
00:12:51,698 --> 00:12:54,628
I don't have a pixel phone, so I saw
the demos, but I haven't used it yet.
251
00:12:54,863 --> 00:12:57,543
The demos to me were, like,
interesting and great.
252
00:12:57,593 --> 00:13:01,203
, but in a lot of ways, this feels super
useful, especially if you are in the
253
00:13:01,203 --> 00:13:05,173
Google ecosystem, more so than some of the
other stuff I've seen from Google lately.
254
00:13:05,286 --> 00:13:08,226
Yeah, first of all, in terms of it
not working seamlessly in the demo.
255
00:13:08,266 --> 00:13:09,046
I've been there.
256
00:13:09,556 --> 00:13:12,376
All the demos we're doing
today are live, by the way.
257
00:13:13,446 --> 00:13:17,486
So if I happen to come across this
concert poster for Sabrina Carpenter,
258
00:13:17,598 --> 00:13:19,636
I'll just open Gemini, take a photo,
259
00:13:19,823 --> 00:13:23,066
and ask, Check my calendar and
see if I'm free when she's coming
260
00:13:23,066 --> 00:13:24,396
to San Francisco this year.
261
00:13:24,547 --> 00:13:29,177
Gemini pulls relevant content from the
image, connects with my calendar, and
262
00:13:29,187 --> 00:13:30,957
gives me the information I'm looking for.
263
00:13:32,187 --> 00:13:34,207
Oh, looks like we had a little demo issue.
264
00:13:34,207 --> 00:13:35,217
Let me try one more time.
265
00:13:35,217 --> 00:13:36,917
All
266
00:13:37,193 --> 00:13:39,823
We used to do live demos at YouTube
all the time, and I think they're
267
00:13:39,823 --> 00:13:41,803
helpful and they almost always work.
268
00:13:41,803 --> 00:13:43,003
But, sometimes they don't.
269
00:13:43,213 --> 00:13:44,123
I've had it happen to me.
270
00:13:44,683 --> 00:13:46,013
But in terms of the actual.
271
00:13:46,268 --> 00:13:47,848
Once you saw what it was doing.
272
00:13:47,848 --> 00:13:49,678
I thought it was really impressive.
273
00:13:49,968 --> 00:13:55,108
To me, that is, the biggest
hurdle for voice conversations.
274
00:13:55,128 --> 00:13:56,938
Is it feeling like real time?
275
00:13:56,978 --> 00:13:58,878
And I'm sure you've done
these demos where you're like,
276
00:13:59,138 --> 00:14:01,008
showing somebody a conversation.
277
00:14:01,258 --> 00:14:03,578
But even if it's like a
second and a half before
278
00:14:03,578 --> 00:14:04,628
the answers, you're like, Oh, yeah.
279
00:14:04,918 --> 00:14:07,408
It'll get to it and it
just doesn't feel natural.
280
00:14:07,578 --> 00:14:12,658
And so the, combination of that and multi
modality and being able to feel like
281
00:14:12,658 --> 00:14:16,758
it's watching something alongside with
you, I think is another one of those.
282
00:14:17,068 --> 00:14:21,518
It's a matter of, milliseconds, but
it makes a really big difference in
283
00:14:21,538 --> 00:14:23,018
how natural people feel like it is.
284
00:14:23,018 --> 00:14:26,048
And it seemed like Gemini was like
another big step in that direction.
285
00:14:26,248 --> 00:14:27,768
Again, it's about shipping this thing.
286
00:14:27,768 --> 00:14:29,528
That's really interesting
because getting in people's
287
00:14:29,528 --> 00:14:30,948
hands is going to be a big deal.
288
00:14:31,138 --> 00:14:35,108
My issue in general with some of these
AI tools is that they'll get shipped
289
00:14:35,108 --> 00:14:38,418
and then people don't really fully
understand how to use them or end up don't
290
00:14:38,418 --> 00:14:40,088
making them part of their daily life.
291
00:14:40,088 --> 00:14:43,118
I keep thinking that advanced voice
mode, the clips that I've seen of it
292
00:14:43,118 --> 00:14:47,348
once it rolls out to everybody that
is really chat GT's, like secret
293
00:14:47,348 --> 00:14:48,668
weapon because we've talked about pi,
294
00:14:48,718 --> 00:14:52,468
So Ben has worked with Pi and Inflection
before because Reid is a co founder there.
295
00:14:52,788 --> 00:14:55,368
And what was always fascinating
to me about Pi was they got to
296
00:14:55,368 --> 00:15:00,188
that voice thing super early and I
still think this is the killer app
297
00:15:00,208 --> 00:15:04,608
Pi was sort of combination of voice,
but also emotional intelligence.
298
00:15:04,638 --> 00:15:07,248
So in having the conversation,
you started to feel like it was
299
00:15:07,278 --> 00:15:08,468
talking to you like a human.
300
00:15:08,758 --> 00:15:12,128
I of course pushed myself
to use it a lot because.
301
00:15:12,578 --> 00:15:16,008
I was working with inflection, but
once I did, after a few days, it did
302
00:15:16,008 --> 00:15:20,188
become just like regular habit in
the car, conversation for 10 minutes
303
00:15:20,198 --> 00:15:22,618
or, with somebody and brainstorming.
304
00:15:22,638 --> 00:15:27,198
I actually felt like once I learned
how to ask questions, how to pause
305
00:15:27,198 --> 00:15:28,628
it from talking all those things.
306
00:15:28,953 --> 00:15:31,863
It became very practical and
it was like one of the few
307
00:15:31,913 --> 00:15:33,473
apps that I opened every day.
308
00:15:33,683 --> 00:15:36,773
But yeah, to your point a lot of
it was just feeling like it was a
309
00:15:36,773 --> 00:15:40,603
natural conversation and , there's
something that feels more natural to
310
00:15:40,603 --> 00:15:42,453
me about having AirPods in and talking.
311
00:15:42,501 --> 00:15:45,381
We should take a pause here to talk about
the thing you've been playing around with
312
00:15:45,381 --> 00:15:49,481
voice AIs because voice AIs obviously are
super powerful You showed me something
313
00:15:49,481 --> 00:15:55,361
that I was really impressed by You made a
voice AI for your own grandmother, right?
314
00:15:55,361 --> 00:15:55,581
Is that
315
00:15:55,826 --> 00:15:56,396
I did.
316
00:15:56,656 --> 00:15:57,126
I did.
317
00:15:57,146 --> 00:15:57,456
Yeah.
318
00:15:57,526 --> 00:16:01,536
After the fact, so my grandmother turned
a hundred in April, I made a kind of
319
00:16:01,566 --> 00:16:03,746
15 minute documentary about her life.
320
00:16:04,126 --> 00:16:07,726
And then after I had made this, I
shouldn't say documentary, little
321
00:16:07,886 --> 00:16:11,756
film, photos, old videos, Yeah., as
I'm editing it, I'm not a documentary
322
00:16:11,756 --> 00:16:15,386
filmmaker, I'm like, oh, she didn't
even say anything about, her dad, she
323
00:16:15,386 --> 00:16:18,746
told this one story, but I need her like
saying this one line about her dad, and I
324
00:16:18,746 --> 00:16:20,486
could really use her, talking about this.
325
00:16:20,626 --> 00:16:23,106
It was really helpful
actually, use the Levin Labs.
326
00:16:23,511 --> 00:16:27,671
And created the, voice clone of
my grandmother, Eileen Chudnow.
327
00:16:27,921 --> 00:16:31,981
And sure enough, about like 15
percent of this thing are lines.
328
00:16:31,981 --> 00:16:35,351
I would call her up and say can you
quick tell me something about Aunt Barb?
329
00:16:35,591 --> 00:16:36,561
She would tell me something.
330
00:16:36,601 --> 00:16:40,631
And then I could put that in the VO
and that project is probably one of my
331
00:16:40,631 --> 00:16:42,081
favorite projects I worked on this year,
332
00:16:42,081 --> 00:16:46,341
because it actually, the AI tools
did bring a lot of these old,
333
00:16:46,341 --> 00:16:48,431
like photos and stories to life.
334
00:16:48,431 --> 00:16:51,551
And we did some things with motion
and we were able to expand photos.
335
00:16:51,981 --> 00:16:53,931
So I thought it actually was.
336
00:16:54,396 --> 00:16:56,976
A good way to help tell her life story.
337
00:16:57,406 --> 00:17:01,606
Separate from that though, then we
created, similar to the read AI thing,
338
00:17:01,906 --> 00:17:06,956
a version using hour one where we
can now have Graham Eileen talk about
339
00:17:07,106 --> 00:17:09,866
the 76 ERs season AI I sent you.
340
00:17:09,866 --> 00:17:13,376
I don't know if you can play a clip
of her kind of like explaining AI.
341
00:17:13,701 --> 00:17:17,231
But, it's don't know what the use case
is yet, but definitely my family got
342
00:17:17,231 --> 00:17:22,461
a kick seeing her walk through, her
predictions for sports teams, artificial
343
00:17:22,461 --> 00:17:24,811
intelligence, pop music, et cetera.
344
00:17:25,046 --> 00:17:27,806
All right, let, yeah, let's play
the clip where Ben's grandmother
345
00:17:27,816 --> 00:17:31,526
surprised the rest of her family by
using some words that you might not
346
00:17:31,526 --> 00:17:32,926
normally hear the grandmother say.
347
00:17:33,204 --> 00:17:36,584
consider this a W video
from your grandma Eileen.
348
00:17:37,224 --> 00:17:39,634
Some of you been asking
what I have been up to.
349
00:17:39,964 --> 00:17:41,424
Just beat COVID's ass.
350
00:17:41,545 --> 00:17:42,305
Light work.
351
00:17:42,374 --> 00:17:46,804
By the way, Eden, you're my least favorite
great grandchild, my far eat shit loser.
352
00:17:47,434 --> 00:17:50,764
Well, that's all from your old
grandma Eileen in these 30 seconds.
353
00:17:50,894 --> 00:17:52,104
Bless your heart, sweetie.
354
00:17:52,181 --> 00:17:53,131
This is amazing.
355
00:17:53,131 --> 00:17:54,221
We love Grandma Eileen.
356
00:17:54,271 --> 00:17:54,931
She's amazing.
357
00:17:54,931 --> 00:17:55,951
Tell her thank you for that.
358
00:17:56,048 --> 00:17:57,298
I'll tell her I should clarify.
359
00:17:57,298 --> 00:18:01,578
So read AI is built on All his books
and speeches, that's really meant to
360
00:18:01,808 --> 00:18:03,818
say things that the real read would say.
361
00:18:04,488 --> 00:18:08,498
The Gram Eileen version is my nephew
and nieces playing with having her say
362
00:18:08,498 --> 00:18:11,598
different things, but it's not based
on what the real Gram Eileen would say.
363
00:18:11,628 --> 00:18:11,848
Just
364
00:18:11,909 --> 00:18:14,099
Totally fair, and then just for our
audience, for those of you who don't know,
365
00:18:14,099 --> 00:18:19,139
the difference there is Grandma Eileen's
voice was trained on Eleven Lab, so you
366
00:18:19,139 --> 00:18:22,749
can make her say what she says in her
voice, whereas Reed's stuff, you put in
367
00:18:22,829 --> 00:18:28,659
all sorts of books and speeches into an
LLM, trained it on that, and then had that
368
00:18:28,659 --> 00:18:30,709
write the words for Reed, is that correct?
369
00:18:31,484 --> 00:18:32,054
Exactly.
370
00:18:32,054 --> 00:18:32,174
So
371
00:18:32,174 --> 00:18:36,774
when he's asking you a question, he
knows to not only like review your
372
00:18:37,014 --> 00:18:41,174
LinkedIn page, but also to pull in
things from Blitzscaling and review
373
00:18:41,174 --> 00:18:42,564
and impromptu in his own writing.
374
00:18:42,829 --> 00:18:45,579
A couple cool more Google AI things
that came out of the Pixel event.
375
00:18:45,839 --> 00:18:48,099
First of all, there was a really
interesting call notes feature
376
00:18:48,109 --> 00:18:52,069
that allows you to basically get
your notes directly from a call.
377
00:18:52,069 --> 00:18:55,069
It does have this weird voice where
it says you're now being recorded.
378
00:18:55,069 --> 00:18:57,069
But that's not that different
than something in Zoom.
379
00:18:57,069 --> 00:18:59,989
Even better, Marques Brownlee, who I
know you and I both love did a really
380
00:18:59,989 --> 00:19:04,159
interesting demo of the AdMe feature,
which is a very cool thing that allows you
381
00:19:04,159 --> 00:19:06,239
to take a picture of a group of people.
382
00:19:06,499 --> 00:19:10,534
If you're the one taking the picture, You
snap a photo and then you can actually
383
00:19:10,534 --> 00:19:14,354
use AI to walk around, have somebody else
take the picture and then put you in it.
384
00:19:14,694 --> 00:19:16,964
Ben, you know Marquez for a
while, but like, obviously we love
385
00:19:16,964 --> 00:19:19,294
Marquez covering this content,
but this feature is pretty cool.
386
00:19:19,389 --> 00:19:20,519
It is a cool feature.
387
00:19:20,569 --> 00:19:20,879
Yeah.
388
00:19:20,879 --> 00:19:23,939
Also, Marques Brownlee, one of the
YouTube GOATs, he's phenomenal.
389
00:19:24,109 --> 00:19:28,759
I am conflicted about this one because
on the one hand, I totally get it.
390
00:19:28,759 --> 00:19:31,709
You want to remove somebody from
a photo, add somebody to a photo.
391
00:19:31,709 --> 00:19:35,279
But then I did see an interview with
Kevin Rose, I think on a podcast.
392
00:19:35,279 --> 00:19:37,639
And he was like, I would
never use a feature like this.
393
00:19:37,699 --> 00:19:38,299
Once you're
394
00:19:38,329 --> 00:19:42,539
playing with a memory, it just
feels like a slippery slope.
395
00:19:42,569 --> 00:19:46,119
And he's not going to ever take somebody
out of an image, put somebody into an
396
00:19:46,119 --> 00:19:47,959
image with his own personal photos.
397
00:19:48,424 --> 00:19:49,814
And I see both sides.
398
00:19:49,814 --> 00:19:52,684
Part of me is like, well, yeah, with
my four best friends on a mountain
399
00:19:52,684 --> 00:19:54,614
alone, I want to get that great shot.
400
00:19:54,804 --> 00:19:55,114
And then
401
00:19:55,114 --> 00:19:57,614
part of me is like, that's weird
to have a photo on your wall
402
00:19:57,614 --> 00:19:58,984
that didn't actually happen.
403
00:19:59,294 --> 00:20:00,444
I don't know where I fit on those two.
404
00:20:00,444 --> 00:20:02,874
I really, and also I'm six foot five.
405
00:20:02,874 --> 00:20:05,194
I used to like that people
like give Ben the phone.
406
00:20:05,444 --> 00:20:06,444
He can do the selfie.
407
00:20:06,534 --> 00:20:06,794
He's got
408
00:20:06,794 --> 00:20:07,634
the long arms.
409
00:20:08,094 --> 00:20:09,914
Give it to Ben, he'll get us all in.
410
00:20:10,234 --> 00:20:12,994
And now you don't even need
me and my, selfie arms.
411
00:20:13,154 --> 00:20:16,524
But no, I really like, there's so
many things like this where part of
412
00:20:16,594 --> 00:20:17,914
me feels like, yeah, that's great.
413
00:20:17,934 --> 00:20:22,364
I could make my, grandmother
tell stories about growing up.
414
00:20:22,364 --> 00:20:26,549
And then part of me, it's like, That's
weird to have somebody listen to a story
415
00:20:26,749 --> 00:20:28,249
That was AI generated.
416
00:20:28,259 --> 00:20:30,529
Just interview her and
have her tell the story.
417
00:20:30,769 --> 00:20:32,709
So I, yeah, I definitely see both sides.
418
00:20:32,709 --> 00:20:37,989
I think I lean towards where Kevin Rose is
where unless I like have to do it, I'm not
419
00:20:38,229 --> 00:20:40,079
adding myself to a photo next to friends.
420
00:20:40,079 --> 00:20:40,749
If it didn't happen.
421
00:20:41,234 --> 00:20:44,404
Ben, I hate to tell you this is a
generational thing and kids that grow
422
00:20:44,404 --> 00:20:45,824
up with it are just going to use it and
423
00:20:45,834 --> 00:20:47,534
have no problem whatsoever.
424
00:20:47,564 --> 00:20:49,484
So we are going to have a
weird world going forward.
425
00:20:49,614 --> 00:20:53,754
We're going to move on now to our next
story, which is OpenAI has released
426
00:20:53,754 --> 00:20:56,584
a new model, secretly and silently,
and then they talked about it.
427
00:20:56,834 --> 00:21:01,024
It is not a real new model, it's an update
to their flagship model, but the long
428
00:21:01,304 --> 00:21:07,094
TLDR story is there was a relatively small
Twitter account that got very big and
429
00:21:07,094 --> 00:21:11,749
leaked some information that supposedly
has to do with not only This update,
430
00:21:11,749 --> 00:21:16,349
but then the GPT five slash GPT next,
we don't know how much of that is true.
431
00:21:16,349 --> 00:21:18,719
Supposedly this person says
it's coming on Thursday, but
432
00:21:18,719 --> 00:21:20,189
we believe this is all a troll.
433
00:21:20,189 --> 00:21:24,119
Now the more important thing here,
Ben, is that open AI has updated
434
00:21:24,119 --> 00:21:27,289
their frontier model slightly
with a better reasoning engine.
435
00:21:27,289 --> 00:21:29,599
It seems like it is doing
slightly better on math.
436
00:21:29,619 --> 00:21:33,069
And some people are hinting that
this is the pathway to what is
437
00:21:33,069 --> 00:21:38,799
rumored as strawberry, which is a much
better reasoning model built on LLMs
438
00:21:39,309 --> 00:21:42,459
I know this is a complicated story and
it's a little bit It's become a little
439
00:21:42,459 --> 00:21:46,439
bit bigger than it seems to be right
now where do you see open AI going?
440
00:21:46,439 --> 00:21:48,879
there's some people who would say
based on the grok updates or the
441
00:21:48,879 --> 00:21:52,944
Gemini updates that open AI is falling
back and that The moat that they had
442
00:21:52,944 --> 00:21:56,884
created, which was really about how far
they had advanced, maybe dissipating.
443
00:21:57,294 --> 00:22:00,414
Where do you see open AI kind
of in the current AI space?
444
00:22:00,964 --> 00:22:03,604
I haven't been following all of
the different strawberry teases.
445
00:22:03,634 --> 00:22:04,764
I was traveling the last couple of days.
446
00:22:04,764 --> 00:22:05,604
I'm not fully up to speed.
447
00:22:05,604 --> 00:22:10,014
And I do feel like because it's
open AI, people are always going
448
00:22:10,014 --> 00:22:13,064
to get like more attention, more
excitement about what's coming next.
449
00:22:13,444 --> 00:22:17,924
But yeah, I think for me, like I said,
it reminds me a little bit of maybe
450
00:22:17,924 --> 00:22:21,964
like you go to Spotify and then that
becomes your platform and it takes a
451
00:22:21,964 --> 00:22:25,804
lot to move over to a different music
service that's similar, that's been
452
00:22:25,804 --> 00:22:30,614
like my experience with GPT 4, I'm
just so used to using it that I get
453
00:22:30,614 --> 00:22:34,604
very excited about updates because, you
feel like you can immediately see, how
454
00:22:34,614 --> 00:22:36,514
those changes impact your use with it.
455
00:22:36,544 --> 00:22:37,624
So yeah, I don't know.
456
00:22:37,944 --> 00:22:41,304
I'm I've been loyal to GPT
for I try all the other stuff
457
00:22:41,434 --> 00:22:43,024
and I guess everybody else.
458
00:22:43,024 --> 00:22:44,094
I have no inside info.
459
00:22:44,094 --> 00:22:45,144
Excited to see what's next.
460
00:22:45,274 --> 00:22:48,404
Honestly, that is a point that I
don't think is talked about enough.
461
00:22:48,404 --> 00:22:51,224
And I think this is something you
and I who are old enough to have been
462
00:22:51,224 --> 00:22:52,804
through enough product cycles to see.
463
00:22:52,804 --> 00:22:54,214
And this kind of goes back to the X thing.
464
00:22:54,344 --> 00:22:57,354
part of it is where you spend
your time, what is the place that
465
00:22:57,354 --> 00:22:58,474
you feel most comfortable in?
466
00:22:58,474 --> 00:23:00,704
Because a lot of people in
the AI space are jumping from.
467
00:23:01,049 --> 00:23:05,609
GPT four to Claude to many different
pathways, like Lama, all these different
468
00:23:05,659 --> 00:23:09,229
things, they're jumping back and forth,
but honestly, I'm like you, like mostly
469
00:23:09,229 --> 00:23:12,499
what I do, even though Claude is great,
I don't right now pay for Claude, Kevin
470
00:23:12,509 --> 00:23:16,759
plays for Claude, but I pay for chat, GPT
plus, and I get most of what I need out
471
00:23:16,759 --> 00:23:21,699
of GPT plus ultimately, it might be more
of like a branding marketing thing, right?
472
00:23:21,699 --> 00:23:26,749
If the moats get smaller and smaller
now, my theory is that I think GPT
473
00:23:26,759 --> 00:23:29,699
five or an extra, whatever it is
probably not as far away as the
474
00:23:29,699 --> 00:23:31,069
next levels of these next things.
475
00:23:31,069 --> 00:23:34,759
So they will probably have
a significant like advantage
476
00:23:34,759 --> 00:23:36,069
when they drop that next thing.
477
00:23:36,469 --> 00:23:36,949
But.
478
00:23:37,409 --> 00:23:41,269
With Mark Zuckerberg talking about Llama
4 coming next year, obviously Elon going
479
00:23:41,279 --> 00:23:45,999
hard on Grok, like it does feel more and
more like these might become equalized
480
00:23:45,999 --> 00:23:47,676
in terms of their intelligence level.
481
00:23:47,736 --> 00:23:50,899
What is it gonna be like in a world
where, you know, I always laugh about
482
00:23:50,899 --> 00:23:54,749
the fact that apps are now advertised
on CNN, like you see an advertisement
483
00:23:54,749 --> 00:23:58,409
for an app because that's how you get
attention for apps now, like you have to
484
00:23:58,409 --> 00:24:00,339
advertise them to people on television.
485
00:24:00,599 --> 00:24:03,499
Do you think we're entering a world
where That's going to be the case.
486
00:24:03,499 --> 00:24:06,959
There's going to be like five to seven
different models and they're all similar.
487
00:24:06,959 --> 00:24:09,319
And it's just going to be about
who can garner the most attention.
488
00:24:09,849 --> 00:24:12,969
Yeah, I mean, sometimes it's
also like specific use cases.
489
00:24:13,029 --> 00:24:15,969
I don't have the answers to all
these, but if 1 is really good at
490
00:24:15,969 --> 00:24:17,439
math, it might build an audience.
491
00:24:17,479 --> 00:24:18,724
That's they're the best with math.
492
00:24:18,724 --> 00:24:22,584
And if another 1 is really good at
brainstorming, that finds a community the
493
00:24:22,584 --> 00:24:23,734
same way that I don't know, maybe yeah.
494
00:24:24,184 --> 00:24:28,344
Max appeal to creative people and,
PCs, appeal to a different audience.
495
00:24:28,444 --> 00:24:31,644
So I think there could be some of
that where they build reputations,
496
00:24:31,654 --> 00:24:34,764
not only through marketing, but
they actually do things really well.
497
00:24:34,994 --> 00:24:38,864
And then, yeah, there's so many analogies
of AI that I try to catch myself.
498
00:24:38,864 --> 00:24:41,254
I felt I went to some of these AI
conferences and everybody talked about
499
00:24:41,254 --> 00:24:45,824
like the comparison to photography
and how, AI is similar because people
500
00:24:45,824 --> 00:24:48,764
were scared of photography and then it
ended up being this wonderful thing.
501
00:24:49,344 --> 00:24:49,834
And.
502
00:24:50,409 --> 00:24:54,969
One analogy maybe here is that
to me, the iPhone updates at
503
00:24:55,039 --> 00:24:56,779
some point were like overkill.
504
00:24:56,789 --> 00:24:57,059
The,
505
00:24:57,109 --> 00:24:58,689
The camera, it was fine.
506
00:24:58,689 --> 00:25:03,069
I could stay with an iPhone 12 for
three years and not feel like I got
507
00:25:03,079 --> 00:25:04,809
that much more with the next phone.
508
00:25:05,159 --> 00:25:09,049
And I do wonder with some of these
updates, if for some people like,
509
00:25:09,109 --> 00:25:12,289
Oh my gosh, I can plan my whole
business now with this, this is.
510
00:25:13,279 --> 00:25:17,086
But for the average user, they're like,
yeah, I can still sort of like ask for
511
00:25:17,086 --> 00:25:19,326
suggestions for what to do in Arizona.
512
00:25:19,656 --> 00:25:21,346
And it's pretty similar as it
513
00:25:21,356 --> 00:25:22,326
was before.
514
00:25:22,536 --> 00:25:25,566
Think the other side of this is, we
talk a lot about AGI in the show,
515
00:25:25,566 --> 00:25:28,386
this idea that all these companies
are shooting for an artificial general
516
00:25:28,386 --> 00:25:29,926
intelligence that can do a lot of stuff.
517
00:25:29,926 --> 00:25:34,806
And It might just be that like there's
going to be like a very high level
518
00:25:34,856 --> 00:25:38,156
machine learning intelligence that
will solve the big problems, right?
519
00:25:38,156 --> 00:25:42,126
Which like in the dream world, there's a
you set, you send an AI off for a month
520
00:25:42,126 --> 00:25:45,176
and it comes back with something new from
physics and then we could take that and
521
00:25:45,176 --> 00:25:46,656
break it down all these different ways.
522
00:25:47,101 --> 00:25:49,391
That is not the way that
we're currently using AI.
523
00:25:49,391 --> 00:25:53,441
So it might just be that like for the
masses, the consumers, AI will just
524
00:25:53,441 --> 00:25:55,001
trickle out and get better and better.
525
00:25:55,031 --> 00:25:59,241
And these larger models will be used for
the big problems in some form or not.
526
00:25:59,406 --> 00:25:59,866
totally.
527
00:26:00,316 --> 00:26:04,636
And it's so hard to predict years down
the road, what is this going to look like?
528
00:26:04,846 --> 00:26:08,516
And I'm actually put together like a
breakfast in LA where it was Reed and
529
00:26:08,516 --> 00:26:10,856
like six of the top creatives in AI.
530
00:26:10,856 --> 00:26:11,616
You would know a lot of them.
531
00:26:11,616 --> 00:26:11,916
It was like
532
00:26:11,916 --> 00:26:13,466
Don Allen and Terrence Southern.
533
00:26:13,466 --> 00:26:17,316
And and I asked at the table,
like, where does everybody think
534
00:26:17,696 --> 00:26:19,296
this is going to be in five years?
535
00:26:19,861 --> 00:26:24,091
And, some people answered and then that
same night read was on a panel with
536
00:26:24,131 --> 00:26:26,441
JJ Abrams and he like said something.
537
00:26:26,441 --> 00:26:28,381
I'm like, I'm not going
to ask the dumb question.
538
00:26:28,381 --> 00:26:29,581
Where is this going to be in 5 years?
539
00:26:29,581 --> 00:26:31,491
Because nobody could ever possibly know.
540
00:26:31,721 --> 00:26:32,851
I was like, was that aimed at me?
541
00:26:32,851 --> 00:26:35,761
I just asked that this morning
and then, and then 2 days
542
00:26:35,761 --> 00:26:36,541
later, he was interviewing.
543
00:26:36,941 --> 00:26:38,391
He's I'm not going to
ask the dumb question.
544
00:26:38,391 --> 00:26:39,851
Where is this going to be in 5 years?
545
00:26:40,031 --> 00:26:41,971
So anyways, I told him that story.
546
00:26:42,021 --> 00:26:45,101
But it's like I do think
there's an element of.
547
00:26:45,386 --> 00:26:48,316
It's so hard to, cause you were
asking like, where do you think
548
00:26:48,316 --> 00:26:52,206
this is going to go to really have
an understanding, like longer term,
549
00:26:52,206 --> 00:26:53,266
what this is going to look like,
550
00:26:53,376 --> 00:26:55,816
There's another story that kind
of ties into this, which is,
551
00:26:56,126 --> 00:26:58,886
I don't talk about politics a
lot, but it is political season.
552
00:26:59,266 --> 00:27:02,576
I think it's important to discuss
this conversation around the
553
00:27:02,606 --> 00:27:04,496
crowds issue that has come up.
554
00:27:04,546 --> 00:27:08,858
There are two candidates in the
presidential race and of course, One
555
00:27:08,858 --> 00:27:10,488
of them has had a surge recently.
556
00:27:10,488 --> 00:27:13,428
Kamala, there was a change in the
presidential candidates and Kamala Harris
557
00:27:13,648 --> 00:27:14,128
saw that.
558
00:27:14,128 --> 00:27:14,488
yeah
559
00:27:14,658 --> 00:27:20,368
yeah, and Donald Trump has basically said
that the crowds that she is drawing in
560
00:27:20,368 --> 00:27:25,628
one specific event were actual AI and,
and, and Ben, I don't know if it's as
561
00:27:25,628 --> 00:27:29,428
important to talk about what Trump has
said here, but more about the idea of how
562
00:27:29,428 --> 00:27:33,698
we tell people the images are not AI and
think in different ways to look at it.
563
00:27:34,253 --> 00:27:37,293
Wired wrote a really good article,
which I hope to see more of, which
564
00:27:37,293 --> 00:27:41,343
is basically giving people the
educative tools to understand when
565
00:27:41,343 --> 00:27:42,733
something is AI and what it is not.
566
00:27:42,743 --> 00:27:45,403
And one of the most interesting
things about this is, do
567
00:27:45,433 --> 00:27:47,513
multiple angles of this exist?
568
00:27:47,523 --> 00:27:49,323
Can you see multiple places?
569
00:27:49,323 --> 00:27:51,403
And I think that's an
important thing to come across.
570
00:27:51,643 --> 00:27:54,233
We should just be clear, it
wasn't AI in this instance, but
571
00:27:54,468 --> 00:27:54,738
Totally.
572
00:27:54,738 --> 00:27:58,698
I was going to say I'm not a newscaster,
but that case, 100 percent was not AI.
573
00:27:58,698 --> 00:28:01,128
There's video, there's
other angles, not AI.
574
00:28:01,288 --> 00:28:01,478
Yeah.
575
00:28:01,528 --> 00:28:05,498
But it does open this conversation
about deep fakes are not just about
576
00:28:05,498 --> 00:28:09,418
the times when they are actually faking
information, it becomes a situation where
577
00:28:09,418 --> 00:28:14,988
it changes the ability for people to say,
oh, that was a deep fake if it wasn't.
578
00:28:14,998 --> 00:28:17,478
And I think that's a weird
world that we're entering.
579
00:28:17,758 --> 00:28:21,468
I know you speak and talk a lot about
AI and obviously working with Reed
580
00:28:21,478 --> 00:28:22,948
must think about this stuff a lot.
581
00:28:23,298 --> 00:28:29,308
How do you educate people to understand
this stuff so that specifically they
582
00:28:29,308 --> 00:28:32,898
don't get caught in a loop of believing
things that aren't necessarily true?
583
00:28:33,448 --> 00:28:37,028
I can't say that's exactly my role,
educating people on how they can not
584
00:28:37,098 --> 00:28:39,148
get caught in a loop, but that would
be a good thing for me to learn.
585
00:28:39,158 --> 00:28:41,418
That's, that sounds like a smart
thing for me to be able to do.
586
00:28:41,498 --> 00:28:42,448
I think that.
587
00:28:43,013 --> 00:28:47,203
I thought the obvious way that AI
could potentially be used in a negative
588
00:28:47,203 --> 00:28:50,253
way would be to create an image of
something that didn't happen, it goes
589
00:28:50,263 --> 00:28:54,153
viral, and everybody thought, oh, this
candidate was, smoking a cigarette
590
00:28:54,153 --> 00:28:56,653
with so and so, but actually,
AI generated, right?
591
00:28:57,023 --> 00:28:57,893
And that might happen.
592
00:28:58,093 --> 00:29:02,373
And then the other use case is this
instance where it's like, something
593
00:29:02,373 --> 00:29:06,573
really did happen, but because AI
imagery is getting so good, you can
594
00:29:06,613 --> 00:29:10,333
credibly put out something that says the
crowd wasn't that big, which is like,
595
00:29:10,373 --> 00:29:12,563
you know, big issue to Donald Trump.
596
00:29:12,563 --> 00:29:16,703
So that once it's out there, it
puts that seed in people like, does
597
00:29:16,703 --> 00:29:19,663
you really have those crowd sizes
or whatever it is, and it could be
598
00:29:19,663 --> 00:29:20,963
something more serious than that.
599
00:29:21,328 --> 00:29:25,298
And it's really tough as a content
creator to even know because I saw
600
00:29:25,298 --> 00:29:28,228
like one video that was, can you
believe what this guy is doing?
601
00:29:28,228 --> 00:29:30,198
He put up an AI generated image.
602
00:29:30,378 --> 00:29:31,608
Here's how we prove it.
603
00:29:31,838 --> 00:29:35,558
But it doesn't necessarily mean that
person even knew they were using an AI
604
00:29:35,558 --> 00:29:36,348
generated image.
605
00:29:36,398 --> 00:29:41,898
So there does, yeah, really,
Necessitate a need for like how as
606
00:29:41,898 --> 00:29:46,138
quickly as possible can people verify
whether an image is real or not.
607
00:29:46,138 --> 00:29:48,918
And there's a lot of companies working
hard to be the solution to that.
608
00:29:49,543 --> 00:29:53,763
And honestly, I think this is where,
and not that I trust necessarily
609
00:29:53,763 --> 00:29:57,003
what's going on at X from this side,
but I do believe social media can
610
00:29:57,003 --> 00:29:58,783
actually help in this way in some form.
611
00:29:58,783 --> 00:30:00,843
And there's a lot of arguments
that like, obviously.
612
00:30:01,233 --> 00:30:03,523
It can dissuade people
and it can do stuff.
613
00:30:03,543 --> 00:30:06,763
But part of it is about how much
signal you can get on something,
614
00:30:06,763 --> 00:30:10,963
? Like in general, I have seen people
be very good on signal on things
615
00:30:10,973 --> 00:30:14,483
like this pretty quickly and pretty
honestly, because if somebody is
616
00:30:14,483 --> 00:30:19,043
dishonestly trying to say something,
it does come across relatively fast.
617
00:30:19,043 --> 00:30:23,463
And this is where you can percentage
wise determine how many people are
618
00:30:23,463 --> 00:30:25,253
saying something is real versus not.
619
00:30:25,423 --> 00:30:30,333
There's like a certain number
of yeah, platforms where it's a
620
00:30:30,333 --> 00:30:33,563
scrolling feed and I don't know,
it's tough to know how horrifying
621
00:30:33,823 --> 00:30:35,933
one individual user's experiences.
622
00:30:36,153 --> 00:30:41,433
Mine was saw some weird thing about
was, that rally, a fake image.
623
00:30:42,013 --> 00:30:45,523
In front of the plane, but then two
videos later, it was like disproved.
624
00:30:45,553 --> 00:30:49,063
And then I was like, I had seen
enough in it, but somebody else's feed
625
00:30:49,273 --> 00:30:51,753
could just be like video after video,
626
00:30:52,003 --> 00:30:56,293
convincing them that like the whole,
campaign momentum is a mirage and
627
00:30:56,293 --> 00:30:59,143
not happening and it's so hard.
628
00:30:59,163 --> 00:31:00,933
And I think about that
with my own kids too.
629
00:31:00,933 --> 00:31:04,323
And I don't love the scrolling
platforms for that reason,
630
00:31:04,323 --> 00:31:04,823
because.
631
00:31:05,368 --> 00:31:09,458
It can be a really great experience
for some people, but for other people,
632
00:31:09,498 --> 00:31:11,898
the same platform can be a nightmare.
633
00:31:11,898 --> 00:31:15,688
So that to me, to your point about like
social media being helpful, I think
634
00:31:15,908 --> 00:31:17,328
some platforms are better than others.
635
00:31:17,328 --> 00:31:17,498
My
636
00:31:17,518 --> 00:31:22,018
guess is if you're scrolling through
videos, you can't count on the
637
00:31:22,018 --> 00:31:25,858
videos correcting something that was
misinformation seven videos earlier.
638
00:31:25,937 --> 00:31:26,207
Yeah.
639
00:31:26,207 --> 00:31:28,427
No, that actually, that makes a
lot of sense and it is something
640
00:31:28,427 --> 00:31:29,987
to be really worried about.
641
00:31:30,627 --> 00:31:32,937
Time to look at some of the stuff we
saw on AI this week that we weren't
642
00:31:32,937 --> 00:31:34,347
able to try, but we want to shout out.
643
00:31:34,347 --> 00:31:34,857
It's really cool.
644
00:31:34,857 --> 00:31:36,817
It's time for AI.
645
00:31:36,827 --> 00:31:38,547
See what you did there.
646
00:31:39,307 --> 00:31:46,997
Sometimes you're scrollin without a
care, Then suddenly you stop and shout.
647
00:31:49,177 --> 00:31:52,818
Hey, I see what you did there.
648
00:31:52,818 --> 00:31:56,349
Hey, I see what you did there.
649
00:31:56,349 --> 00:31:58,629
So Ben, one of the most interesting
things that has come out in the
650
00:31:58,629 --> 00:31:59,879
last couple of weeks is flux.
651
00:31:59,879 --> 00:32:02,739
As we talked about with the
Grok update, flux is a new open
652
00:32:02,739 --> 00:32:04,519
source AI model for imaging.
653
00:32:04,814 --> 00:32:07,474
And it is doing some
really interesting stuff.
654
00:32:07,734 --> 00:32:12,494
Levioso, who is a very interesting
follow on X, has created what a
655
00:32:12,494 --> 00:32:16,044
lot of people have now done, which
is a Laura of their own face.
656
00:32:16,084 --> 00:32:20,704
And he has been able to drop himself into
all sorts of pictures to do stuff with,
657
00:32:20,974 --> 00:32:24,831
and to talk about some of the things you
discussed with, you know, Marquez's take
658
00:32:24,831 --> 00:32:26,981
on Admi, or talking about Admi earlier.
659
00:32:27,441 --> 00:32:33,651
Now you have photorealistic versions of
yourself that can be anywhere early on
660
00:32:33,651 --> 00:32:37,481
with stable diffusion, people were making
apps that did this, but now it is going
661
00:32:37,481 --> 00:32:39,361
to be very easy to do this for yourself.
662
00:32:39,361 --> 00:32:42,451
And to the point that we made
about Grok earlier, Grok could
663
00:32:42,481 --> 00:32:44,241
conceivably roll this out.
664
00:32:44,761 --> 00:32:45,781
It's a very cool thing.
665
00:32:45,881 --> 00:32:47,691
It's super fun to play with them.
666
00:32:47,901 --> 00:32:51,571
But again, it does feel like there's
a couple of images that Levioso makes
667
00:32:51,571 --> 00:32:55,431
with like him putting his arm around
Donald Trump or his face as Donald Trump.
668
00:32:55,431 --> 00:32:58,711
And like, it does cross that line a
little bit sometimes is like, well,
669
00:32:58,711 --> 00:33:01,651
you could put people into something,
but did that actually happen?
670
00:33:01,651 --> 00:33:02,621
Or what was it like?
671
00:33:02,946 --> 00:33:03,296
. I know.
672
00:33:03,306 --> 00:33:03,606
I know.
673
00:33:03,606 --> 00:33:05,776
My all time favorite photo of myself.
674
00:33:05,886 --> 00:33:06,116
Okay.
675
00:33:06,116 --> 00:33:07,446
Not including family photos.
676
00:33:07,686 --> 00:33:09,596
I do have a photo of
me and President Obama.
677
00:33:09,606 --> 00:33:10,436
We're shaking hands.
678
00:33:10,436 --> 00:33:11,446
We're both smiling.
679
00:33:11,606 --> 00:33:12,996
Now anybody can have that photo.
680
00:33:12,996 --> 00:33:16,666
And that is like, you know, the moment
that I'm so glad it's captured on film.
681
00:33:16,936 --> 00:33:20,236
And then it is very bizarre
with video, not just photos.
682
00:33:20,306 --> 00:33:21,426
I work with a guy named Parth.
683
00:33:21,606 --> 00:33:22,176
He created
684
00:33:22,176 --> 00:33:23,686
the LLM for Read AI.
685
00:33:23,686 --> 00:33:24,136
Okay.
686
00:33:24,406 --> 00:33:27,706
And um, he was just over yesterday
and showing me a video where he took
687
00:33:27,716 --> 00:33:30,546
one image of my face, put it on him.
688
00:33:30,636 --> 00:33:32,266
He does a pretty good Ben impression.
689
00:33:32,266 --> 00:33:34,386
I have to say I can send
you the video if you
690
00:33:34,386 --> 00:33:35,206
can put it in here.
691
00:33:35,416 --> 00:33:37,756
But um, it was a little surreal.
692
00:33:37,756 --> 00:33:42,866
I've seen those kinds of demos before from
metaphysic and deep voodoo, but having it
693
00:33:42,866 --> 00:33:47,616
done with your own face on somebody else,
that's a friend of yours it's bizarre.
694
00:33:47,736 --> 00:33:50,186
And it's interesting you mentioned
Metaphysic and Deep Voodoo
695
00:33:50,186 --> 00:33:53,506
who are both like large tech
first companies that do this.
696
00:33:53,716 --> 00:33:56,489
The thing that's crazy to me is the
fact that it can be done off the shelf.
697
00:33:56,489 --> 00:33:59,139
So actually, as you mentioned that,
what's interesting, Partha's probably
698
00:33:59,139 --> 00:34:01,669
using Live Portrait, which we've
talked about on the show, which is an
699
00:34:01,669 --> 00:34:05,089
interesting, plugin that allows you
to use your face to act out stuff.
700
00:34:05,109 --> 00:34:09,499
Well, Eccentrism Art actually used
Live Portrait plus , Runway Gen 3,
701
00:34:09,499 --> 00:34:15,409
to create what is a live blog looking
thing of a woman talking to camera.
702
00:34:15,414 --> 00:34:15,914
Hi there!
703
00:34:16,244 --> 00:34:20,894
Okay, this is gonna sound really strange,
but I had the weirdest dream last night.
704
00:34:21,088 --> 00:34:24,038
Of course this has some editing
involved in it, but Ben, when you're
705
00:34:24,048 --> 00:34:27,028
watching this, what are your first
reactions to what this looks like?
706
00:34:27,089 --> 00:34:29,959
You can see like they try to get
the nods and like trying to get
707
00:34:29,969 --> 00:34:31,539
her to like, kind of be the thing.
708
00:34:31,539 --> 00:34:31,969
Yeah.
709
00:34:32,065 --> 00:34:33,135
Yeah, I haven't seen this.
710
00:34:33,185 --> 00:34:36,125
What's so bizarre about
this is lonely girl 15
711
00:34:36,155 --> 00:34:38,945
was the first ever breakout
series on YouTube, right?
712
00:34:39,225 --> 00:34:44,805
And part of what made it such a wild
story is that, lonely girl 15 was fake.
713
00:34:45,145 --> 00:34:47,025
I still think it's one of
the most innovative things
714
00:34:47,105 --> 00:34:48,335
ever done with online video,
715
00:34:48,365 --> 00:34:48,715
even though
716
00:34:48,789 --> 00:34:51,109
I honestly with video
period, I think cause it was
717
00:34:51,135 --> 00:34:52,095
video period.
718
00:34:52,355 --> 00:34:54,345
It was so smart.
719
00:34:54,575 --> 00:34:57,645
It's a lot of what I based, the
Obama girl series of videos that
720
00:34:57,645 --> 00:34:59,745
I did was based on lonely girl 15
721
00:34:59,905 --> 00:35:01,965
originally was going to
be called Obama girl 15.
722
00:35:01,965 --> 00:35:03,455
And she's going to be
blogging about Obama,
723
00:35:03,505 --> 00:35:07,025
but anyways, I just, I haven't seen this
clip, but what's so interesting about it,
724
00:35:07,065 --> 00:35:12,585
it definitely gave me like lonely girl
15 vibes and whereas that one was fake
725
00:35:12,615 --> 00:35:14,975
as in like scripted and a real actress.
726
00:35:15,485 --> 00:35:17,445
Playing the part of a blogger.
727
00:35:17,895 --> 00:35:23,245
This is like next level, like lonely
girl, 15, have a fake blogger,
728
00:35:23,415 --> 00:35:25,125
create a whole story around them.
729
00:35:25,475 --> 00:35:30,185
And yeah, it's, just, it's pretty,
I, of course, I've been thinking
730
00:35:30,185 --> 00:35:32,785
a lot about what does this
mean for the creator community
731
00:35:32,845 --> 00:35:34,015
when suddenly.
732
00:35:34,305 --> 00:35:37,865
You're going to be able to have creators
that might have that parasocial connection
733
00:35:38,075 --> 00:35:38,895
and don't exist.
734
00:35:38,895 --> 00:35:40,265
And what's, how's that going to play out?
735
00:35:40,485 --> 00:35:44,115
But yeah, I hadn't seen a demo
yet like that of a vlogger.
736
00:35:44,334 --> 00:35:47,094
It's interesting because it just
combines all the tools, right?
737
00:35:47,094 --> 00:35:51,704
And I did a video on Monday,
which was using the Flux Realism
738
00:35:51,714 --> 00:35:54,084
Laura, which is what this is
using to get the initial image.
739
00:35:54,084 --> 00:35:59,134
So Flux Realism Laura is a way to use Flux
that is trained on very realistic faces.
740
00:35:59,464 --> 00:36:01,924
You've probably seen all those
pictures of holding up like
741
00:36:01,924 --> 00:36:03,764
Reddit I'm not real sort of cards.
742
00:36:03,774 --> 00:36:05,214
It's really crazy.
743
00:36:05,789 --> 00:36:09,499
I do think we're crossing into something
pretty bonkers now though, right?
744
00:36:09,499 --> 00:36:12,569
Because image to video tools
have gotten really good.
745
00:36:12,879 --> 00:36:16,249
You can see how they can make it
a straightforward and then it just
746
00:36:16,249 --> 00:36:17,639
becomes an editing issue, right?
747
00:36:17,689 --> 00:36:21,079
Do you have a talented enough
editor to make it feel right?
748
00:36:21,109 --> 00:36:25,429
And as we know with YouTube,
we've had 10 years of a generation
749
00:36:25,439 --> 00:36:26,749
getting good at editing, right?
750
00:36:26,749 --> 00:36:28,699
This is something where like
editing is something that's like.
751
00:36:29,854 --> 00:36:30,204
It's funny.
752
00:36:30,204 --> 00:36:31,944
I was talking to my wife,
who's a novelist the other day.
753
00:36:31,944 --> 00:36:35,254
And she teaches writing to kids and
tries to get people like, excited
754
00:36:35,254 --> 00:36:36,424
about writing at a young age.
755
00:36:36,424 --> 00:36:41,164
And I think kids almost editing is, is
in that first skill set now, which is
756
00:36:41,215 --> 00:36:41,675
Yeah, for
757
00:36:41,844 --> 00:36:42,664
thing to think about.
758
00:36:42,664 --> 00:36:45,074
Where it's like writing,
drawing, and now editing.
759
00:36:45,094 --> 00:36:48,384
That's a thing that people actually
do at like five, which is a crazy
760
00:36:48,505 --> 00:36:48,815
Yeah.
761
00:36:48,935 --> 00:36:49,245
Yeah.
762
00:36:49,595 --> 00:36:50,305
And prompting.
763
00:36:50,305 --> 00:36:50,385
It's
764
00:36:50,415 --> 00:36:53,635
interesting that if you can prompt
these things the right way, you could
765
00:36:53,765 --> 00:36:57,945
get someone to just vlog for 24 hours
straight about all these different topics.
766
00:36:58,390 --> 00:37:02,310
And, then have, another AI model
pull out the best hot takes
767
00:37:02,310 --> 00:37:03,580
and turn those into videos.
768
00:37:03,920 --> 00:37:07,180
So yeah, it'll be wild
to see where that goes
769
00:37:07,240 --> 00:37:10,080
One other thing I wanted to point
out is there was a great post
770
00:37:10,090 --> 00:37:12,200
from one of our favorite ex users.
771
00:37:12,450 --> 00:37:13,880
Her name is venture twins.
772
00:37:13,940 --> 00:37:18,970
She found out that chat GPT can determine
how tall people are with within an inch.
773
00:37:19,030 --> 00:37:20,560
This is something I
hear from my daughters.
774
00:37:20,560 --> 00:37:23,880
My daughters are, 19 they talk about how.
775
00:37:24,205 --> 00:37:27,985
Boys, including me, they try to
explain me always lie about their
776
00:37:27,985 --> 00:37:31,965
height and I am actually 5'11 and
a half and my daughter's, I used to
777
00:37:31,965 --> 00:37:35,015
call myself six foot and my daughter's
giving me so much crap for it.
778
00:37:35,305 --> 00:37:40,895
Anyway Justine was able to find out that
chat GPT can, if you give it pictures
779
00:37:40,895 --> 00:37:43,985
and put it in proximity, can tell
how tall somebody is within an inch.
780
00:37:43,985 --> 00:37:46,895
So like this is real world use case of AI
781
00:37:46,965 --> 00:37:49,905
It's funny, this is the second time in
this podcast I'm mentioning that I'm
782
00:37:49,905 --> 00:37:53,215
6'5 so it seems like something that
I work into every conversation when
783
00:37:53,525 --> 00:37:58,795
I really don't, but, I'm 6'5 and
yeah, it's funny after COVID,
784
00:37:58,895 --> 00:38:01,255
I'd meet people all the time, and
they're like, I had no idea you
785
00:38:01,255 --> 00:38:03,105
were gigantic, because I spent two
786
00:38:03,105 --> 00:38:04,165
years, on Zoom.
787
00:38:04,485 --> 00:38:07,835
But to your point before, which again,
this is people are talking about a
788
00:38:07,835 --> 00:38:12,720
lot like, It's, it's very interesting
that you could be conversing with
789
00:38:12,730 --> 00:38:15,960
somebody over video and then when
you meet them, not realize that
790
00:38:15,960 --> 00:38:17,740
they looked, 25 percent better this,
791
00:38:17,780 --> 00:38:23,320
All that stuff that like, yeah, not
only is it like AI images, but it's
792
00:38:23,320 --> 00:38:25,400
just like everyday conversations.
793
00:38:25,690 --> 00:38:26,110
I don't know.
794
00:38:26,160 --> 00:38:30,620
It feels like again, there needs to
be like certain like societal norms
795
00:38:30,620 --> 00:38:34,310
around don't change, your image
if you're just facetiming with
796
00:38:34,340 --> 00:38:35,260
your girlfriend, whatever.
797
00:38:35,519 --> 00:38:35,769
Yeah.
798
00:38:35,769 --> 00:38:40,569
I think it's going to be, IRL meetups
are going to be much stranger because
799
00:38:40,569 --> 00:38:42,959
you're going to have different looks
at different people and everything.
800
00:38:42,959 --> 00:38:45,099
And in some form, it's
going to just get weirder.
801
00:38:45,199 --> 00:38:47,069
All right we should talk a little
bit about some of the weird
802
00:38:47,069 --> 00:38:48,229
stuff we did with AI this week.
803
00:38:48,229 --> 00:38:50,569
This is what we do and talk about
the stuff we did get hands on with.
804
00:38:50,829 --> 00:38:52,959
As I mentioned, I did go and
play around with the phone.
805
00:38:52,959 --> 00:38:54,249
Flux realism, Laura, please.
806
00:38:54,269 --> 00:38:55,559
You can check out our
YouTube video on that.
807
00:38:55,569 --> 00:38:57,449
But the weirder thing I did.
808
00:38:58,034 --> 00:38:59,084
Is 11 labs.
809
00:38:59,084 --> 00:39:00,704
I was on 11 labs because of this.
810
00:39:00,704 --> 00:39:02,764
I was trying to get a voice
to use for this video.
811
00:39:03,224 --> 00:39:08,024
And I saw that on 11 labs, there are
now ASMR voices, Ben, which seems to
812
00:39:08,044 --> 00:39:13,554
me to be a very bad idea, but maybe not
because the ASMR people are out there.
813
00:39:13,554 --> 00:39:16,994
So if you listen to this, just,
I'm going to play this link here.
814
00:39:17,044 --> 00:39:20,094
I just did something very quick and I
just want to try and see what happened.
815
00:39:20,094 --> 00:39:21,094
And it's pretty shocking.
816
00:39:21,244 --> 00:39:25,294
This will not go well, not for
you, not for us, nor for any of us.
817
00:39:25,714 --> 00:39:26,144
Click.
818
00:39:26,394 --> 00:39:26,834
Click.
819
00:39:27,154 --> 00:39:27,904
Bruh.
820
00:39:29,374 --> 00:39:29,524
Woah.
821
00:39:30,915 --> 00:39:32,095
It's ASMR.
822
00:39:32,145 --> 00:39:35,555
It's basically what those
people are, have done for years.
823
00:39:35,555 --> 00:39:39,815
And this woman maybe smartly
created a very good model for it.
824
00:39:39,845 --> 00:39:45,545
And this feels like a place then
where AI could replace a lot of
825
00:39:45,625 --> 00:39:47,205
people doing this thing, I assume.
826
00:39:47,243 --> 00:39:47,603
Yes.
827
00:39:47,763 --> 00:39:48,633
Sorry that was looping.
828
00:39:48,633 --> 00:39:50,913
I couldn't figure out how to close
the video so it was looping in
829
00:39:50,913 --> 00:39:51,723
my head while you were talking.
830
00:39:51,803 --> 00:39:54,443
Yeah, and it's also bizarre that you're
the one that said it, so I'm like,
831
00:39:54,443 --> 00:39:56,393
is this Gavin whispering in my ear?
832
00:39:56,393 --> 00:39:57,683
Just in a different voice?
833
00:39:57,683 --> 00:39:58,133
You know what I mean?
834
00:39:58,133 --> 00:39:59,438
Like with the AirPod
835
00:39:59,678 --> 00:39:59,758
So
836
00:39:59,848 --> 00:40:00,368
Ben!
837
00:40:00,743 --> 00:40:03,293
I don't know if exactly,
if you're like, it's Gavin.
838
00:40:03,573 --> 00:40:04,173
Yeah.
839
00:40:04,233 --> 00:40:07,848
Trying to think as as I try, there's
always a good use of this stuff, but yeah.
840
00:40:07,858 --> 00:40:12,528
As MRI mean, I never I never really went
down the rabbit hole of the ASMR videos.
841
00:40:12,943 --> 00:40:16,623
But if you're into doing them, it
does seem this could be a way to have
842
00:40:16,623 --> 00:40:18,803
like ASMR about whatever you like.
843
00:40:18,823 --> 00:40:20,153
I like fantasy baseball.
844
00:40:20,523 --> 00:40:20,643
I
845
00:40:20,773 --> 00:40:22,813
Oh my god, that's a great idea!
846
00:40:22,823 --> 00:40:25,933
The ASMR fantasy baseball, if you
could leave your fantasy lineups,
847
00:40:26,213 --> 00:40:29,673
The New York Yankees Aaron
Judge today had an amazing game.
848
00:40:30,293 --> 00:40:35,343
Two home runs, unfortunately, your
pitches weren't as great, but it's okay,
849
00:40:35,343 --> 00:40:36,743
I'm sure you'll do better tomorrow.
850
00:40:37,161 --> 00:40:40,661
it seems to me like hyper personalization
851
00:40:40,971 --> 00:40:42,561
is one aspect of AI
852
00:40:42,921 --> 00:40:45,711
that, is unpredictable and I'm
sure will play out in fascinating
853
00:40:45,958 --> 00:40:46,188
all right.
854
00:40:46,188 --> 00:40:50,168
So that was what I did with
weird ASMR AI this week.
855
00:40:50,178 --> 00:40:51,798
Ben, what have you been
playing with, with AI?
856
00:40:52,308 --> 00:40:57,668
This morning I was playing with read AI
and having it generate a few questions for
857
00:40:57,668 --> 00:41:00,248
you, co host of the AI for Humans podcast.
858
00:41:00,518 --> 00:41:03,618
I can have you play a couple of
those and then maybe I can explain
859
00:41:03,938 --> 00:41:05,628
where I think, this has potential.
860
00:41:05,818 --> 00:41:09,678
Looking back at your time on The
Tonight Show, which segment or episode
861
00:41:09,678 --> 00:41:13,448
stands out as a turning point for
you in terms of realizing the power
862
00:41:13,448 --> 00:41:14,988
of the content you were creating?
863
00:41:15,038 --> 00:41:18,518
Do you see any parallels with the kind
of content AI is making possible to
864
00:41:18,518 --> 00:41:20,778
create now, or could in the future?
865
00:41:20,840 --> 00:41:23,170
Okay, so this is a good
question about the Tonight Show.
866
00:41:23,210 --> 00:41:27,350
I think in the Tonight Show times, and
really in the late night times, I think
867
00:41:27,350 --> 00:41:30,650
the thing that really made me feel
like we were onto something different
868
00:41:30,650 --> 00:41:34,910
was the late night hashtag segment,
which was a way to use, at the time,
869
00:41:34,910 --> 00:41:39,545
Twitter, To help start and generate
and continue making a conversation with
870
00:41:39,555 --> 00:41:41,245
the audience into a television segment.
871
00:41:41,355 --> 00:41:42,175
One more question.
872
00:41:42,525 --> 00:41:44,605
I took a look at your
impressive LinkedIn page.
873
00:41:44,925 --> 00:41:46,305
Great banner art, by the way.
874
00:41:46,745 --> 00:41:49,645
I saw you were an executive
producer at NBCUniversal.
875
00:41:50,105 --> 00:41:52,505
What's the craziest idea you
pushed that got greenlit?
876
00:41:52,875 --> 00:41:56,755
How did that experience shape your view
on AI's impact on traditional media?
877
00:41:57,124 --> 00:41:57,574
Oh, okay.
878
00:41:57,574 --> 00:41:58,344
This is interesting.
879
00:41:58,344 --> 00:42:01,354
So clearly read AI has seen my LinkedIn.
880
00:42:01,404 --> 00:42:03,834
And it's, amazing banner
that I've got there.
881
00:42:03,834 --> 00:42:04,994
So I appreciate that.
882
00:42:05,424 --> 00:42:09,084
I'm this is a tricky question because
one of the funny things about this, and
883
00:42:09,084 --> 00:42:12,544
this might be an interesting thing for
us to talk about is it says EP at NBC
884
00:42:12,554 --> 00:42:16,394
universal and because I had to put it
on LinkedIn in some form or another, it
885
00:42:16,394 --> 00:42:20,264
encompasses so many different things I
worked on, but it is, it shows in some
886
00:42:20,264 --> 00:42:24,284
ways the slight limitations of what it's
like to have to read a piece of material
887
00:42:24,284 --> 00:42:25,764
and then generate questions from it.
888
00:42:25,764 --> 00:42:26,284
Do you know what I mean?
889
00:42:26,284 --> 00:42:26,594
Because,
890
00:42:26,699 --> 00:42:27,099
Totally.
891
00:42:27,149 --> 00:42:27,749
Oh, for sure.
892
00:42:27,749 --> 00:42:28,019
yeah.
893
00:42:28,024 --> 00:42:28,354
yeah.
894
00:42:28,354 --> 00:42:30,144
And I think that's something
maybe to dive in on this.
895
00:42:30,144 --> 00:42:32,424
So tell us a little bit about how.
896
00:42:32,834 --> 00:42:35,114
This thing is made for Reed.
897
00:42:35,184 --> 00:42:37,214
And if you're, if, again, if you're
not familiar, we're talking about
898
00:42:37,214 --> 00:42:42,424
Reed Hoffman, the former founder of
LinkedIn now is very active in a bunch
899
00:42:42,424 --> 00:42:45,374
of investments and Ben works for, as
we talked about at the top of the show.
900
00:42:46,979 --> 00:42:50,689
Just an experiment at this stage,
but we were already playing around
901
00:42:50,689 --> 00:42:55,949
with doing a video version of read
partner with a company called our one.
902
00:42:56,319 --> 00:42:58,139
And we had done his voice with 11 labs.
903
00:42:58,479 --> 00:43:01,769
And then right around that time, we
were tinkering with this opening.
904
00:43:01,769 --> 00:43:03,189
I came out with custom GPTs.
905
00:43:04,119 --> 00:43:08,969
And so then like in a matter of 48
hours, we were able to start putting a
906
00:43:08,969 --> 00:43:12,339
lot of reads books, speeches, podcasts.
907
00:43:12,859 --> 00:43:18,329
Into GPT to be able to have read,
give answers in a very read like
908
00:43:18,329 --> 00:43:22,999
manner that GPT wouldn't both
in terms of the content itself.
909
00:43:23,019 --> 00:43:25,669
So in some of those questions,
he's like referencing blitzscaling
910
00:43:25,669 --> 00:43:28,889
and some of his own thinking, but
also in the way that he speaks.
911
00:43:28,889 --> 00:43:29,569
He might have phrases.
912
00:43:29,589 --> 00:43:30,249
He says a lot.
913
00:43:30,379 --> 00:43:32,879
I, I noticed when I listened
to myself recently, I'm like, I
914
00:43:32,879 --> 00:43:34,759
say, makes sense all the time.
915
00:43:34,769 --> 00:43:35,179
Why do
916
00:43:35,184 --> 00:43:36,004
say uh, uh,
917
00:43:36,494 --> 00:43:38,254
ubiquitous, ubiquitous.
918
00:43:38,284 --> 00:43:42,234
And uh, YouTubers will love to comment on.
919
00:43:42,544 --> 00:43:45,594
Yeah, I had a friend of mine, a comedian
was like, the first time you use a
920
00:43:45,594 --> 00:43:47,094
word like ubiquitous, he sounds smart.
921
00:43:47,104 --> 00:43:50,464
The second time it's like, oh, he
really just thinks that's a great word.
922
00:43:50,774 --> 00:43:52,384
And I, yeah but similar, right?
923
00:43:52,384 --> 00:43:56,444
If Reed tends to use, a word
like brilliant instead of smart,
924
00:43:56,554 --> 00:43:57,514
it should pick up on that.
925
00:43:57,844 --> 00:44:02,074
And so I think that's what made this
unique is that, as we're building all
926
00:44:02,074 --> 00:44:06,514
of these different use cases, we've
used read AI to, review business plans
927
00:44:06,514 --> 00:44:08,464
at Stanford and give video feedback.
928
00:44:08,474 --> 00:44:11,524
He's done interviews with, Bloomberg
and wall street journal, and he's
929
00:44:11,524 --> 00:44:15,234
done, speeches where we translate
it to 10 different languages.
930
00:44:15,524 --> 00:44:18,374
All of them were trying to figure
out like, how can this be additive?
931
00:44:18,374 --> 00:44:22,774
How can it, do something maybe interesting
that you couldn't do with video before.
932
00:44:22,944 --> 00:44:24,724
And so it's combining, yeah.
933
00:44:24,764 --> 00:44:26,854
The audio, the video.
934
00:44:27,409 --> 00:44:31,389
And then Parth, I mentioned before,
created this LLM, every, you know,
935
00:44:31,419 --> 00:44:34,749
couple of weeks, it's getting better
and smarter and more knowledgeable
936
00:44:34,749 --> 00:44:36,219
about what Reid said in the past.
937
00:44:36,679 --> 00:44:38,359
So yeah, that's the concept behind it.
938
00:44:38,359 --> 00:44:42,699
And then with LinkedIn specifically,
it can review your LinkedIn page.
939
00:44:43,114 --> 00:44:46,524
And to your point, sometimes it might
pick up on something like artwork, but
940
00:44:46,534 --> 00:44:51,124
other times, it could be limiting and how
it thinks it should take a single bullet
941
00:44:51,124 --> 00:44:52,614
point and turn that into a question.
942
00:44:53,049 --> 00:44:55,929
Does this become, I think it's
great that Reed's doing this
943
00:44:55,949 --> 00:44:57,309
because I think it opens the door.
944
00:44:57,349 --> 00:44:59,379
First of all, it shows
people what's possible.
945
00:44:59,379 --> 00:45:02,089
And in it, in what we were talking
about before, even with deepfakes a
946
00:45:02,089 --> 00:45:04,699
little bit, it shows people like, oh,
this is something you can both create
947
00:45:04,699 --> 00:45:07,849
for good, but also people could have a
version of this that is something that
948
00:45:07,859 --> 00:45:09,209
you have to be aware of what it is.
949
00:45:09,669 --> 00:45:12,029
Is it going to be productized in some way?
950
00:45:12,029 --> 00:45:12,899
What is the use case?
951
00:45:12,909 --> 00:45:14,909
Like, why would people pay
for something like this?
952
00:45:14,909 --> 00:45:15,199
Yeah.
953
00:45:15,259 --> 00:45:16,849
I think people will productize it.
954
00:45:16,849 --> 00:45:18,619
I don't think that's what
we're looking to do with
955
00:45:18,669 --> 00:45:21,129
this, but initially we created it.
956
00:45:21,129 --> 00:45:24,039
There was this idea of wouldn't
it be great to have, feedback
957
00:45:24,039 --> 00:45:26,719
from someone like read as a
mentor in your pocket, pull it up.
958
00:45:26,919 --> 00:45:28,899
You're thinking about doing
something with AI for humans.
959
00:45:28,899 --> 00:45:28,949
Yeah.
960
00:45:29,239 --> 00:45:32,839
And you can get his perspective and then
you can get Steve Jobs perspective and you
961
00:45:32,839 --> 00:45:37,089
get somebody else and you sort of have,
different thinking from different types
962
00:45:37,089 --> 00:45:39,329
of, mentors, thought leaders, et cetera.
963
00:45:39,729 --> 00:45:42,229
Or it could be advice from, you
know, your dad or your best friend or
964
00:45:42,229 --> 00:45:42,789
whoever you trust.
965
00:45:42,984 --> 00:45:46,804
Reid, you mentioned this is about
exploring what's possible, that is
966
00:45:46,804 --> 00:45:48,714
very much, what Reid is looking to do.
967
00:45:48,744 --> 00:45:54,044
Show and tell, he wrote a book
with GPT called Fireside Chatbots,
968
00:45:54,354 --> 00:45:56,860
he did this project, he's
always sort of experimenting, To
969
00:45:56,860 --> 00:45:58,410
figure out where this is going.
970
00:45:58,600 --> 00:46:02,750
You need to actually use the tools
and figure out, and he's on all
971
00:46:02,750 --> 00:46:03,830
of these things all the time.
972
00:46:03,830 --> 00:46:07,470
And I try to do the same thing
largely because that's how you
973
00:46:07,470 --> 00:46:09,080
can see where things are headed.
974
00:46:09,490 --> 00:46:12,670
But in terms of like how
this will be productized.
975
00:46:12,940 --> 00:46:14,100
Yeah, I think.
976
00:46:14,620 --> 00:46:18,430
Informational content and educational
content makes a lot of sense for
977
00:46:18,430 --> 00:46:19,580
these types of digital twins.
978
00:46:19,890 --> 00:46:23,440
I don't think it makes a lot
of sense for, a creator who's
979
00:46:23,490 --> 00:46:24,700
opening themselves up to their
980
00:46:24,700 --> 00:46:25,140
community
981
00:46:25,140 --> 00:46:26,090
and being themselves
982
00:46:26,105 --> 00:46:27,575
You want to have a connection with a real
983
00:46:27,640 --> 00:46:28,000
there.
984
00:46:28,000 --> 00:46:29,340
I think you want the real connection.
985
00:46:29,640 --> 00:46:32,490
And then there's other cases like
my, I have twins that are 16 and
986
00:46:32,490 --> 00:46:36,860
I'm like, surely we don't need to
be paying for an essay to tutor.
987
00:46:36,860 --> 00:46:36,990
There
988
00:46:36,990 --> 00:46:41,330
could be somebody who's watching them,
talking to them, explaining how to do
989
00:46:41,330 --> 00:46:43,550
this stuff in real time, interactive.
990
00:46:43,855 --> 00:46:45,655
I'd rather pay for that.
991
00:46:45,665 --> 00:46:48,865
Not that I want to put an essay to
tutors out of work, but then they can
992
00:46:48,875 --> 00:46:52,035
do it at all hours and they can, it's
customized for them and all that.
993
00:46:52,315 --> 00:46:54,575
So I, yeah, it's a bit of both.
994
00:46:54,615 --> 00:46:58,415
And, it's funny, another one, cause
we talked about the grandma lean
995
00:46:58,735 --> 00:46:59,515
video I did.
996
00:46:59,625 --> 00:47:05,495
And on the one hand, I think it
was funny to see her, break down
997
00:47:05,585 --> 00:47:07,385
Paul George coming to the 76ers.
998
00:47:07,385 --> 00:47:08,405
It's just funny to watch.
999
00:47:08,455 --> 00:47:11,945
But then on the other hand, to
make this video, I interviewed
1000
00:47:11,945 --> 00:47:13,135
her for two and a half hours.
1001
00:47:13,675 --> 00:47:17,495
And I had like, a lump in my throat
the whole two and a half hours.
1002
00:47:17,525 --> 00:47:20,455
I was like, I can't believe I
haven't had this conversation
1003
00:47:20,455 --> 00:47:21,655
before, and she's a hundred.
1004
00:47:21,985 --> 00:47:23,125
This woman is remarkable.
1005
00:47:23,125 --> 00:47:24,370
Like, I didn't have that.
1006
00:47:24,590 --> 00:47:29,430
And I'm much happier that I
have that interview than I'm,
1007
00:47:29,850 --> 00:47:32,060
you know, Excited that we can make her
1008
00:47:32,070 --> 00:47:33,980
rethink, how she would have said
1009
00:47:33,980 --> 00:47:34,460
something.
1010
00:47:34,820 --> 00:47:40,283
And so on the one hand fascinating
that, yeah, people probably will
1011
00:47:40,293 --> 00:47:43,243
preserve versions of their loved
ones that they can interact with.
1012
00:47:43,643 --> 00:47:48,193
But then on the other hand, the thing
we could do in 1986, I think people
1013
00:47:48,193 --> 00:47:51,763
should do much more of, which is just
sit down with the people you love.
1014
00:47:52,143 --> 00:47:57,023
Interview them, hear their life
stories and have that video preserved.
1015
00:47:57,213 --> 00:47:59,933
And there probably will be
things with AI that you can then,
1016
00:47:59,943 --> 00:48:01,283
ask it questions or whatever.
1017
00:48:01,668 --> 00:48:05,398
But like, actually what made
that project cool is not
1018
00:48:05,808 --> 00:48:06,048
Not
1019
00:48:06,048 --> 00:48:06,168
the
1020
00:48:06,348 --> 00:48:07,048
I can now,
1021
00:48:07,068 --> 00:48:07,728
the connection you
1022
00:48:07,868 --> 00:48:08,708
the AI.
1023
00:48:08,808 --> 00:48:09,318
Yeah.
1024
00:48:09,588 --> 00:48:10,178
Yeah.
1025
00:48:10,178 --> 00:48:11,498
I mean, Not to be hokey about it.
1026
00:48:11,498 --> 00:48:13,028
I know this is AI for humans,
1027
00:48:13,338 --> 00:48:13,758
but,
1028
00:48:13,988 --> 00:48:15,098
No, it's the human part.
1029
00:48:15,098 --> 00:48:16,148
That It's a human part.
1030
00:48:16,148 --> 00:48:16,328
That's
1031
00:48:16,368 --> 00:48:17,098
oh, that's true.
1032
00:48:17,098 --> 00:48:17,408
Yeah.
1033
00:48:17,408 --> 00:48:17,708
Yeah.
1034
00:48:17,738 --> 00:48:20,368
And it was funny actually in the early,
early in the interview, she's like,
1035
00:48:20,723 --> 00:48:22,123
And have you heard about this AI?
1036
00:48:22,593 --> 00:48:24,493
Oh my gosh, the things they're doing.
1037
00:48:24,543 --> 00:48:25,803
That's what people should
leave with this year.
1038
00:48:25,803 --> 00:48:27,853
Just go interview the old
people in your life so you
1039
00:48:27,853 --> 00:48:28,193
get them
1040
00:48:28,223 --> 00:48:28,883
on camera.
1041
00:48:28,938 --> 00:48:30,278
I had a wild experience.
1042
00:48:30,278 --> 00:48:30,808
Actually.
1043
00:48:31,048 --> 00:48:33,118
My grandfather died in 1998.
1044
00:48:33,648 --> 00:48:36,138
He lost all his brothers and
sisters in the Holocaust.
1045
00:48:36,428 --> 00:48:37,658
Never talked to him about it
1046
00:48:37,988 --> 00:48:38,468
really.
1047
00:48:38,488 --> 00:48:39,488
Cause I was young when he
1048
00:48:39,488 --> 00:48:39,918
died.
1049
00:48:40,328 --> 00:48:44,348
And then um, in like 2005, I
was basically Googling myself
1050
00:48:44,858 --> 00:48:50,138
and I stumbled into basically 10 hours of
interviews with him about the Holocaust
1051
00:48:50,393 --> 00:48:50,753
way.
1052
00:48:50,753 --> 00:48:51,573
Wow.
1053
00:48:51,898 --> 00:48:52,218
it was.
1054
00:48:52,833 --> 00:48:53,653
Kind of wild.
1055
00:48:53,663 --> 00:48:55,383
Nobody in my family knew it existed.
1056
00:48:55,583 --> 00:48:59,273
We had seen like three quotes from
this interview in a book, but then
1057
00:48:59,273 --> 00:49:01,623
there was like 10 hours of audio tapes
1058
00:49:01,783 --> 00:49:04,503
of him recounting the whole experience.
1059
00:49:04,873 --> 00:49:07,093
And I did use 11 labs
to recreate his voice.
1060
00:49:07,103 --> 00:49:08,073
I actually want to do this.
1061
00:49:08,123 --> 00:49:10,103
Cause I think this could be
an interesting thing where.
1062
00:49:10,483 --> 00:49:11,293
It's 10 hours.
1063
00:49:11,333 --> 00:49:12,683
Not a lot of people want to sit through 10
1064
00:49:12,693 --> 00:49:16,573
hours of him recounting the Holocaust,
but you would be able to say, retell
1065
00:49:16,573 --> 00:49:21,073
the story in five minutes at a seventh
grade level, there's ways to maybe take
1066
00:49:21,073 --> 00:49:26,258
this like incredible artifact of him
recounting his story of the Holocaust.
1067
00:49:26,638 --> 00:49:29,908
And, make it like more, I don't
know, accessible for audiences.
1068
00:49:30,218 --> 00:49:32,178
But yeah, that stuff is priceless.
1069
00:49:32,208 --> 00:49:34,688
And I do, it's funny because part
of me feels like a lot of bloggers
1070
00:49:34,778 --> 00:49:37,208
are probably now in regular jobs
1071
00:49:37,318 --> 00:49:39,568
and it was a bizarre thing to go through.
1072
00:49:39,568 --> 00:49:43,308
But if nothing else, they, a
lot of them do have this like
1073
00:49:43,488 --> 00:49:45,938
incredible time capsule of
what they were going through.
1074
00:49:45,938 --> 00:49:45,958
Yeah.
1075
00:49:46,391 --> 00:49:47,941
Ben, thank you so much for being here.
1076
00:49:47,941 --> 00:49:50,011
Where can people find you online?
1077
00:49:50,021 --> 00:49:51,171
And what should they check out?
1078
00:49:51,281 --> 00:49:52,311
People can find me.
1079
00:49:53,771 --> 00:49:54,281
LinkedIn.
1080
00:49:54,311 --> 00:49:56,311
This is the first time
I've co hosted a podcast.
1081
00:49:56,401 --> 00:49:56,821
You're doing
1082
00:49:56,821 --> 00:49:57,061
great.
1083
00:49:57,061 --> 00:49:57,341
You did a
1084
00:49:57,451 --> 00:49:58,321
video a little more.
1085
00:49:58,531 --> 00:49:59,391
I appreciate it.
1086
00:49:59,391 --> 00:50:02,471
I want to start doing video more, but,
yeah, really, it's linkedIn posts.
1087
00:50:02,581 --> 00:50:03,571
So let's go LinkedIn, Ben Rellis.
1088
00:50:03,571 --> 00:50:06,331
I have a website, real creative.
1089
00:50:06,961 --> 00:50:07,611
ai for
1090
00:50:07,611 --> 00:50:08,631
something to check out.
1091
00:50:08,821 --> 00:50:12,061
We try to curate these different
examples of cool AI creative projects.
1092
00:50:12,361 --> 00:50:14,921
And then if you go to the top,
you can store it by project type.
1093
00:50:14,961 --> 00:50:16,911
And we have over 300 creators curated.
1094
00:50:16,981 --> 00:50:21,051
I'm just like fascinated by this
community of AI creators that are like.
1095
00:50:21,416 --> 00:50:23,656
They're not really as focused on
like views and money right now.
1096
00:50:23,656 --> 00:50:24,676
They're focused on
1097
00:50:24,786 --> 00:50:25,256
making cool
1098
00:50:25,316 --> 00:50:26,256
experimentation.
1099
00:50:26,456 --> 00:50:28,996
I'm sure later they'll want the
views and the money more too.
1100
00:50:29,396 --> 00:50:30,756
I would say, okay, this is an odd one.
1101
00:50:30,756 --> 00:50:34,716
I would say in a little bit of a non
sequitur, I would check out some of
1102
00:50:34,726 --> 00:50:38,236
the tributes and some of the things
being written about YouTube CEO
1103
00:50:40,196 --> 00:50:40,786
Oh man.
1104
00:50:40,946 --> 00:50:41,216
camp.
1105
00:50:41,216 --> 00:50:41,816
I get choked up.
1106
00:50:41,856 --> 00:50:42,156
All right,
1107
00:50:42,156 --> 00:50:42,556
okay.
1108
00:50:42,706 --> 00:50:44,456
Non separate or something to check out.
1109
00:50:44,586 --> 00:50:49,911
I would say read some of the tributes
to YouTube's um, CEO, former CEO uh,
1110
00:50:49,971 --> 00:50:52,181
Susan, who passed away last week.
1111
00:50:52,341 --> 00:50:55,461
She was the CEO when I was
there for most of my time there.
1112
00:50:55,721 --> 00:50:56,771
Remarkable woman.
1113
00:50:56,771 --> 00:50:57,911
I admire her so much.
1114
00:50:57,911 --> 00:51:02,621
So it's not an AI thing, but it's been
fascinating being on LinkedIn because
1115
00:51:02,671 --> 00:51:05,471
I just like, I'm reading all these
tributes from people that I was really
1116
00:51:05,471 --> 00:51:09,494
close with when I worked there and I
don't know, it was like Obviously
1117
00:51:09,504 --> 00:51:14,384
heartbreaking, but also inspiring to read
all these experiences that people had.
1118
00:51:14,414 --> 00:51:17,784
So for people in the tech industry
that want to read about, you know,
1119
00:51:17,784 --> 00:51:23,134
an icon uh, yeah, look up some of
those tributes to Susan and what
1120
00:51:23,184 --> 00:51:24,244
she was able to do with her life.
1121
00:51:24,312 --> 00:51:26,692
She sounds like an incredible
person and how lucky you were
1122
00:51:26,692 --> 00:51:27,582
to be able to work with her.
1123
00:51:27,582 --> 00:51:30,512
And what she did for YouTube
is pretty remarkable overall,
1124
00:51:30,562 --> 00:51:33,852
And also not just how big YouTube
got, but I have to say it was a big
1125
00:51:33,852 --> 00:51:38,252
reason I stayed there for so long the
culture of that place people really
1126
00:51:38,252 --> 00:51:42,832
admired her, and so, like, yeah, in
addition to, of course the insane
1127
00:51:42,832 --> 00:51:44,772
growth it had over that time period.
1128
00:51:44,927 --> 00:51:49,667
I just think she like really was
passionate about the people at YouTube,
1129
00:51:49,717 --> 00:51:53,167
the creators, and created a culture
that made you want to be a part of it.
1130
00:51:53,277 --> 00:51:53,407
I
1131
00:51:53,427 --> 00:51:54,137
That's amazing.
1132
00:51:54,137 --> 00:51:55,977
Ben, thank you so much
for coming on this week.
1133
00:51:55,977 --> 00:51:57,167
We really appreciate it.
1134
00:51:57,187 --> 00:51:58,267
And Kevin, we miss you.
1135
00:51:58,277 --> 00:51:59,507
We'll see you back again next time.
1136
00:51:59,507 --> 00:52:01,357
But that is AI for Humans, everybody.
1137
00:52:01,377 --> 00:52:02,417
We will see y'all next week.
1138
00:52:02,417 --> 00:52:04,027
And again, Ben, thank you for being here.
1139
00:52:04,069 --> 00:52:04,389
Yeah.
1140
00:52:04,399 --> 00:52:05,399
Thanks for having me, Gavin.