Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:02:16,560 --> 00:02:19,560
Thank you.
2
00:02:23,400 --> 00:02:25,800
Oh my goodness!
3
00:02:25,800 --> 00:02:28,200
What are those?
4
00:02:28,200 --> 00:02:31,200
that will be the end
of my terrible Spanish.
5
00:02:31,240 --> 00:02:34,360
for the evening,
or possibly we might insert
6
00:02:34,360 --> 00:02:37,360
a little in as we go.
7
00:02:37,360 --> 00:02:41,760
I have been looking at some of the posters
8
00:02:42,080 --> 00:02:45,080
for the festival up around campus,
9
00:02:45,120 --> 00:02:49,120
and also looking at you here and thinking
10
00:02:49,560 --> 00:02:52,680
I might do a little switcheroo on my talk.
11
00:02:54,120 --> 00:02:56,640
it is listed as a talk about
12
00:02:56,640 --> 00:03:00,680
using artificial intelligence
to improve collective intelligence.
13
00:03:01,080 --> 00:03:05,280
How can we all be smarter together
and I'll talk about that.
14
00:03:06,960 --> 00:03:09,320
but I actually thought
15
00:03:09,320 --> 00:03:12,240
just the tagline
16
00:03:12,240 --> 00:03:15,240
of the conference, Artificial Intelligence
17
00:03:15,640 --> 00:03:18,320
and other intelligence
18
00:03:18,320 --> 00:03:21,240
and trying to understand
19
00:03:21,240 --> 00:03:24,240
the difference between how we think
20
00:03:24,720 --> 00:03:27,160
and how current
21
00:03:27,160 --> 00:03:29,480
bleeding edge
22
00:03:29,480 --> 00:03:33,240
of artificial intelligence
thinks would actually be really useful.
23
00:03:33,920 --> 00:03:38,480
Having said all of that,
anyone who's here because they want a deep
24
00:03:38,480 --> 00:03:42,000
technical dive into machine
learning and artificial intelligence,
25
00:03:42,400 --> 00:03:45,600
please stay around
and come talk to me afterwards.
26
00:03:45,920 --> 00:03:50,000
I would be thrilled to answer
any questions about anything I talk about,
27
00:03:50,400 --> 00:03:53,760
or if you happen to have the misfortune
of knowing my research, or me
28
00:03:53,760 --> 00:03:55,800
for that matter. Anything else.
29
00:03:56,800 --> 00:03:59,280
But instead, I wanted
30
00:03:59,280 --> 00:04:02,280
to start
with something different for a minute.
31
00:04:02,280 --> 00:04:06,240
It won't even feel like we're going
to talk about artificial intelligence,
32
00:04:07,480 --> 00:04:10,480
because I think
the most important question
33
00:04:10,800 --> 00:04:13,800
I can ask you
34
00:04:14,400 --> 00:04:15,960
that's relevant
35
00:04:15,960 --> 00:04:20,240
to how I might affect all of our lives
36
00:04:21,320 --> 00:04:24,320
is cut out.
37
00:04:26,120 --> 00:04:26,920
How are you doing?
38
00:04:26,920 --> 00:04:29,200
Are you happy?
39
00:04:29,200 --> 00:04:31,920
And maybe even more importantly,
40
00:04:31,920 --> 00:04:34,400
if you weren't.
41
00:04:34,400 --> 00:04:37,400
What would you do about it?
42
00:04:37,440 --> 00:04:40,080
And the reason
why this is such an important question.
43
00:04:40,080 --> 00:04:44,880
Such a fascinating question,
is because research
44
00:04:45,360 --> 00:04:48,360
about intelligence and happiness
45
00:04:48,920 --> 00:04:51,920
has this rather astounding finding.
46
00:04:53,080 --> 00:04:56,080
Intelligence doesn't predict happiness.
47
00:04:56,400 --> 00:04:59,400
We could look at all of us in this room
48
00:04:59,720 --> 00:05:02,600
here at a grand university.
49
00:05:02,600 --> 00:05:05,520
I'm sure some of us are quite clever.
50
00:05:05,520 --> 00:05:08,000
If we looked across
all of the different people
51
00:05:09,040 --> 00:05:09,360
and we
52
00:05:09,360 --> 00:05:12,880
went and measured all the different ways
53
00:05:13,240 --> 00:05:17,040
people measure intelligence
in the academic world.
54
00:05:18,240 --> 00:05:20,880
IQ, we don't use so much anymore.
55
00:05:20,880 --> 00:05:23,880
But there's a modern version of it
called G
56
00:05:24,240 --> 00:05:26,200
or working memory span.
57
00:05:26,200 --> 00:05:29,760
How many things can you keep in mind
at any given moment?
58
00:05:29,920 --> 00:05:32,720
How many different things
can you attend to?
59
00:05:32,720 --> 00:05:35,280
So all these different ways to measure
60
00:05:35,280 --> 00:05:38,280
our raw cognitive power.
61
00:05:39,000 --> 00:05:41,280
And none of them
62
00:05:41,280 --> 00:05:44,280
correlate in any way with happiness.
63
00:05:45,000 --> 00:05:47,920
What a strange finding.
64
00:05:47,920 --> 00:05:50,240
There are many things to do.
65
00:05:50,240 --> 00:05:53,240
People
that are more resilient in their lives
66
00:05:53,360 --> 00:05:55,560
tend to be happier.
67
00:05:55,560 --> 00:05:59,320
People that have better communication
skills tend to be happier.
68
00:06:00,400 --> 00:06:02,960
People that have a strong sense of purpose
in life.
69
00:06:02,960 --> 00:06:07,920
Meaningfulness are substantially happier
than those that don't.
70
00:06:09,560 --> 00:06:11,080
Why not? Intelligence?
71
00:06:11,080 --> 00:06:14,080
Every one of those things
I just mentioned purpose,
72
00:06:14,120 --> 00:06:17,120
resilience, communication skills.
73
00:06:17,280 --> 00:06:20,280
They're predictive of a wide range of life
outcomes.
74
00:06:20,520 --> 00:06:24,840
How much money you earn in your lifetime,
how far you'll go in your education.
75
00:06:26,320 --> 00:06:29,320
But unlike intelligence,
76
00:06:29,520 --> 00:06:32,520
which is a strong predictor, both of those
77
00:06:32,880 --> 00:06:35,360
they also predict happiness.
78
00:06:35,360 --> 00:06:37,840
What's going on there?
79
00:06:37,840 --> 00:06:41,440
Well, I never be happy
because it's only intelligence.
80
00:06:43,200 --> 00:06:46,880
Or maybe we're measuring intelligence.
81
00:06:47,240 --> 00:06:49,080
Wrong.
82
00:06:49,080 --> 00:06:53,160
Maybe our own understanding of our own
intelligence
83
00:06:53,840 --> 00:06:56,840
has been profoundly misinformed,
84
00:06:57,280 --> 00:07:01,160
and it's affecting our understanding
of this new form of intelligence
85
00:07:01,920 --> 00:07:04,240
that mad scientists like
86
00:07:04,240 --> 00:07:07,240
me are bringing into the world.
87
00:07:07,240 --> 00:07:10,400
So let's
think about how we measure intelligence,
88
00:07:11,000 --> 00:07:14,000
how people are publishing papers
doing this
89
00:07:14,240 --> 00:07:17,120
with new artificial intelligence
like large language
90
00:07:17,120 --> 00:07:21,360
models GPT, ChatGPT, Claude Gemini,
91
00:07:21,360 --> 00:07:24,360
all these others,
but also with our own children.
92
00:07:25,800 --> 00:07:27,720
What do we do?
93
00:07:27,720 --> 00:07:29,880
We give them a test.
94
00:07:29,880 --> 00:07:32,880
And a test has right and wrong answers.
95
00:07:34,040 --> 00:07:37,040
It could be a fairly simple test.
96
00:07:37,080 --> 00:07:41,160
It could be a test specifically designed
to be invariant to language
97
00:07:41,160 --> 00:07:44,880
skills or cultural bias,
like Raven's Progressive Matrices.
98
00:07:45,680 --> 00:07:47,960
But still, these are tests
99
00:07:47,960 --> 00:07:50,960
that have explicitly right
and wrong answers.
100
00:07:51,360 --> 00:07:53,960
And the more right answers
you could give, the smarter you are.
101
00:07:56,760 --> 00:07:57,880
Interestingly, we might
102
00:07:57,880 --> 00:08:01,640
call this kind of a test
this kind of a problem.
103
00:08:02,160 --> 00:08:05,160
It's a well-posed problem.
104
00:08:05,480 --> 00:08:08,920
It's a well-posed question is a question
105
00:08:08,920 --> 00:08:11,920
that has an explicitly right answer.
106
00:08:12,600 --> 00:08:16,560
And so if we're measuring people
and how well they can answer
107
00:08:17,080 --> 00:08:21,360
questions that have right answers,
then we're measuring how
108
00:08:21,360 --> 00:08:24,880
well they can answer well posed questions,
well posed
109
00:08:25,080 --> 00:08:28,080
problems.
110
00:08:28,240 --> 00:08:33,120
In 25 years of my career,
111
00:08:34,320 --> 00:08:37,320
I've never worked on a well-posed problem.
112
00:08:38,200 --> 00:08:40,080
Overwhelmingly,
113
00:08:40,080 --> 00:08:45,080
the problems that are of interest
to society, to America,
114
00:08:45,080 --> 00:08:50,680
to Mexico, to us in our individual
lives are ill posed problems.
115
00:08:50,920 --> 00:08:54,000
Problems that don't have defined
116
00:08:54,000 --> 00:08:57,000
right answers.
117
00:08:57,600 --> 00:08:59,800
If you weren't happy,
118
00:08:59,800 --> 00:09:02,680
what would you do to change that?
119
00:09:02,680 --> 00:09:05,280
That doesn't have a right answer.
120
00:09:05,280 --> 00:09:07,080
It's fact.
121
00:09:07,080 --> 00:09:09,480
It's been well-studied,
and it would be different
122
00:09:09,480 --> 00:09:11,880
for almost everyone in this room.
123
00:09:11,880 --> 00:09:13,840
You would need to know yourself.
124
00:09:13,840 --> 00:09:15,840
You'd need to know your context.
125
00:09:15,840 --> 00:09:20,640
You have to change everything
you plan to do in response
126
00:09:21,400 --> 00:09:24,520
to a very complicated world.
127
00:09:26,920 --> 00:09:29,920
Why does all of this matter?
128
00:09:30,280 --> 00:09:32,520
Because of artificial intelligence
129
00:09:32,520 --> 00:09:35,440
and the way that we have built it to date.
130
00:09:35,440 --> 00:09:38,440
It's fundamentally designed to solve
131
00:09:38,760 --> 00:09:41,760
well-posed questions
132
00:09:41,760 --> 00:09:44,080
by its very nature.
133
00:09:44,080 --> 00:09:47,280
Let's do a little walkthrough.
134
00:09:47,280 --> 00:09:51,160
I think you've gone through
two weeks of festival, but just in case
135
00:09:51,160 --> 00:09:55,720
anyone wants a quick tutorial
on artificial intelligence,
136
00:09:57,120 --> 00:09:59,640
as we talked about earlier today,
137
00:09:59,640 --> 00:10:02,640
it's any autonomous system
138
00:10:02,920 --> 00:10:05,920
that can answer questions
under uncertainty.
139
00:10:07,520 --> 00:10:09,040
So uncertainty doesn't necessarily
140
00:10:09,040 --> 00:10:12,040
mean that a problem is ill posed.
141
00:10:12,080 --> 00:10:14,280
It could mean there's
142
00:10:14,280 --> 00:10:18,640
this little blob on an x ray cancer
143
00:10:19,280 --> 00:10:22,280
or just an artifact of the machine.
144
00:10:23,760 --> 00:10:26,760
It could mean,
145
00:10:26,760 --> 00:10:28,440
in this contract,
146
00:10:28,440 --> 00:10:32,920
how many flaws are there
that an opposing lawyer could exploit?
147
00:10:34,320 --> 00:10:37,800
These, on some level,
are well posed questions
148
00:10:38,280 --> 00:10:41,280
in that
they have explicit answers to them.
149
00:10:41,360 --> 00:10:44,760
You might need an advanced law degree
to read that contract
150
00:10:44,760 --> 00:10:46,040
and see where the flaws are.
151
00:10:48,240 --> 00:10:49,920
on the other hand,
152
00:10:49,920 --> 00:10:53,280
you may not be able to describe
as most doctors can.
153
00:10:53,560 --> 00:10:56,040
What about the blob?
154
00:10:56,040 --> 00:10:58,920
Makes it look like cancer to you
155
00:10:58,920 --> 00:11:01,920
versus this other blob?
156
00:11:03,640 --> 00:11:09,440
but we can take lots of images
of cancerous,
157
00:11:10,120 --> 00:11:14,400
x rays and lots of images of x rays
that don't have cancer in them.
158
00:11:14,760 --> 00:11:18,280
And we can label one as cancer
and the other as not,
159
00:11:18,480 --> 00:11:21,320
and give them to an algorithm
160
00:11:21,320 --> 00:11:24,960
and have it learn and learn
and learn to distinguish between them.
161
00:11:26,160 --> 00:11:30,480
And in that sense, still fundamentally,
this is a well posed question.
162
00:11:31,560 --> 00:11:33,640
We have all the data we need.
163
00:11:33,640 --> 00:11:35,760
We have all the right answers.
164
00:11:35,760 --> 00:11:38,720
We even know the question
that we're asking
165
00:11:38,720 --> 00:11:41,720
is this x ray indicative of cancer?
166
00:11:43,480 --> 00:11:46,560
That's a serious skill to have.
167
00:11:48,560 --> 00:11:49,560
But now we live in a
168
00:11:49,560 --> 00:11:52,680
world where machines can do that skill,
169
00:11:53,520 --> 00:11:56,160
increasingly competitive
170
00:11:56,160 --> 00:11:59,160
to us and beyond competitive.
171
00:11:59,280 --> 00:12:02,280
In a recent paper published, a group,
172
00:12:03,320 --> 00:12:05,120
used large language models.
173
00:12:05,120 --> 00:12:06,920
It didn't matter what one.
174
00:12:06,920 --> 00:12:10,240
So these are massive AIS,
175
00:12:11,480 --> 00:12:14,480
some of the biggest
that have ever existed.
176
00:12:15,000 --> 00:12:19,160
Trillions
and trillions of individual equations
177
00:12:19,160 --> 00:12:22,840
getting computed to produce a next word.
178
00:12:23,680 --> 00:12:27,640
You give it some words
and it predicts the next word.
179
00:12:28,120 --> 00:12:31,120
It's the world's largest autocomplete.
180
00:12:31,920 --> 00:12:36,560
And while it may feel like I'm selling
GPT short, I'm not.
181
00:12:36,600 --> 00:12:40,920
What's astonishing about
it is what that simple mechanism
182
00:12:40,920 --> 00:12:44,200
can do just by producing the next word.
183
00:12:44,880 --> 00:12:47,880
It, for example, can read a contract,
184
00:12:48,840 --> 00:12:51,760
find all of the legal flaws
185
00:12:51,760 --> 00:12:54,800
equitable
to a working professional lawyer,
186
00:12:56,280 --> 00:12:58,240
and instead of taking two
187
00:12:58,240 --> 00:13:01,560
hours, it takes 30s.
188
00:13:02,840 --> 00:13:03,240
If you work
189
00:13:03,240 --> 00:13:06,240
out the math on that
as published in this paper,
190
00:13:07,080 --> 00:13:10,840
the at similar effectiveness
191
00:13:11,520 --> 00:13:14,520
for identifying problems and contracts.
192
00:13:14,920 --> 00:13:19,520
The AI is 99.7%
193
00:13:19,560 --> 00:13:24,160
no 99.97% less expensive
194
00:13:24,520 --> 00:13:27,520
than a human lawyer.
195
00:13:27,520 --> 00:13:30,520
That's sounds scary.
196
00:13:31,000 --> 00:13:31,560
Scary.
197
00:13:31,560 --> 00:13:33,880
If you're in law school.
198
00:13:33,880 --> 00:13:36,120
Scary if you're in med school.
199
00:13:36,120 --> 00:13:38,520
And by the way, that wasn't just GPT.
200
00:13:38,520 --> 00:13:39,960
That could do it.
201
00:13:39,960 --> 00:13:42,360
Every one of the large language models,
202
00:13:42,360 --> 00:13:47,400
including Lamda, the version that's free
and open source out in the world,
203
00:13:47,640 --> 00:13:50,640
could also beat the human lawyers.
204
00:13:52,840 --> 00:13:55,840
so is this the end of the game?
205
00:13:56,120 --> 00:13:58,440
Well, this is why I wanted to talk today
206
00:13:58,440 --> 00:14:02,200
about the differences
between human and machine intelligence.
207
00:14:04,080 --> 00:14:06,680
it's not just about
208
00:14:06,680 --> 00:14:10,080
whether you can answer
well posed questions.
209
00:14:12,120 --> 00:14:15,360
It's about the messy
uncertainty of the real world.
210
00:14:16,440 --> 00:14:17,960
It turns out that's what's
211
00:14:17,960 --> 00:14:20,960
fascinating about us.
212
00:14:22,080 --> 00:14:24,240
So let me take a little detour.
213
00:14:24,240 --> 00:14:27,480
Now that I've framed
where I want to take this,
214
00:14:29,320 --> 00:14:31,280
and say
215
00:14:31,280 --> 00:14:34,280
I love working
with artificial intelligence.
216
00:14:34,600 --> 00:14:39,280
I built my first model very long ago.
217
00:14:39,320 --> 00:14:42,320
At least in the world of AI 25 years ago.
218
00:14:42,680 --> 00:14:45,320
25 years ago, I was an undergrad.
219
00:14:45,320 --> 00:14:48,000
I thought I was going to stick wires
into cat brains
220
00:14:48,000 --> 00:14:51,400
and be a wet neuroscientist
and understand how the brain works.
221
00:14:51,920 --> 00:14:56,160
And I took my first and only programing
course, and the professor in that course
222
00:14:56,400 --> 00:15:00,880
recommended me to work at this place
called the Machine Perception Lab,
223
00:15:01,680 --> 00:15:04,880
where we were building an AI
that would analyze people's
224
00:15:04,880 --> 00:15:08,720
faces in real time
and tell whether or not they were lying.
225
00:15:10,200 --> 00:15:11,400
If you were curious,
226
00:15:11,400 --> 00:15:14,400
we've been given
a bunch of money by the CIA.
227
00:15:15,360 --> 00:15:19,080
so sometimes that's
what happens in your life.
228
00:15:20,000 --> 00:15:23,000
a bunch of spies give you money to make,
229
00:15:23,280 --> 00:15:26,280
superintelligent machines that can tell
if people are lying.
230
00:15:26,880 --> 00:15:30,120
But we were just doing basic
science research.
231
00:15:30,800 --> 00:15:33,040
What could we actually do?
232
00:15:33,040 --> 00:15:35,200
And it was amazing.
233
00:15:35,200 --> 00:15:38,400
I could build a machine that could tell
if someone was smiling or not.
234
00:15:39,880 --> 00:15:42,720
And I thought, what can I do with this?
235
00:15:42,720 --> 00:15:45,480
How could you make a difference
in the world with something like this?
236
00:15:45,480 --> 00:15:48,360
Even if the original money came
from an organization
237
00:15:48,360 --> 00:15:51,360
that not everybody in the world loves,
and for very good reason.
238
00:15:52,560 --> 00:15:56,320
Later in my career,
I had the chance to build take what
239
00:15:56,320 --> 00:16:00,120
I learned building
those original models, and
240
00:16:00,120 --> 00:16:03,360
I built a system for Google Glass.
241
00:16:04,680 --> 00:16:06,040
It could find
242
00:16:06,040 --> 00:16:09,040
someone, read their facial expression,
243
00:16:09,320 --> 00:16:12,840
write the expression in the little heads
up display.
244
00:16:12,840 --> 00:16:15,840
Now, there's all sorts of terrible things
you could do with this.
245
00:16:16,200 --> 00:16:18,520
many bankers have asked me,
246
00:16:18,520 --> 00:16:21,360
could we get this for all of our tellers?
247
00:16:21,360 --> 00:16:25,120
And then whenever anyone comes
in, it would just float everybody's
248
00:16:25,120 --> 00:16:28,880
credit score above their head,
so I'd know whether or not
249
00:16:28,880 --> 00:16:32,360
they would qualify for a loan before
I've ever even talked
250
00:16:32,360 --> 00:16:35,360
to them.
251
00:16:35,520 --> 00:16:37,920
suffice to say,
I'm not interested in that.
252
00:16:37,920 --> 00:16:40,920
We built this for autistic children.
253
00:16:41,480 --> 00:16:45,560
It turned out that when an autistic kid
had the chance to talk face
254
00:16:45,560 --> 00:16:48,560
to face and a natural social interaction,
255
00:16:49,320 --> 00:16:52,320
but they were told
what the expression was,
256
00:16:52,320 --> 00:16:55,320
something they don't get for free.
257
00:16:56,040 --> 00:16:57,400
Autism is complex.
258
00:16:57,400 --> 00:17:02,000
Some kids are better at these than others,
but at its heart, autistic children
259
00:17:03,480 --> 00:17:04,200
often have
260
00:17:04,200 --> 00:17:07,200
incredibly hard times
reading facial expressions.
261
00:17:08,040 --> 00:17:10,320
So if you could build this thing
262
00:17:10,320 --> 00:17:13,320
and while they're interacting
with the people around them,
263
00:17:14,160 --> 00:17:17,160
they could read their facial expression
264
00:17:17,320 --> 00:17:19,800
or get a signal to teach them how.
265
00:17:19,800 --> 00:17:21,600
And that was the amazing thing.
266
00:17:21,600 --> 00:17:25,280
Not only did these kids learn
how to read facial expressions
267
00:17:25,760 --> 00:17:29,160
better than the standard,
which at the time and still is today,
268
00:17:29,160 --> 00:17:33,600
cartoon faces on flashcards,
you just show them a face.
269
00:17:33,600 --> 00:17:34,320
This is happy.
270
00:17:34,320 --> 00:17:35,520
This is sad.
271
00:17:35,520 --> 00:17:38,440
Now they were interacting
with an actual person
272
00:17:38,440 --> 00:17:40,720
that was smiling or frowning.
273
00:17:40,720 --> 00:17:43,200
And they learned even better.
274
00:17:43,200 --> 00:17:46,160
And not only that, they learned
275
00:17:46,160 --> 00:17:49,160
theory of mind, empathy
276
00:17:49,320 --> 00:17:51,560
turned out it's really hard to learn why
277
00:17:51,560 --> 00:17:55,680
other people do what they do
if they're all speaking a secret language.
278
00:17:55,920 --> 00:17:56,720
You did not.
279
00:17:57,880 --> 00:18:01,720
So I got to take artificial intelligence
280
00:18:02,760 --> 00:18:05,880
and help these children understand
281
00:18:06,720 --> 00:18:09,560
what makes other people happy and sad.
282
00:18:09,560 --> 00:18:13,480
Again, fundamentally, a human experience.
283
00:18:14,120 --> 00:18:17,760
Later, I took the work
I had done doing facial modeling,
284
00:18:18,240 --> 00:18:21,240
and did something even bigger.
285
00:18:22,400 --> 00:18:24,480
I built a game,
286
00:18:24,480 --> 00:18:27,080
a really terrible game,
287
00:18:27,080 --> 00:18:30,080
of which I would be so ashamed.
288
00:18:30,320 --> 00:18:32,840
But you need to wait
for the end of the story.
289
00:18:32,840 --> 00:18:35,840
This game was called Sexy Face
290
00:18:35,920 --> 00:18:37,680
and it was online.
291
00:18:37,680 --> 00:18:40,680
And if you went on to the Sexy Face
website,
292
00:18:41,040 --> 00:18:45,000
our promise was pick out the faces
you think are sexy.
293
00:18:45,440 --> 00:18:49,480
It would just pick random people
out of Facebook and Flickr
294
00:18:49,480 --> 00:18:50,800
that dated a little bit.
295
00:18:50,800 --> 00:18:53,800
For those of you
who are old enough to remember that,
296
00:18:54,600 --> 00:18:56,960
it would just pick out these random faces
and put them in.
297
00:18:56,960 --> 00:18:59,960
Then you pick the faces you think are sexy
298
00:19:00,160 --> 00:19:03,720
and for free will find everyone
on Facebook that you think is sexy.
299
00:19:04,560 --> 00:19:08,480
And for $5, we'll find everyone
that thinks you're sexy.
300
00:19:10,720 --> 00:19:12,880
Now, if
301
00:19:12,880 --> 00:19:16,080
I am a horrible person,
and that's still a reasonable thing
302
00:19:16,080 --> 00:19:19,080
to be wondering about, it's
not because of that.
303
00:19:19,200 --> 00:19:21,840
After
you played a couple of rounds, it then
304
00:19:21,840 --> 00:19:24,840
confessed, actually, we're Trojan,
305
00:19:25,280 --> 00:19:28,280
which means we're a secret game.
306
00:19:30,240 --> 00:19:33,800
we're an AI,
and this is a mind reading game.
307
00:19:34,120 --> 00:19:35,880
So not just sexy faces.
308
00:19:35,880 --> 00:19:39,600
Any face come up with any face category
309
00:19:39,600 --> 00:19:42,600
you can think of and, well, guess it.
310
00:19:42,840 --> 00:19:46,160
And so you think, I don't know, Southeast
Asian
311
00:19:46,920 --> 00:19:51,560
with, wispy mustache and a sense of ennui.
312
00:19:52,960 --> 00:19:55,480
The amazing thing is
313
00:19:55,480 --> 00:19:57,520
that our network
314
00:19:57,520 --> 00:20:01,320
actually learned
how to distinguish these faces.
315
00:20:02,000 --> 00:20:05,000
It could really find a face
316
00:20:05,440 --> 00:20:08,200
based on someone's hidden idea
317
00:20:08,200 --> 00:20:11,200
of what it is.
318
00:20:11,320 --> 00:20:12,160
and it turns out
319
00:20:12,160 --> 00:20:15,200
even this game
wasn't the end of the story.
320
00:20:16,560 --> 00:20:19,560
The End of story is a book.
321
00:20:19,560 --> 00:20:22,560
It's a book
with a million photographs in it.
322
00:20:22,640 --> 00:20:27,600
It is a book that is likely filling up
with photographs right now as we speak.
323
00:20:28,480 --> 00:20:29,800
It is the U.N.
324
00:20:29,800 --> 00:20:32,800
Refugee Program's book of orphan refugees.
325
00:20:33,360 --> 00:20:36,520
And when we did this project
now over ten years ago,
326
00:20:37,080 --> 00:20:40,200
it had a million photographs in it of kids
327
00:20:40,880 --> 00:20:45,000
in refugee camps, all around the world,
right there and then.
328
00:20:45,600 --> 00:20:48,600
So if you were fearing the worst
329
00:20:49,000 --> 00:20:51,520
and you hadn't heard
from your sister's family,
330
00:20:51,520 --> 00:20:55,800
and you drove all the way outside
your country to a refugee camp nearby
331
00:20:56,880 --> 00:20:59,880
and you go to them,
332
00:20:59,880 --> 00:21:03,120
they would give you this book
and you'd start leafing through it,
333
00:21:04,160 --> 00:21:09,000
hoping you don't find your niece,
hoping you do find your niece.
334
00:21:10,680 --> 00:21:14,520
We, we took
335
00:21:14,880 --> 00:21:17,840
the AI that was trained by people playing
336
00:21:17,840 --> 00:21:20,840
our very stupid game,
337
00:21:21,160 --> 00:21:25,680
and we used it to further train
338
00:21:25,680 --> 00:21:28,800
an AI
on the faces of all of these children.
339
00:21:29,680 --> 00:21:33,120
And we built a system
that could run on the little tablet,
340
00:21:34,480 --> 00:21:36,240
and it would pull 18
341
00:21:36,240 --> 00:21:40,800
random kids out of the book
and would show their faces,
342
00:21:41,440 --> 00:21:44,440
and you'd pick
the ones that look more like your niece
343
00:21:45,840 --> 00:21:49,000
and then it would pull up
344
00:21:49,000 --> 00:21:52,000
a next set of faces
based on your past responses.
345
00:21:52,560 --> 00:21:55,480
And in about 3 to 5 minutes,
if your niece is in the camp
346
00:21:55,480 --> 00:21:58,480
anywhere in the world, you found it.
347
00:21:58,960 --> 00:22:01,960
I love working with AI,
348
00:22:02,240 --> 00:22:05,040
not because I love geeking out
349
00:22:05,040 --> 00:22:08,040
about the mathematics
of artificial intelligence,
350
00:22:08,360 --> 00:22:10,400
although there
are moments when that is fun,
351
00:22:11,920 --> 00:22:13,080
not because I
352
00:22:13,080 --> 00:22:18,080
have gnarly data problems
or I mean fabulous engineer, I am not.
353
00:22:18,080 --> 00:22:19,240
I am a scientist.
354
00:22:19,240 --> 00:22:24,560
I have never written a line of code
that didn't involve 32 curse words
355
00:22:24,560 --> 00:22:27,560
for every single function word
I put into the code.
356
00:22:28,560 --> 00:22:31,560
I love it because of what it can do.
357
00:22:31,800 --> 00:22:35,680
What we can choose to do with it.
358
00:22:37,320 --> 00:22:40,440
Yeah, we built in the
AI that the CIA wanted to use to tell
359
00:22:40,440 --> 00:22:42,320
if people were lying.
360
00:22:42,320 --> 00:22:44,680
But we also built systems to help
361
00:22:44,680 --> 00:22:47,680
autistic children
learn how to read facial expressions
362
00:22:47,760 --> 00:22:51,120
and to reunite orphaned refugees
with their extended family
363
00:22:51,120 --> 00:22:54,120
through a program called Refugees United.
364
00:22:55,040 --> 00:22:59,240
That's what got me excited
about working in this space.
365
00:22:59,240 --> 00:23:05,040
And over the years, I founded companies
and I've published, research
366
00:23:05,040 --> 00:23:09,160
in theoretical neuroscience
related to our brains
367
00:23:09,160 --> 00:23:12,200
and artificial intelligence, but
368
00:23:13,560 --> 00:23:16,120
still would always really drove me along.
369
00:23:16,120 --> 00:23:19,120
Was the idea that you could change a life
370
00:23:20,040 --> 00:23:23,040
and that that would be valuable?
371
00:23:23,360 --> 00:23:26,400
One of my current companies has the first
372
00:23:26,760 --> 00:23:29,760
ever biological test
for postpartum depression,
373
00:23:30,920 --> 00:23:33,000
and it's predictive.
374
00:23:33,000 --> 00:23:36,280
So before a mom ever even gives birth,
375
00:23:37,360 --> 00:23:40,400
before she ever experiences
the first symptoms,
376
00:23:41,480 --> 00:23:43,800
we can actually see
377
00:23:43,800 --> 00:23:47,960
in your epigenetics the biological signal
378
00:23:48,400 --> 00:23:52,120
for postpartum depression
and begin treatment before it affects you.
379
00:23:52,320 --> 00:23:56,920
This is the most common complication
in pregnancy in the United States.
380
00:23:57,680 --> 00:24:01,880
It is the single most common reason
women die in childbirth
381
00:24:01,880 --> 00:24:05,040
is related to complications in postpartum
382
00:24:05,040 --> 00:24:08,040
depression, including suicide.
383
00:24:09,360 --> 00:24:10,280
And we can predict.
384
00:24:10,280 --> 00:24:13,280
And it would be impossible to do this
without AI,
385
00:24:13,680 --> 00:24:17,840
in this case, a deep neural network
that analyzes all the epigenetic data
386
00:24:17,840 --> 00:24:21,400
and makes, the predictions
that we need to be able to make.
387
00:24:21,880 --> 00:24:23,720
But we didn't stop there.
388
00:24:23,720 --> 00:24:28,640
The large language models, ChatGPT
and others, we built a version of that.
389
00:24:28,800 --> 00:24:32,520
They can also just talk with moms,
talk with them before
390
00:24:32,520 --> 00:24:36,520
they're even pregnant,
just talk with women.
391
00:24:37,200 --> 00:24:41,040
And in those conversations,
we have a completely separate
392
00:24:41,040 --> 00:24:45,160
AI that's analyzing
their speech patterns and,
393
00:24:45,520 --> 00:24:51,360
turns out, can also independently predict
depression and suicide risk.
394
00:24:52,720 --> 00:24:54,720
Being able to work on projects like these
395
00:24:54,720 --> 00:24:58,280
and bring them out to
the world is immensely powerful.
396
00:24:59,240 --> 00:25:02,760
But you'll notice in none of these cases
that I build
397
00:25:02,760 --> 00:25:06,520
something that just kind of got
turned loose on the world.
398
00:25:09,040 --> 00:25:10,640
And there's a reason for that,
399
00:25:10,640 --> 00:25:13,240
and it's a reason
that it's become very salient
400
00:25:13,240 --> 00:25:16,800
in the world of generative AI, which,
401
00:25:17,520 --> 00:25:21,240
if there are any nerds in the room,
is a very frustrating term for me,
402
00:25:21,240 --> 00:25:26,280
because generative modeling already
means something different mathematically.
403
00:25:26,640 --> 00:25:30,160
And so now for some reason,
we're calling this kind of AI
404
00:25:30,160 --> 00:25:33,360
the exact same word
that isn't mathematically applicable.
405
00:25:33,360 --> 00:25:37,200
It drives a nerd like me up the wall.
406
00:25:37,440 --> 00:25:39,760
But I think we all know
what I'm talking about.
407
00:25:39,760 --> 00:25:42,760
The the chat bots that you can now
talk to the
408
00:25:43,080 --> 00:25:46,840
the models
that can create an image out of nothing.
409
00:25:48,000 --> 00:25:49,960
And they're amazing.
410
00:25:49,960 --> 00:25:52,440
They are huge breakthrough advances.
411
00:25:52,440 --> 00:25:55,360
I will not share them. Sell them short.
412
00:25:55,360 --> 00:25:56,400
GPT and Gemini.
413
00:25:56,400 --> 00:25:59,400
I truly can surpass
414
00:26:00,240 --> 00:26:03,480
human medical students
in answering medical questions
415
00:26:04,400 --> 00:26:07,400
and most average doctors as well.
416
00:26:08,080 --> 00:26:10,680
And wouldn't
it be amazing for all of those people
417
00:26:10,680 --> 00:26:13,800
that don't have access
to an immediate doctor to be able
418
00:26:13,800 --> 00:26:16,800
to ask questions and get answers?
419
00:26:18,520 --> 00:26:20,480
But here's the flip side.
420
00:26:20,480 --> 00:26:23,480
Now that we've come full circle
421
00:26:23,520 --> 00:26:27,240
and why I started this story
422
00:26:28,320 --> 00:26:30,240
with this question that didn't
423
00:26:30,240 --> 00:26:33,240
seem to have anything to do
with artificial intelligence.
424
00:26:33,480 --> 00:26:37,080
Are you happy and how do you become happy?
425
00:26:38,920 --> 00:26:42,720
Because we might do one of two things
right now.
426
00:26:43,600 --> 00:26:45,760
We might look at AI being able to do
427
00:26:45,760 --> 00:26:50,000
all of these amazing things and give up.
428
00:26:51,960 --> 00:26:53,280
Just leave it
429
00:26:53,280 --> 00:26:57,960
to jerks like Vivian to invent things,
430
00:26:57,960 --> 00:27:01,200
and they'll do all the work and solve
all the problems.
431
00:27:01,840 --> 00:27:06,000
You definitely shouldn't trust me
to do anything, but that is an option.
432
00:27:06,000 --> 00:27:07,760
A choice that could be made.
433
00:27:07,760 --> 00:27:08,520
Another.
434
00:27:08,520 --> 00:27:14,160
And let me promise you,
there are a hundred thousand startups
435
00:27:15,320 --> 00:27:18,400
trying to replace every job
436
00:27:19,000 --> 00:27:22,080
that universities are currently training
students to do.
437
00:27:24,200 --> 00:27:27,120
It's important to understand that.
438
00:27:27,120 --> 00:27:30,560
But the other big problem
439
00:27:31,680 --> 00:27:35,040
is that we start using AI
440
00:27:35,680 --> 00:27:40,280
now, that it's no longer just the domains
of the mad scientist like me.
441
00:27:40,880 --> 00:27:44,160
Now that it is something
all of us can interact with.
442
00:27:44,880 --> 00:27:47,880
All of us can talk to.
443
00:27:47,920 --> 00:27:50,920
Now, what we're wondering is,
444
00:27:52,360 --> 00:27:55,880
are we going to keep trying
as hard as we should be trying?
445
00:27:56,840 --> 00:27:57,720
Are we willing to
446
00:27:57,720 --> 00:28:02,000
put in the effort in our lives
when we don't have to,
447
00:28:02,600 --> 00:28:05,600
and something else
can put the effort in for us?
448
00:28:05,600 --> 00:28:09,480
So let's go back to this idea
that I brought up earlier
449
00:28:10,400 --> 00:28:13,440
that there are
ill posed problems in the world,
450
00:28:13,960 --> 00:28:16,680
and there are well
posed problems in the world,
451
00:28:16,680 --> 00:28:19,600
and that artificial intelligence
452
00:28:19,600 --> 00:28:22,600
is quickly surpassing human beings
453
00:28:22,760 --> 00:28:25,760
at solving well posed problems
454
00:28:26,760 --> 00:28:28,960
and wonder,
455
00:28:28,960 --> 00:28:31,680
why is that still
456
00:28:31,680 --> 00:28:34,680
how we're educating our children?
457
00:28:35,360 --> 00:28:38,360
Why are we raising generations of kids
458
00:28:39,000 --> 00:28:42,000
to answer routine questions
459
00:28:42,960 --> 00:28:45,960
when all of us in this room,
460
00:28:46,000 --> 00:28:48,640
in our pocket,
have the answers to those questions
461
00:28:48,640 --> 00:28:51,640
for free?
462
00:28:52,120 --> 00:28:54,080
Instead,
463
00:28:54,080 --> 00:28:57,080
we should be building
kids for the unknown.
464
00:28:58,440 --> 00:28:59,640
We should be building them
465
00:28:59,640 --> 00:29:03,200
to solve ill posed questions.
466
00:29:04,400 --> 00:29:07,400
We need a generation of explorers
467
00:29:08,760 --> 00:29:11,760
because the one thing, at least for now,
468
00:29:11,760 --> 00:29:14,360
that natural intelligence does
469
00:29:14,360 --> 00:29:17,360
and artificial intelligence does not,
470
00:29:17,400 --> 00:29:20,400
is explore the unknown.
471
00:29:20,400 --> 00:29:22,120
Now, to be clear, I use
472
00:29:22,120 --> 00:29:25,440
AI regularly to explore the unknown.
473
00:29:25,640 --> 00:29:28,640
I actually had a question.
474
00:29:28,840 --> 00:29:31,080
this may be a question of zero
475
00:29:31,080 --> 00:29:34,560
interest to the majority of you,
but in a way, that's part of the point.
476
00:29:35,240 --> 00:29:40,120
What is the confidence interval
of the product of two regression model
477
00:29:40,120 --> 00:29:43,120
parameters?
478
00:29:43,200 --> 00:29:44,040
It's obviously
479
00:29:44,040 --> 00:29:47,040
not a question that pops up
in most of our day to day lives.
480
00:29:47,560 --> 00:29:48,440
It's not a question
481
00:29:48,440 --> 00:29:52,440
that actually even pops up in the day
to day lives of most scientists.
482
00:29:52,880 --> 00:29:56,680
Even statisticians
might have to think hard about,
483
00:29:56,680 --> 00:30:01,080
well, gosh, what is the equation
to compute the confidence interval?
484
00:30:01,760 --> 00:30:06,040
Well, fortunately we have GPT and Gemini.
485
00:30:06,040 --> 00:30:08,440
So I ask them and they were wrong.
486
00:30:10,640 --> 00:30:11,240
If you want to know
487
00:30:11,240 --> 00:30:16,000
one simple take away lesson
for how you should interact
488
00:30:16,560 --> 00:30:19,560
with large language models like GPT,
489
00:30:19,800 --> 00:30:22,080
they know everything
490
00:30:22,080 --> 00:30:25,080
and they understand nothing.
491
00:30:27,120 --> 00:30:30,680
GPT knows so much more than I do,
492
00:30:32,080 --> 00:30:36,440
despite how much study I have,
despite my fancy degrees,
493
00:30:36,720 --> 00:30:41,600
despite my years of research,
it just knows more than me.
494
00:30:42,200 --> 00:30:45,240
It's encoded
in those trillions of parameters
495
00:30:45,360 --> 00:30:48,320
that are predicting the next word,
496
00:30:48,320 --> 00:30:51,560
but it doesn't understand
anything that it knows.
497
00:30:52,920 --> 00:30:54,520
That isn't a knock on it.
498
00:30:54,520 --> 00:30:57,520
What it can do is amazing.
499
00:30:58,440 --> 00:31:01,080
but when I asked it
500
00:31:01,080 --> 00:31:05,520
for the answer to this problem,
which, by the way, is in fact a well
501
00:31:05,520 --> 00:31:08,960
posed problem, just a very esoteric one,
502
00:31:08,960 --> 00:31:12,040
a very weird and unusual one.
503
00:31:12,280 --> 00:31:13,640
It gave me the wrong answer.
504
00:31:13,640 --> 00:31:16,640
It gave me the answer to another,
more common problem.
505
00:31:16,960 --> 00:31:20,600
It's another thing to understand
about large language models
506
00:31:20,600 --> 00:31:22,080
and generative image models.
507
00:31:23,320 --> 00:31:25,400
They are
508
00:31:25,400 --> 00:31:29,440
the best of the average of the internet.
509
00:31:31,160 --> 00:31:34,920
Turns out, the average of the internet
doesn't typically have to know
510
00:31:34,920 --> 00:31:38,800
what the confidence interval
of the product of two model
511
00:31:38,800 --> 00:31:41,960
parameters is,
and so its first answer was wrong.
512
00:31:42,680 --> 00:31:46,680
It answered a much simpler question,
but I didn't give up on it.
513
00:31:47,320 --> 00:31:49,440
I stuck with it
514
00:31:49,440 --> 00:31:51,520
and we talked it out.
515
00:31:51,520 --> 00:31:55,360
Now I didn't talk it out
like it was a person,
516
00:31:55,640 --> 00:31:59,920
but at the same time I did engage with it
the way I might engage with
517
00:31:59,920 --> 00:32:04,560
one of my graduate students, my graduate
students, whatever they're studying,
518
00:32:05,240 --> 00:32:08,240
they actually know it better than me.
519
00:32:08,400 --> 00:32:09,560
They're invested in it.
520
00:32:09,560 --> 00:32:12,840
I talked to them an hour
a week about this issue.
521
00:32:13,440 --> 00:32:15,800
They're fully invested in it.
522
00:32:15,800 --> 00:32:17,520
This is their dissertation.
523
00:32:18,720 --> 00:32:21,120
This is
their ticket to whatever is coming next.
524
00:32:21,120 --> 00:32:24,120
And they know it so much more than I do,
525
00:32:24,600 --> 00:32:26,680
but they don't understand it.
526
00:32:26,680 --> 00:32:29,120
That's what
I'm bringing to that relationship
527
00:32:29,120 --> 00:32:32,720
as an understanding of the implications
of what they're discovering
528
00:32:32,920 --> 00:32:36,040
and trying to guide them
towards a similar understanding.
529
00:32:36,720 --> 00:32:39,240
When I interact with GPT
530
00:32:39,240 --> 00:32:42,240
or Gemini, I treat it the same way.
531
00:32:42,720 --> 00:32:45,720
He knows everything and understands
nothing.
532
00:32:45,800 --> 00:32:49,400
And so I said, no, you've understood
misunderstood my problem.
533
00:32:49,400 --> 00:32:50,400
You've actually given me
534
00:32:50,400 --> 00:32:53,400
the confidence interval
to the product of two random variables.
535
00:32:53,520 --> 00:32:57,120
I know this is super exciting to everyone,
but that's what it did.
536
00:32:57,520 --> 00:33:01,800
And so I slowly walked it
through the problem.
537
00:33:02,200 --> 00:33:05,200
I didn't know the equation
538
00:33:05,520 --> 00:33:07,760
that I was looking for.
539
00:33:07,760 --> 00:33:10,160
Interestingly, neither did it,
540
00:33:11,400 --> 00:33:11,880
but about
541
00:33:11,880 --> 00:33:17,480
half an hour later we had the actual
equation written out and correct
542
00:33:18,320 --> 00:33:22,120
and the code written out in the programing
language Python.
543
00:33:22,120 --> 00:33:25,080
To execute that on my data,
544
00:33:25,080 --> 00:33:28,080
something I couldn't have done by myself
and nor could it.
545
00:33:29,560 --> 00:33:33,120
That's what excites me about AI isn't
546
00:33:33,640 --> 00:33:36,640
what can I go do by itself?
547
00:33:37,040 --> 00:33:41,720
I'm too busy to write emails
to all of my business acquaintances.
548
00:33:41,880 --> 00:33:44,280
I don't have the time to write an essay.
549
00:33:44,280 --> 00:33:48,520
I don't want to have to think about
the art I'm putting in my presentation.
550
00:33:48,560 --> 00:33:51,000
Let I do it for you.
551
00:33:51,000 --> 00:33:54,520
No, please don't make that choice.
552
00:33:55,800 --> 00:33:58,800
Think about what you could do better
553
00:33:59,040 --> 00:34:02,040
using AI as a tool.
554
00:34:02,160 --> 00:34:04,440
It's not a magic wand.
555
00:34:04,440 --> 00:34:07,080
It is also not intelligent the way we are.
556
00:34:07,080 --> 00:34:12,280
Intelligent again, it's not a knock
is astonishing what it can do,
557
00:34:13,080 --> 00:34:17,320
but it doesn't understand the world.
558
00:34:17,800 --> 00:34:19,640
That's your job.
559
00:34:19,640 --> 00:34:21,120
That's our job.
560
00:34:22,480 --> 00:34:23,400
Instead of building a
561
00:34:23,400 --> 00:34:27,000
generation of students
ready to answer tough questions
562
00:34:27,440 --> 00:34:30,200
that this can do so much more easily,
563
00:34:30,200 --> 00:34:33,040
we should build a generation of students
564
00:34:33,040 --> 00:34:36,200
that know how to find good questions.
565
00:34:37,440 --> 00:34:39,480
A generation of students
566
00:34:39,480 --> 00:34:43,440
that when their first try doesn't work,
they don't give up.
567
00:34:44,120 --> 00:34:46,560
They keep trying
because they have a belief
568
00:34:46,560 --> 00:34:49,560
that their hard work
will actually pay off.
569
00:34:49,560 --> 00:34:52,840
None of that applies
to artificial intelligence today.
570
00:34:53,520 --> 00:34:57,000
This little interlude is just going to be
for the nerds in the room.
571
00:34:57,840 --> 00:35:01,240
There is a famous difference in human
learning
572
00:35:02,000 --> 00:35:07,160
that is identified in both neuroscience
and psychology, called model free
573
00:35:07,160 --> 00:35:12,200
and model based learning and model free
is basically just statistics.
574
00:35:13,400 --> 00:35:15,480
And we in our brains
575
00:35:15,480 --> 00:35:18,600
definitely do a huge amount of model free
learning.
576
00:35:18,880 --> 00:35:22,000
And essentially,
that's what most of AI is.
577
00:35:23,120 --> 00:35:25,640
It's just a bunch of raw
578
00:35:25,640 --> 00:35:29,000
correlations
churned and churned and churned.
579
00:35:29,240 --> 00:35:34,160
Interestingly, on masses of data
that no human could ever interact with,
580
00:35:35,280 --> 00:35:38,360
the, the there's a whole other class
581
00:35:38,360 --> 00:35:41,360
of artificial intelligence
called reinforcement learning models,
582
00:35:42,000 --> 00:35:44,520
the most famous of which would be AlphaGo,
583
00:35:44,520 --> 00:35:48,240
the AI that beat the world's
best human go players.
584
00:35:48,520 --> 00:35:51,520
And since then,
585
00:35:51,920 --> 00:35:56,520
Deep Brain, the Google subsidiary,
has gone on to use AlphaGo
586
00:35:56,520 --> 00:35:59,280
to create AlphaFold,
587
00:35:59,280 --> 00:36:02,280
and to create,
588
00:36:02,360 --> 00:36:05,160
the new system
that can discover all these proteins.
589
00:36:05,160 --> 00:36:07,880
But in all of these cases,
590
00:36:07,880 --> 00:36:10,880
it's just blindly exploring
591
00:36:11,640 --> 00:36:14,040
in a single night,
592
00:36:14,040 --> 00:36:18,880
the reinforcement learning system
from AlphaGo can play something
593
00:36:18,880 --> 00:36:22,720
on the order of 180
594
00:36:22,720 --> 00:36:26,280
years of human experience.
595
00:36:26,520 --> 00:36:29,520
Go in a single night.
596
00:36:29,720 --> 00:36:32,240
It plays so many games against itself.
597
00:36:32,240 --> 00:36:35,360
It is the human being played it with,
598
00:36:35,840 --> 00:36:39,800
you know, only breaking
to sleep for 186 years.
599
00:36:40,800 --> 00:36:41,160
It's not so
600
00:36:41,160 --> 00:36:44,160
surprising that it does better than we do,
601
00:36:44,640 --> 00:36:47,520
but does it truly understand? Go.
602
00:36:47,520 --> 00:36:50,520
I mean, it would keep playing go
if the game was back or at
603
00:36:51,440 --> 00:36:54,440
it would keep playing go
if the house was burning down.
604
00:36:56,440 --> 00:36:58,720
it turns numbers into numbers.
605
00:36:58,720 --> 00:37:01,000
That's a powerful tool.
606
00:37:01,000 --> 00:37:04,000
But it's not the ability
to explore the unknown.
607
00:37:04,240 --> 00:37:07,240
Let me ratchet this up another level.
608
00:37:07,440 --> 00:37:12,600
A lot of research has looked
at human workers
609
00:37:12,720 --> 00:37:16,400
using large language models
to speed up their work.
610
00:37:16,920 --> 00:37:22,440
Published papers,
some of them by OpenAI itself, others
611
00:37:22,440 --> 00:37:26,680
by notable economists at Harvard
and Stanford and beyond.
612
00:37:27,360 --> 00:37:30,360
So, for example,
613
00:37:30,680 --> 00:37:33,000
let's start at what economists
614
00:37:33,000 --> 00:37:36,000
would see
as the low end of the skills ladder.
615
00:37:36,400 --> 00:37:37,920
Call centers.
616
00:37:37,920 --> 00:37:39,160
You're working on call center.
617
00:37:39,160 --> 00:37:42,920
Someone calls up, you take their call,
you essentially have a script,
618
00:37:42,920 --> 00:37:45,920
and you just run through the script
to try and answer their questions.
619
00:37:46,720 --> 00:37:49,560
And what they found was when call center
workers
620
00:37:49,560 --> 00:37:52,800
had a large language model
helping them answer the questions.
621
00:37:53,560 --> 00:37:56,960
They were 40% better and 40%
more productive.
622
00:37:56,960 --> 00:38:02,360
They answered 40% more calls
and had fewer referrals to their manager.
623
00:38:02,960 --> 00:38:03,600
They were better.
624
00:38:05,240 --> 00:38:07,560
Now let's take a step further
625
00:38:07,560 --> 00:38:11,080
and look at programmers.
626
00:38:11,080 --> 00:38:12,440
Maybe I'm not being fair.
627
00:38:12,440 --> 00:38:15,160
Many people go to university
to learn how to program.
628
00:38:15,160 --> 00:38:17,640
Many people learn
how to program completely on their own.
629
00:38:17,640 --> 00:38:19,160
It's a little unclear.
630
00:38:19,160 --> 00:38:21,800
This was certainly
the skill of the future.
631
00:38:21,800 --> 00:38:24,800
Not five years ago,
632
00:38:24,960 --> 00:38:27,960
certainly ten years ago.
633
00:38:28,080 --> 00:38:30,920
Every policy paper in the world
634
00:38:30,920 --> 00:38:35,520
from the UN, from the World
Economic Forum, the white House, McKinsey,
635
00:38:35,960 --> 00:38:38,880
every major country had a policy paper
636
00:38:38,880 --> 00:38:41,960
and it said, we need to teach every kid
how to program.
637
00:38:42,960 --> 00:38:44,640
Well guess what?
638
00:38:44,640 --> 00:38:47,720
We live in a world in which I can program.
639
00:38:49,280 --> 00:38:54,640
and I could have told you that ten years
ago, and I did.
640
00:38:54,640 --> 00:38:58,800
I didn't tell you, but I told everyone
writing these policy papers.
641
00:38:59,520 --> 00:39:01,200
What a crazy thing.
642
00:39:01,200 --> 00:39:04,200
Imagine you had asked me,
643
00:39:05,560 --> 00:39:07,040
what is
644
00:39:07,040 --> 00:39:09,960
the world financial markets
going to do for the next 30 years?
645
00:39:09,960 --> 00:39:12,200
And I said, don't worry about it.
646
00:39:12,200 --> 00:39:14,880
Simply take all of your money.
647
00:39:14,880 --> 00:39:17,080
Buy nothing
648
00:39:17,080 --> 00:39:20,080
but this one stock
and then never change it.
649
00:39:20,160 --> 00:39:21,640
The logic is about the same.
650
00:39:21,640 --> 00:39:26,160
Teach everyone in the world how to program
because that will be a protected skill.
651
00:39:26,680 --> 00:39:30,000
Programing is routine knowledge.
652
00:39:30,920 --> 00:39:33,920
It's a solution to well-posed problems.
653
00:39:34,560 --> 00:39:38,640
People can do creative things
with programing, but the actual core
654
00:39:38,640 --> 00:39:43,600
of programing is routine
and I can do that and do it very well.
655
00:39:43,880 --> 00:39:47,400
So programmers using large language models
656
00:39:47,400 --> 00:39:50,400
to help them program again
about 40% faster
657
00:39:50,520 --> 00:39:53,520
with fewer errors.
658
00:39:54,800 --> 00:39:55,520
and now I will step
659
00:39:55,520 --> 00:39:58,760
up, professional writers
with university educations.
660
00:39:59,560 --> 00:40:03,000
Again, they're faster
and they produce better work.
661
00:40:04,120 --> 00:40:07,360
Boston Consulting Group consultants
662
00:40:08,520 --> 00:40:11,600
with very fancy degrees
from elite universities.
663
00:40:12,640 --> 00:40:15,920
They're asked to analyze
spreadsheets of data
664
00:40:17,080 --> 00:40:20,280
and produce a report, a written report,
665
00:40:20,360 --> 00:40:23,320
and then a set of slides
for a presentation.
666
00:40:23,320 --> 00:40:25,840
And some of them are given GPS
667
00:40:25,840 --> 00:40:28,560
to use and some of them aren't.
668
00:40:28,560 --> 00:40:30,920
And unsurprisingly,
669
00:40:30,920 --> 00:40:35,640
the ones using GPT are faster and produce
higher quality work
670
00:40:36,080 --> 00:40:39,080
just the same as all the other workers.
671
00:40:40,400 --> 00:40:43,040
Well, but there's two
672
00:40:43,040 --> 00:40:46,400
incredibly important missing parts
to this story.
673
00:40:46,880 --> 00:40:49,880
One in every one of these stories,
674
00:40:50,640 --> 00:40:53,640
it wasn't
everyone that received the boost.
675
00:40:54,080 --> 00:40:57,600
Only the least experienced and lowest
skilled workers.
676
00:40:58,920 --> 00:41:03,400
Interestingly, at every skill level,
the people that already knew
677
00:41:03,400 --> 00:41:06,640
how to do the job
didn't actually get benefit from using AI.
678
00:41:07,720 --> 00:41:10,240
That's really interesting,
679
00:41:10,240 --> 00:41:13,400
particularly
if I add in this additional twist
680
00:41:13,680 --> 00:41:18,960
from the world of education where
we've been studying using AI for decades.
681
00:41:20,840 --> 00:41:21,200
If an
682
00:41:21,200 --> 00:41:24,760
AI tutor ever gives the students
the answer.
683
00:41:25,360 --> 00:41:28,360
They never learn anything.
684
00:41:29,440 --> 00:41:30,760
Wait a minute.
685
00:41:30,760 --> 00:41:33,760
Isn't that exactly what's happening
with these workers?
686
00:41:34,600 --> 00:41:36,560
With the new career, workers
687
00:41:36,560 --> 00:41:39,600
are starting their jobs and the
AI is telling them what to do.
688
00:41:40,560 --> 00:41:44,080
I promise you,
they will never learn how to do their job
689
00:41:44,080 --> 00:41:45,320
any better than they're doing it.
690
00:41:45,320 --> 00:41:48,320
The day they show up.
691
00:41:48,320 --> 00:41:52,680
The idea
that we should simply offload our effort
692
00:41:52,680 --> 00:41:57,200
and our work to
I robs us of the chance to grow.
693
00:41:58,920 --> 00:42:01,920
But there's even a bigger issue,
694
00:42:02,200 --> 00:42:06,280
which is in the Boston Consulting Study.
695
00:42:07,640 --> 00:42:10,440
there was one additional test
696
00:42:10,440 --> 00:42:12,440
because their job, their job
697
00:42:12,440 --> 00:42:17,240
doesn't end with analyze the spreadsheet,
which is a well-posed problem,
698
00:42:17,760 --> 00:42:20,760
and write a report
which shouldn't be a well-posed problem.
699
00:42:20,800 --> 00:42:24,040
But I don't know how many of you
have done a business report.
700
00:42:25,200 --> 00:42:27,360
You might as well let an AI write it.
701
00:42:27,360 --> 00:42:30,360
They read exactly the same every time.
702
00:42:32,400 --> 00:42:34,920
so what was after
703
00:42:34,920 --> 00:42:39,880
that is they then had to say,
what should their client
704
00:42:39,880 --> 00:42:42,880
do about the analysis that they just ran?
705
00:42:44,440 --> 00:42:47,720
And so now we've moved from Well-posed
to opposed.
706
00:42:48,800 --> 00:42:52,480
We moved into a world
in which there isn't a right answer.
707
00:42:53,400 --> 00:42:56,280
And in fact, very cleverly,
the researchers set it up
708
00:42:56,280 --> 00:42:59,280
so that GPT would give them
the wrong answer.
709
00:43:00,920 --> 00:43:02,400
And the people using the
710
00:43:02,400 --> 00:43:05,440
AI did worse than the people that weren't
711
00:43:06,520 --> 00:43:09,520
when it became a creative task.
712
00:43:09,720 --> 00:43:12,720
That should scare all of us.
713
00:43:13,800 --> 00:43:16,120
We are uniquely good
714
00:43:16,120 --> 00:43:19,120
for now at exploring the unknown.
715
00:43:19,320 --> 00:43:22,080
But these highly educated,
716
00:43:22,080 --> 00:43:25,080
highly motivated workers,
717
00:43:25,360 --> 00:43:27,600
when given an AI tool,
718
00:43:27,600 --> 00:43:30,600
stopped exploring
and just did what it told them to.
719
00:43:32,440 --> 00:43:35,440
So how do we reverse this trend?
720
00:43:36,040 --> 00:43:39,320
How do we get the benefits
of artificial intelligence?
721
00:43:40,000 --> 00:43:44,440
And again, I can just bragging, Lee,
describe my own work
722
00:43:45,000 --> 00:43:48,000
autistic children that can learn
how to read facial expressions
723
00:43:48,800 --> 00:43:52,120
or from refugees
reunited with their extended family.
724
00:43:52,360 --> 00:43:55,680
I built the first ever system that could,
725
00:43:56,200 --> 00:43:58,720
predict bipolar episodes.
726
00:43:58,720 --> 00:44:01,040
manic episodes and bipolar sufferers.
727
00:44:01,040 --> 00:44:03,000
My company,
728
00:44:03,000 --> 00:44:05,520
has the first test ever
for manic depression.
729
00:44:05,520 --> 00:44:10,400
When my son in 2011
was diagnosed with type one diabetes.
730
00:44:11,120 --> 00:44:14,120
I hacked all of his medical devices.
731
00:44:14,640 --> 00:44:15,160
Broke.
732
00:44:15,160 --> 00:44:17,080
As it turns out, all sorts of U.S.
733
00:44:17,080 --> 00:44:18,640
federal laws.
734
00:44:18,640 --> 00:44:22,920
And I developed the first ever
AI for diabetes just for my son.
735
00:44:23,680 --> 00:44:26,520
It sent updates to my Google Glass, which,
believe it or not,
736
00:44:26,520 --> 00:44:29,520
I wore to a party at the white House.
737
00:44:29,560 --> 00:44:34,080
There is literally a Secret Service policy
738
00:44:34,080 --> 00:44:38,000
at the white House naming me saying,
739
00:44:38,000 --> 00:44:42,160
you can't wear live computers
on your head in the white House anymore.
740
00:44:43,680 --> 00:44:47,440
but I did get to meet President Obama
because of that.
741
00:44:47,440 --> 00:44:51,200
And after he met me and walked
742
00:44:51,200 --> 00:44:54,200
up, he actually walked up to me,
look straight at me,
743
00:44:55,000 --> 00:44:58,000
and just shook his head
like I was such a disappointment.
744
00:44:58,120 --> 00:44:59,680
What are you wearing?
745
00:44:59,680 --> 00:45:03,120
A fluorescent blue computer on your face
at my fancy party?
746
00:45:04,040 --> 00:45:07,040
But then I said,
747
00:45:07,680 --> 00:45:10,680
okay, Google, take a picture.
748
00:45:11,400 --> 00:45:14,040
so that was my voice
activated Google Glass.
749
00:45:14,040 --> 00:45:16,040
It took a picture of the president.
750
00:45:16,040 --> 00:45:18,840
Google has confirmed
it is the only presidential portrait
751
00:45:18,840 --> 00:45:20,600
ever taken by Google Glass.
752
00:45:20,600 --> 00:45:23,600
It is a terrible picture.
753
00:45:23,760 --> 00:45:26,760
it did not improve
his opinion of me, by the way.
754
00:45:27,800 --> 00:45:30,800
but then he said,
755
00:45:31,200 --> 00:45:34,200
why are you wearing this silly thing?
756
00:45:34,560 --> 00:45:39,560
And I said,
because I built an artificial intelligence
757
00:45:40,680 --> 00:45:42,680
that's monitoring my son's diabetes.
758
00:45:42,680 --> 00:45:44,760
He's back in San Francisco.
759
00:45:44,760 --> 00:45:47,760
I'm on the other side of the continent
in Washington, D.C..
760
00:45:48,320 --> 00:45:51,480
I just got a text message from my AI
761
00:45:52,200 --> 00:45:55,760
telling me my son's blood
glucose levels would go dangerously low.
762
00:45:56,120 --> 00:45:59,480
So I sent a message back to my sister
who was watching after him.
763
00:46:00,000 --> 00:46:03,000
She gave him a tracker and his blood
glucose levels didn't go up.
764
00:46:04,200 --> 00:46:07,200
I built myself a superpower,
765
00:46:07,200 --> 00:46:10,800
something that every parent
should have the chance to build.
766
00:46:11,960 --> 00:46:15,320
I truly can do amazing things.
767
00:46:15,600 --> 00:46:18,600
We should have it in our lives,
768
00:46:18,840 --> 00:46:21,840
but we shouldn't use it to substitute
769
00:46:21,880 --> 00:46:24,880
for the things
we can already do for ourselves.
770
00:46:27,640 --> 00:46:31,080
When I use artificial intelligence today.
771
00:46:33,200 --> 00:46:36,360
I use it in a peculiar way.
772
00:46:38,160 --> 00:46:41,040
Now, I'm not talking about the models
I built for my research.
773
00:46:41,040 --> 00:46:45,040
All of those I'm building by hand,
and I'm trying to solve specific problems
774
00:46:45,040 --> 00:46:48,240
and explore things
I don't understand about the world.
775
00:46:48,760 --> 00:46:52,840
But when I use it as a regular person,
776
00:46:54,600 --> 00:46:56,400
I, for example, in
777
00:46:56,400 --> 00:46:59,440
using it to write on a book
I'm working on right now.
778
00:47:00,200 --> 00:47:03,840
The worst thing I could do is say GPT,
779
00:47:04,680 --> 00:47:07,080
write the next chapter of my book.
780
00:47:07,080 --> 00:47:08,920
I have actually tried.
781
00:47:08,920 --> 00:47:11,440
It writes garbage.
782
00:47:11,440 --> 00:47:13,680
It is a terrible writer.
783
00:47:13,680 --> 00:47:15,320
Not it gets the punctuation.
784
00:47:15,320 --> 00:47:18,280
Actually, I wish my students spelling
785
00:47:18,280 --> 00:47:21,280
and punctuation was as good as GPT three.
786
00:47:21,360 --> 00:47:24,360
Human beings
turn out to be terrible at that,
787
00:47:24,680 --> 00:47:27,680
but it's so banal.
788
00:47:28,200 --> 00:47:30,920
It's so substance less.
789
00:47:30,920 --> 00:47:36,240
Instead, I get to do something different
when I engage with it.
790
00:47:37,200 --> 00:47:40,200
I write my own,
791
00:47:40,680 --> 00:47:41,720
my own article.
792
00:47:41,720 --> 00:47:42,920
My own essay.
793
00:47:42,920 --> 00:47:44,640
Of course I do.
794
00:47:44,640 --> 00:47:49,320
In a world where all of the routine
knowledge is available
795
00:47:49,320 --> 00:47:51,840
for free in your pocket.
796
00:47:51,840 --> 00:47:54,040
What's truly left for us
797
00:47:54,040 --> 00:47:56,720
is our unique voice,
798
00:47:58,240 --> 00:47:59,440
the way we see the
799
00:47:59,440 --> 00:48:03,080
world differently than anyone else.
800
00:48:03,480 --> 00:48:06,480
Literally anyone else.
801
00:48:07,320 --> 00:48:09,960
And is your resilience
802
00:48:09,960 --> 00:48:12,960
and your,
803
00:48:13,160 --> 00:48:16,160
courage to use that voice
804
00:48:16,440 --> 00:48:19,440
that is your value
to the people around you.
805
00:48:20,040 --> 00:48:22,800
So how do I use.
806
00:48:22,800 --> 00:48:25,800
In my case, Gemini, Google's version,
807
00:48:26,280 --> 00:48:31,040
I say, Gemini, you are my worst enemy.
808
00:48:32,400 --> 00:48:35,800
You have been hounding me
for my entire career,
809
00:48:35,800 --> 00:48:38,880
finding every floor
and everything I've ever written.
810
00:48:39,960 --> 00:48:42,600
You were the reason
I don't have a Nobel Prize.
811
00:48:42,600 --> 00:48:45,600
Because you keep pointing out
everything I've ever done that's wrong.
812
00:48:45,600 --> 00:48:48,600
Let's just pretend that that's true.
813
00:48:49,320 --> 00:48:54,120
Now read my next essay and find for me
814
00:48:54,680 --> 00:48:57,400
all of the arrows, all of the things
815
00:48:57,400 --> 00:49:00,400
you might criticize.
816
00:49:00,440 --> 00:49:02,680
I'm not speeding up my writing.
817
00:49:02,680 --> 00:49:04,760
I'm slowing it down.
818
00:49:04,760 --> 00:49:07,760
I already went through
and wrote the whole thing
819
00:49:07,960 --> 00:49:10,800
in a way that I was happy with,
820
00:49:10,800 --> 00:49:13,920
and now I have turned the AI
821
00:49:13,920 --> 00:49:17,560
against me to tell me all of the mistakes
I made,
822
00:49:17,880 --> 00:49:21,840
all the flaws that exist, all of the ways
people might disagree with me.
823
00:49:22,440 --> 00:49:25,320
Now, I don't have to believe everything
it says.
824
00:49:25,320 --> 00:49:27,840
I work with it like it's my grad student.
825
00:49:27,840 --> 00:49:29,440
It knows everything.
826
00:49:29,440 --> 00:49:32,000
It doesn't understand everything.
827
00:49:32,000 --> 00:49:34,800
And so I take some of what it says.
828
00:49:34,800 --> 00:49:37,800
I ignore some of what it says
because I don't think it understood.
829
00:49:38,280 --> 00:49:41,040
And it's what we call productive friction.
830
00:49:42,480 --> 00:49:46,080
Rather than using AI to make my life
easier.
831
00:49:47,040 --> 00:49:50,760
I use AI to make myself better.
832
00:49:52,560 --> 00:49:54,440
That's what we should all be doing.
833
00:49:54,440 --> 00:49:57,440
Whether you are interested in building
AI yourself,
834
00:49:58,320 --> 00:50:00,640
or you are interested simply
835
00:50:00,640 --> 00:50:03,640
in making use of it
for your everyday life.
836
00:50:03,760 --> 00:50:07,640
Ask yourself this very specific question
837
00:50:08,960 --> 00:50:10,920
first.
838
00:50:10,920 --> 00:50:13,920
Am I better when I'm using it?
839
00:50:14,920 --> 00:50:17,720
And then think not just AI,
but a lot of technologies.
840
00:50:17,720 --> 00:50:20,960
We might admit we're not really
a better person when we use it.
841
00:50:22,600 --> 00:50:25,280
Feel free to interpret
better person as you want.
842
00:50:25,280 --> 00:50:27,960
But for me, it means
843
00:50:27,960 --> 00:50:31,440
when I'm not so long from now, 65,
844
00:50:33,000 --> 00:50:34,520
will I still be alive?
845
00:50:34,520 --> 00:50:35,760
Well, I've made lots of money.
846
00:50:35,760 --> 00:50:37,760
Well, I have lots of friends.
847
00:50:37,760 --> 00:50:39,720
Well, I be happy.
848
00:50:39,720 --> 00:50:42,720
Things
we might all agree is a better life.
849
00:50:42,880 --> 00:50:46,960
Does this technology, when I'm using it,
give me a better life?
850
00:50:47,320 --> 00:50:49,080
But here's the second question.
851
00:50:49,080 --> 00:50:50,800
And to me it's just as important.
852
00:50:51,960 --> 00:50:54,320
Am I a better person than when I started?
853
00:50:54,320 --> 00:50:57,320
When I turn it off again?
854
00:50:58,200 --> 00:51:00,120
And even amazing
855
00:51:00,120 --> 00:51:04,000
useful technologies
frequently follow that second test.
856
00:51:05,320 --> 00:51:10,280
And I have a whole story
I can share about Google Maps
857
00:51:10,280 --> 00:51:14,320
and automated navigation,
and why it's an astonishing technology
858
00:51:15,440 --> 00:51:18,440
that is increasing dementia,
859
00:51:19,320 --> 00:51:22,320
and everyone around the world.
860
00:51:22,760 --> 00:51:26,160
But I've run out of time
and so you'll have to ask me about it
861
00:51:26,160 --> 00:51:29,160
later
or come chat with me after this event.
862
00:51:29,160 --> 00:51:30,880
It's been a lot of fun.
863
00:51:30,880 --> 00:51:36,080
I guess I never got back around to talking
about collective intelligence, but,
864
00:51:36,600 --> 00:51:39,600
So I'm gonna do this one last little plug.
865
00:51:39,760 --> 00:51:42,480
When the pandemic started,
866
00:51:42,480 --> 00:51:45,320
I have a very strange life.
867
00:51:45,320 --> 00:51:48,320
In addition to my companies
and my academic work.
868
00:51:48,600 --> 00:51:51,320
My real day
job is running a philanthropic lab.
869
00:51:52,320 --> 00:51:55,320
People bring me challenging problems.
870
00:51:55,600 --> 00:51:58,600
And if I think my team
and I can make a meaningful difference,
871
00:51:58,760 --> 00:52:00,400
I pay for everything.
872
00:52:00,400 --> 00:52:03,200
And whatever we invent, we give away.
873
00:52:03,200 --> 00:52:08,160
So it's the stupidest startup idea
anyone's ever come up with.
874
00:52:08,600 --> 00:52:10,680
Endless demand, zero revenue.
875
00:52:10,680 --> 00:52:13,680
But it's also the best job
in the entire world.
876
00:52:13,920 --> 00:52:17,760
And so people write me and they say,
my daughter has a life threatening
877
00:52:17,760 --> 00:52:20,600
egg allergy. Please save her life.
878
00:52:20,600 --> 00:52:23,880
I say our company has bias
in this promotion process.
879
00:52:24,080 --> 00:52:26,800
Help us figure out where it's coming from.
880
00:52:26,800 --> 00:52:30,160
Our country's education system
is failing our students, but we're doing
881
00:52:30,160 --> 00:52:33,160
everything we're supposed to do,
and it still isn't helping.
882
00:52:33,880 --> 00:52:38,760
And again, if I think
I have my unique voice not because
883
00:52:38,760 --> 00:52:41,960
I'm smarter than everyone else,
but because I'm different.
884
00:52:42,840 --> 00:52:45,840
I have become very comfortable
with being different.
885
00:52:45,840 --> 00:52:49,760
Does my unique voice and insight
have something to say about this problem?
886
00:52:50,040 --> 00:52:52,560
And if so, we just do it.
887
00:52:52,560 --> 00:52:56,160
So at the beginning of the pandemic,
I got almost
888
00:52:56,160 --> 00:52:59,240
the identical question
from two separate people.
889
00:52:59,240 --> 00:53:00,360
Except they weren't people.
890
00:53:00,360 --> 00:53:02,760
They were giant multinational companies.
891
00:53:02,760 --> 00:53:06,960
Facebook and Amazon called me up and said,
we just sent everyone home
892
00:53:06,960 --> 00:53:08,040
and we don't know what to do.
893
00:53:09,240 --> 00:53:09,800
And what we
894
00:53:09,800 --> 00:53:12,920
particularly don't want know what to do
is how to be inclusive
895
00:53:13,920 --> 00:53:16,840
and how to innovate through a camera.
896
00:53:16,840 --> 00:53:19,840
If everyone's life
is going through a camera
897
00:53:20,200 --> 00:53:23,880
and we never see one another,
and we never have time to talk
898
00:53:24,400 --> 00:53:26,920
at the,
899
00:53:26,920 --> 00:53:29,920
over beers or at the watercooler,
900
00:53:31,720 --> 00:53:34,800
where does that collective intelligence
901
00:53:34,800 --> 00:53:37,800
and the brilliance of people come from?
902
00:53:38,160 --> 00:53:41,120
And because I'm weird,
I thought that sounds like a reasonable
903
00:53:41,120 --> 00:53:42,200
philanthropic project.
904
00:53:42,200 --> 00:53:45,280
I should help two of the largest,
richest companies in the world
905
00:53:45,280 --> 00:53:46,440
answer that question.
906
00:53:46,440 --> 00:53:49,440
But as long as they're fine with me
telling everyone what I found.
907
00:53:49,800 --> 00:53:54,520
Then we dove in
and I got to see everybody's life
908
00:53:55,120 --> 00:53:56,840
going through those cameras.
909
00:53:56,840 --> 00:53:59,840
All of our interactions around documents.
910
00:54:00,000 --> 00:54:04,200
And we discovered that the smartest teams
had three things in common.
911
00:54:04,640 --> 00:54:07,200
They were relatively small, 3 to 5 people.
912
00:54:08,160 --> 00:54:10,800
They had very flat hierarchies.
913
00:54:10,800 --> 00:54:15,200
There might be a Nobel Prize
winner and undergraduate intern.
914
00:54:15,200 --> 00:54:18,760
There might be the SVP of engineering
915
00:54:18,760 --> 00:54:22,520
and, someone who's just there
for the summer.
916
00:54:22,560 --> 00:54:25,480
It didn't matter
when they were collaborating.
917
00:54:25,480 --> 00:54:29,360
It all disappeared
and everyone took equal turns.
918
00:54:30,480 --> 00:54:33,360
Again, it is you, your unique voice
919
00:54:33,360 --> 00:54:36,360
that has value to this world.
920
00:54:36,440 --> 00:54:39,280
And the last ingredient
921
00:54:39,280 --> 00:54:42,280
is that those teams were diverse.
922
00:54:43,320 --> 00:54:46,880
So I truly am now finally out of time.
923
00:54:46,880 --> 00:54:50,480
But I will simply say this
sounded like a cool thing to build
924
00:54:51,240 --> 00:54:53,760
an artificial intelligence model for,
925
00:54:53,760 --> 00:54:56,520
not one that people would interact with,
926
00:54:56,520 --> 00:54:59,080
but one that we called the matchmaker
927
00:54:59,080 --> 00:55:02,080
and would look through
a social network of people
928
00:55:02,560 --> 00:55:07,400
and find all the matches to make
and break.
929
00:55:08,040 --> 00:55:11,040
Just as importantly,
breaking social connections
930
00:55:11,520 --> 00:55:14,640
to maximize
the collective intelligence of teams.
931
00:55:15,040 --> 00:55:17,400
And what we found is you could double
932
00:55:18,400 --> 00:55:21,840
the innovative output of a team of people
933
00:55:22,080 --> 00:55:25,440
by just shifting around those connections
934
00:55:25,440 --> 00:55:29,120
dynamically, using our AI system.
935
00:55:29,120 --> 00:55:33,000
And and again, it was so exciting
because it's another look
936
00:55:33,640 --> 00:55:36,320
not at how I can replace us,
937
00:55:36,320 --> 00:55:39,160
but how it can make us better
938
00:55:39,160 --> 00:55:42,120
in this case, not better as individuals,
939
00:55:42,120 --> 00:55:44,720
but by finding the community
940
00:55:44,720 --> 00:55:49,560
with which we shared just enough
trust and enough difference
941
00:55:50,080 --> 00:55:53,520
for the sparks to produce
something truly amazing.
942
00:55:54,120 --> 00:55:56,160
So I'm going to stop here.
943
00:55:56,160 --> 00:56:01,560
But again, say, for anyone that's foolish
enough to want to stay around
944
00:56:01,560 --> 00:56:06,440
and ask me questions, I am happy to spend
as much time here afterwards as I can.
945
00:56:06,960 --> 00:56:09,040
Thank you all so very much.
946
00:56:09,040 --> 00:56:12,040
Let's get it.
947
00:56:15,080 --> 00:56:18,080
Thank you very much.
948
00:56:20,440 --> 00:56:23,440
Okay. The.
949
00:56:24,200 --> 00:56:24,920
First video.
950
00:56:24,920 --> 00:56:26,080
Now is.
951
00:58:55,200 --> 00:58:58,200
No more.
952
00:58:59,560 --> 00:59:00,000
Thank you.
953
00:59:00,000 --> 00:59:01,200
Thank you very much.
954
00:59:01,200 --> 00:59:02,600
got all those questions.
955
00:59:02,600 --> 00:59:05,880
So this is easy for you to answer?
956
00:59:08,160 --> 00:59:11,640
the question
in English or Spanish and, okay.
957
00:59:11,640 --> 00:59:14,640
Got translated.
958
00:59:14,840 --> 00:59:16,200
Thank you very much.
959
00:59:16,200 --> 00:59:18,520
It has been an amazing talk.
960
00:59:18,520 --> 00:59:24,360
I think, the use that you see here,
the research the CIA,
961
00:59:24,920 --> 00:59:27,920
the children we saw, you know,
962
00:59:29,040 --> 00:59:29,880
it's amazing.
963
00:59:29,880 --> 00:59:31,840
So I want to
964
00:59:31,840 --> 00:59:34,840
work with children all week away
today. You.
965
00:59:35,160 --> 00:59:36,080
Yeah.
966
00:59:36,080 --> 00:59:39,240
With children, with ADHD
967
00:59:40,520 --> 00:59:41,280
paper.
968
00:59:41,280 --> 00:59:43,080
I can.
969
00:59:43,080 --> 00:59:46,080
Yeah. So,
970
00:59:48,200 --> 00:59:49,560
I think a whole lot of people
971
00:59:49,560 --> 00:59:54,520
like myself have had, diagnoses of, ADHD.
972
00:59:54,520 --> 00:59:58,680
And interestingly,
the comorbidity, the genetic underpinnings
973
00:59:58,680 --> 01:00:02,080
of many of these sort of track together.
974
01:00:02,080 --> 01:00:04,760
So ADHD,
975
01:00:04,760 --> 01:00:08,840
we certainly don't think of it
as being as challenging as autism
976
01:00:08,840 --> 01:00:11,840
or even things like bipolar
or schizophrenia, but
977
01:00:12,440 --> 01:00:17,400
you see similarities genetically
and where it's emerging from.
978
01:00:17,800 --> 01:00:19,920
So, yes.
979
01:00:19,920 --> 01:00:24,480
one of the things I'm looking at, more
in the neurotechnology side of my life,
980
01:00:24,840 --> 01:00:28,280
one of my companies has developed a,
981
01:00:29,640 --> 01:00:32,640
neuromodulator treatment for Alzheimer's.
982
01:00:33,760 --> 01:00:37,080
so the idea is we stimulate the brain
983
01:00:37,400 --> 01:00:40,400
in a way which,
984
01:00:40,520 --> 01:00:44,640
tricks it into thinking it's working very,
very hard.
985
01:00:45,080 --> 01:00:48,080
And this is good for Alzheimer's
because it
986
01:00:48,080 --> 01:00:51,840
it causes the brain,
particularly support structures
987
01:00:51,840 --> 01:00:54,840
in the brain,
glial cells and astrocytes to help clean
988
01:00:54,840 --> 01:00:57,840
up accumulated,
989
01:00:58,280 --> 01:01:01,280
byproducts of, of neural activity.
990
01:01:01,520 --> 01:01:05,160
And that clearly, is helping,
991
01:01:05,880 --> 01:01:08,160
with Alzheimer's.
992
01:01:08,160 --> 01:01:10,680
But it turns out this idea
993
01:01:10,680 --> 01:01:15,000
that you could stimulate the brain
and you can stimulate it.
994
01:01:15,000 --> 01:01:17,160
We're using light.
995
01:01:17,160 --> 01:01:21,720
So strobe ascorbic light at 40Hz, 40 times
per second.
996
01:01:21,760 --> 01:01:24,760
Light is flicking or at eight hertz.
997
01:01:25,760 --> 01:01:28,480
or you can use
electrical stimulation, alternating
998
01:01:28,480 --> 01:01:33,600
current, transcranial stimulation
and some other methodologies.
999
01:01:33,960 --> 01:01:36,800
You can cause these,
1000
01:01:36,800 --> 01:01:40,480
this synchronized activity
to, speed up in the brain.
1001
01:01:41,160 --> 01:01:44,080
And I've looked at traumatic brain
1002
01:01:44,080 --> 01:01:47,800
injury in kids
recovering working memory function.
1003
01:01:48,720 --> 01:01:51,600
there's research looking at in autism
1004
01:01:51,600 --> 01:01:55,920
at enhancing functioning there
and intra cortical connectivity.
1005
01:01:56,160 --> 01:01:58,440
So this seems to be one of the challenges
with autism.
1006
01:01:58,440 --> 01:02:03,120
Is one part of the brain
isn't talking to other wide ranging
1007
01:02:03,120 --> 01:02:07,160
areas as well as you might in people
that were neurotypical. And
1008
01:02:08,600 --> 01:02:11,240
certain kinds
of stimulation seem to help with that.
1009
01:02:11,240 --> 01:02:14,480
This is all very early in ADHD.
1010
01:02:14,480 --> 01:02:18,320
It's less clear
exactly what the mechanism is,
1011
01:02:18,760 --> 01:02:22,320
but it does seem that that 40Hz
1012
01:02:22,320 --> 01:02:25,320
stimulation seems to help.
1013
01:02:25,800 --> 01:02:28,000
But it's unclear now.
1014
01:02:28,000 --> 01:02:30,720
I don't want to totally nerd out
in front of everyone else,
1015
01:02:30,720 --> 01:02:33,720
but one of the issues with ADHD
might simply be that
1016
01:02:34,840 --> 01:02:39,040
the resting state of your brain
and another state.
1017
01:02:39,040 --> 01:02:42,880
So what we call default mode
network versus
1018
01:02:43,360 --> 01:02:46,360
executive control network, that those
1019
01:02:46,640 --> 01:02:51,840
those two states of your brain are very
distant and people that have difficulty,
1020
01:02:51,920 --> 01:02:55,440
different kinds of difficulties,
for example, inhibitory control,
1021
01:02:55,760 --> 01:02:59,400
people that commit crimes a lot looks like
these states are very different.
1022
01:02:59,400 --> 01:03:03,440
It's hard for them to take to shift
to a control state
1023
01:03:03,600 --> 01:03:06,240
when they suddenly have an impulse
to do a thing.
1024
01:03:06,240 --> 01:03:08,920
But what it seems like maybe with ADHD
1025
01:03:08,920 --> 01:03:11,920
is that the saliency,
the sort of sensory states
1026
01:03:12,520 --> 01:03:16,680
they're just very active
and very close to the default state.
1027
01:03:16,680 --> 01:03:19,800
And so as soon as you see something,
it kind of takes over and takes
1028
01:03:19,800 --> 01:03:20,800
you into a different place.
1029
01:03:22,600 --> 01:03:24,560
so all I can really say is
1030
01:03:24,560 --> 01:03:29,160
there is some exciting new work
using normal jittery, triggers.
1031
01:03:29,520 --> 01:03:32,520
And there's generic
artificial intelligence work
1032
01:03:32,920 --> 01:03:36,720
from my lab and others looking at,
1033
01:03:37,840 --> 01:03:40,760
what sort of lived experiences,
1034
01:03:40,760 --> 01:03:44,160
what sort of educational interventions
are effective across different kids?
1035
01:03:44,520 --> 01:03:49,680
So years ago we released a system
called muse, which helps new parents
1036
01:03:50,240 --> 01:03:53,240
with young children
for social emotional development.
1037
01:03:54,560 --> 01:03:57,800
and we were super excited
and had all sorts of great results,
1038
01:03:58,200 --> 01:04:03,760
but it was made for not for average kids,
because the whole point of the AI in
1039
01:04:03,760 --> 01:04:06,880
our system
was figure out who this child is
1040
01:04:07,560 --> 01:04:12,000
and what experiences are right for them,
not for someone else.
1041
01:04:12,960 --> 01:04:16,200
But now we're collaborating
with a group in Vancouver
1042
01:04:16,200 --> 01:04:20,600
that specifically wants to take
this explicitly to non neurotypical.
1043
01:04:20,840 --> 01:04:22,800
So ADHD and
1044
01:04:23,800 --> 01:04:25,680
other let's
1045
01:04:25,680 --> 01:04:29,440
call them non-clinical level,
learning challenges.
1046
01:04:29,440 --> 01:04:34,600
And I think there's a huge space
of just being able to identify
1047
01:04:34,800 --> 01:04:37,800
which behavioral therapies
will actually be effective,
1048
01:04:38,120 --> 01:04:42,480
as well as making space
for doing noninvasive
1049
01:04:43,560 --> 01:04:45,720
therapies that could also help as well.
1050
01:04:45,720 --> 01:04:49,200
And, you know, I don't want to rule out
the possibility of primary
1051
01:04:49,880 --> 01:04:51,640
treatment, drug treatments helping.
1052
01:04:51,640 --> 01:04:57,120
But I think the first step
in most of these, the way I think of it
1053
01:04:57,120 --> 01:05:00,120
as a computational neuroscientist, is
1054
01:05:00,360 --> 01:05:03,360
I want to get these different states.
1055
01:05:03,360 --> 01:05:05,560
I want to get the default state closer
to the one
1056
01:05:05,560 --> 01:05:08,560
we want and farther
from the one that's causing the problem.
1057
01:05:09,040 --> 01:05:12,040
And that's a very new way of thinking
about this sort of a problem.
1058
01:05:12,840 --> 01:05:17,080
and if you ever want to talk more,
be happy to follow up on it.
1059
01:05:17,080 --> 01:05:18,920
But there is work in this space.
1060
01:05:18,920 --> 01:05:21,520
I have a
my daughter has been diagnosed with,
1061
01:05:23,880 --> 01:05:26,800
we work with,
1062
01:05:26,800 --> 01:05:28,680
I wish, you know.
1063
01:05:28,680 --> 01:05:33,200
People are genuinely working hard on this,
but I will say as a having a kid
1064
01:05:33,200 --> 01:05:37,720
with diabetes, what I have been told
is for the last 50 years,
1065
01:05:37,720 --> 01:05:41,120
people have been saying, as soon as,
you know, when my kid got,
1066
01:05:42,240 --> 01:05:45,760
diagnosed, they told me
a cure was right around the corner.
1067
01:05:45,760 --> 01:05:48,120
And for 50 years we've been waiting.
1068
01:05:48,120 --> 01:05:50,200
So I don't want to over promise.
1069
01:05:50,200 --> 01:05:52,320
But we are making progress.
1070
01:05:52,320 --> 01:05:55,320
Yeah.
1071
01:06:00,760 --> 01:06:09,400
And, My vision for this
1072
01:06:09,600 --> 01:06:12,600
as we kind of,
1073
01:06:13,920 --> 01:06:15,280
as I think please remember
1074
01:06:15,280 --> 01:06:18,280
in the speech you mentioned.
1075
01:06:20,600 --> 01:06:25,160
and so how can you use, how we teach,
1076
01:06:25,800 --> 01:06:30,000
using the use to make people, a little.
1077
01:06:34,680 --> 01:06:36,400
So, hopefully
1078
01:06:36,400 --> 01:06:39,520
we can get you to use 3 million.
1079
01:06:41,840 --> 01:06:44,320
we have been involved in,
1080
01:06:46,400 --> 01:06:49,400
in these issues from,
1081
01:06:51,200 --> 01:06:54,840
the war and in the post-World War II.
1082
01:06:56,200 --> 01:06:58,080
So we have,
1083
01:06:58,080 --> 01:07:01,480
approaches
that people can use about people
1084
01:07:01,520 --> 01:07:04,520
that take place within the world.
1085
01:07:04,680 --> 01:07:06,800
We use in this
1086
01:07:06,800 --> 01:07:09,800
been used in school. So,
1087
01:07:10,560 --> 01:07:13,560
simplification and this the
1088
01:07:14,120 --> 01:07:17,120
how can we go from, in order to world.
1089
01:07:21,600 --> 01:07:23,640
There is no doubt
1090
01:07:23,640 --> 01:07:27,480
that two of the biggest consumers
of artificial intelligence,
1091
01:07:27,680 --> 01:07:31,360
though they will never admit it
publicly, are the U.S
1092
01:07:31,360 --> 01:07:34,360
and the Chinese military.
1093
01:07:35,800 --> 01:07:38,800
but even aside from that,
1094
01:07:39,920 --> 01:07:42,960
OpenAI and Microsoft, Google,
1095
01:07:44,240 --> 01:07:47,400
Baidu and China of two countries
1096
01:07:48,400 --> 01:07:50,960
and maybe six companies
1097
01:07:50,960 --> 01:07:54,800
effectively control the artificial
intelligence infrastructure of the planet.
1098
01:07:55,120 --> 01:07:55,920
It's not that there aren't
1099
01:07:55,920 --> 01:07:58,920
other people doing interesting things
with the AI in the world,
1100
01:08:00,120 --> 01:08:03,080
but if you're going to use
a large language model,
1101
01:08:03,080 --> 01:08:08,080
there are only so many
that actually exist, and all of them
1102
01:08:08,080 --> 01:08:11,920
are trained on the infrastructure
of either Amazon, Google or Baidu.
1103
01:08:12,600 --> 01:08:16,200
So, or the internal infrastructure
of a company like Facebook.
1104
01:08:16,800 --> 01:08:19,120
So that's problematic.
1105
01:08:21,720 --> 01:08:24,720
I have two answers to your question.
1106
01:08:24,800 --> 01:08:27,120
One is an essay.
1107
01:08:27,120 --> 01:08:28,680
I just wrote, an op ed
1108
01:08:28,680 --> 01:08:32,160
that was published in the Financial Times
of the United Kingdom,
1109
01:08:33,160 --> 01:08:36,160
and it had this very provocative opening
1110
01:08:37,280 --> 01:08:41,360
as it is being used today,
artificial intelligence
1111
01:08:41,640 --> 01:08:44,560
is fundamentally incompatible
1112
01:08:44,560 --> 01:08:47,560
with any modern notion of civil rights.
1113
01:08:48,600 --> 01:08:50,520
The idea that you might be
1114
01:08:50,520 --> 01:08:53,680
denied a job or a loan,
1115
01:08:54,480 --> 01:08:56,880
or that you might receive
1116
01:08:56,880 --> 01:08:59,880
additional scrutiny from the police,
1117
01:09:01,200 --> 01:09:03,960
you have no control over that.
1118
01:09:03,960 --> 01:09:07,920
And I know this
because some of that were my companies.
1119
01:09:08,680 --> 01:09:14,360
One of my most successful companies
was using AI and hiring my customer
1120
01:09:14,360 --> 01:09:15,560
wasn't you.
1121
01:09:15,560 --> 01:09:18,600
My customer was Google
and Citibank and Nestlé.
1122
01:09:19,600 --> 01:09:21,800
So we built an AI
1123
01:09:21,800 --> 01:09:24,480
that figured out who they should hire.
1124
01:09:24,480 --> 01:09:28,600
We didn't
build an AI to tell them to hire you.
1125
01:09:29,560 --> 01:09:32,640
So, and
my new company's working in health,
1126
01:09:32,640 --> 01:09:35,680
working on Alzheimer's,
working on postpartum depression and more.
1127
01:09:37,640 --> 01:09:41,920
let's say you go to a different doctor
because you want another diagnosis,
1128
01:09:41,920 --> 01:09:45,440
but they're just running the exact same
AI algorithm,
1129
01:09:46,200 --> 01:09:48,480
so you get the exact same diagnosis.
1130
01:09:48,480 --> 01:09:51,280
And probably if you're in America,
1131
01:09:51,280 --> 01:09:54,280
your insurance company
is not going to pay for it anyways.
1132
01:09:54,840 --> 01:09:57,440
So what are we going to do?
1133
01:09:57,440 --> 01:10:01,400
So the end of that essay, my argument was
1134
01:10:02,040 --> 01:10:05,040
just as we have a right to a doctor
1135
01:10:05,360 --> 01:10:09,440
or a lawyer acting solely in our interest,
1136
01:10:10,880 --> 01:10:14,080
access to AI acting only for us
1137
01:10:14,560 --> 01:10:17,080
must be a civil right itself.
1138
01:10:17,080 --> 01:10:20,080
Otherwise
civil rights just doesn't make sense.
1139
01:10:20,200 --> 01:10:23,680
If groups that you might be concerned
about that
1140
01:10:23,680 --> 01:10:28,560
I'm concerned about
are making decisions in 50 milliseconds
1141
01:10:29,760 --> 01:10:32,760
in a server farm in Wyoming,
1142
01:10:32,800 --> 01:10:35,800
and you never even knew
that decision was being made,
1143
01:10:37,000 --> 01:10:40,000
how would you be able
to take action against it?
1144
01:10:40,200 --> 01:10:43,480
So in that spirit,
I've tried to do something different
1145
01:10:43,480 --> 01:10:46,280
with my company, Denise's health, that's
1146
01:10:46,280 --> 01:10:49,600
working in on postpartum
depression, epigenetics.
1147
01:10:50,240 --> 01:10:53,200
I made a promise
before we ever founded the company.
1148
01:10:53,200 --> 01:10:56,200
At the same time,
we would found an independent
1149
01:10:56,640 --> 01:10:59,640
nonprofit, what's called the Data Trust.
1150
01:11:00,600 --> 01:11:04,160
And it it not my company
1151
01:11:04,320 --> 01:11:08,120
would own all of the data
and run all of the algorithms.
1152
01:11:08,840 --> 01:11:13,000
And it sole reason
for existing and fancy language.
1153
01:11:13,000 --> 01:11:18,040
It's so fiduciary responsibility is
the mothers from whom that data is coming.
1154
01:11:19,240 --> 01:11:22,240
So legally,
we've created a legal framework
1155
01:11:22,680 --> 01:11:26,080
in which our eyes sole purpose
1156
01:11:26,640 --> 01:11:29,640
is to help the health
outcomes of those moms.
1157
01:11:29,880 --> 01:11:33,000
I actually happen to believe it's
actually a good business decision.
1158
01:11:33,840 --> 01:11:36,840
If we stop arguing over who owns the data
1159
01:11:37,120 --> 01:11:40,640
and instead thought does May I save lives?
1160
01:11:41,360 --> 01:11:44,360
Is it actually making people's lives
better?
1161
01:11:44,520 --> 01:11:47,520
Is my business
based on having the best product,
1162
01:11:48,000 --> 01:11:50,520
or do I have some unique data
1163
01:11:50,520 --> 01:11:54,000
set that no one else has that
I stole from someone else?
1164
01:11:54,000 --> 01:11:55,400
Anyways?
1165
01:11:55,400 --> 01:11:58,320
Well, if we could stop arguing about that,
1166
01:11:58,320 --> 01:12:02,680
we could build systems
that were genuinely meant to help people.
1167
01:12:03,240 --> 01:12:06,680
Now I realize I'm being a little
idealistic about it here, but
1168
01:12:07,840 --> 01:12:08,840
part of what I wanted to
1169
01:12:08,840 --> 01:12:13,880
just show is someone could found a company
based on artificial intelligence
1170
01:12:14,720 --> 01:12:17,520
and make an explicit commitment
1171
01:12:17,520 --> 01:12:22,200
that it would only serve the interests
of the people that we were trying to help.
1172
01:12:23,160 --> 01:12:25,040
We're going to make lots of money.
1173
01:12:25,040 --> 01:12:27,720
We have the only test in the world,
and it's a test
1174
01:12:27,720 --> 01:12:30,720
for a condition that saves lives.
1175
01:12:31,320 --> 01:12:34,080
so if we can't make a business
out of that,
1176
01:12:34,080 --> 01:12:36,440
we should stop pretending to be business
people.
1177
01:12:36,440 --> 01:12:39,080
I don't need to also sell people's data.
1178
01:12:39,080 --> 01:12:39,760
I don't need to.
1179
01:12:39,760 --> 01:12:42,520
Also abuse the data
and trick them into doing things.
1180
01:12:42,520 --> 01:12:45,520
So this is this is hard.
1181
01:12:46,120 --> 01:12:49,680
This knowledge of how to build an
AI is out there in the world,
1182
01:12:50,880 --> 01:12:54,400
and the ability to do it
at massive scale is controlled
1183
01:12:54,400 --> 01:12:59,120
by a very small number
of organizations and countries.
1184
01:13:00,240 --> 01:13:02,360
What are we
1185
01:13:02,360 --> 01:13:06,360
in Mexico
and the United States around the world?
1186
01:13:07,120 --> 01:13:10,120
All of the rest of us are going to decide
1187
01:13:10,680 --> 01:13:14,040
is how this
AI is going to affect our lives.
1188
01:13:15,440 --> 01:13:19,360
And I'm not saying this again,
as someone that is reflexively against AI,
1189
01:13:19,760 --> 01:13:22,600
it's amazing what it can do.
1190
01:13:22,600 --> 01:13:26,400
We have a
right to its benefits in our lives,
1191
01:13:27,680 --> 01:13:30,680
but we also have a right
1192
01:13:30,840 --> 01:13:33,360
to decide what those benefits
1193
01:13:33,360 --> 01:13:37,080
should be for ourselves
and the trust to know
1194
01:13:37,080 --> 01:13:40,560
that that is happening
in the way we want it to happen.
1195
01:13:41,120 --> 01:13:44,120
So the simple truth is,
1196
01:13:44,160 --> 01:13:46,800
people are going to abuse this stuff.
1197
01:13:46,800 --> 01:13:48,720
What we need to decide
1198
01:13:48,720 --> 01:13:51,800
is what are the rules
we want to put in place
1199
01:13:52,680 --> 01:13:56,800
that bring the most benefits while
minimizing the harms as much as possible.
1200
01:13:57,720 --> 01:14:00,000
And unfortunately, right now
1201
01:14:00,000 --> 01:14:04,240
we live in a world where people like
Elon Musk and Sam Altman,
1202
01:14:06,720 --> 01:14:08,800
on one day are warning about
1203
01:14:08,800 --> 01:14:12,280
the evils of AI as they
themselves are building them,
1204
01:14:13,120 --> 01:14:16,120
which should make us very suspicious.
1205
01:14:18,360 --> 01:14:20,880
when Sam Altman
1206
01:14:20,880 --> 01:14:22,800
actually just before my op ed
1207
01:14:22,800 --> 01:14:25,800
came out, Sam Altman
1208
01:14:26,400 --> 01:14:31,480
in an interview said,
you know, will judges
1209
01:14:31,520 --> 01:14:36,040
be able to subpoena your AI,
your personal assistant,
1210
01:14:37,080 --> 01:14:40,240
so that they would be able to use
that information against you in court?
1211
01:14:41,480 --> 01:14:46,720
And that is a legitimate issue
also addressed in my op ed.
1212
01:14:46,720 --> 01:14:51,520
What I note he's not pointing out
is he already has all that information
1213
01:14:51,800 --> 01:14:54,520
and is happily using it against us.
1214
01:14:54,520 --> 01:14:58,200
It's a little bit like the snake
from The Jungle Book warning
1215
01:14:58,520 --> 01:15:01,200
the little kid about the tiger.
1216
01:15:01,200 --> 01:15:04,200
I I'm worried about both of them.
1217
01:15:05,240 --> 01:15:09,000
we're the ones that need the protections,
not Sam Altman.
1218
01:15:09,680 --> 01:15:11,080
and certainly not the US government.
1219
01:15:11,080 --> 01:15:14,080
So your concerns are well-founded,
1220
01:15:14,720 --> 01:15:17,880
but there are people
trying to build these things in the world
1221
01:15:18,440 --> 01:15:21,400
on principles
other than how rich can we get
1222
01:15:21,400 --> 01:15:23,120
and how much of
the world we can take over?
1223
01:15:24,240 --> 01:15:26,160
but in the end,
1224
01:15:26,160 --> 01:15:29,160
it's our decisions that matter.
1225
01:15:29,880 --> 01:15:32,880
it's our ability to,
1226
01:15:34,640 --> 01:15:37,160
to collectively decide
1227
01:15:37,160 --> 01:15:39,480
what we want these borders to be.
1228
01:15:39,480 --> 01:15:41,080
It shouldn't be me.
1229
01:15:41,080 --> 01:15:44,720
Just because I build them
first doesn't mean I get to decide.
1230
01:15:46,280 --> 01:15:49,280
I want this to not be treated
like it's science fiction.
1231
01:15:49,320 --> 01:15:52,160
I think we feel that way
maybe today about AI.
1232
01:15:52,160 --> 01:15:55,160
But I'll also say about neuro
technologies.
1233
01:15:55,200 --> 01:15:58,200
Elon Musk
also has a neurotechnology company.
1234
01:15:58,320 --> 01:16:01,640
And trust me,
if you didn't like having Facebook
1235
01:16:01,640 --> 01:16:05,120
knowing everything about you,
just wait until they get inside
1236
01:16:05,440 --> 01:16:08,040
your head to literally,
1237
01:16:08,040 --> 01:16:12,480
let's take these issues
genuinely seriously now.
1238
01:16:12,840 --> 01:16:17,040
And ultimately, I think the best way to do
it is to empower individual people
1239
01:16:17,360 --> 01:16:20,960
to be able to truly make decisions
about their own data
1240
01:16:21,240 --> 01:16:24,200
and their own interactions
with artificial intelligence,
1241
01:16:24,200 --> 01:16:27,200
rather than broad guidelines.
1242
01:16:27,280 --> 01:16:29,640
Or on the other side, just
1243
01:16:30,720 --> 01:16:33,720
trust me, I'm here to save the world.
1244
01:16:33,960 --> 01:16:36,360
so I wish I had a more concrete
1245
01:16:36,360 --> 01:16:40,320
answer for you, but all I can say
is what I personally am doing,
1246
01:16:40,320 --> 01:16:45,080
and I'm hoping that we'll be able
to prove something with our new trust.
1247
01:16:45,720 --> 01:16:48,720
So there were a number of hands, up.
1248
01:16:54,040 --> 01:16:54,480
Okay.
1249
01:16:54,480 --> 01:16:56,480
And get first.
1250
01:16:56,480 --> 01:16:59,480
thank you for an interest in that.
1251
01:17:00,000 --> 01:17:03,480
well, I have a question for AI since,
1252
01:17:05,040 --> 01:17:08,040
I've been doing these,
1253
01:17:08,240 --> 01:17:10,120
tools or
1254
01:17:10,120 --> 01:17:12,720
make better some of them.
1255
01:17:12,720 --> 01:17:15,680
And really, changed my life.
1256
01:17:15,680 --> 01:17:19,600
I guess some need for,
to build up my state of the brain.
1257
01:17:20,000 --> 01:17:23,240
and, I and some other to do
1258
01:17:23,240 --> 01:17:26,960
and make my life easier.
1259
01:17:26,960 --> 01:17:29,480
And in number three.
1260
01:17:29,480 --> 01:17:34,120
But those two are only useful for me for
a moment.
1261
01:17:34,120 --> 01:17:36,080
And I think there
1262
01:17:37,160 --> 01:17:38,200
might be many
1263
01:17:38,200 --> 01:17:41,640
people doing the same
as I'm doing at this moment.
1264
01:17:42,200 --> 01:17:46,480
But we have the resources
that we must have.
1265
01:17:47,760 --> 01:17:50,280
And so I need
1266
01:17:50,280 --> 01:17:53,200
would like to help other people
1267
01:17:53,200 --> 01:17:58,200
and to have this, these
to make our lives better.
1268
01:17:58,200 --> 01:18:01,520
And but my question for you is.
1269
01:18:03,600 --> 01:18:05,880
What or where
1270
01:18:05,880 --> 01:18:10,240
should I start if I want to?
1271
01:18:10,560 --> 01:18:14,120
You have any and the air.
1272
01:18:15,720 --> 01:18:17,800
what should be my first?
1273
01:18:17,800 --> 01:18:20,800
Yes. Where can I go?
1274
01:18:21,640 --> 01:18:22,960
yeah.
1275
01:18:22,960 --> 01:18:25,080
So there are different ways
to answer that question.
1276
01:18:25,080 --> 01:18:29,800
One is whether you specifically
want to have an impact in the AI world.
1277
01:18:29,800 --> 01:18:31,200
And even then, we could break it up.
1278
01:18:31,200 --> 01:18:35,560
Do you mean the bleeding edge
mathematics call researchers
1279
01:18:35,560 --> 01:18:38,600
that are looking at convergence proofs
and complexity
1280
01:18:38,600 --> 01:18:42,880
theorems, around
what's possible with given structures?
1281
01:18:44,160 --> 01:18:46,400
or, you know, the build
1282
01:18:46,400 --> 01:18:49,800
community that's really going out
and trying to make something.
1283
01:18:50,120 --> 01:18:53,400
And even then I would say that
there's a whole other space
1284
01:18:53,520 --> 01:18:56,760
which is just going and building things,
1285
01:18:58,800 --> 01:19:00,400
that, that they happen to use.
1286
01:19:00,400 --> 01:19:05,400
AI is secondary, that they solve
a problem of interest is primary.
1287
01:19:06,920 --> 01:19:09,200
so that's really what
that latter group was,
1288
01:19:09,200 --> 01:19:12,200
what it was the experience for me.
1289
01:19:12,360 --> 01:19:17,360
I didn't plan on studying,
machine learning.
1290
01:19:18,200 --> 01:19:20,840
I couldn't have I,
I mean, I was a big science
1291
01:19:20,840 --> 01:19:23,920
fiction nerd as a kid,
so of course, I'd read.
1292
01:19:24,760 --> 01:19:27,000
I don't know what the first thing
1293
01:19:27,000 --> 01:19:30,360
there is an AI in an episode of Star Trek.
1294
01:19:30,600 --> 01:19:33,360
Maybe that was the very first
AI I'd ever seen.
1295
01:19:33,360 --> 01:19:35,520
There's the AI in 2001,
1296
01:19:35,520 --> 01:19:38,680
but as a very little kid, I
probably wouldn't have watched that movie.
1297
01:19:39,600 --> 01:19:44,640
but that was my and most people's
introduction to artificial intelligence.
1298
01:19:45,600 --> 01:19:47,400
For me, it's interesting
1299
01:19:47,400 --> 01:19:51,560
and noteworthy
that I was a secondary thing.
1300
01:19:51,840 --> 01:19:56,280
I was studying brains and people,
and then I got introduced
1301
01:19:56,280 --> 01:19:59,880
to a specific subfield,
theoretical neuroscience.
1302
01:20:01,440 --> 01:20:04,440
where kind of like theoretical physics,
1303
01:20:04,640 --> 01:20:07,720
we would use mathematical principles
to study the brain.
1304
01:20:08,400 --> 01:20:11,720
And it turns out
if that's where you're starting from,
1305
01:20:12,240 --> 01:20:15,360
then you very quickly become a machine
learning researchers
1306
01:20:15,880 --> 01:20:20,840
because it's mathematically principled
and it looks like the brain
1307
01:20:20,840 --> 01:20:23,840
does something
at least roughly similar to that
1308
01:20:24,600 --> 01:20:26,880
usually, especially for early vision
1309
01:20:26,880 --> 01:20:29,880
and early audition.
1310
01:20:29,880 --> 01:20:32,880
so then for me, I was always a tool.
1311
01:20:33,440 --> 01:20:36,760
It was always a way to understand
something else.
1312
01:20:37,560 --> 01:20:42,920
So that was that is and how I have always
engaged with it in this way.
1313
01:20:42,920 --> 01:20:46,360
It's kind of funny because then
I get introduced as being an AI expert.
1314
01:20:46,360 --> 01:20:48,240
Well, let's be fair.
1315
01:20:48,240 --> 01:20:51,800
There are people that spend
every minute of their life
1316
01:20:51,920 --> 01:20:57,120
working on the next limb
or the next generative model.
1317
01:20:57,120 --> 01:20:58,200
Actually, the guy who came up
1318
01:20:58,200 --> 01:21:01,560
with the original diffusion
model is a former lab mate of mine,
1319
01:21:02,040 --> 01:21:06,080
eyes, the original first author
on the very first paper, a brilliant guy
1320
01:21:06,280 --> 01:21:10,000
who then wrote a whole essay
apologizing for it later,
1321
01:21:11,280 --> 01:21:11,760
because of
1322
01:21:11,760 --> 01:21:14,760
how he didn't like how it's being used.
1323
01:21:15,520 --> 01:21:16,800
and, you know,
1324
01:21:16,800 --> 01:21:20,360
he put a lot of work, for example,
into using AI modeling in education.
1325
01:21:20,360 --> 01:21:23,880
That was something he thought
was worthwhile, whereas putting artists
1326
01:21:23,880 --> 01:21:26,880
out of business
didn't feel like a great job to him.
1327
01:21:28,880 --> 01:21:31,640
so there are all these different ways
you could go about doing it.
1328
01:21:31,640 --> 01:21:34,240
And I'm not saying my way
is the right way.
1329
01:21:34,240 --> 01:21:37,200
The right way for you
is kind of where you want to take it.
1330
01:21:37,200 --> 01:21:40,680
There are communities out there
kind of around the world,
1331
01:21:42,360 --> 01:21:45,360
that are in these build spaces,
1332
01:21:45,760 --> 01:21:48,480
I'm sure, for example,
that there are Reddit communities
1333
01:21:48,480 --> 01:21:51,760
working on stuff like this,
although you couldn't get me
1334
01:21:51,760 --> 01:21:54,840
to spend 30s on Reddit for anything ever.
1335
01:21:58,040 --> 01:22:00,640
So, yeah,
1336
01:22:00,640 --> 01:22:03,640
you know, there's a lot, at UC Berkeley,
1337
01:22:04,360 --> 01:22:07,360
there is a really vibrant community,
1338
01:22:09,040 --> 01:22:10,920
hackers that just get together.
1339
01:22:10,920 --> 01:22:15,360
So I'm the chair of the Neurotic Collider
Lab at UC Berkeley,
1340
01:22:15,840 --> 01:22:18,920
the chair of the external board, and,
1341
01:22:20,520 --> 01:22:24,040
this what we do is just students
from across the university,
1342
01:22:24,040 --> 01:22:26,320
the neuroscience Institute,
the School of Engineering,
1343
01:22:26,320 --> 01:22:28,680
even the business school,
they all come together,
1344
01:22:28,680 --> 01:22:31,600
they form up a team
and they try and build a product.
1345
01:22:31,600 --> 01:22:35,680
And then, you know, we take it from there
and it's really cool.
1346
01:22:36,240 --> 01:22:38,840
Most of what they come up with
doesn't go anywhere.
1347
01:22:38,840 --> 01:22:43,520
Some of it's genuinely interesting
and a whole lot of it uses.
1348
01:22:43,520 --> 01:22:46,600
I mean, you can't do modern day
neuroscience
1349
01:22:46,600 --> 01:22:49,600
without machine learning.
1350
01:22:49,720 --> 01:22:53,720
and but they're in the computer
science department at Cal.
1351
01:22:53,720 --> 01:22:57,680
We have a whole new school of data
and society,
1352
01:22:58,360 --> 01:22:59,560
where people doing stuff like you.
1353
01:22:59,560 --> 01:23:03,280
I'm not pitching you on, Cal,
but I bet there's a hacker
1354
01:23:03,280 --> 01:23:06,960
culture around here that's doing that
same sort of thing of building
1355
01:23:06,960 --> 01:23:09,960
and trying out ideas
and seeing what connects.
1356
01:23:10,760 --> 01:23:14,360
You don't happen to be down the street
from,
1357
01:23:15,240 --> 01:23:19,640
you know, a $1 billion venture fund
that could look at it.
1358
01:23:19,640 --> 01:23:24,720
But, I know there's some money
in this town and there's probably some VCs
1359
01:23:24,720 --> 01:23:27,120
that are playing around here
looking for interesting ideas.
1360
01:23:28,280 --> 01:23:28,760
I will
1361
01:23:28,760 --> 01:23:31,760
simply say, whatever approach you take,
1362
01:23:32,240 --> 01:23:35,240
put the problem first.
1363
01:23:35,880 --> 01:23:39,120
I super cool, but again, still a tool.
1364
01:23:40,040 --> 01:23:42,240
Overwhelmingly,
1365
01:23:42,240 --> 01:23:45,320
the hardest thing is finding
the right question.
1366
01:23:46,440 --> 01:23:49,920
When Amazon tried to hire me
to be their chief scientist,
1367
01:23:49,920 --> 01:23:54,360
one of the projects I would have
run is a deep neural network
1368
01:23:54,560 --> 01:23:57,000
that would help speed up
the hiring process,
1369
01:23:57,000 --> 01:24:00,320
that would filter out
who they ought to bring in for interviews.
1370
01:24:00,840 --> 01:24:03,200
And I saw what they were doing with it,
1371
01:24:03,200 --> 01:24:06,880
and I told them at the time,
this isn't going to work.
1372
01:24:07,360 --> 01:24:11,960
And then it actually came out publicly
1373
01:24:12,400 --> 01:24:15,840
that it just it would not hire women,
1374
01:24:16,680 --> 01:24:19,600
which is exactly what I expected.
1375
01:24:19,600 --> 01:24:21,160
but they're not necessarily bad people.
1376
01:24:21,160 --> 01:24:23,160
So they did
everything you're supposed to do.
1377
01:24:23,160 --> 01:24:26,680
They took out all the women's colleges
and all the names of everyone off the.
1378
01:24:26,720 --> 01:24:30,000
They took out everything
that could trigger a human being.
1379
01:24:30,000 --> 01:24:32,840
So you could see that
this app job applicant was a woman.
1380
01:24:32,840 --> 01:24:36,880
And I did something amazing,
something no human could do.
1381
01:24:37,200 --> 01:24:39,120
It figured out
who was a woman and wouldn't hire them.
1382
01:24:40,120 --> 01:24:43,040
So why did the whole system fail?
1383
01:24:43,040 --> 01:24:45,480
Because Amazon hired
1384
01:24:45,480 --> 01:24:48,480
a bunch of brilliant young, largely men.
1385
01:24:48,560 --> 01:24:51,560
That doesn't make them bad,
but it just was the case.
1386
01:24:52,080 --> 01:24:56,840
And for the seven years
prior to that, working on their PhDs,
1387
01:24:58,000 --> 01:25:01,280
their advisor had said,
all right, we're going to build an AI,
1388
01:25:01,440 --> 01:25:05,240
and it's going to identify
all of the dog breeds in photographs,
1389
01:25:05,520 --> 01:25:08,520
which is an actual competition
that I claim every year.
1390
01:25:09,480 --> 01:25:12,560
and here is all the training data.
1391
01:25:12,880 --> 01:25:15,240
Here's all of the right answers.
1392
01:25:15,240 --> 01:25:18,400
And the question,
is there a dog in this picture?
1393
01:25:18,400 --> 01:25:21,320
And if there is, what breed is it?
1394
01:25:21,320 --> 01:25:22,120
Never.
1395
01:25:22,120 --> 01:25:25,120
That's a perfect example of a
well posed question.
1396
01:25:25,920 --> 01:25:28,920
Amazon was confronting
an ill posed question
1397
01:25:29,280 --> 01:25:32,280
how could an AI improve our hiring?
1398
01:25:32,880 --> 01:25:35,880
And the first thing you have to do there
is figure out what the right question is.
1399
01:25:36,840 --> 01:25:38,680
What Amazon chose was,
1400
01:25:40,080 --> 01:25:41,560
let's build an AI
1401
01:25:41,560 --> 01:25:44,800
and train it on who got a promotion
in their first year at Amazon.
1402
01:25:45,920 --> 01:25:47,720
Well, guess what?
1403
01:25:47,720 --> 01:25:50,280
The single biggest correlate of
1404
01:25:50,280 --> 01:25:53,280
getting a promotion
your first year at Amazon is being male.
1405
01:25:53,600 --> 01:25:56,440
So that's what the the
I learned no matter what they did,
1406
01:25:56,440 --> 01:25:59,440
no matter how they tried to retrain it,
that's all it would learn
1407
01:25:59,760 --> 01:26:02,520
because it doesn't know
what makes a great employee.
1408
01:26:02,520 --> 01:26:05,040
It just knows the correlations.
1409
01:26:05,040 --> 01:26:06,240
Same as GPT.
1410
01:26:06,240 --> 01:26:08,360
It doesn't know language.
1411
01:26:08,360 --> 01:26:10,160
It knows the correlations it produces.
1412
01:26:10,160 --> 01:26:11,840
The next best word.
1413
01:26:11,840 --> 01:26:14,440
Both of them did amazing things, one
1414
01:26:14,440 --> 01:26:17,440
traumatically wrong,
but still kind of amazing.
1415
01:26:17,600 --> 01:26:20,600
And the other,
1416
01:26:20,880 --> 01:26:22,480
it produces errors.
1417
01:26:22,480 --> 01:26:26,360
Again, if we mis understand
a large language model
1418
01:26:26,360 --> 01:26:29,360
as actually a reasoning machine, that's
1419
01:26:29,720 --> 01:26:32,720
really, truly thinking things out,
1420
01:26:32,880 --> 01:26:35,880
then then you're mystified as to why
1421
01:26:35,880 --> 01:26:39,480
it just makes things up every now and then
and completely confabulation.
1422
01:26:39,880 --> 01:26:41,600
So when you're working on a problem,
1423
01:26:42,880 --> 01:26:45,880
start question first.
1424
01:26:46,480 --> 01:26:50,760
What is the problem that I want to solve
so much
1425
01:26:50,760 --> 01:26:54,440
that I'm willing to put time into it,
even if no one ever knows that I did.
1426
01:26:55,080 --> 01:26:59,520
And if this is a problem,
why is it a problem?
1427
01:26:59,640 --> 01:27:01,840
Why haven't we figured it out?
1428
01:27:01,840 --> 01:27:05,520
Because if it's still a problem today,
that almost certainly means
1429
01:27:05,760 --> 01:27:08,760
no one is thinking about it correctly.
1430
01:27:09,360 --> 01:27:12,880
So now you have a place
1431
01:27:13,120 --> 01:27:17,160
where you can build some really cool
AI to do something interesting.
1432
01:27:18,080 --> 01:27:20,880
Only once you've reached that point.
1433
01:27:20,880 --> 01:27:23,320
So I wish I had a recommendation
for a bunch of really
1434
01:27:23,320 --> 01:27:26,520
cool communities
for you to connect with here in Mexico.
1435
01:27:26,520 --> 01:27:30,120
Around the world I have the social skills
of an autistic badger.
1436
01:27:30,400 --> 01:27:34,720
I forget incredibly well on a stage,
but I enjoy talking
1437
01:27:34,760 --> 01:27:38,520
to random people about as much as I enjoy
poking out my eyes.
1438
01:27:38,520 --> 01:27:41,840
So I love going into my lab
1439
01:27:42,360 --> 01:27:45,360
all by myself and building something new.
1440
01:27:47,240 --> 01:27:50,240
and then I listen to the world
1441
01:27:50,600 --> 01:27:55,120
and see if it actually works
and learn from that.
1442
01:27:55,800 --> 01:27:58,160
And then people come to me
1443
01:27:58,160 --> 01:28:01,160
and talk to me
about that thing that I built
1444
01:28:01,320 --> 01:28:04,800
and how it worked,
and maybe a way to do it better.
1445
01:28:05,360 --> 01:28:09,040
So maybe I am suggesting that also is
1446
01:28:09,440 --> 01:28:12,240
don't just build a thing,
1447
01:28:12,240 --> 01:28:14,520
release it into the world.
1448
01:28:14,520 --> 01:28:17,520
There's
nothing like the learning experience
1449
01:28:17,520 --> 01:28:22,320
of releasing something
and seeing it getting used or not,
1450
01:28:22,320 --> 01:28:26,160
and then studying why it worked or didn't
1451
01:28:26,480 --> 01:28:29,200
and trying again and again.
1452
01:28:29,200 --> 01:28:32,280
Because it takes a long time
to get good at this stuff.
1453
01:28:32,520 --> 01:28:37,120
The solving of the problem stuff,
the AI stuff also takes a long time,
1454
01:28:37,520 --> 01:28:40,680
but it will inevitably connect you
1455
01:28:40,680 --> 01:28:44,520
into community, which I can say as someone
1456
01:28:46,120 --> 01:28:48,280
genuinely on the spectrum.
1457
01:28:48,280 --> 01:28:50,880
And yet I get to travel around the world.
1458
01:28:50,880 --> 01:28:54,040
World leaders ask my opinions about things
1459
01:28:54,360 --> 01:28:57,360
that should scare all of us.
1460
01:28:58,560 --> 01:29:00,080
and yet it happens.
1461
01:29:00,080 --> 01:29:03,640
Despite my total lack of social skills,
despite the fact that I don't network
1462
01:29:03,640 --> 01:29:07,320
with anyone because I build things
and I put them out into the world,
1463
01:29:07,520 --> 01:29:10,520
people come to me and
1464
01:29:11,320 --> 01:29:14,320
you should give that a try.
1465
01:29:22,480 --> 01:29:24,960
I have, I have for sure.
1466
01:29:24,960 --> 01:29:27,720
One is did you see the as
1467
01:29:27,720 --> 01:29:30,840
can the way of making decisions.
1468
01:29:32,000 --> 01:29:35,960
This is about you stand by money
on that one thing that weighs
1469
01:29:36,480 --> 01:29:39,480
you building up confidence.
1470
01:29:40,400 --> 01:29:42,320
the second question scared me.
1471
01:29:42,320 --> 01:29:45,320
So I just want to know,
1472
01:29:45,360 --> 01:29:48,360
the first time I ever wonder,
what would you recommend me?
1473
01:29:49,320 --> 01:29:51,440
you know, paper
1474
01:29:51,440 --> 01:29:54,640
towels, those things that you love, you so
1475
01:29:56,560 --> 01:29:59,080
you actually think. Go.
1476
01:29:59,080 --> 01:30:02,080
Okay.
1477
01:30:03,240 --> 01:30:06,000
So to the first one,
1478
01:30:06,000 --> 01:30:10,680
I think that the problem
of automated navigation
1479
01:30:10,680 --> 01:30:14,840
systems, paradoxically,
sometimes making traffic worse
1480
01:30:15,360 --> 01:30:19,520
is exactly some of the issue with AI
1481
01:30:20,160 --> 01:30:23,760
that as it gets built in a way,
1482
01:30:23,760 --> 01:30:27,480
I'm actually going to return it
and say it's a it's a problem with us.
1483
01:30:27,960 --> 01:30:30,960
We thought, oh God, if only I knew
1484
01:30:31,360 --> 01:30:34,240
that secret fastest route
1485
01:30:34,240 --> 01:30:37,440
that only the locals knew
then I could get around.
1486
01:30:37,440 --> 01:30:40,520
But then if you make that available
to everybody,
1487
01:30:41,120 --> 01:30:44,120
things go awry.
1488
01:30:44,600 --> 01:30:45,920
and often people
1489
01:30:45,920 --> 01:30:50,040
building these systems
don't really think through
1490
01:30:50,320 --> 01:30:54,320
the chain of causation
that runs out from there.
1491
01:30:54,320 --> 01:30:59,160
So my comments earlier, for example,
about inexperienced early career workers
1492
01:30:59,680 --> 01:31:04,440
being harmed by the AI, the very AI that's
helping them do their job
1493
01:31:05,000 --> 01:31:07,800
because it's robbing them of the chance
to learn
1494
01:31:07,800 --> 01:31:10,920
how to do their job,
I think is a very similar phenomenon.
1495
01:31:11,400 --> 01:31:14,280
All of these unintended
negative consequences.
1496
01:31:15,360 --> 01:31:15,720
it's one
1497
01:31:15,720 --> 01:31:19,480
of the
reasons why, while I am not reflexively
1498
01:31:19,480 --> 01:31:23,760
supportive of the idea
that we should sandbox all new AI.
1499
01:31:23,760 --> 01:31:26,760
So this is idea much like drug discovery,
1500
01:31:26,760 --> 01:31:30,560
where we should test it extensively
before it gets released into the wild.
1501
01:31:31,400 --> 01:31:35,520
One of the reasons I don't believe in
it is just, how are you going to stop
1502
01:31:36,240 --> 01:31:39,000
everyone at this university,
much less everyone in the world?
1503
01:31:39,000 --> 01:31:40,680
From building AI?
1504
01:31:40,680 --> 01:31:43,680
There's no realistic
constraint on it. But,
1505
01:31:44,680 --> 01:31:46,400
but it has implications.
1506
01:31:46,400 --> 01:31:50,320
And I alluded to one earlier, Google Maps
1507
01:31:51,000 --> 01:31:54,000
causing,
1508
01:31:54,720 --> 01:31:56,120
cognitive decline.
1509
01:31:56,120 --> 01:31:59,880
So one of the things we know
is good for long term
1510
01:31:59,880 --> 01:32:02,880
cognitive
health is navigating through space.
1511
01:32:03,320 --> 01:32:06,080
Well, if you're using an AI to tell you
1512
01:32:06,080 --> 01:32:09,360
when to make your turns,
then you're not doing that anymore.
1513
01:32:10,320 --> 01:32:12,360
So here's how I use Google Maps.
1514
01:32:12,360 --> 01:32:15,360
So I'm not saying Google Maps is bad.
1515
01:32:16,560 --> 01:32:19,760
I get to be a native everywhere
1516
01:32:19,760 --> 01:32:22,760
I go, in every city in the world.
1517
01:32:23,800 --> 01:32:27,480
I can open up my phone and it can tell me
how to get to where I want to go.
1518
01:32:27,600 --> 01:32:30,600
Not always the best route,
but a good enough route.
1519
01:32:30,800 --> 01:32:33,640
And the way I use Google Maps
is I open it up.
1520
01:32:33,640 --> 01:32:37,720
I look at its route, the way
I use it at home when I'm in a new city.
1521
01:32:38,240 --> 01:32:41,360
I don't do terrible things to myself.
1522
01:32:41,520 --> 01:32:44,240
When I'm at home, I open it up.
1523
01:32:44,240 --> 01:32:47,240
I check out to make sure that there isn't
a traffic jam or something,
1524
01:32:47,720 --> 01:32:50,800
and then I try and beat Google Maps.
1525
01:32:52,040 --> 01:32:56,000
It's given me its route
and how long it thinks I will take.
1526
01:32:56,480 --> 01:33:00,600
And then I think,
what do I know about Berkeley, California?
1527
01:33:00,600 --> 01:33:02,000
What do I know about San Francisco?
1528
01:33:02,000 --> 01:33:04,000
What do I know about LA?
1529
01:33:04,000 --> 01:33:07,920
That I can beat Google Maps
without cheating, without speeding
1530
01:33:07,920 --> 01:33:09,240
and driving like a crazy person?
1531
01:33:10,280 --> 01:33:11,880
Then I have to think about it.
1532
01:33:11,880 --> 01:33:15,960
I got the benefits of the
AI telling me how traffic is
1533
01:33:16,160 --> 01:33:20,080
and what the best route would be,
but I'm still thinking
1534
01:33:20,080 --> 01:33:23,080
deeply about the problem
and reaping those benefits.
1535
01:33:23,080 --> 01:33:25,120
So that's how I.
1536
01:33:25,120 --> 01:33:27,640
I wish the people building these systems
1537
01:33:27,640 --> 01:33:30,920
thought through them
as much as I'm trying to.
1538
01:33:30,920 --> 01:33:34,800
The overwhelming ethic
in AI development right now and technology
1539
01:33:34,800 --> 01:33:37,800
development writ
large is make life easier.
1540
01:33:39,000 --> 01:33:42,000
I hope I convinced someone here today
1541
01:33:42,440 --> 01:33:46,200
that we should actually build technology
to make life
1542
01:33:46,200 --> 01:33:49,200
more challenging in a productive way.
1543
01:33:50,400 --> 01:33:53,400
and that's what I'd love
to see out of Waze in Google Maps.
1544
01:33:53,760 --> 01:33:57,480
and then the other one is,
1545
01:33:59,160 --> 01:34:02,160
I mean, the dirty truth is, yes,
1546
01:34:02,440 --> 01:34:05,640
because most of the AI working in
1547
01:34:05,640 --> 01:34:08,800
hiring today is bad.
1548
01:34:09,800 --> 01:34:14,080
It's lazy,
it's unscientific, it's unproven,
1549
01:34:14,760 --> 01:34:18,160
and it's largely
just looking for keywords on your resume.
1550
01:34:18,800 --> 01:34:20,120
Absolutely lutely.
1551
01:34:20,120 --> 01:34:22,440
You could have GPT
1552
01:34:22,440 --> 01:34:26,200
all by itself, read to your resume
and help you optimize it.
1553
01:34:27,080 --> 01:34:30,000
Now, the very things
that it would optimize
1554
01:34:30,000 --> 01:34:33,000
are the things we shouldn't
be using and hiring.
1555
01:34:33,480 --> 01:34:36,120
What does the recruiter do
when they look at a resume?
1556
01:34:36,120 --> 01:34:37,600
They give you five seconds.
1557
01:34:37,600 --> 01:34:40,600
Now look at your name,
your school, and your last job.
1558
01:34:42,480 --> 01:34:46,880
and it turns out
once I know almost anything about you,
1559
01:34:46,920 --> 01:34:49,920
none of those are meaningfully
predictive anymore.
1560
01:34:50,240 --> 01:34:54,120
So I can be very realistic
1561
01:34:54,120 --> 01:34:58,320
about it and say, absolutely,
Gemini and Gpt3 could help you
1562
01:34:58,840 --> 01:35:02,600
revise your resume,
make it more likely to get hired.
1563
01:35:02,960 --> 01:35:05,760
But what it's going to do is implant,
1564
01:35:06,880 --> 01:35:09,880
aspects that recruiters respond to
1565
01:35:10,480 --> 01:35:12,600
that are not, in fact, predictive
1566
01:35:12,600 --> 01:35:15,600
of whether
you're actually good at the job.
1567
01:35:15,840 --> 01:35:18,840
so feel free,
1568
01:35:19,080 --> 01:35:21,720
play the game on the system.
1569
01:35:21,720 --> 01:35:25,440
But in reality, here's
what I tell all of my students
1570
01:35:26,160 --> 01:35:28,800
don't get a job with a resume.
1571
01:35:28,800 --> 01:35:30,760
Get them to come to you,
1572
01:35:30,760 --> 01:35:33,440
but build something.
1573
01:35:33,440 --> 01:35:38,640
in fact, this may seem a little off topic,
but when we look at women and wage
1574
01:35:38,640 --> 01:35:43,320
gap penalties between men and women,
when you turn that around
1575
01:35:43,760 --> 01:35:47,880
and you look at employees
that are actively recruited,
1576
01:35:48,600 --> 01:35:51,600
wage gap quickly disappears.
1577
01:35:52,000 --> 01:35:55,000
When I'm not hiring a generic person
to fill a job,
1578
01:35:55,320 --> 01:35:57,960
I'm hiring you.
1579
01:35:57,960 --> 01:35:59,960
Amazon wanted me to be
1580
01:35:59,960 --> 01:36:02,960
their chief scientist for people.
1581
01:36:03,160 --> 01:36:08,720
that is the only way
I tell my students to get a job.
1582
01:36:09,840 --> 01:36:11,280
now, I also want to be realistic.
1583
01:36:11,280 --> 01:36:14,640
Most people don't have the privilege
of doing it that way, but,
1584
01:36:15,840 --> 01:36:17,800
I strongly recommend
1585
01:36:17,800 --> 01:36:20,800
that you don't march to the front door
with a resume.
1586
01:36:21,120 --> 01:36:23,800
You come in the back door
1587
01:36:23,800 --> 01:36:26,800
because people there wanted you to be
there.
1588
01:36:26,840 --> 01:36:32,120
It's just it doesn't compare in terms of
the career outcomes that emerged from it.
1589
01:36:32,360 --> 01:36:36,200
But genuinely,
you are right in speculating, GPT
1590
01:36:36,320 --> 01:36:39,320
actually does a pretty spectacular job
in polishing resumes.
1591
01:36:43,120 --> 01:36:46,400
Hey. Dave.
1592
01:36:50,400 --> 01:36:53,400
Dave. Hey.
1593
01:36:56,840 --> 01:36:58,920
It's the opening night.
1594
01:36:58,920 --> 01:37:01,840
Welcome.
1595
01:37:01,840 --> 01:37:03,640
For the significant full
1596
01:37:03,640 --> 01:37:06,640
inclusion for us, candidates
are going to see a more.
1597
01:37:07,920 --> 01:37:09,800
At that point.
1598
01:37:09,800 --> 01:37:12,800
It's time once again, both keynote.
1599
01:37:14,920 --> 01:37:15,400
Okay.
1600
01:37:15,400 --> 01:37:19,680
And this includes nuclear waste
and security removal
1601
01:37:19,720 --> 01:37:22,720
from us. And.
1602
01:37:25,720 --> 01:37:29,000
Nuclear security and social networking.
1603
01:37:29,880 --> 01:37:32,000
liquid metal enthusiast.
1604
01:37:34,280 --> 01:37:37,280
lot of us in the way meaningful
1605
01:37:38,360 --> 01:37:39,400
community.
1606
01:37:39,400 --> 01:37:42,400
No city.
1607
01:37:43,560 --> 01:37:46,720
This site could easily put.
1608
01:37:49,320 --> 01:37:51,480
Thousands of years in behind.
1609
01:37:51,480 --> 01:37:53,960
Put a.
1610
01:37:53,960 --> 01:37:56,960
Uniform on a good way to say,
1611
01:38:05,280 --> 01:38:08,280
Same. No.
1612
01:38:08,800 --> 01:38:10,200
I think
1613
01:38:10,200 --> 01:38:13,200
we missed the moment, but maybe we can.
1614
01:38:13,320 --> 01:38:17,440
I got about 15% of that begin with.
1615
01:38:21,840 --> 01:38:24,160
What is,
1616
01:38:24,160 --> 01:38:26,800
a kind of of.
1617
01:38:26,800 --> 01:38:27,960
Okay.
1618
01:38:27,960 --> 01:38:29,840
I can hear you now. Yes.
1619
01:38:29,840 --> 01:38:32,840
What changes? Yes.
1620
01:38:33,160 --> 01:38:36,160
You can go from moving
1621
01:38:36,840 --> 01:38:39,840
over to.
1622
01:38:41,720 --> 01:38:44,720
The to city and.
1623
01:38:46,520 --> 01:38:48,000
Guys,
1624
01:38:48,000 --> 01:38:51,000
may in 2013, at some point.
1625
01:38:51,640 --> 01:38:54,640
Okay.
1626
01:38:55,360 --> 01:38:58,360
Us one those that oppose.
1627
01:38:59,760 --> 01:39:02,760
And you know
1628
01:39:04,040 --> 01:39:07,040
me said can be the,
1629
01:39:10,040 --> 01:39:13,200
That these policy issues with us
1630
01:39:14,880 --> 01:39:17,880
take us to you for the final.
1631
01:39:24,680 --> 01:39:25,080
Year.
1632
01:39:25,080 --> 01:39:27,600
The California
1633
01:39:27,600 --> 01:39:30,600
entertainment.
1634
01:39:31,680 --> 01:39:34,680
Groups again to for.
1635
01:39:38,400 --> 01:39:41,400
You can us for to see some.
1636
01:39:45,240 --> 01:39:47,200
Unity in this game so.
1637
01:39:49,680 --> 01:39:52,080
Please have
1638
01:39:52,080 --> 01:39:55,280
for this is. A.
1639
01:39:59,880 --> 01:40:02,880
What that what we have to.
1640
01:40:03,960 --> 01:40:05,000
For missing
1641
01:40:05,000 --> 01:40:08,000
individuals that you cannot forget.
1642
01:40:08,160 --> 01:40:11,160
Who? Moses
1643
01:40:11,280 --> 01:40:11,880
said about me.
1644
01:40:11,880 --> 01:40:14,880
And I assume.
1645
01:40:24,000 --> 01:40:26,280
Yeah.
1646
01:40:26,280 --> 01:40:27,640
that's an excellent question.
1647
01:40:27,640 --> 01:40:31,760
And it's interesting because you frame it
1648
01:40:31,760 --> 01:40:35,040
well in terms of,
1649
01:40:37,200 --> 01:40:40,200
how many communities,
1650
01:40:42,320 --> 01:40:43,800
as you say, such as? Well, Hakka.
1651
01:40:43,800 --> 01:40:45,960
I do a lot of work in South Africa.
1652
01:40:45,960 --> 01:40:50,080
or, an India
where there are just parts of the country
1653
01:40:50,080 --> 01:40:54,160
that are plugged into the global data
stream
1654
01:40:54,160 --> 01:40:57,160
and parts that aren't,
1655
01:40:58,200 --> 01:41:01,200
and so
1656
01:41:01,800 --> 01:41:02,680
I think you're right.
1657
01:41:02,680 --> 01:41:08,160
It's not just an issue of, of inequality,
but also that there's
1658
01:41:08,160 --> 01:41:12,320
this privileged position of being able
to hold back sets of information.
1659
01:41:12,480 --> 01:41:17,760
It's interesting that the flip side
also happens with, information
1660
01:41:17,760 --> 01:41:21,840
that might otherwise be available
being expunged from the record.
1661
01:41:23,280 --> 01:41:25,680
I've visited China many times.
1662
01:41:25,680 --> 01:41:29,600
I think it's an amazing place,
but they've made the decision
1663
01:41:29,600 --> 01:41:32,400
that certain kinds of information
absolutely will never enter
1664
01:41:32,400 --> 01:41:35,400
their data sphere, information
that ought to be there.
1665
01:41:35,720 --> 01:41:39,960
And so, in a sense, there's this complex
1666
01:41:40,680 --> 01:41:44,880
really of the data
that's easy to get, the surface
1667
01:41:44,880 --> 01:41:48,120
level data, the data
that people see as economically valuable,
1668
01:41:49,560 --> 01:41:53,280
you know, open AI was supposed to be
a nonprofit helping the world
1669
01:41:53,760 --> 01:41:58,560
very quickly and demonstrably changed
into something wildly different.
1670
01:41:58,560 --> 01:42:02,160
As soon as billions of dollars
started flooding in from Microsoft.
1671
01:42:03,080 --> 01:42:06,040
So I,
1672
01:42:06,040 --> 01:42:09,040
as I was saying earlier,
when I was talking about data trust,
1673
01:42:10,000 --> 01:42:13,360
I'm a firm believer in the idea
1674
01:42:13,360 --> 01:42:16,800
that these systems
should be working for us.
1675
01:42:18,880 --> 01:42:21,120
there's plenty of wonderful
1676
01:42:21,120 --> 01:42:24,120
business opportunities
for people building things.
1677
01:42:24,440 --> 01:42:28,320
They also they don't get to own me.
1678
01:42:28,840 --> 01:42:34,960
They don't get to turn my
my books into knowledge that is available
1679
01:42:35,240 --> 01:42:40,320
not for me, but from, there I necessarily.
1680
01:42:40,880 --> 01:42:44,440
And there's this interesting question.
1681
01:42:44,440 --> 01:42:49,240
Also, I think one of the concerns is that,
1682
01:42:50,400 --> 01:42:53,400
you know, as people access these systems
1683
01:42:53,880 --> 01:42:57,040
and different language
versions of large models
1684
01:42:57,040 --> 01:43:00,960
will be somewhat different, particularly
1685
01:43:00,960 --> 01:43:03,960
in, you know, say,
1686
01:43:04,680 --> 01:43:07,520
Mandarin Chinese versus English,
1687
01:43:07,520 --> 01:43:11,560
but there's also a lot of underlying
model structure.
1688
01:43:12,000 --> 01:43:15,000
So there's a huge potential homogenization
1689
01:43:15,400 --> 01:43:18,040
that I will facilitate
throughout the world.
1690
01:43:18,040 --> 01:43:23,240
And that's deeply problematic,
cause us all to we have this amazing tool
1691
01:43:23,240 --> 01:43:26,240
that could help us
to all think differently.
1692
01:43:26,280 --> 01:43:29,920
And it seems much more likely
that in the near term at least,
1693
01:43:29,920 --> 01:43:33,040
it will cause us to all think
more homogeneously.
1694
01:43:35,000 --> 01:43:36,320
even if we're speaking different
1695
01:43:36,320 --> 01:43:39,360
languages, I suspect it will induce people
1696
01:43:39,360 --> 01:43:42,960
to use similar patterns of thought
and similar pieces of information.
1697
01:43:43,240 --> 01:43:46,240
Some of which may not even be true.
1698
01:43:46,800 --> 01:43:48,760
in that sense,
1699
01:43:48,760 --> 01:43:54,200
if we weren't talking about
the private property of OpenAI or Google,
1700
01:43:54,720 --> 01:43:58,640
then we might feel like there's
a real value in being able
1701
01:43:58,640 --> 01:44:03,000
to have a broader
range of knowledge available in the world.
1702
01:44:04,320 --> 01:44:06,040
I would love to know
1703
01:44:06,040 --> 01:44:09,160
if someone living,
1704
01:44:09,840 --> 01:44:12,720
you know, much as their ancestors
1705
01:44:12,720 --> 01:44:15,720
did in the mountains of Borneo,
1706
01:44:16,080 --> 01:44:21,280
something that could make my life better,
but I just don't have access to it.
1707
01:44:21,400 --> 01:44:24,800
It would be wonderful
if somehow I could find that.
1708
01:44:26,360 --> 01:44:30,640
but I think you're on to a pretty profound
1709
01:44:31,880 --> 01:44:33,480
challenge.
1710
01:44:33,480 --> 01:44:36,480
in these spaces, most of the AI I built,
1711
01:44:38,120 --> 01:44:41,560
has been about solving a specific problem.
1712
01:44:42,080 --> 01:44:45,320
A specific model solves
a specific problem.
1713
01:44:45,440 --> 01:44:48,520
Even in the domains
in which I have a lens.
1714
01:44:49,680 --> 01:44:52,680
large language models in my work,
1715
01:44:53,160 --> 01:44:55,840
they're more like an interface.
1716
01:44:55,840 --> 01:44:59,120
You can talk to them so that our suicide,
1717
01:44:59,840 --> 01:45:03,840
risk prevention
model could act, or you could talk to them
1718
01:45:04,080 --> 01:45:08,040
to help us identify
and diagnose postpartum depression. But
1719
01:45:09,240 --> 01:45:09,920
our actual
1720
01:45:09,920 --> 01:45:12,920
diagnoses
aren't through the large language model.
1721
01:45:13,320 --> 01:45:15,160
They are through a specialized model.
1722
01:45:15,160 --> 01:45:18,160
And what we find is we have to develop, in
1723
01:45:18,160 --> 01:45:22,520
many cases, a different model
for every group of people
1724
01:45:22,520 --> 01:45:25,880
that we're working with,
even within one notional community.
1725
01:45:25,880 --> 01:45:29,720
So even if we're just talking
about Mexico, absolutely,
1726
01:45:29,720 --> 01:45:33,120
we would have different models
in the North in the South,
1727
01:45:33,120 --> 01:45:36,120
different models
for the cities versus rural areas.
1728
01:45:36,920 --> 01:45:37,680
different model,
1729
01:45:37,680 --> 01:45:41,360
clearly different models for Spanish
speaking and non Spanish speaking.
1730
01:45:41,760 --> 01:45:43,520
Mexicans,
1731
01:45:43,520 --> 01:45:45,960
all of which have to be there.
1732
01:45:45,960 --> 01:45:49,040
when we build models
for our education system,
1733
01:45:49,800 --> 01:45:52,520
in the States, all originally in English,
1734
01:45:52,520 --> 01:45:58,280
we quickly found that the way it spoke to
people had to be distinctly different
1735
01:45:58,440 --> 01:46:01,760
for different populations,
or they just wouldn't engage with it.
1736
01:46:03,600 --> 01:46:05,400
so again and again, in
1737
01:46:05,400 --> 01:46:09,440
most of my work, we're looking at
1738
01:46:10,240 --> 01:46:13,240
trying
to, well, describe a specific problem
1739
01:46:13,240 --> 01:46:18,040
and building a specific mean machine
learning or artificial intelligence tool
1740
01:46:18,440 --> 01:46:21,120
that could be used in that space,
often use
1741
01:46:21,120 --> 01:46:24,120
by a person,
1742
01:46:24,360 --> 01:46:27,440
say being used by a doctor or nurse
or teacher,
1743
01:46:27,760 --> 01:46:30,760
sometimes working completely on its own.
1744
01:46:32,120 --> 01:46:34,080
and in that sense,
I think you're identifying
1745
01:46:34,080 --> 01:46:37,440
some of the challenges with existing
1746
01:46:37,440 --> 01:46:40,440
large language models is and again,
1747
01:46:40,920 --> 01:46:43,840
it doesn't have to be this way,
1748
01:46:43,840 --> 01:46:46,680
but it is the way that it currently is
1749
01:46:46,680 --> 01:46:50,520
that they tend to just suck everything in
1750
01:46:51,240 --> 01:46:54,560
and make very little distinction
within them.
1751
01:46:54,560 --> 01:46:56,760
That does not exist
within the corpus of language
1752
01:46:56,760 --> 01:46:59,200
they've been trained on,
or the corpus of images.
1753
01:46:59,200 --> 01:47:02,080
A diffusion model has been trained on.
1754
01:47:02,080 --> 01:47:05,080
And it would actually be amazing
1755
01:47:05,360 --> 01:47:08,360
if these systems
actually understood these differences.
1756
01:47:08,800 --> 01:47:13,560
So one of the emerging areas of, research,
1757
01:47:14,840 --> 01:47:18,240
is developing causal modeling in AI.
1758
01:47:18,840 --> 01:47:22,680
And this takes a different forms.
1759
01:47:22,680 --> 01:47:24,240
Probably the most common right now
1760
01:47:24,240 --> 01:47:27,880
is less exciting to me,
but is very exciting in places like OpenAI
1761
01:47:27,880 --> 01:47:31,960
and Google is combining reinforcement
learning models with lens.
1762
01:47:32,760 --> 01:47:35,880
You stick them together
and you create an independent agent
1763
01:47:36,080 --> 01:47:39,480
that can go out and collect its own data
about the world.
1764
01:47:40,840 --> 01:47:42,880
but that runs into the
some of the very problems
1765
01:47:42,880 --> 01:47:46,760
you're talking about in terms of, well,
maybe people didn't want to share that
1766
01:47:46,760 --> 01:47:50,320
data with an agent, or they didn't know
they were interacting with an agent.
1767
01:47:50,640 --> 01:47:55,040
And also fair to what you were saying,
in some places in the world,
1768
01:47:55,040 --> 01:47:58,880
how would such an agent actually
get access to that information physically?
1769
01:47:59,040 --> 01:48:01,920
How could it be present to get information
1770
01:48:01,920 --> 01:48:05,320
in a place
that isn't connected, into the net?
1771
01:48:06,720 --> 01:48:09,720
And again, AI is not
1772
01:48:10,120 --> 01:48:14,400
AI as much as anyone have a say in this,
but I know more than anyone
1773
01:48:14,400 --> 01:48:17,400
should
I have the say of how these things happen.
1774
01:48:17,640 --> 01:48:23,160
So I love questions like that
because it brings in a very different
1775
01:48:23,280 --> 01:48:26,560
maybe there are parts of our selves
in our lives
1776
01:48:26,560 --> 01:48:29,560
that we want to preserve.
1777
01:48:30,840 --> 01:48:32,880
and preserve in a way
1778
01:48:32,880 --> 01:48:37,200
that isn't simply just more data points
1779
01:48:37,200 --> 01:48:40,200
in some giant companies database.
1780
01:48:40,520 --> 01:48:43,600
But I also, in my heart, feel like,
1781
01:48:45,120 --> 01:48:49,080
how are the ways that we can share
that my sister and brother in law
1782
01:48:49,080 --> 01:48:53,360
go to Wallach, every year and hike around
1783
01:48:53,360 --> 01:48:57,600
and spend time in villages,
and they absolutely love it.
1784
01:49:00,280 --> 01:49:03,960
but, not everyone
1785
01:49:05,080 --> 01:49:05,520
just does.
1786
01:49:05,520 --> 01:49:07,280
Not everyone wants to share their data
with the world.
1787
01:49:07,280 --> 01:49:11,320
Not everyone can travel to other parts
of the world to experience it.
1788
01:49:11,320 --> 01:49:12,800
And finding that
1789
01:49:14,480 --> 01:49:17,960
not even a dividing line, maybe something
messy and fuzzy
1790
01:49:17,960 --> 01:49:23,640
between what we keep for ourselves
and what we share with the world,
1791
01:49:24,320 --> 01:49:27,520
and how we want to make certain
that that's respected.
1792
01:49:27,840 --> 01:49:30,320
And I think that's one of the big things.
1793
01:49:30,320 --> 01:49:33,320
Are those distinctions being respected?
1794
01:49:33,480 --> 01:49:35,840
And I don't have to think
1795
01:49:35,840 --> 01:49:39,160
the Sam Altman is a bad guy.
1796
01:49:39,200 --> 01:49:43,040
I don't happen to think he's a great guy,
but I don't happen to think
1797
01:49:43,040 --> 01:49:47,000
he's a bad guy, that his opinion alone
should decide these questions.
1798
01:49:47,400 --> 01:49:48,480
These are things
1799
01:49:48,480 --> 01:49:51,800
we collectively should decide,
and that just isn't happening right now.
1800
01:49:52,320 --> 01:49:55,320
So that is unfortunately the best
1801
01:49:55,680 --> 01:49:58,680
flailing answer I can offer to you.
1802
01:50:05,440 --> 01:50:06,200
In the back.
1803
01:50:06,200 --> 01:50:09,200
Okay, great.
1804
01:50:34,240 --> 01:50:37,240
Thank.
1805
01:50:37,320 --> 01:50:39,720
Hey. Thank you again for
1806
01:50:39,720 --> 01:50:43,400
the insightful,
for sharing your thoughts on this.
1807
01:50:44,640 --> 01:50:48,280
we. I some,
1808
01:50:48,840 --> 01:50:51,840
work for me. for.
1809
01:50:53,640 --> 01:50:56,840
And to see you, they start to encourage
1810
01:50:56,840 --> 01:51:00,560
us to use this part of our space.
1811
01:51:03,480 --> 01:51:06,480
So, I was just,
1812
01:51:07,400 --> 01:51:11,720
so that I, you know,
what be I sharing that,
1813
01:51:12,600 --> 01:51:15,600
we actually may spend.
1814
01:51:18,480 --> 01:51:21,480
it's also like, in,
1815
01:51:21,520 --> 01:51:24,000
this work.
1816
01:51:24,000 --> 01:51:27,000
So right now.
1817
01:51:27,320 --> 01:51:31,200
I, I may be today, reading it,
1818
01:51:31,520 --> 01:51:34,520
and then we might.
1819
01:51:35,480 --> 01:51:38,480
but I think that I still have some.
1820
01:51:39,840 --> 01:51:43,200
kind of a favorite,
at least from what I do.
1821
01:51:44,040 --> 01:51:45,400
These three days.
1822
01:51:45,400 --> 01:51:48,240
So if you have any surprises
1823
01:51:48,240 --> 01:51:51,360
on the species
or the price of the weekend,
1824
01:51:52,400 --> 01:51:54,320
if you go to
1825
01:51:54,320 --> 01:51:57,320
get out,
1826
01:51:57,480 --> 01:52:01,600
for the days we think that the price,
1827
01:52:02,840 --> 01:52:05,840
just for us can be developments or,
1828
01:52:07,080 --> 01:52:10,080
some, some things as basic as, like
1829
01:52:10,880 --> 01:52:12,800
operations or.
1830
01:52:12,800 --> 01:52:14,720
Yeah. And, you know,
1831
01:52:15,720 --> 01:52:18,720
yeah.
1832
01:52:20,640 --> 01:52:21,280
so I don't have
1833
01:52:21,280 --> 01:52:24,360
specific courses to recommend, but,
1834
01:52:25,600 --> 01:52:29,440
I will say, though
admittedly not what you're asking for,
1835
01:52:29,960 --> 01:52:32,960
but which are already doing,
just experimenting with it.
1836
01:52:33,120 --> 01:52:37,560
again, I'm really encouraging people
to engage and explore.
1837
01:52:39,080 --> 01:52:42,560
having said that,
I think some of the best sources
1838
01:52:42,560 --> 01:52:47,120
for interesting insights
about these sorts of technologies.
1839
01:52:47,120 --> 01:52:50,120
I highly recommend MIT Technology Review.
1840
01:52:51,280 --> 01:52:54,600
is a good source,
and they have published some articles
1841
01:52:54,600 --> 01:52:57,600
about specifically
about using large language models and,
1842
01:52:57,840 --> 01:53:00,120
and thoughts
and recommendations around that.
1843
01:53:00,120 --> 01:53:01,680
I have a newsletter.
1844
01:53:01,680 --> 01:53:03,200
You're welcome to sign up to it.
1845
01:53:03,200 --> 01:53:08,880
I don't specialize in talking about LMS,
but I certainly have included a couple
1846
01:53:08,880 --> 01:53:14,720
of specific, episodes of that newsletter
in which I went through people using it in
1847
01:53:14,800 --> 01:53:18,480
for education purposes, and for,
1848
01:53:19,600 --> 01:53:21,800
even for financial modeling purposes
1849
01:53:21,800 --> 01:53:25,120
and recommendations around
how to get the most out of it.
1850
01:53:26,000 --> 01:53:29,280
and it turns out
there are actually a fair number of,
1851
01:53:30,120 --> 01:53:34,040
how to write with GPT videos on YouTube
right now.
1852
01:53:35,240 --> 01:53:38,640
I wouldn't recommend them
because I think they're very much
1853
01:53:38,640 --> 01:53:43,000
of the vein of it's going to speed it
all up and make it so fast.
1854
01:53:43,000 --> 01:53:45,080
Let's write a book in a day.
1855
01:53:45,080 --> 01:53:48,800
And that's taking the exact
wrong lesson out of this.
1856
01:53:49,280 --> 01:53:52,080
So steer away from those sorts of things.
1857
01:53:52,080 --> 01:53:55,160
Feel free to look at it's
all free on my site.
1858
01:53:55,160 --> 01:53:55,920
Feel free to look
1859
01:53:55,920 --> 01:53:58,920
and see some of the recommendations
I've had in my past newsletters,
1860
01:53:59,280 --> 01:54:02,280
and check out Technology Review.
1861
01:54:02,600 --> 01:54:05,600
I'm not an MIT grad.
1862
01:54:05,600 --> 01:54:08,600
I decided to go to a place
where people are,
1863
01:54:09,160 --> 01:54:12,160
collaborated with each other instead of
stabbing each other in the back.
1864
01:54:12,160 --> 01:54:15,160
But it's still an amazing place
with brilliant people, and
1865
01:54:15,280 --> 01:54:16,520
it's a good source of info.
1866
01:54:19,320 --> 01:54:21,360
Thank you.
1867
01:54:21,360 --> 01:54:24,080
Thank you very much.
1868
01:54:24,080 --> 01:54:27,480
so some of those things to me today,
so we'll say thank you
1869
01:54:28,520 --> 01:54:31,280
to people. So
1870
01:54:31,280 --> 01:54:34,280
in the code society,
1871
01:54:35,200 --> 01:54:38,200
Something about.
1872
01:54:40,320 --> 01:54:41,440
Small business.
1873
01:54:41,440 --> 01:54:41,840
Thank you.
1874
01:54:41,840 --> 01:54:44,840
Thank you very much for your time.
1875
01:54:46,240 --> 01:54:46,440
Thanks.
151494
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.