Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
0
00:00:00,000 --> 00:00:03,976
[MUSIC PLAYING]
1
00:00:03,976 --> 00:01:18,530
[MUSIC PLAYING]
2
00:01:18,530 --> 00:01:19,580
DAVID J. MALAN: All right.
3
00:01:19,580 --> 00:01:20,876
AUDIENCE: All right.
4
00:01:20,876 --> 00:01:24,440
DAVID J. MALAN: [CHUCKLES] This is CS50.
5
00:01:24,440 --> 00:01:27,140
And this is week 10, our very last together.
6
00:01:27,140 --> 00:01:29,750
And before we dive in today, I just wanted
7
00:01:29,750 --> 00:01:33,877
to acknowledge how much work we know this course is for everyone.
8
00:01:33,877 --> 00:01:35,960
We know there's still a tad bit of work remaining.
9
00:01:35,960 --> 00:01:39,168
But we do hope ultimately, that you're really proud of what you've pulled off
10
00:01:39,168 --> 00:01:40,572
over the past few months only.
11
00:01:40,572 --> 00:01:43,280
And indeed the final project, whatever it is you end up building,
12
00:01:43,280 --> 00:01:45,320
really is meant to be this capstone, where
13
00:01:45,320 --> 00:01:46,890
you're finally standing on your own.
14
00:01:46,890 --> 00:01:50,120
There's no distribution code, there's not really a specification, and really
15
00:01:50,120 --> 00:01:53,810
just an opportunity to take all this knowledge out now for a spin.
16
00:01:53,810 --> 00:01:56,150
And we do hope it serves you well longer term.
17
00:01:56,150 --> 00:02:00,140
Before we dive in too, just wanted to offer a number of thanks for so much
18
00:02:00,140 --> 00:02:02,690
of the team that helps out behind the scenes,
19
00:02:02,690 --> 00:02:06,110
in particular, the Memorial Hall team, our hosts here, who
20
00:02:06,110 --> 00:02:09,530
make all of the space and the activities behind the scenes possible,
21
00:02:09,530 --> 00:02:12,710
the education support services team, who helps with audio and video
22
00:02:12,710 --> 00:02:16,640
and more, and then especially CS50 his own team all here in the darkness
23
00:02:16,640 --> 00:02:18,950
helping out in front of the camera, behind the camera.
24
00:02:18,950 --> 00:02:22,625
If we could, a huge round of applause for everyone that makes this possible.
25
00:02:22,625 --> 00:02:24,932
[APPLAUSE AND CHEERS]
26
00:02:24,932 --> 00:02:31,220
27
00:02:31,220 --> 00:02:35,780
You might have noticed that these have been unusual times,
28
00:02:35,780 --> 00:02:38,480
and we've had some unusual guests in the front of the room here,
29
00:02:38,480 --> 00:02:41,893
since we weren't sure what to expect early on as to just what protocols
30
00:02:41,893 --> 00:02:42,685
would be on campus.
31
00:02:42,685 --> 00:02:45,260
And so we have, of course, all of these plush figures
32
00:02:45,260 --> 00:02:48,510
behind the scenes who have been helping out behind the camera,
33
00:02:48,510 --> 00:02:50,638
behind the monitors, and so forth.
34
00:02:50,638 --> 00:02:52,430
And what many of you'll see, if you've been
35
00:02:52,430 --> 00:02:55,405
watching right now or in the future of these videos online,
36
00:02:55,405 --> 00:02:57,530
you'll see a lot of backs of heads, so that there's
37
00:02:57,530 --> 00:03:00,810
a little bit of characteristic to some of the shots that we have here.
38
00:03:00,810 --> 00:03:03,650
But this is actually borne of an inspiration that
39
00:03:03,650 --> 00:03:07,730
comes from who will be ultimately today's special guest, Jennifer 8.
40
00:03:07,730 --> 00:03:10,730
Lee, in fact, whom we'll meet in just a little bit
41
00:03:10,730 --> 00:03:13,730
was ultimately the good friend of the cost
42
00:03:13,730 --> 00:03:17,780
that inspired this tradition of using puppetry in some form in the class
43
00:03:17,780 --> 00:03:18,530
here.
44
00:03:18,530 --> 00:03:22,070
What I see down below is a shot like this here.
45
00:03:22,070 --> 00:03:26,030
And funny enough, it seems that with machine learning, what it is nowadays,
46
00:03:26,030 --> 00:03:29,690
artificial intelligence, so to speak, on social media and the like--
47
00:03:29,690 --> 00:03:34,340
Literally, no joke I pulled up Twitter earlier today and among my suggestions
48
00:03:34,340 --> 00:03:39,740
for whom I should follow now we're literally the suggestions here.
49
00:03:39,740 --> 00:03:42,200
This is perhaps not surprising though, because some weeks
50
00:03:42,200 --> 00:03:45,807
back I actually started following Count von Count, whom
51
00:03:45,807 --> 00:03:47,390
you might remember from Sesame Street.
52
00:03:47,390 --> 00:03:51,800
If you're not following him already, this is an amazing Count to follow,
53
00:03:51,800 --> 00:03:53,405
an actual count to follow.
54
00:03:53,405 --> 00:03:55,405
And it's actually an amazing use of programming.
55
00:03:55,405 --> 00:03:58,160
So this account joined in April of 2012.
56
00:03:58,160 --> 00:04:02,480
It's got 198,000 followers as of today.
57
00:04:02,480 --> 00:04:05,810
And what it's been doing for like nine plus years
58
00:04:05,810 --> 00:04:09,150
is tweeting out a number, one per day.
59
00:04:09,150 --> 00:04:15,950
This morning's was 3,327.
60
00:04:15,950 --> 00:04:18,476
Yesterday's was 3,326.
61
00:04:18,476 --> 00:04:19,790
Ah, ah, ah.
62
00:04:19,790 --> 00:04:23,270
And so presumably someone's just written a program, Python or something else,
63
00:04:23,270 --> 00:04:25,860
that's just generating these tweets once a day.
64
00:04:25,860 --> 00:04:28,760
Even more amusing though, is that every tweet for the past nine years
65
00:04:28,760 --> 00:04:31,980
has 20 or 30 comments on it, from people who are following it.
66
00:04:31,980 --> 00:04:36,890
So, perhaps consider following this same account and the same application of CS
67
00:04:36,890 --> 00:04:37,660
as well.
68
00:04:37,660 --> 00:04:40,650
Wanted to also thank CS50's team behind the cameras.
69
00:04:40,650 --> 00:04:44,750
You might recall the teaching fellows last year in particular,
70
00:04:44,750 --> 00:04:49,220
when everything was on Zoom, kindly put together this visualization of TCP/IP
71
00:04:49,220 --> 00:04:53,690
and the passing of messages among routers and in turn computers,
72
00:04:53,690 --> 00:04:56,990
for instance, from Phyllis at bottom right to Brian at top left.
73
00:04:56,990 --> 00:05:00,530
Just wanted to thank the team but also reveal to you
74
00:05:00,530 --> 00:05:04,310
all that these takes were not perfect by any means.
75
00:05:04,310 --> 00:05:07,280
And in fact, here's just 60 seconds or so of outtakes of us
76
00:05:07,280 --> 00:05:14,050
trying to get data from point A to point B. Buffering.
77
00:05:14,050 --> 00:05:16,270
OK.
78
00:05:16,270 --> 00:05:16,810
Josh.
79
00:05:16,810 --> 00:05:19,000
Yes.
80
00:05:19,000 --> 00:05:26,230
Ellen, ooh [LAUGHS] [INAUDIBLE] No.
81
00:05:26,230 --> 00:05:26,950
Oh, wait.
82
00:05:26,950 --> 00:05:32,840
83
00:05:32,840 --> 00:05:33,950
That was amazing, Josh.
84
00:05:33,950 --> 00:05:36,650
85
00:05:36,650 --> 00:05:38,915
Uh, um, Sophie.
86
00:05:38,915 --> 00:05:41,680
87
00:05:41,680 --> 00:05:48,650
[LAUGHS] Amazing.
88
00:05:48,650 --> 00:05:50,840
That was perfect.
89
00:05:50,840 --> 00:05:51,607
Tony.
90
00:05:51,607 --> 00:05:54,290
[LAUGHS]
91
00:05:54,290 --> 00:05:55,615
AUDIENCE: I think I--
92
00:05:55,615 --> 00:05:57,510
AUDIENCE: Hey, [INAUDIBLE]
93
00:05:57,510 --> 00:05:59,040
DAVID J. MALAN: [INAUDIBLE]
94
00:05:59,040 --> 00:06:00,290
AUDIENCE: Oh, now you expect--
95
00:06:00,290 --> 00:06:00,915
DAVID J. MALAN: Guy?
96
00:06:00,915 --> 00:06:04,880
97
00:06:04,880 --> 00:06:05,725
That was amazing.
98
00:06:05,725 --> 00:06:06,637
Thank you all.
99
00:06:06,637 --> 00:06:07,470
AUDIENCE: --so good.
100
00:06:07,470 --> 00:06:09,835
101
00:06:09,835 --> 00:06:10,710
DAVID J. MALAN: All right.
102
00:06:10,710 --> 00:06:13,418
If we could too a round of applause for all the teaching fellows,
103
00:06:13,418 --> 00:06:15,300
teaching assistants, and course assistant
104
00:06:15,300 --> 00:06:18,750
who make the course possible as well.
105
00:06:18,750 --> 00:06:22,330
Before we now do a bit of review of the semester,
106
00:06:22,330 --> 00:06:25,448
I thought we'd take first a higher level view of where we've come from.
107
00:06:25,448 --> 00:06:27,990
Recall, of course, from the syllabus and literally week zero,
108
00:06:27,990 --> 00:06:30,120
we claimed this, that would ultimately matters in this course
109
00:06:30,120 --> 00:06:32,520
is not so much where you end up relative to classmates
110
00:06:32,520 --> 00:06:35,310
but where you end up relative to yourself when you began.
111
00:06:35,310 --> 00:06:36,435
And we really do mean that.
112
00:06:36,435 --> 00:06:39,143
There are certainly classmates of yours who have been programming
113
00:06:39,143 --> 00:06:40,330
since they are 10 years old.
114
00:06:40,330 --> 00:06:43,432
But there are 2/3 of your classmates who were not in fact that case.
115
00:06:43,432 --> 00:06:46,140
And so behind you, in front of you, to the left, and to the right
116
00:06:46,140 --> 00:06:49,930
today are so many classmates who have had a very shared experience with you.
117
00:06:49,930 --> 00:06:53,100
But the only person that really matters at the end of the day, in terms
118
00:06:53,100 --> 00:06:56,910
of how you've progressed in this class truly, is where you in fact began.
119
00:06:56,910 --> 00:06:59,820
And I realized that with CS, and especially this course,
120
00:06:59,820 --> 00:07:01,950
and with programming assignments especially,
121
00:07:01,950 --> 00:07:04,950
it can feel like week after week that you're not really making progress,
122
00:07:04,950 --> 00:07:07,830
because it might feel like you're struggling every darn week.
123
00:07:07,830 --> 00:07:12,147
But that's just really because we keep moving the bar higher and higher
124
00:07:12,147 --> 00:07:14,730
or pushing the finish line a little further and further ahead.
125
00:07:14,730 --> 00:07:18,420
Because think back to week one when this for instance-- whoops.
126
00:07:18,420 --> 00:07:22,200
When this alone was hard and you were just
127
00:07:22,200 --> 00:07:25,770
trying to get Mario to ascend a pyramid that might look a little something
128
00:07:25,770 --> 00:07:26,580
like this.
129
00:07:26,580 --> 00:07:30,090
Or the week after, when you started dabbling with readability,
130
00:07:30,090 --> 00:07:30,960
or two weeks after.
131
00:07:30,960 --> 00:07:33,910
Mr. And Mrs. Dursty of number four privet drive, and so forth,
132
00:07:33,910 --> 00:07:36,780
trying to analyze just how complex a sentence like that was,
133
00:07:36,780 --> 00:07:39,990
and manipulating strings and characters for the first time.
134
00:07:39,990 --> 00:07:44,228
And then of course, we progress to deeper dives into algorithms
135
00:07:44,228 --> 00:07:46,770
and actually implementing something that's all too real world
136
00:07:46,770 --> 00:07:50,760
these days in implementing electoral algorithms in a few different forms.
137
00:07:50,760 --> 00:07:54,030
Dabbling thereafter in a bit of forensics, a bit of imagery,
138
00:07:54,030 --> 00:07:57,780
and taking images like this here and filtering it in a number of ways.
139
00:07:57,780 --> 00:08:00,330
Ultimately understanding, hopefully, how these things
140
00:08:00,330 --> 00:08:01,980
are implemented underneath the hood.
141
00:08:01,980 --> 00:08:05,220
So that henceforth, when all you're doing is tapping an icon on your phone
142
00:08:05,220 --> 00:08:07,350
or clicking a command on your computer, you
143
00:08:07,350 --> 00:08:10,320
can infer, even if you didn't write that particular code,
144
00:08:10,320 --> 00:08:12,683
how the thing is likely working.
145
00:08:12,683 --> 00:08:15,600
And even if you had started to get your footing then around week four,
146
00:08:15,600 --> 00:08:18,690
then things escalated quickly further to data structures.
147
00:08:18,690 --> 00:08:23,370
But recall for your spell checker, you implemented a fairly sophisticated data
148
00:08:23,370 --> 00:08:24,930
structure known as a hash table.
149
00:08:24,930 --> 00:08:26,805
And even if you struggle to get that working,
150
00:08:26,805 --> 00:08:30,330
again, think back five weeks prior you were just
151
00:08:30,330 --> 00:08:33,240
trying to get for loops to work and variables to work.
152
00:08:33,240 --> 00:08:36,480
And so each week realized there was significant progress.
153
00:08:36,480 --> 00:08:39,539
And then if you aggregate all these most recent weeks with Python,
154
00:08:39,539 --> 00:08:42,270
and SQL, HTML, JavaScript, and CSS.
155
00:08:42,270 --> 00:08:44,460
I mean, you built your very own web application.
156
00:08:44,460 --> 00:08:46,710
And many of you will go on and build something grander
157
00:08:46,710 --> 00:08:50,580
for your own final project, or focus again on C, or on Python alone,
158
00:08:50,580 --> 00:08:51,310
or the like.
159
00:08:51,310 --> 00:08:54,030
But ultimately aggregating all of these technologies
160
00:08:54,030 --> 00:08:57,307
and kind of stitching together something that you yourself created.
161
00:08:57,307 --> 00:08:59,640
We might have put some of the foundation there in place,
162
00:08:59,640 --> 00:09:02,970
but the end result ultimately is yours.
163
00:09:02,970 --> 00:09:05,340
So at the end of the day, as we promised in week zero,
164
00:09:05,340 --> 00:09:08,130
this course is really about computational thinking,
165
00:09:08,130 --> 00:09:09,870
cleaning up your thought process, getting
166
00:09:09,870 --> 00:09:12,810
you to think a little more logically, more methodically,
167
00:09:12,810 --> 00:09:16,830
and to express yourself just as logically and methodically.
168
00:09:16,830 --> 00:09:19,470
But it's also about, in some form, critical thinking.
169
00:09:19,470 --> 00:09:21,975
And at the end of the day, what computer science is,
170
00:09:21,975 --> 00:09:26,340
is really just taking input producing output, ideally correct output.
171
00:09:26,340 --> 00:09:28,840
And all the hard stuff is in the middle there.
172
00:09:28,840 --> 00:09:32,040
But what we do hope you have in your toolkit, so to speak,
173
00:09:32,040 --> 00:09:34,290
is all the more of a mental model, all the more
174
00:09:34,290 --> 00:09:36,600
of an understanding of first principles from which you
175
00:09:36,600 --> 00:09:40,350
can derive new outputs, new conclusions, based on those inputs.
176
00:09:40,350 --> 00:09:44,070
And certainly today there's so much misinformation or miseducation
177
00:09:44,070 --> 00:09:44,650
in the world.
178
00:09:44,650 --> 00:09:47,700
And just being able to take input and produce proper output in and of
179
00:09:47,700 --> 00:09:49,860
itself is a compelling skill.
180
00:09:49,860 --> 00:09:52,410
And indeed when you all find yourselves invariably
181
00:09:52,410 --> 00:09:55,710
in engineering positions, where you're asked to build something
182
00:09:55,710 --> 00:09:58,238
because you now can or perhaps you're in a managerial role
183
00:09:58,238 --> 00:10:00,030
where you decide you should build something
184
00:10:00,030 --> 00:10:01,620
because you know people who can.
185
00:10:01,620 --> 00:10:05,250
I would also start to consider, even though the past 10 plus weeks have all
186
00:10:05,250 --> 00:10:08,730
been about build this because we ask you to,
187
00:10:08,730 --> 00:10:10,890
really start to consider whether it's for fun,
188
00:10:10,890 --> 00:10:13,380
for profession, for political purposes, or the like.
189
00:10:13,380 --> 00:10:14,760
Should you build something?
190
00:10:14,760 --> 00:10:17,760
And actually considering, now that you have this skill,
191
00:10:17,760 --> 00:10:20,280
how you can use it most responsibly.
192
00:10:20,280 --> 00:10:22,890
And not just make a website do something or make an app
193
00:10:22,890 --> 00:10:27,210
do something because it can be done, but really start to ask and ask of others,
194
00:10:27,210 --> 00:10:28,720
should we be doing this?
195
00:10:28,720 --> 00:10:32,790
It's just a skill that you can but don't necessarily have to use.
196
00:10:32,790 --> 00:10:34,920
Now, when it comes to writing some actual code,
197
00:10:34,920 --> 00:10:38,430
keep in mind that you might continue to evaluate
198
00:10:38,430 --> 00:10:40,440
or your employer or your colleagues might
199
00:10:40,440 --> 00:10:42,840
continue to evaluate your code along these same axes.
200
00:10:42,840 --> 00:10:44,250
These are not CS50 specific.
201
00:10:44,250 --> 00:10:45,570
Correctness.
202
00:10:45,570 --> 00:10:47,310
Does it do what it's supposed to do?
203
00:10:47,310 --> 00:10:48,000
Design.
204
00:10:48,000 --> 00:10:50,730
How well qualitatively is it implemented?
205
00:10:50,730 --> 00:10:51,450
And then style.
206
00:10:51,450 --> 00:10:52,320
How readable is it?
207
00:10:52,320 --> 00:10:53,250
How pretty is it?
208
00:10:53,250 --> 00:10:56,250
And these three axes should really guide all of your thinking,
209
00:10:56,250 --> 00:11:01,440
whether it's for a test, or a project, or an open source project, or the like.
210
00:11:01,440 --> 00:11:03,610
All three of these things really matter.
211
00:11:03,610 --> 00:11:06,382
And so, if you're in the mindset of wondering, oh, do I
212
00:11:06,382 --> 00:11:07,840
have to worry about style for this?
213
00:11:07,840 --> 00:11:09,220
Do I have to comment this?
214
00:11:09,220 --> 00:11:11,020
The answer is always yes.
215
00:11:11,020 --> 00:11:14,170
This is what it means to be a good programmer, a good engineer,
216
00:11:14,170 --> 00:11:17,440
to optimize these kinds of axes.
217
00:11:17,440 --> 00:11:20,500
Now what about those tools in the toolkit?
218
00:11:20,500 --> 00:11:23,020
Well, let's focus on just a couple of here.
219
00:11:23,020 --> 00:11:25,758
Full circle, at the end of the semester, abstraction recall
220
00:11:25,758 --> 00:11:27,550
was one of the tools in the toolkit that we
221
00:11:27,550 --> 00:11:31,540
proposed is all about taking complicated problems, complicated ideas,
222
00:11:31,540 --> 00:11:33,430
and simplifying them to really the essence.
223
00:11:33,430 --> 00:11:35,680
So you can focus on really just what matters
224
00:11:35,680 --> 00:11:38,170
or what helps you get real work done.
225
00:11:38,170 --> 00:11:41,418
And then related to that was also this notion of precision.
226
00:11:41,418 --> 00:11:43,210
Even as you abstract things away, you still
227
00:11:43,210 --> 00:11:46,240
have to be super precise when you're writing code for a computer
228
00:11:46,240 --> 00:11:48,380
or just giving instructions to another human
229
00:11:48,380 --> 00:11:53,020
so that they are implementing your ideas, your algorithms correctly.
230
00:11:53,020 --> 00:11:56,980
And sometimes these two goals, abstraction and precision
231
00:11:56,980 --> 00:11:58,717
can rather be at odds at one another.
232
00:11:58,717 --> 00:12:00,550
And what we thought we'd do is give everyone
233
00:12:00,550 --> 00:12:03,300
a sheet of paper today, which you probably received on the way in,
234
00:12:03,300 --> 00:12:04,180
if not a pen as well.
235
00:12:04,180 --> 00:12:06,610
If you didn't receive, hopefully you or a friend near you
236
00:12:06,610 --> 00:12:08,680
has a sheet of paper and a pen or pencil.
237
00:12:08,680 --> 00:12:10,030
Do go ahead and grab that.
238
00:12:10,030 --> 00:12:13,930
And we thought we'd come full circle too and see if we can't get a brave
239
00:12:13,930 --> 00:12:16,720
volunteer to come up on the stage here.
240
00:12:16,720 --> 00:12:19,020
And we just need someone to give some stage directions.
241
00:12:19,020 --> 00:12:19,520
All right.
242
00:12:19,520 --> 00:12:21,340
I like it when people start pointing and pointing.
243
00:12:21,340 --> 00:12:22,870
How about you being pointed at?
244
00:12:22,870 --> 00:12:24,300
Yes.
245
00:12:24,300 --> 00:12:24,850
Yes, you.
246
00:12:24,850 --> 00:12:25,540
Yes.
247
00:12:25,540 --> 00:12:27,768
Come on down.
248
00:12:27,768 --> 00:12:29,810
Well there'll be one more opportunity after this.
249
00:12:29,810 --> 00:12:30,520
Come on down.
250
00:12:30,520 --> 00:12:31,990
What's your name?
251
00:12:31,990 --> 00:12:32,830
AUDIENCE: Claire.
252
00:12:32,830 --> 00:12:33,300
DAVID J. MALAN: Claire.
253
00:12:33,300 --> 00:12:33,800
OK.
254
00:12:33,800 --> 00:12:36,370
A round of applause for Claire for being so enthusiastic.
255
00:12:36,370 --> 00:12:37,552
[APPLAUSE]
256
00:12:37,552 --> 00:12:39,400
257
00:12:39,400 --> 00:12:40,540
Come on over here.
258
00:12:40,540 --> 00:12:43,000
Would you like to make a quick introduction to the group?
259
00:12:43,000 --> 00:12:43,625
AUDIENCE: Yeah.
260
00:12:43,625 --> 00:12:45,154
Hey.
261
00:12:45,154 --> 00:12:47,860
[CHUCKLES] I'm Claire.
262
00:12:47,860 --> 00:12:48,590
Yeah.
263
00:12:48,590 --> 00:12:50,380
That's all you need to know about me.
264
00:12:50,380 --> 00:12:50,830
DAVID J. MALAN: All right.
265
00:12:50,830 --> 00:12:51,340
[CHUCKLES]
266
00:12:51,340 --> 00:12:53,770
So what I'm about to hand Claire is a sheet of paper
267
00:12:53,770 --> 00:12:55,640
that has a drawing on it.
268
00:12:55,640 --> 00:13:00,100
And the goal at hand is for you all to ultimately follow
269
00:13:00,100 --> 00:13:02,410
Claire's hopefully very precise instructions,
270
00:13:02,410 --> 00:13:05,368
because she's going to give you step by step instructions, an algorithm
271
00:13:05,368 --> 00:13:08,930
if you will, for drawing something on that sheet of paper.
272
00:13:08,930 --> 00:13:09,430
All right.
273
00:13:09,430 --> 00:13:11,410
We're going to keep it in this manila envelope
274
00:13:11,410 --> 00:13:12,868
so that folks can't see through it.
275
00:13:12,868 --> 00:13:17,110
But this is what we would like you to give verbal instructions
276
00:13:17,110 --> 00:13:18,460
to the audience to draw.
277
00:13:18,460 --> 00:13:21,130
And you can say anything you want, but you may not
278
00:13:21,130 --> 00:13:25,032
make physical hand gestures or the like, and/or dip it down
279
00:13:25,032 --> 00:13:25,990
so everyone can see it.
280
00:13:25,990 --> 00:13:27,010
AUDIENCE: Oh, that's so true.
281
00:13:27,010 --> 00:13:27,670
That's so true.
282
00:13:27,670 --> 00:13:28,030
DAVID J. MALAN: All right.
283
00:13:28,030 --> 00:13:28,390
Go ahead.
284
00:13:28,390 --> 00:13:28,980
Step one.
285
00:13:28,980 --> 00:13:29,605
AUDIENCE: Wait.
286
00:13:29,605 --> 00:13:30,772
I could say whatever I want?
287
00:13:30,772 --> 00:13:32,230
DAVID J. MALAN: Related to this problem.
288
00:13:32,230 --> 00:13:32,830
Yes, idea.
289
00:13:32,830 --> 00:13:33,755
[CHUCKLES]
290
00:13:33,755 --> 00:13:34,630
AUDIENCE: Oh, my God.
291
00:13:34,630 --> 00:13:38,080
DAVID J. MALAN: Give them instructions for recreating this picture on their paper.
292
00:13:38,080 --> 00:13:40,614
AUDIENCE: OK.
293
00:13:40,614 --> 00:13:53,080
Start with like a square but, but it's--
294
00:13:53,080 --> 00:13:54,460
DAVID J. MALAN: No hand gestures.
295
00:13:54,460 --> 00:13:54,780
AUDIENCE: OK.
296
00:13:54,780 --> 00:13:55,280
OK.
297
00:13:55,280 --> 00:13:55,870
Sorry, sorry.
298
00:13:55,870 --> 00:13:59,170
Start with the square, but it's like a diamond kind--
299
00:13:59,170 --> 00:14:01,810
like there's a point on top.
300
00:14:01,810 --> 00:14:04,000
[LAUGHS] Wait.
301
00:14:04,000 --> 00:14:05,680
I should not be the one doing this.
302
00:14:05,680 --> 00:14:06,430
OK.
303
00:14:06,430 --> 00:14:08,680
So it's like a square but--
304
00:14:08,680 --> 00:14:12,657
yeah, start with a square.
305
00:14:12,657 --> 00:14:13,240
DAVID J. MALAN: OK.
306
00:14:13,240 --> 00:14:14,290
Step two?
307
00:14:14,290 --> 00:14:18,520
AUDIENCE: Step two is that on one of the sides of the square
308
00:14:18,520 --> 00:14:20,035
there's another square.
309
00:14:20,035 --> 00:14:21,934
[CHUCKLES]
310
00:14:21,934 --> 00:14:25,100
311
00:14:25,100 --> 00:14:27,100
DAVID J. MALAN: Doing really well on the abstraction.
312
00:14:27,100 --> 00:14:29,030
AUDIENCE: I don't feel like I'm doing too hot.
313
00:14:29,030 --> 00:14:29,530
OK.
314
00:14:29,530 --> 00:14:30,880
Woo.
315
00:14:30,880 --> 00:14:33,057
Does this affect my grade in any way?
316
00:14:33,057 --> 00:14:33,640
DAVID J. MALAN: No.
317
00:14:33,640 --> 00:14:34,140
No.
318
00:14:34,140 --> 00:14:34,780
[LAUGHTER]
319
00:14:34,780 --> 00:14:35,140
AUDIENCE: OK.
320
00:14:35,140 --> 00:14:35,848
DAVID J. MALAN: Go on.
321
00:14:35,848 --> 00:14:36,620
Two squares.
322
00:14:36,620 --> 00:14:37,120
Step three.
323
00:14:37,120 --> 00:14:39,100
AUDIENCE: Then there's another square.
324
00:14:39,100 --> 00:14:39,960
[CHUCKLES]
325
00:14:39,960 --> 00:14:40,840
326
00:14:40,840 --> 00:14:41,800
But they're not square.
327
00:14:41,800 --> 00:14:44,470
They're kind of slanted.
328
00:14:44,470 --> 00:14:46,360
There's another square in between--
329
00:14:46,360 --> 00:14:52,420
next to those squares, connecting those squares.
330
00:14:52,420 --> 00:14:54,550
DAVID J. MALAN: Any step four?
331
00:14:54,550 --> 00:14:59,353
AUDIENCE: Step four is that it should look like a cube.
332
00:14:59,353 --> 00:15:00,802
[CHUCKLES]
333
00:15:00,802 --> 00:15:02,740
334
00:15:02,740 --> 00:15:03,370
DAVID J. MALAN: OK.
335
00:15:03,370 --> 00:15:05,050
So let's go ahead and pause here.
336
00:15:05,050 --> 00:15:06,112
Pause here.
337
00:15:06,112 --> 00:15:07,945
Let's thank Claire for coming on up bravely.
338
00:15:07,945 --> 00:15:08,445
[APPLAUSE]
339
00:15:08,445 --> 00:15:10,720
I'll take this.
340
00:15:10,720 --> 00:15:12,852
Let's go ahead and collect just a few of these.
341
00:15:12,852 --> 00:15:14,560
If maybe Carter and Valerie, you wouldn't
342
00:15:14,560 --> 00:15:16,600
mind helping me grab just a few sheets of paper.
343
00:15:16,600 --> 00:15:20,230
If you'd like to volunteer what it is you drew in those seconds,
344
00:15:20,230 --> 00:15:21,790
just hand it over if you would like.
345
00:15:21,790 --> 00:15:25,100
No need for a name or anything like that.
346
00:15:25,100 --> 00:15:25,600
OK.
347
00:15:25,600 --> 00:15:26,020
All right.
348
00:15:26,020 --> 00:15:26,560
Very eager.
349
00:15:26,560 --> 00:15:27,890
Thank you.
350
00:15:27,890 --> 00:15:28,390
OK.
351
00:15:28,390 --> 00:15:30,580
Thank you.
352
00:15:30,580 --> 00:15:32,870
All right.
353
00:15:32,870 --> 00:15:33,370
Thank you.
354
00:15:33,370 --> 00:15:34,480
Thank you.
355
00:15:34,480 --> 00:15:36,436
OK.
356
00:15:36,436 --> 00:15:37,870
[STRAINING] Sorry.
357
00:15:37,870 --> 00:15:38,820
OK.
358
00:15:38,820 --> 00:15:39,460
That's plenty.
359
00:15:39,460 --> 00:15:40,180
Let's come on up.
360
00:15:40,180 --> 00:15:42,670
If you want to-- want to hand me yours too.
361
00:15:42,670 --> 00:15:43,600
OK.
362
00:15:43,600 --> 00:15:44,870
Sorry to reach.
363
00:15:44,870 --> 00:15:45,370
All right.
364
00:15:45,370 --> 00:15:49,130
So Carter if you want to meet me up on stage for a second.
365
00:15:49,130 --> 00:15:52,540
So we have a whole bunch of submissions here
366
00:15:52,540 --> 00:15:56,770
that represent what it was Claire was describing.
367
00:15:56,770 --> 00:16:00,790
Let me go ahead and just project here in a moment.
368
00:16:00,790 --> 00:16:01,640
Use my camera.
369
00:16:01,640 --> 00:16:03,050
So here we have one.
370
00:16:03,050 --> 00:16:03,550
Let's see.
371
00:16:03,550 --> 00:16:06,530
Carter feel free to just bring those on up here.
372
00:16:06,530 --> 00:16:07,030
OK.
373
00:16:07,030 --> 00:16:10,750
So here we have one I'll hold up.
374
00:16:10,750 --> 00:16:11,350
All right.
375
00:16:11,350 --> 00:16:12,490
Some squares overlapping.
376
00:16:12,490 --> 00:16:13,990
Starts to look more like a cube.
377
00:16:13,990 --> 00:16:15,820
Thank you so much.
378
00:16:15,820 --> 00:16:18,200
Here maybe in more primitive form--
379
00:16:18,200 --> 00:16:18,700
[LAUGHTER]
380
00:16:18,700 --> 00:16:20,660
--was another one.
381
00:16:20,660 --> 00:16:23,200
This one kind of started to have wheels, which was kind of--
382
00:16:23,200 --> 00:16:24,610
[LAUGHTER]
383
00:16:24,610 --> 00:16:28,270
And then things started to take shape, perhaps at the very end, both big cube
384
00:16:28,270 --> 00:16:29,260
and small cube.
385
00:16:29,260 --> 00:16:32,680
What it was that Claire was showing us now, if we project it,
386
00:16:32,680 --> 00:16:34,390
was in fact this.
387
00:16:34,390 --> 00:16:36,280
And it's actually-- exactly what, Claire,
388
00:16:36,280 --> 00:16:39,250
you just went through is actually a perfect example of why abstraction
389
00:16:39,250 --> 00:16:41,410
can be hard and where the line is when you're just
390
00:16:41,410 --> 00:16:42,868
trying to communicate instructions.
391
00:16:42,868 --> 00:16:45,520
So in fairness, might have been nice to just start
392
00:16:45,520 --> 00:16:47,803
with, we're going to draw a cube and here's how,
393
00:16:47,803 --> 00:16:49,720
because that was kind of a spoiler at the end.
394
00:16:49,720 --> 00:16:51,610
But that too, a cube is an abstraction.
395
00:16:51,610 --> 00:16:53,260
But it's not very precise, right?
396
00:16:53,260 --> 00:16:54,700
Like, how big is the cube?
397
00:16:54,700 --> 00:16:56,567
At what angle is it rotated?
398
00:16:56,567 --> 00:16:57,650
How are you looking at it?
399
00:16:57,650 --> 00:17:00,067
And so when you were struggling to describe these squares,
400
00:17:00,067 --> 00:17:02,740
but no they're kind of like diamonds or whatnot, I mean,
401
00:17:02,740 --> 00:17:05,950
that's because of this tension between what it is you're trying to abstract
402
00:17:05,950 --> 00:17:08,170
but what it is you're trying to communicate.
403
00:17:08,170 --> 00:17:11,230
You could have gone maybe the complete other direction
404
00:17:11,230 --> 00:17:14,800
and maybe have been super precise and not abstract this thing away as a cube.
405
00:17:14,800 --> 00:17:16,540
But say to everyone, all right, everyone,
406
00:17:16,540 --> 00:17:18,380
put your pen down on the paper.
407
00:17:18,380 --> 00:17:23,720
Now draw a diagonal line to say southwest at 45 degrees.
408
00:17:23,720 --> 00:17:25,300
Now, do another one that's south.
409
00:17:25,300 --> 00:17:28,490
You could really get into the weeds and tell people to go up, down, left,
410
00:17:28,490 --> 00:17:28,990
right.
411
00:17:28,990 --> 00:17:31,573
Of course, it could get a little tricky if they sort of follow
412
00:17:31,573 --> 00:17:32,980
the direction incorrectly.
413
00:17:32,980 --> 00:17:34,930
But it would be hard for us all to know what
414
00:17:34,930 --> 00:17:38,690
it is we're drawing if all we're hearing are these very low level instructions.
415
00:17:38,690 --> 00:17:40,940
But that's what you're doing when you're writing code.
416
00:17:40,940 --> 00:17:43,090
You might implement a function called cube.
417
00:17:43,090 --> 00:17:46,150
How it works is via those low level instructions.
418
00:17:46,150 --> 00:17:47,740
But after that, you just don't care.
419
00:17:47,740 --> 00:17:50,350
You'd much rather think about it as a cube function,
420
00:17:50,350 --> 00:17:54,370
maybe with some arguments that speak to the size or the rotation of it
421
00:17:54,370 --> 00:17:57,140
or the like, and that's where, again, abstraction can come in.
422
00:17:57,140 --> 00:17:59,020
So as we've discussed for so many weeks now,
423
00:17:59,020 --> 00:18:01,570
these trade-off that were manifest even in week zero,
424
00:18:01,570 --> 00:18:05,080
even if we didn't necessarily put our finger on it just then.
425
00:18:05,080 --> 00:18:07,480
Why don't we do things in a slightly different direction?
426
00:18:07,480 --> 00:18:09,620
If we could get one other volunteer.
427
00:18:09,620 --> 00:18:10,120
OK.
428
00:18:10,120 --> 00:18:10,840
Come on down.
429
00:18:10,840 --> 00:18:12,160
I saw your hand first.
430
00:18:12,160 --> 00:18:16,690
One other volunteer, who this time we're going to give the pen to.
431
00:18:16,690 --> 00:18:19,450
We're going to give the pen to.
432
00:18:19,450 --> 00:18:20,800
And what's your name?
433
00:18:20,800 --> 00:18:22,760
Jonathan, come on up.
434
00:18:22,760 --> 00:18:25,940
So I'm going to make this screen be drawable in just a moment.
435
00:18:25,940 --> 00:18:31,210
But what we need you to do first, on the honor system, is close your eyes.
436
00:18:31,210 --> 00:18:31,900
All right.
437
00:18:31,900 --> 00:18:33,085
Eyes are closed.
438
00:18:33,085 --> 00:18:34,960
Everyone else in the audience is about to see
439
00:18:34,960 --> 00:18:36,660
the picture that we want you to draw.
440
00:18:36,660 --> 00:18:39,160
And you all the audience are going to give Jonathan the step
441
00:18:39,160 --> 00:18:40,840
by step instructions this time around.
442
00:18:40,840 --> 00:18:42,380
So eyes stay closed.
443
00:18:42,380 --> 00:18:46,090
This is what we're going to want Jonathan to draw.
444
00:18:46,090 --> 00:18:49,060
So kind of ingrain it in your mind.
445
00:18:49,060 --> 00:18:52,540
If you need a refresher we can have him close his eyes again.
446
00:18:52,540 --> 00:18:53,890
That's what we want him to draw.
447
00:18:53,890 --> 00:18:55,540
I'm going to go back to the blank screen.
448
00:18:55,540 --> 00:18:55,720
All right.
449
00:18:55,720 --> 00:18:57,095
Jonathan, you can open your eyes.
450
00:18:57,095 --> 00:18:58,780
We have a blank canvas.
451
00:18:58,780 --> 00:19:03,130
And now, step one, what would you like Jonathan to draw first?
452
00:19:03,130 --> 00:19:04,293
AUDIENCE: Draw a circle.
453
00:19:04,293 --> 00:19:05,710
DAVID J. MALAN: Draw a circle, I heard.
454
00:19:05,710 --> 00:19:10,150
455
00:19:10,150 --> 00:19:12,860
OK, it's a little smaller I'm hearing now.
456
00:19:12,860 --> 00:19:13,360
OK.
457
00:19:13,360 --> 00:19:15,680
You can move it.
458
00:19:15,680 --> 00:19:16,180
Oh, no.
459
00:19:16,180 --> 00:19:16,763
Don't do that.
460
00:19:16,763 --> 00:19:17,930
All right.
461
00:19:17,930 --> 00:19:18,940
We'll give you one redo.
462
00:19:18,940 --> 00:19:21,820
Use three fingers to delete everything.
463
00:19:21,820 --> 00:19:23,350
Three fingers all together.
464
00:19:23,350 --> 00:19:24,130
Yep.
465
00:19:24,130 --> 00:19:26,920
They we-- Farther apart.
466
00:19:26,920 --> 00:19:27,660
There we go.
467
00:19:27,660 --> 00:19:29,060
No, it's back.
468
00:19:29,060 --> 00:19:29,560
OK.
469
00:19:29,560 --> 00:19:30,110
I'll do this part.
470
00:19:30,110 --> 00:19:30,610
[CHUCKLES]
471
00:19:30,610 --> 00:19:31,210
OK.
472
00:19:31,210 --> 00:19:31,780
All right.
473
00:19:31,780 --> 00:19:33,580
So I heard-- thank you.
474
00:19:33,580 --> 00:19:35,080
I heard draw a circle.
475
00:19:35,080 --> 00:19:38,560
Would anyone like to finish the sentence more precisely?
476
00:19:38,560 --> 00:19:39,910
A smaller circle--
477
00:19:39,910 --> 00:19:41,380
[INAUDIBLE]
478
00:19:41,380 --> 00:19:41,965
--on top.
479
00:19:41,965 --> 00:19:43,960
AUDIENCE: A medium sized circle.
480
00:19:43,960 --> 00:19:46,760
DAVID J. MALAN: A medium sized circle at the top.
481
00:19:46,760 --> 00:19:47,260
All right.
482
00:19:47,260 --> 00:19:48,220
That's pretty good.
483
00:19:48,220 --> 00:19:49,617
Medium size circle at the top.
484
00:19:49,617 --> 00:19:50,950
And no more deleting after this.
485
00:19:50,950 --> 00:19:51,370
Good.
486
00:19:51,370 --> 00:19:51,820
[APPLAUSE]
487
00:19:51,820 --> 00:19:52,320
All right.
488
00:19:52,320 --> 00:19:53,492
Step two.
489
00:19:53,492 --> 00:19:54,820
AUDIENCE: A line straight down.
490
00:19:54,820 --> 00:19:56,440
DAVID J. MALAN: A line straight down.
491
00:19:56,440 --> 00:20:03,323
AUDIENCE: [INAUDIBLE]
492
00:20:03,323 --> 00:20:03,990
DAVID J. MALAN: Yeah.
493
00:20:03,990 --> 00:20:05,080
OK.
494
00:20:05,080 --> 00:20:05,580
Good.
495
00:20:05,580 --> 00:20:05,820
All right.
496
00:20:05,820 --> 00:20:06,570
That was step two.
497
00:20:06,570 --> 00:20:07,990
Nicely done.
498
00:20:07,990 --> 00:20:08,490
What's that?
499
00:20:08,490 --> 00:20:09,676
Step three.
500
00:20:09,676 --> 00:20:13,167
AUDIENCE: Draw a line down from the bottom of the line to the left.
501
00:20:13,167 --> 00:20:15,500
DAVID J. MALAN: Draw a line down from the bottom to the left.
502
00:20:15,500 --> 00:20:19,790
AUDIENCE: [INAUDIBLE]
503
00:20:19,790 --> 00:20:21,650
DAVID J. MALAN: OK.
504
00:20:21,650 --> 00:20:22,560
Good.
505
00:20:22,560 --> 00:20:23,060
All right.
506
00:20:23,060 --> 00:20:23,540
Next.
507
00:20:23,540 --> 00:20:24,332
Let's go over here.
508
00:20:24,332 --> 00:20:27,020
Next one?
509
00:20:27,020 --> 00:20:29,325
Same thing, but on the right.
510
00:20:29,325 --> 00:20:29,950
AUDIENCE: Yeah.
511
00:20:29,950 --> 00:20:30,950
DAVID J. MALAN: Yes.
512
00:20:30,950 --> 00:20:32,040
All right.
513
00:20:32,040 --> 00:20:32,540
That's what?
514
00:20:32,540 --> 00:20:33,457
One, two, three, four.
515
00:20:33,457 --> 00:20:35,750
Step five?
516
00:20:35,750 --> 00:20:36,910
Yes, step five.
517
00:20:36,910 --> 00:20:38,210
AUDIENCE: Do that again higher.
518
00:20:38,210 --> 00:20:39,590
Closer to the circle.
519
00:20:39,590 --> 00:20:43,670
DAVID J. MALAN: Do that again but higher, closer to the circle on the right side.
520
00:20:43,670 --> 00:20:49,280
AUDIENCE: [INAUDIBLE]
521
00:20:49,280 --> 00:20:50,540
DAVID J. MALAN: Oh.
522
00:20:50,540 --> 00:20:52,320
We're going to have to go with it.
523
00:20:52,320 --> 00:20:54,050
Step six.
524
00:20:54,050 --> 00:20:55,382
Step six?
525
00:20:55,382 --> 00:20:59,270
AUDIENCE: Starting from the neck, draw a line down and to the right.
526
00:20:59,270 --> 00:21:02,821
DAVID J. MALAN: Starting from the neck, draw a line down and to the right.
527
00:21:02,821 --> 00:21:08,018
AUDIENCE: [INAUDIBLE]
528
00:21:08,018 --> 00:21:09,310
DAVID J. MALAN: You don't like that.
529
00:21:09,310 --> 00:21:09,810
He's--
530
00:21:09,810 --> 00:21:10,870
AUDIENCE: [INAUDIBLE]
531
00:21:10,870 --> 00:21:11,710
DAVID J. MALAN: What do you want him to do?
532
00:21:11,710 --> 00:21:12,235
Step six.
533
00:21:12,235 --> 00:21:14,870
534
00:21:14,870 --> 00:21:16,520
AUDIENCE: [INAUDIBLE] step four.
535
00:21:16,520 --> 00:21:19,117
DAVID J. MALAN: [GUFFAWS] no one do.
536
00:21:19,117 --> 00:21:21,288
AUDIENCE: [INAUDIBLE]
537
00:21:21,288 --> 00:21:23,330
DAVID J. MALAN: Where the other line ends-- say again.
538
00:21:23,330 --> 00:21:28,065
AUDIENCE: [INAUDIBLE]
539
00:21:28,065 --> 00:21:29,440
DAVID J. MALAN: Where the other line--
540
00:21:29,440 --> 00:21:35,350
AUDIENCE: [INAUDIBLE]
541
00:21:35,350 --> 00:21:40,690
DAVID J. MALAN: Near the vertical line where the other line ends
542
00:21:40,690 --> 00:21:43,360
draw a line that goes down.
543
00:21:43,360 --> 00:21:44,130
AUDIENCE: Yeah.
544
00:21:44,130 --> 00:21:44,970
DAVID J. MALAN: OK.
545
00:21:44,970 --> 00:21:45,950
A couple more steps.
546
00:21:45,950 --> 00:21:46,770
Step seven.
547
00:21:46,770 --> 00:21:47,460
Seven.
548
00:21:47,460 --> 00:21:50,550
AUDIENCE: Draw a horizontally slanting to the right line
549
00:21:50,550 --> 00:21:52,335
from the end of the line you just drew.
550
00:21:52,335 --> 00:21:55,680
DAVID J. MALAN: Draw a horizontally slanting line from the end of the line
551
00:21:55,680 --> 00:21:56,725
you just drew.
552
00:21:56,725 --> 00:21:57,600
AUDIENCE: Diagonally.
553
00:21:57,600 --> 00:21:59,196
DAVID J. MALAN: Diagonally.
554
00:21:59,196 --> 00:22:02,477
AUDIENCE: [INAUDIBLE]
555
00:22:02,477 --> 00:22:03,060
DAVID J. MALAN: OK.
556
00:22:03,060 --> 00:22:06,540
We're resorting to hand gestures now, but I think that's what you mean.
557
00:22:06,540 --> 00:22:07,630
Yes.
558
00:22:07,630 --> 00:22:08,130
OK.
559
00:22:08,130 --> 00:22:09,480
Good.
560
00:22:09,480 --> 00:22:10,800
Good.
561
00:22:10,800 --> 00:22:11,310
All right.
562
00:22:11,310 --> 00:22:12,480
One or two final steps.
563
00:22:12,480 --> 00:22:13,740
Let's get as close as we can.
564
00:22:13,740 --> 00:22:15,480
AUDIENCE: [INAUDIBLE]
565
00:22:15,480 --> 00:22:17,067
DAVID J. MALAN: Say hi.
566
00:22:17,067 --> 00:22:18,150
AUDIENCE: Make him say hi.
567
00:22:18,150 --> 00:22:19,440
DAVID J. MALAN: Make him say hi.
568
00:22:19,440 --> 00:22:21,360
AUDIENCE: No.
569
00:22:21,360 --> 00:22:22,083
DAVID J. MALAN: No.
570
00:22:22,083 --> 00:22:28,397
AUDIENCE: [INAUDIBLE] High.
571
00:22:28,397 --> 00:22:28,980
DAVID J. MALAN: OK.
572
00:22:28,980 --> 00:22:29,490
High.
573
00:22:29,490 --> 00:22:32,240
AUDIENCE: [INAUDIBLE]
574
00:22:32,240 --> 00:22:32,840
DAVID J. MALAN: OK.
575
00:22:32,840 --> 00:22:34,190
And maybe one final step.
576
00:22:34,190 --> 00:22:36,230
We'll give them one more--
577
00:22:36,230 --> 00:22:39,050
say again.
578
00:22:39,050 --> 00:22:39,550
Again.
579
00:22:39,550 --> 00:22:42,200
AUDIENCE: Put one of the lines from high to the circle.
580
00:22:42,200 --> 00:22:46,160
DAVID J. MALAN: Put one of those lines from high to the circle.
581
00:22:46,160 --> 00:22:52,280
AUDIENCE: [INAUDIBLE]
582
00:22:52,280 --> 00:22:55,970
DAVID J. MALAN: A line between high and the circle.
583
00:22:55,970 --> 00:22:56,840
All right.
584
00:22:56,840 --> 00:22:58,520
Let's show Jonathan.
585
00:22:58,520 --> 00:22:59,720
That's pretty darn close.
586
00:22:59,720 --> 00:23:03,350
Let's show him what we had in mind was this.
587
00:23:03,350 --> 00:23:06,620
So a round of applause for Jonathan too, if we could.
588
00:23:06,620 --> 00:23:09,110
A bigger round of applause for Jonathan, if we could.
589
00:23:09,110 --> 00:23:10,415
[APPLAUSE]
590
00:23:10,415 --> 00:23:11,720
591
00:23:11,720 --> 00:23:14,480
All right.
592
00:23:14,480 --> 00:23:17,420
There is this thing in computer science known as pair programming.
593
00:23:17,420 --> 00:23:18,890
Where you actually program with someone else.
594
00:23:18,890 --> 00:23:20,270
And that's actually not all that dissimilar
595
00:23:20,270 --> 00:23:22,320
trying to communicate your ideas to someone else.
596
00:23:22,320 --> 00:23:24,712
But notice just all of the ambiguities, and it certainly
597
00:23:24,712 --> 00:23:26,670
doesn't help that we're in a big space, but all
598
00:23:26,670 --> 00:23:28,587
of the ambiguities that arise when you're just
599
00:23:28,587 --> 00:23:30,140
trying to convey something precisely.
600
00:23:30,140 --> 00:23:33,530
So this is not necessarily as constrained as a program would.
601
00:23:33,530 --> 00:23:35,490
But it's representative of the end of the day,
602
00:23:35,490 --> 00:23:37,680
even after all these weeks, this stuff is hard.
603
00:23:37,680 --> 00:23:40,430
And in fact, it's not necessarily ever going
604
00:23:40,430 --> 00:23:43,160
to be completely straightforward, because the problems you're
605
00:23:43,160 --> 00:23:45,230
going to try solving down the road presumably,
606
00:23:45,230 --> 00:23:47,420
if you continue to apply these skills, themselves
607
00:23:47,420 --> 00:23:49,610
are just going to get more and more sophisticated.
608
00:23:49,610 --> 00:23:53,030
But hopefully, the feeling you get from accomplishing something as a result
609
00:23:53,030 --> 00:23:55,220
is just going to rise with them as well.
610
00:23:55,220 --> 00:23:58,970
Before we now do a bit of review, just wanted to offer a few suggestions
611
00:23:58,970 --> 00:24:02,948
and answer to an FAQ, which is, what do I do after a class like CS50.
612
00:24:02,948 --> 00:24:05,990
Typically about half of you will go on and take one or more other classes
613
00:24:05,990 --> 00:24:09,032
in CS, which is great building on this kind of foundation, and about half
614
00:24:09,032 --> 00:24:10,160
of you will not.
615
00:24:10,160 --> 00:24:10,910
This will be it.
616
00:24:10,910 --> 00:24:13,790
But very likely, certainly given how the world
617
00:24:13,790 --> 00:24:16,650
is trending, when you have opportunities in the arts, humanities,
618
00:24:16,650 --> 00:24:20,120
social sciences, or beyond to just apply programming to data
619
00:24:20,120 --> 00:24:22,230
sets, to problems, in those own domains.
620
00:24:22,230 --> 00:24:24,740
And so, toward that end, we would encourage
621
00:24:24,740 --> 00:24:31,100
you to start thinking about how you can transition from what has been your code
622
00:24:31,100 --> 00:24:34,940
space in the cloud to something client side, like using your own Mac
623
00:24:34,940 --> 00:24:36,570
and PC here on out.
624
00:24:36,570 --> 00:24:40,460
So that you're not reliant on a course's infrastructure, a particular website.
625
00:24:40,460 --> 00:24:43,220
And even though we used a fairly industry standard tool,
626
00:24:43,220 --> 00:24:45,590
you can actually get almost all of that stuff running,
627
00:24:45,590 --> 00:24:48,110
with some effort perhaps, on your own Mac and PC.
628
00:24:48,110 --> 00:24:51,200
So terminal Windows actually comes built into Mac OS.
629
00:24:51,200 --> 00:24:53,780
If you go to your applications folder, utilities,
630
00:24:53,780 --> 00:24:57,080
there is a program literally called terminal that has always been there,
631
00:24:57,080 --> 00:24:59,205
even if you've never used it, that will behave very
632
00:24:59,205 --> 00:25:01,432
similar to what VS codes does as well.
633
00:25:01,432 --> 00:25:03,140
In the world of Windows can you similarly
634
00:25:03,140 --> 00:25:05,780
install a version of the Terminal Windows software
635
00:25:05,780 --> 00:25:09,470
that we used in the cloud too to actually run similar commands like CD,
636
00:25:09,470 --> 00:25:11,180
and LS, and much more.
637
00:25:11,180 --> 00:25:13,820
We would encourage you ultimately, to learn Git.
638
00:25:13,820 --> 00:25:16,100
You've been indirectly using Git this semester.
639
00:25:16,100 --> 00:25:18,830
When you run certain commands, we have been
640
00:25:18,830 --> 00:25:21,470
using Git underneath the hood of some of CS50's tools
641
00:25:21,470 --> 00:25:25,040
that essentially push your code, so to speak, to the cloud,
642
00:25:25,040 --> 00:25:26,690
to a place like GitHub.com.
643
00:25:26,690 --> 00:25:30,920
But, Git itself is an incredibly powerful and just useful
644
00:25:30,920 --> 00:25:34,280
tool for one, backing up your code somewhere else to the cloud, which
645
00:25:34,280 --> 00:25:37,700
is effectively what we've used it for, but two, collaboration,
646
00:25:37,700 --> 00:25:40,820
so that you can actually share your code more readily with other people.
647
00:25:40,820 --> 00:25:44,450
And three, building much bigger pieces of software, where each of you
648
00:25:44,450 --> 00:25:46,520
work on different files, different folders,
649
00:25:46,520 --> 00:25:48,560
or even just different parts of the same file,
650
00:25:48,560 --> 00:25:50,578
and then somehow merge all of your handiwork
651
00:25:50,578 --> 00:25:53,120
together at the end of the day to build something much bigger
652
00:25:53,120 --> 00:25:55,520
than you as one person could alone.
653
00:25:55,520 --> 00:25:58,100
VS Code itself now too, we've been hosting it
654
00:25:58,100 --> 00:26:00,380
in the cloud, a real version of VS Code.
655
00:26:00,380 --> 00:26:03,590
But it's much more commonly used on people's own Macs and PCs.
656
00:26:03,590 --> 00:26:06,230
And you can download it onto your own Mac and PC.
657
00:26:06,230 --> 00:26:08,300
You might have to jump through a few more hoops
658
00:26:08,300 --> 00:26:10,250
to get things like C working, though Python
659
00:26:10,250 --> 00:26:12,410
is much easier to get working as well.
660
00:26:12,410 --> 00:26:14,600
Some of the configuration won't be quite the same,
661
00:26:14,600 --> 00:26:16,860
like your prompt might look a little different and the like.
662
00:26:16,860 --> 00:26:18,230
But that's just going to be the case any time
663
00:26:18,230 --> 00:26:20,480
you sit down in the future at a different system,
664
00:26:20,480 --> 00:26:23,647
it's going to look and feel a little different to things you've used before.
665
00:26:23,647 --> 00:26:25,880
But hopefully there'll be enough familiarities
666
00:26:25,880 --> 00:26:29,420
that you can get yourself up and running pretty quickly, nonetheless.
667
00:26:29,420 --> 00:26:30,440
Hosting a website.
668
00:26:30,440 --> 00:26:34,040
Not necessarily something you have to do or will do for your final project,
669
00:26:34,040 --> 00:26:35,300
depending on your proposal.
670
00:26:35,300 --> 00:26:39,350
But there's lots of ways to just host your own portfolio page, home
671
00:26:39,350 --> 00:26:43,430
page, website, whatever, on the internet itself
672
00:26:43,430 --> 00:26:46,760
using tools like these GitHub, or Netlify, or other tools too.
673
00:26:46,760 --> 00:26:50,265
Most of which have free student friendly plans.
674
00:26:50,265 --> 00:26:51,890
Some of these are indeed paid services.
675
00:26:51,890 --> 00:26:53,640
But they very often have entry level plans
676
00:26:53,640 --> 00:26:56,810
that are totally fine if it's just you on the internet
677
00:26:56,810 --> 00:26:59,553
and you don't expect having thousands, 10,000 of users.
678
00:26:59,553 --> 00:27:01,470
It's a drop in the bucket for these companies.
679
00:27:01,470 --> 00:27:03,980
And so they very often have free tiers of service.
680
00:27:03,980 --> 00:27:06,500
If you want to host something more dynamic, something
681
00:27:06,500 --> 00:27:10,415
like CS50 Finance that takes user input and output, uses sessions,
682
00:27:10,415 --> 00:27:13,490
uses databases you might like something like Heroku.
683
00:27:13,490 --> 00:27:17,150
And for instance, we have some documentation on one of CS50s websites
684
00:27:17,150 --> 00:27:20,180
for actually moving your implementation of CS50 Finance
685
00:27:20,180 --> 00:27:24,590
over to this third party application called Heroku so that you can actually
686
00:27:24,590 --> 00:27:27,920
run it or something like it in the cloud as well, here too
687
00:27:27,920 --> 00:27:29,750
using a free tier of service.
688
00:27:29,750 --> 00:27:31,200
All of these providers.
689
00:27:31,200 --> 00:27:33,740
These are big cloud providers these days.
690
00:27:33,740 --> 00:27:36,380
Amazon, Microsoft, Google, and others all
691
00:27:36,380 --> 00:27:40,040
have student-friendly accounts that you can sign up for during or shortly
692
00:27:40,040 --> 00:27:44,210
after you're in school that just give you free compute time and storage.
693
00:27:44,210 --> 00:27:47,300
GitHub itself has this whole student pack that by transitivity gives you
694
00:27:47,300 --> 00:27:50,160
access to a whole bunch of discounts on other things as well.
695
00:27:50,160 --> 00:27:51,952
So if you're liking this stuff and you just
696
00:27:51,952 --> 00:27:54,680
want to learn more perhaps over break, by playing on your own,
697
00:27:54,680 --> 00:27:57,710
these then would be some good starting points.
698
00:27:57,710 --> 00:27:59,630
And as for just keeping abreast of trends
699
00:27:59,630 --> 00:28:02,180
in programming and technology or the like,
700
00:28:02,180 --> 00:28:04,530
there's so many different blogs and websites out there.
701
00:28:04,530 --> 00:28:07,640
But here are just a couple of different subreddits, so to speak,
702
00:28:07,640 --> 00:28:10,940
on Reddit that are very programming specific, Stack Overflow with which
703
00:28:10,940 --> 00:28:14,750
you've probably interacted, Server Fault, which is similar,
704
00:28:14,750 --> 00:28:17,990
TechCrunch, Y Cominator, and other sites too.
705
00:28:17,990 --> 00:28:20,150
And ultimately, we would encourage all of you
706
00:28:20,150 --> 00:28:22,610
to stay in touch, certainly beyond today.
707
00:28:22,610 --> 00:28:25,055
By the time you finish your final projects
708
00:28:25,055 --> 00:28:27,120
we'll have something waiting for you.
709
00:28:27,120 --> 00:28:29,720
And if you want to stay engaged, either on the teaching staff
710
00:28:29,720 --> 00:28:33,240
or just as a lifelong learner of course and programming, by all means,
711
00:28:33,240 --> 00:28:34,820
check out any of these URLs here.
712
00:28:34,820 --> 00:28:38,030
But in just a few weeks' time will you have one of these to your name?
713
00:28:38,030 --> 00:28:41,840
Your very own I took CS50 t-shirt, which we will distribute before long--
714
00:28:41,840 --> 00:28:43,640
[APPLAUSE]
715
00:28:43,640 --> 00:28:45,330
--as well.
716
00:28:45,330 --> 00:28:48,800
And now, if we may, we have an opportunity here
717
00:28:48,800 --> 00:28:51,950
to synthesize the past several weeks of material.
718
00:28:51,950 --> 00:28:55,070
If you would like to go ahead and open up the URL that we
719
00:28:55,070 --> 00:28:56,240
put on the screen earlier.
720
00:28:56,240 --> 00:28:57,720
I'll toss it up here again.
721
00:28:57,720 --> 00:28:59,970
You can use your phone or your laptop.
722
00:28:59,970 --> 00:29:02,750
You might recall for a previous problem set,
723
00:29:02,750 --> 00:29:06,770
we asked you to propose a whole bunch of review questions, multiple choice
724
00:29:06,770 --> 00:29:10,100
or the like that synthesized the past several weeks of material.
725
00:29:10,100 --> 00:29:14,240
We took some of our favorite submissions of those, ported it to this poll
726
00:29:14,240 --> 00:29:16,610
every war platform, so that we could interactively
727
00:29:16,610 --> 00:29:19,470
see where everyone's minds are at, understanding is that.
728
00:29:19,470 --> 00:29:23,090
And I think you'll find all of these are written by you and your classmates.
729
00:29:23,090 --> 00:29:26,648
That we slipped a few fun ones there, also written by you along the way.
730
00:29:26,648 --> 00:29:28,940
If Carter, you want to come on up here to get us ready.
731
00:29:28,940 --> 00:29:31,820
If you haven't yet open the website, go to this URL
732
00:29:31,820 --> 00:29:35,180
here on your phone or your laptop.
733
00:29:35,180 --> 00:29:37,490
And let me go ahead and switch us over here
734
00:29:37,490 --> 00:29:41,480
before Carter takes control of this machine here.
735
00:29:41,480 --> 00:29:43,690
Here's that same 2D bar code again.
736
00:29:43,690 --> 00:29:45,890
Feel free to background that now.
737
00:29:45,890 --> 00:29:49,250
And in just a moment, we've got a 20 question quiz show.
738
00:29:49,250 --> 00:29:50,485
It's all multiple choice.
739
00:29:50,485 --> 00:29:52,610
So long as you have internet access, whether you're
740
00:29:52,610 --> 00:29:54,650
here physically or online right now, you should
741
00:29:54,650 --> 00:29:58,130
be able to buzz in within 10 to 20 seconds of seeing a question.
742
00:29:58,130 --> 00:29:59,630
And I'll read each one aloud.
743
00:29:59,630 --> 00:30:01,560
I think Carter, were just about good to go.
744
00:30:01,560 --> 00:30:05,210
So, does everyone have the software up and running
745
00:30:05,210 --> 00:30:06,980
on their phone or their laptop?
746
00:30:06,980 --> 00:30:09,260
If not, no big deal, just look on with a friend.
747
00:30:09,260 --> 00:30:12,205
But otherwise, Carter, do you want to say hello too and tee us up?
748
00:30:12,205 --> 00:30:13,580
CARTER: Absolutely, Hi, everyone.
749
00:30:13,580 --> 00:30:17,060
We're going to go ahead and get started here with our first question.
750
00:30:17,060 --> 00:30:18,300
Speed here matters.
751
00:30:18,300 --> 00:30:19,970
So our first question, David, go ahead.
752
00:30:19,970 --> 00:30:20,708
What is it?
753
00:30:20,708 --> 00:30:23,750
DAVID J. MALAN: What does CSS stand for, is the first question written by you?
754
00:30:23,750 --> 00:30:27,830
Four possible options are, Cascading Style Sheet,
755
00:30:27,830 --> 00:30:32,600
coding style sheet, cascading style system, coded style sheet.
756
00:30:32,600 --> 00:30:38,570
Fifteen seconds, up to 300 responses already both here in person and online.
757
00:30:38,570 --> 00:30:40,140
Give folks a few more seconds.
758
00:30:40,140 --> 00:30:41,570
What does CSS stand for?
759
00:30:41,570 --> 00:30:43,700
These are the four options that were provided.
760
00:30:43,700 --> 00:30:47,540
Three, two, one, Carter.
761
00:30:47,540 --> 00:30:51,900
Cascading Style Sheets at 86% is indeed the right answer.
762
00:30:51,900 --> 00:30:54,710
So congrats to those of you 86% who got that one.
763
00:30:54,710 --> 00:30:55,790
Here's the leaderboard.
764
00:30:55,790 --> 00:30:58,040
You all have fairly random user names.
765
00:30:58,040 --> 00:31:01,835
But if your username is on this board here or really any of the 86% of you
766
00:31:01,835 --> 00:31:04,460
that just got that right, all of you are currently in the lead.
767
00:31:04,460 --> 00:31:07,610
But we'll see if this shifts before long.
768
00:31:07,610 --> 00:31:11,330
Question two, which best describe the role of a compiler,
769
00:31:11,330 --> 00:31:13,760
is our next question?
770
00:31:13,760 --> 00:31:16,790
Debug one's code, run the written program,
771
00:31:16,790 --> 00:31:19,190
distinguish between functions and arguments,
772
00:31:19,190 --> 00:31:22,280
turn source code into machine code.
773
00:31:22,280 --> 00:31:23,990
Three hundred responses in so far.
774
00:31:23,990 --> 00:31:25,220
Ten seconds to go.
775
00:31:25,220 --> 00:31:29,960
Which best describes the role of a compiler?
776
00:31:29,960 --> 00:31:31,400
Three seconds.
777
00:31:31,400 --> 00:31:32,720
Just crossed 400.
778
00:31:32,720 --> 00:31:37,730
And Carter, turning source code into machine code, at 92%.
779
00:31:37,730 --> 00:31:40,610
Some excellent progress there, is indeed the correct answer.
780
00:31:40,610 --> 00:31:45,120
And indeed more generally, a compiler just converts one language to another.
781
00:31:45,120 --> 00:31:48,950
The use cases we've seen for it have been only source code to machine code.
782
00:31:48,950 --> 00:31:50,870
But as you go out into the real world, you'll
783
00:31:50,870 --> 00:31:54,500
actually find there to be compilers from one source code language
784
00:31:54,500 --> 00:31:57,080
to another source code language that itself might
785
00:31:57,080 --> 00:31:59,060
be runnable or compiler thereafter.
786
00:31:59,060 --> 00:32:00,950
Good job to all of you guests.
787
00:32:00,950 --> 00:32:02,960
And Carter number three.
788
00:32:02,960 --> 00:32:06,560
What is the type of argc asks a classmate?
789
00:32:06,560 --> 00:32:08,570
Int, stir, char, float.
790
00:32:08,570 --> 00:32:11,420
791
00:32:11,420 --> 00:32:13,010
What is the type of argc?
792
00:32:13,010 --> 00:32:16,680
793
00:32:16,680 --> 00:32:18,600
All right about 350 responses.
794
00:32:18,600 --> 00:32:20,460
Seven seconds to go.
795
00:32:20,460 --> 00:32:24,360
About to cross the 400 threshold in three, two, one.
796
00:32:24,360 --> 00:32:28,257
The type of argc is int is indeed correct.
797
00:32:28,257 --> 00:32:30,090
But we're now starting to distinguish folks.
798
00:32:30,090 --> 00:32:32,310
Only 55% there.
799
00:32:32,310 --> 00:32:34,170
Char is not correct.
800
00:32:34,170 --> 00:32:38,100
You might be thinking of argv in C. But even that is not a char.
801
00:32:38,100 --> 00:32:43,090
It's a char star array or a char star star in fact.
802
00:32:43,090 --> 00:32:44,130
So it's not just a char.
803
00:32:44,130 --> 00:32:46,590
Stir is in Python.
804
00:32:46,590 --> 00:32:49,140
But even that too, if you were thinking of sys.argv,
805
00:32:49,140 --> 00:32:51,700
that would be a list of stirs, not a single stir.
806
00:32:51,700 --> 00:32:52,200
All right.
807
00:32:52,200 --> 00:32:55,000
Carter, the leaderboard.
808
00:32:55,000 --> 00:32:55,500
All right.
809
00:32:55,500 --> 00:32:56,940
There are our guests all in.
810
00:32:56,940 --> 00:32:57,600
Still tied.
811
00:32:57,600 --> 00:32:58,710
And number four.
812
00:32:58,710 --> 00:33:03,990
What is the searching efficiency of a balanced binary search tree?
813
00:33:03,990 --> 00:33:10,850
Big O of n, big O of n squared, big O of log n, big O of n log n.
814
00:33:10,850 --> 00:33:15,110
What is the searching efficiency of a balanced binary search tree?
815
00:33:15,110 --> 00:33:19,250
The balance being key because, as folks continue buzzing in,
816
00:33:19,250 --> 00:33:25,760
recall that binary search trees can degrade, devolve into linked lists--
817
00:33:25,760 --> 00:33:29,380
Big O log n is correct for 54%.
818
00:33:29,380 --> 00:33:29,880
All right.
819
00:33:29,880 --> 00:33:31,400
Now, people are getting annoyed.
820
00:33:31,400 --> 00:33:32,400
But let's keep going.
821
00:33:32,400 --> 00:33:33,980
Number five.
822
00:33:33,980 --> 00:33:35,840
Leaderboard's not yet that interesting.
823
00:33:35,840 --> 00:33:40,010
More subtle, what was the CS50 duck's Halloween costume?
824
00:33:40,010 --> 00:33:42,740
He's here in winter dress today thanks to Valerie.
825
00:33:42,740 --> 00:33:48,590
A skeleton, a vampire, Frankenstein, or a ghost.
826
00:33:48,590 --> 00:33:54,020
What was its costume at Halloween a few weeks back?
827
00:33:54,020 --> 00:33:57,050
Answers are coming in a little slower this time.
828
00:33:57,050 --> 00:34:00,170
People online or perhaps clicking on the video.
829
00:34:00,170 --> 00:34:03,860
And vampire is correct at 69%.
830
00:34:03,860 --> 00:34:05,780
Nicely done.
831
00:34:05,780 --> 00:34:06,530
All right.
832
00:34:06,530 --> 00:34:08,690
Guests are still shuffled in the top.
833
00:34:08,690 --> 00:34:12,320
Oh, and we're starting to see some leaders pull ahead.
834
00:34:12,320 --> 00:34:15,679
The time in which you buzz in is also taken into account now.
835
00:34:15,679 --> 00:34:19,130
In C how can we unify several variables of different types
836
00:34:19,130 --> 00:34:21,320
into a single new type?
837
00:34:21,320 --> 00:34:25,960
Trees, arrays, structs, tables.
838
00:34:25,960 --> 00:34:26,770
Oh, it got quiet.
839
00:34:26,770 --> 00:34:30,199
In C how can we unify several variables of different types
840
00:34:30,199 --> 00:34:33,040
into a single new type?
841
00:34:33,040 --> 00:34:34,719
Eight seconds.
842
00:34:34,719 --> 00:34:36,280
Four hundred responses in.
843
00:34:36,280 --> 00:34:39,610
844
00:34:39,610 --> 00:34:40,630
Four hundred fifty.
845
00:34:40,630 --> 00:34:44,679
And the answer is structs are indeed correct.
846
00:34:44,679 --> 00:34:46,270
Recall that we had a student struct.
847
00:34:46,270 --> 00:34:48,370
And we saw strokes later on four nodes that
848
00:34:48,370 --> 00:34:52,780
allowed us to cluster multiple variables or data types inside of our own brand
849
00:34:52,780 --> 00:34:55,389
new structure that we then typedef to a name.
850
00:34:55,389 --> 00:34:57,250
Carter, should we see the leaderboard now?
851
00:34:57,250 --> 00:34:58,270
[GRUMBLING]
852
00:34:58,270 --> 00:34:58,930
All right.
853
00:34:58,930 --> 00:35:05,230
Whoever guessed 4045 and 4383 have eked ahead ever so slightly.
854
00:35:05,230 --> 00:35:08,080
So buzzing and fast can now benefit your score too.
855
00:35:08,080 --> 00:35:09,550
Next question, Carter.
856
00:35:09,550 --> 00:35:13,540
In Python, which of the following statements is false,
857
00:35:13,540 --> 00:35:17,680
tuples are an ordered immutable set of data, dictionaries associate keywords
858
00:35:17,680 --> 00:35:20,770
with values, arrays in Python or fixed size,
859
00:35:20,770 --> 00:35:23,320
Python is an object oriented language?
860
00:35:23,320 --> 00:35:25,250
Which of those statements is false?
861
00:35:25,250 --> 00:35:29,470
862
00:35:29,470 --> 00:35:30,550
Three seconds.
863
00:35:30,550 --> 00:35:33,430
Answers coming in more slowly.
864
00:35:33,430 --> 00:35:35,950
But the most popular answer is correct.
865
00:35:35,950 --> 00:35:40,342
Arrays in Python are indeed not of a fixed size, which is why that's false.
866
00:35:40,342 --> 00:35:42,550
They're not even called arrays, they're called lists.
867
00:35:42,550 --> 00:35:45,760
And recall that they dynamically grow and shrink effectively implemented
868
00:35:45,760 --> 00:35:48,400
for you as a linked list.
869
00:35:48,400 --> 00:35:50,200
All right.
870
00:35:50,200 --> 00:35:50,860
All right.
871
00:35:50,860 --> 00:35:52,270
We have a leader.
872
00:35:52,270 --> 00:35:56,290
Whoever 4383 is, nicely done.
873
00:35:56,290 --> 00:36:02,560
What does strcmp return in C, S-T-R-C-M-P. Does it return a boolean,
874
00:36:02,560 --> 00:36:04,255
an integer, a string, or a char?
875
00:36:04,255 --> 00:36:08,060
876
00:36:08,060 --> 00:36:10,280
What does strcmp return in C?
877
00:36:10,280 --> 00:36:12,305
Used to compare two strings of course.
878
00:36:12,305 --> 00:36:15,180
879
00:36:15,180 --> 00:36:21,150
Recall that it returns potentially not just true/false but--
880
00:36:21,150 --> 00:36:24,300
ooh, an integer is indeed correct.
881
00:36:24,300 --> 00:36:25,560
Does anyone recall why?
882
00:36:25,560 --> 00:36:29,170
Why is it an int and not just a simple true/false?
883
00:36:29,170 --> 00:36:30,660
Why is three values helpful?
884
00:36:30,660 --> 00:36:33,450
885
00:36:33,450 --> 00:36:34,020
Exactly.
886
00:36:34,020 --> 00:36:38,250
It returns 0, if they're equal, or returns negative value or positive
887
00:36:38,250 --> 00:36:41,370
value based on whether one string comes before or after
888
00:36:41,370 --> 00:36:45,000
the other ASCII-betically, so to speak, based on its ASCII code.
889
00:36:45,000 --> 00:36:46,590
The results, Carter.
890
00:36:46,590 --> 00:36:52,200
All right, 4383 still doing quite well, but being caught up with here.
891
00:36:52,200 --> 00:36:59,160
What is David Maylon's phone number 949-468-2750 play when you call it?
892
00:36:59,160 --> 00:37:02,010
The Harvard Alma mater, a parody of Yale song,
893
00:37:02,010 --> 00:37:06,390
a recording of David Maylon singing "Never Going to Give You Up."
894
00:37:06,390 --> 00:37:08,190
Feel free to call or text.
895
00:37:08,190 --> 00:37:11,175
I can't get it now, but we have nicely automated that process.
896
00:37:11,175 --> 00:37:14,170
897
00:37:14,170 --> 00:37:15,040
Four seconds.
898
00:37:15,040 --> 00:37:17,340
Four hundred responses in, and the answer, of course,
899
00:37:17,340 --> 00:37:20,050
is "Never Going to Give You Up."
900
00:37:20,050 --> 00:37:21,880
Thanks to a little programming and a script
901
00:37:21,880 --> 00:37:24,400
that our friend Rong Shin wrote that essentially answers
902
00:37:24,400 --> 00:37:28,450
the phone automatically and replies with a URL or a song.
903
00:37:28,450 --> 00:37:30,880
Carter?
904
00:37:30,880 --> 00:37:34,470
Oh, dethroned.
905
00:37:34,470 --> 00:37:35,320
Dethroned.
906
00:37:35,320 --> 00:37:37,360
Two six eight eight, nicely done.
907
00:37:37,360 --> 00:37:39,970
Next question.
908
00:37:39,970 --> 00:37:42,370
From which of the following places does malloc
909
00:37:42,370 --> 00:37:50,080
get free memory for a program to use, heap, stack, array, or pointer?
910
00:37:50,080 --> 00:37:52,330
From which of the following places does malloc
911
00:37:52,330 --> 00:37:54,685
get free memory for a program to use?
912
00:37:54,685 --> 00:37:58,650
913
00:37:58,650 --> 00:38:00,540
Answers are a little slower this time.
914
00:38:00,540 --> 00:38:01,260
Five seconds.
915
00:38:01,260 --> 00:38:04,890
916
00:38:04,890 --> 00:38:07,880
And the answer is in--
917
00:38:07,880 --> 00:38:09,098
[GRUMBLINGS]
918
00:38:09,098 --> 00:38:11,852
919
00:38:11,852 --> 00:38:13,230
[LAUGHTER]
920
00:38:13,230 --> 00:38:13,770
OK.
921
00:38:13,770 --> 00:38:17,110
That's the answer we were given in the problem sets.
922
00:38:17,110 --> 00:38:20,460
But I think we would beg to differ.
923
00:38:20,460 --> 00:38:22,205
Pretty sure, Carter, would you go with--
924
00:38:22,205 --> 00:38:23,580
CARTER: I would go with the heap.
925
00:38:23,580 --> 00:38:25,247
DAVID J. MALAN: I think it's indeed the heap.
926
00:38:25,247 --> 00:38:27,030
So this answer, not correct.
927
00:38:27,030 --> 00:38:28,380
[BOOING]
928
00:38:28,380 --> 00:38:29,220
I know.
929
00:38:29,220 --> 00:38:29,760
I know.
930
00:38:29,760 --> 00:38:31,800
We just transcribed what you gave us, though.
931
00:38:31,800 --> 00:38:34,050
Let's see how that affects the scores.
932
00:38:34,050 --> 00:38:35,280
AUDIENCE: No.
933
00:38:35,280 --> 00:38:38,010
DAVID J. MALAN: OK, 2688 is still doing OK.
934
00:38:38,010 --> 00:38:39,120
Next question.
935
00:38:39,120 --> 00:38:40,690
About 10 or so to go.
936
00:38:40,690 --> 00:38:44,490
Suppose I have an unsorted list of items, store receipts perhaps,
937
00:38:44,490 --> 00:38:47,730
should I sort the items before searching for an element?
938
00:38:47,730 --> 00:38:49,920
Yes, you should always sort before searching.
939
00:38:49,920 --> 00:38:52,740
No, you should never sought before searching.
940
00:38:52,740 --> 00:38:56,235
If you will be searching the list many times, then yes, you should sort first.
941
00:38:56,235 --> 00:38:58,110
If you will be searching the list many times,
942
00:38:58,110 --> 00:39:01,140
then no, you should not sort first.
943
00:39:01,140 --> 00:39:02,880
Some nuanced replies.
944
00:39:02,880 --> 00:39:05,760
Five seconds.
945
00:39:05,760 --> 00:39:07,900
Fewer answers than usual at this point.
946
00:39:07,900 --> 00:39:11,762
And, if you will be searching the list many times, then yes,
947
00:39:11,762 --> 00:39:12,720
you should short first.
948
00:39:12,720 --> 00:39:14,790
An example that we discussed of trade offs.
949
00:39:14,790 --> 00:39:16,915
Because if you're just going to do a one off search
950
00:39:16,915 --> 00:39:19,800
and never again, why bother incurring n log n or n
951
00:39:19,800 --> 00:39:22,515
squared time to actually sort the thing.
952
00:39:22,515 --> 00:39:25,410
953
00:39:25,410 --> 00:39:26,100
All right.
954
00:39:26,100 --> 00:39:27,420
Some shuffling happening.
955
00:39:27,420 --> 00:39:29,040
But, 2688, nicely done.
956
00:39:29,040 --> 00:39:30,360
Next question.
957
00:39:30,360 --> 00:39:34,440
When you run the CREATE INDEX command and SQL, what type of data structure
958
00:39:34,440 --> 00:39:41,270
do you create, array, B-trees, linked lists, hash tables?
959
00:39:41,270 --> 00:39:44,960
When you run CREATE INDEX, recall we did this with the movie titles,
960
00:39:44,960 --> 00:39:48,110
the TV show titles, to speed things up, so that things
961
00:39:48,110 --> 00:39:50,420
wouldn't be super long and linear.
962
00:39:50,420 --> 00:39:53,580
We did a different data structure.
963
00:39:53,580 --> 00:39:54,080
All right.
964
00:39:54,080 --> 00:39:55,550
About 400 responses in.
965
00:39:55,550 --> 00:39:59,570
The answer is indeed B-trees.
966
00:39:59,570 --> 00:40:00,690
B-trees.
967
00:40:00,690 --> 00:40:02,390
Not to be confused with binary tree.
968
00:40:02,390 --> 00:40:06,530
A B-tree typically has other children besides two
969
00:40:06,530 --> 00:40:10,310
that pulls the data even higher up from the leaves of the tree.
970
00:40:10,310 --> 00:40:12,620
Could use a hash table, could use a linked list,
971
00:40:12,620 --> 00:40:15,920
but indeed the technology in databases is generally these things
972
00:40:15,920 --> 00:40:18,560
called B-trees, certainly in SQLite.
973
00:40:18,560 --> 00:40:20,810
Carter.
974
00:40:20,810 --> 00:40:25,370
Oh, dethroned, but 4179 has now pulled ahead.
975
00:40:25,370 --> 00:40:26,040
Nicely done.
976
00:40:26,040 --> 00:40:27,350
Next question.
977
00:40:27,350 --> 00:40:36,980
What HTTP status code means I'm a teapot, 000, 418, 007, 128?
978
00:40:36,980 --> 00:40:44,120
This, recall, was an April Fool's joke by technical people some years ago that
979
00:40:44,120 --> 00:40:46,640
has become part of computing lore.
980
00:40:46,640 --> 00:40:48,980
It's still there, though, in the document.
981
00:40:48,980 --> 00:40:53,760
In two seconds we'll know that it's 418, indeed.
982
00:40:53,760 --> 00:40:56,000
Let's see how that affected things.
983
00:40:56,000 --> 00:41:02,300
Four one seven nine is way down on the list, 7280 is number one now.
984
00:41:02,300 --> 00:41:03,200
Nicely done.
985
00:41:03,200 --> 00:41:06,620
What is an example of a SQL injection attack,
986
00:41:06,620 --> 00:41:10,220
when someone submits malicious SQL commands via web form,
987
00:41:10,220 --> 00:41:13,250
physically destroying a computer hardware that stores a SQL
988
00:41:13,250 --> 00:41:16,370
database, overwhelming a server with thousands of requests
989
00:41:16,370 --> 00:41:21,410
to access a database, injection attacks are only in movies or TV?
990
00:41:21,410 --> 00:41:22,670
Five seconds.
991
00:41:22,670 --> 00:41:26,110
Some fun answers.
992
00:41:26,110 --> 00:41:27,850
Four hundred responses about in.
993
00:41:27,850 --> 00:41:31,990
And indeed, when someone submits malicious SQL commands via web
994
00:41:31,990 --> 00:41:36,250
form because the programmer is not escaping the code using the question
995
00:41:36,250 --> 00:41:40,090
mark syntax that we've seen using CS50's library or other third party
996
00:41:40,090 --> 00:41:41,500
libraries like it.
997
00:41:41,500 --> 00:41:43,450
Carter.
998
00:41:43,450 --> 00:41:46,983
Seven two eight oh is still the guest to beat.
999
00:41:46,983 --> 00:41:47,650
Nearing the end.
1000
00:41:47,650 --> 00:41:48,730
A few more questions.
1001
00:41:48,730 --> 00:41:52,780
How are the elements of an array stored in memory, contiguously,
1002
00:41:52,780 --> 00:41:55,180
in random locations that happen to be available,
1003
00:41:55,180 --> 00:41:58,840
as a linked list, as a binary tree?
1004
00:41:58,840 --> 00:42:04,720
How are the elements of an array stored in memory?
1005
00:42:04,720 --> 00:42:06,730
About five seconds to go.
1006
00:42:06,730 --> 00:42:09,430
Almost have everyone in.
1007
00:42:09,430 --> 00:42:14,800
Two, one, and contiguously is indeed the right answer.
1008
00:42:14,800 --> 00:42:18,580
Back to back to back in random locations that happen to be available
1009
00:42:18,580 --> 00:42:21,940
is probably describing your use of malloc in the heap,
1010
00:42:21,940 --> 00:42:24,550
but you would then need a linked list or some other structure
1011
00:42:24,550 --> 00:42:26,590
to stitch those locations together.
1012
00:42:26,590 --> 00:42:29,440
And array, by definition is contiguous.
1013
00:42:29,440 --> 00:42:31,140
Carter.
1014
00:42:31,140 --> 00:42:37,260
Seven two eight oh is hanging on to that lead by about 499 points.
1015
00:42:37,260 --> 00:42:40,620
Next up is which SQL query would allow you
1016
00:42:40,620 --> 00:42:46,530
to select the ID of a specific movie star Zendaya in a table of movie stars,
1017
00:42:46,530 --> 00:42:50,880
select ID where name equals Zendaya, select Star ID for movie stars
1018
00:42:50,880 --> 00:42:53,640
where name equals Zendaya, select ID from movie stars
1019
00:42:53,640 --> 00:42:56,460
where name equals Zendaya, select ID for movie stars
1020
00:42:56,460 --> 00:42:59,100
where name equals quote unquote Zendaya?
1021
00:42:59,100 --> 00:43:01,350
And I'm spoiling it.
1022
00:43:01,350 --> 00:43:04,020
I should have read out some quotes earlier too.
1023
00:43:04,020 --> 00:43:05,970
One second.
1024
00:43:05,970 --> 00:43:08,100
The last one is correct.
1025
00:43:08,100 --> 00:43:11,220
And indeed, this one's almost correct but lacks the single quotes.
1026
00:43:11,220 --> 00:43:12,660
Zendaya is not a single--
1027
00:43:12,660 --> 00:43:14,820
it's not a SQL key word.
1028
00:43:14,820 --> 00:43:16,090
It's of course a string.
1029
00:43:16,090 --> 00:43:17,955
So it does need to be escaped there.
1030
00:43:17,955 --> 00:43:20,220
But 63% of you realized that.
1031
00:43:20,220 --> 00:43:22,690
Seventy-two eighty is still in the lead.
1032
00:43:22,690 --> 00:43:25,170
I think we have a few more questions to go.
1033
00:43:25,170 --> 00:43:29,070
Why is a hash table faster to search than a linked list,
1034
00:43:29,070 --> 00:43:33,390
even though the runtime for both is big O of n?
1035
00:43:33,390 --> 00:43:37,350
The hash table actually has big O of n squared runtime,
1036
00:43:37,350 --> 00:43:41,442
the hash table optimally has omega of O runtime,
1037
00:43:41,442 --> 00:43:43,650
the hash table creates shorter length lists to search
1038
00:43:43,650 --> 00:43:48,870
rather than one long linked list, the hash table takes less memory.
1039
00:43:48,870 --> 00:43:54,000
And this was an example of practical versus theoretical differences.
1040
00:43:54,000 --> 00:43:58,890
And indeed-- that was interesting. --with 83% of you buzzing in,
1041
00:43:58,890 --> 00:44:01,540
the hash table creates shorter linked lists, ideally,
1042
00:44:01,540 --> 00:44:05,730
if you have a good hash function rather than one long linked list.
1043
00:44:05,730 --> 00:44:08,400
Even though technically it's still in big O of n.
1044
00:44:08,400 --> 00:44:12,510
Seventy-two eighty seemed to know that is pulling ahead of the crowd.
1045
00:44:12,510 --> 00:44:14,790
Still a few questions.
1046
00:44:14,790 --> 00:44:20,100
Game of Thrones is a dot, dot, dot, comedy, drama, historical fantasy,
1047
00:44:20,100 --> 00:44:24,660
documentary, romance sci-fi, or all of the above?
1048
00:44:24,660 --> 00:44:31,110
This is written by your classmates, recall, based on our sequel week.
1049
00:44:31,110 --> 00:44:37,440
In five seconds we'll be reminded that, according to our CSV file,
1050
00:44:37,440 --> 00:44:39,990
they were all of the above.
1051
00:44:39,990 --> 00:44:40,530
OK.
1052
00:44:40,530 --> 00:44:43,450
All of the above.
1053
00:44:43,450 --> 00:44:46,290
All right, 7280 did OK with that.
1054
00:44:46,290 --> 00:44:47,700
Next question.
1055
00:44:47,700 --> 00:44:49,800
Which of the following is a golden rule when
1056
00:44:49,800 --> 00:44:54,180
allocating memory, every block of memory that you malloc must be freed,
1057
00:44:54,180 --> 00:44:58,140
only memory that you malloc should be freed, do not free a block of memory
1058
00:44:58,140 --> 00:45:00,030
more than once, all of the above?
1059
00:45:00,030 --> 00:45:03,330
1060
00:45:03,330 --> 00:45:07,130
More into the nuances of C, this golden rule when allocating memory--
1061
00:45:07,130 --> 00:45:08,880
didn't have to worry about this in Python.
1062
00:45:08,880 --> 00:45:13,680
We did in C. In two seconds we'll know that all of the above
1063
00:45:13,680 --> 00:45:16,080
are indeed things you must do.
1064
00:45:16,080 --> 00:45:18,300
Not doing those would be in fact bugs.
1065
00:45:18,300 --> 00:45:20,520
Carter, the leaderboard.
1066
00:45:20,520 --> 00:45:24,300
Still doing well 7280, whoever you are.
1067
00:45:24,300 --> 00:45:25,890
Last few questions.
1068
00:45:25,890 --> 00:45:28,110
Last question, in fact.
1069
00:45:28,110 --> 00:45:29,460
Last question.
1070
00:45:29,460 --> 00:45:34,650
What do the binary bulbs on stage spell today?
1071
00:45:34,650 --> 00:45:40,680
The answers could be, faced with medical mask, faced with tears of joy,
1072
00:45:40,680 --> 00:45:44,212
snowman without snow, or red heart?
1073
00:45:44,212 --> 00:45:45,540
AUDIENCE: [INAUDIBLE]
1074
00:45:45,540 --> 00:45:48,790
DAVID J. MALAN: What do the binary bulbs on stage spell?
1075
00:45:48,790 --> 00:45:52,982
Six, five, four, three--
1076
00:45:52,982 --> 00:45:56,170
AUDIENCE: Two, one.
1077
00:45:56,170 --> 00:45:59,650
DAVID J. MALAN: The answer is the red heart.
1078
00:45:59,650 --> 00:46:02,560
Taking a look at the leaderboard here, who's our winner?
1079
00:46:02,560 --> 00:46:05,660
1080
00:46:05,660 --> 00:46:08,420
The winner is-- oh!
1081
00:46:08,420 --> 00:46:10,250
Guest 3487.
1082
00:46:10,250 --> 00:46:12,290
A big round of applause for our guest.
1083
00:46:12,290 --> 00:46:13,175
Thank you to Carter.
1084
00:46:13,175 --> 00:46:14,951
[APPLAUSE]
1085
00:46:14,951 --> 00:46:17,940
1086
00:46:17,940 --> 00:46:24,030
So it's nice that were some opportunity here, because recall that in week zero,
1087
00:46:24,030 --> 00:46:27,690
we did start talking about emoji, and really about data, and representation,
1088
00:46:27,690 --> 00:46:30,922
and we talked not about just binary but ASCII and then Unicode.
1089
00:46:30,922 --> 00:46:33,630
And then when we had Unicode, we had all of these additional bits
1090
00:46:33,630 --> 00:46:34,630
that we could play with.
1091
00:46:34,630 --> 00:46:36,900
And we could start to represent not just letters
1092
00:46:36,900 --> 00:46:39,270
of the English alphabet as an ASCII but really
1093
00:46:39,270 --> 00:46:42,150
letters of any human alphabet and even alphabets
1094
00:46:42,150 --> 00:46:43,565
that are continuing to develop.
1095
00:46:43,565 --> 00:46:45,690
And indeed, this was faced with medical mask, which
1096
00:46:45,690 --> 00:46:50,130
we claimed at the time was just how a Mac or PC or Android phone or iPhone
1097
00:46:50,130 --> 00:46:54,240
nowadays would interpret and display a pattern of bits like this.
1098
00:46:54,240 --> 00:46:59,310
This happening to be for the four bytes that represent that particular emoji.
1099
00:46:59,310 --> 00:47:01,590
And over time, humans have been deciding to use
1100
00:47:01,590 --> 00:47:06,720
different patterns for new emojis that might not have existed yesterday.
1101
00:47:06,720 --> 00:47:09,270
And indeed, most any time you update your Mac, or your PC,
1102
00:47:09,270 --> 00:47:12,660
or your phone these days, at least on a semi-annual basis,
1103
00:47:12,660 --> 00:47:15,030
are you getting some new and improved emojis.
1104
00:47:15,030 --> 00:47:17,185
And they're not just these faces now.
1105
00:47:17,185 --> 00:47:19,560
They're of course, representing different human emotions,
1106
00:47:19,560 --> 00:47:21,390
different physical objects, and ultimately
1107
00:47:21,390 --> 00:47:23,730
among the Unicode consortium's goals, is to be
1108
00:47:23,730 --> 00:47:26,160
able to represent all human languages.
1109
00:47:26,160 --> 00:47:29,670
But, we're not for certain groups of people and certain individuals,
1110
00:47:29,670 --> 00:47:32,302
these things would all rather look fairly similar.
1111
00:47:32,302 --> 00:47:34,260
And indeed today, we're so pleased to be joined
1112
00:47:34,260 --> 00:47:36,240
by an old classmate of mine Jennifer 8.
1113
00:47:36,240 --> 00:47:39,240
Lee, who was class of '99 here at the college, who's
1114
00:47:39,240 --> 00:47:43,140
gone off to do many, many different things in life, prolifically so.
1115
00:47:43,140 --> 00:47:45,138
Not only has she been a writer, an author,
1116
00:47:45,138 --> 00:47:48,180
a journalist for the New York Times, a producer of films like The Harvard
1117
00:47:48,180 --> 00:47:51,060
Computers, The Search for General Tso, and The Emoji
1118
00:47:51,060 --> 00:47:53,940
Story, which focuses on exactly today's topic.
1119
00:47:53,940 --> 00:47:56,940
Jenny and her colleagues have been involved particularly
1120
00:47:56,940 --> 00:48:01,620
with championing representation of different types of people,
1121
00:48:01,620 --> 00:48:02,820
and cultures, and languages.
1122
00:48:02,820 --> 00:48:06,120
And these are just a few of the emojis that our friend Jenny has indeed
1123
00:48:06,120 --> 00:48:09,330
brought into creation on our phones and laptops.
1124
00:48:09,330 --> 00:48:12,120
Jenny too is the original inspiration for what
1125
00:48:12,120 --> 00:48:14,288
has become it seems my Twitter recommendations
1126
00:48:14,288 --> 00:48:15,330
and all of these puppets.
1127
00:48:15,330 --> 00:48:17,970
I was visiting her in Manhattan one time some years ago,
1128
00:48:17,970 --> 00:48:21,550
she had on herself a couple of puppets known as Muppet whatnots.
1129
00:48:21,550 --> 00:48:24,300
At the time you could go to FAO Schwartz or the website Therefore,
1130
00:48:24,300 --> 00:48:26,160
an old tort store, and you could actually
1131
00:48:26,160 --> 00:48:27,510
configure your very own Muppets.
1132
00:48:27,510 --> 00:48:29,218
And I thought this was the coolest thing.
1133
00:48:29,218 --> 00:48:31,800
And literally on the cab ride home from her place
1134
00:48:31,800 --> 00:48:34,477
was I logging into the website configuring a couple of puppets.
1135
00:48:34,477 --> 00:48:36,060
A couple of weeks later, they arrived.
1136
00:48:36,060 --> 00:48:38,315
And then rather sat my shelf for a couple of years,
1137
00:48:38,315 --> 00:48:41,190
as I wondered why I had just bought two Muppets in the back of a cab.
1138
00:48:41,190 --> 00:48:43,530
But brought them into the office at one point.
1139
00:48:43,530 --> 00:48:45,600
A colleague saw them, drew inspiration from them,
1140
00:48:45,600 --> 00:48:48,720
and now have they been roving really into the fabric of this course,
1141
00:48:48,720 --> 00:48:51,570
in particular, and a lot of the course's pedagogy, at least
1142
00:48:51,570 --> 00:48:55,030
incarnated here just for fun but also in video form as well.
1143
00:48:55,030 --> 00:48:57,780
Which is only to say, so glad that our friend Jenny 8.
1144
00:48:57,780 --> 00:49:00,910
Lee is here for us today to talk about these emoji.
1145
00:49:00,910 --> 00:49:01,410
Jenny.
1146
00:49:01,410 --> 00:49:01,530
JENNIFER 8.
1147
00:49:01,530 --> 00:49:02,030
LEE: Hi.
1148
00:49:02,030 --> 00:49:03,618
[APPLAUSE]
1149
00:49:03,618 --> 00:49:07,780
1150
00:49:07,780 --> 00:49:09,370
All right.
1151
00:49:09,370 --> 00:49:10,480
This is very exciting.
1152
00:49:10,480 --> 00:49:14,530
I took CS50 in 1994.
1153
00:49:14,530 --> 00:49:16,270
To give you a sense, one of my blockmates
1154
00:49:16,270 --> 00:49:21,310
was the first intern for Netscape, if you guys have ever heard of Netscape.
1155
00:49:21,310 --> 00:49:25,262
And I graduated just as Google was--
1156
00:49:25,262 --> 00:49:27,220
we did not have Google when we were undergrads.
1157
00:49:27,220 --> 00:49:32,410
So, it's an honor, obviously, to be at CS50.
1158
00:49:32,410 --> 00:49:34,900
It's also very impressive to see how David has turned it
1159
00:49:34,900 --> 00:49:39,580
from entry level computer science course into a lifestyle
1160
00:49:39,580 --> 00:49:41,560
brand that is world renowned.
1161
00:49:41,560 --> 00:49:42,880
So, it's an honor.
1162
00:49:42,880 --> 00:49:48,460
And I'm going to talk to you today about how an emoji becomes an emoji.
1163
00:49:48,460 --> 00:49:51,070
So first I'm going to talk about my journey
1164
00:49:51,070 --> 00:49:54,730
down the rabbit hole of how I got involved with emoji.
1165
00:49:54,730 --> 00:49:59,050
So, this is my friend Iing Lu, she is a designer
1166
00:49:59,050 --> 00:50:01,540
famous for designing a Twitter Fail Whale, which
1167
00:50:01,540 --> 00:50:06,680
was this kind of image that popped up when Twitter went down,
1168
00:50:06,680 --> 00:50:08,890
which back in the day was rather often.
1169
00:50:08,890 --> 00:50:12,940
So, she's Chinese-Australian-American, which
1170
00:50:12,940 --> 00:50:15,550
is a weird interesting combination.
1171
00:50:15,550 --> 00:50:19,240
So one day we were texting about dumplings, because that
1172
00:50:19,240 --> 00:50:20,650
is what Chinese-ish women do.
1173
00:50:20,650 --> 00:50:21,820
We text about food.
1174
00:50:21,820 --> 00:50:24,820
And so I sent her this picture of dumplings.
1175
00:50:24,820 --> 00:50:29,260
And then she said, yum, yum, yum, yum, yum knife and fork, knife
1176
00:50:29,260 --> 00:50:30,310
and fork, knife and fork.
1177
00:50:30,310 --> 00:50:33,160
And then she was like, oh, I'm surprised that Apple
1178
00:50:33,160 --> 00:50:34,780
doesn't have a dumpling emoji.
1179
00:50:34,780 --> 00:50:37,000
And I'm like, oh, yeah, that's kind of weird.
1180
00:50:37,000 --> 00:50:42,040
And it's one of those things where the thought comes to your head
1181
00:50:42,040 --> 00:50:43,960
and then it leaves.
1182
00:50:43,960 --> 00:50:48,760
Was just an observation, but then half an hour later onto my phone
1183
00:50:48,760 --> 00:50:52,360
pops up this dumpling emoji with hearts.
1184
00:50:52,360 --> 00:50:55,810
Actually, you can't see it here, but it actually had blinking eyes.
1185
00:50:55,810 --> 00:50:58,480
So she called it bling bling dumpling.
1186
00:50:58,480 --> 00:50:59,380
She's a designer.
1187
00:50:59,380 --> 00:51:05,740
So she decided she was going to fix this lack of dumpling emoji problem.
1188
00:51:05,740 --> 00:51:09,310
And I was actually really puzzled.
1189
00:51:09,310 --> 00:51:12,190
How could there be no dumpling emoji, right?
1190
00:51:12,190 --> 00:51:15,730
Because I knew that emoji are originally Japanese.
1191
00:51:15,730 --> 00:51:17,800
This by the way was back in 2015.
1192
00:51:17,800 --> 00:51:23,620
So, Japanese food's super well represented on the emoji keyboard.
1193
00:51:23,620 --> 00:51:28,150
You have ramen, you have tempura, you have curry, you have--
1194
00:51:28,150 --> 00:51:31,390
actually, wait, bento box, curry, then tempura.
1195
00:51:31,390 --> 00:51:36,070
You even have slightly weird foods like--
1196
00:51:36,070 --> 00:51:36,640
let's see.
1197
00:51:36,640 --> 00:51:41,360
You had these things on a stick, which are fish cakes, I discovered.
1198
00:51:41,360 --> 00:51:45,190
Then you have this white and pink swirly thing, which is also a fish cake.
1199
00:51:45,190 --> 00:51:49,450
You even have this triangle thing that looks like it's had a bikini wax.
1200
00:51:49,450 --> 00:51:54,760
But, in essence, there were all these foods that were on the keyboard,
1201
00:51:54,760 --> 00:51:57,100
but there was no dumpling, right?
1202
00:51:57,100 --> 00:52:00,640
And I was like, dumplings are this universal food.
1203
00:52:00,640 --> 00:52:03,250
Every culture has some version of a dumpling,
1204
00:52:03,250 --> 00:52:07,300
whether or not it's empanadas, or ravioli, or--
1205
00:52:07,300 --> 00:52:08,440
God, what else?
1206
00:52:08,440 --> 00:52:15,070
--ravioli, pierogi, momos-- the whole idea is all cultures have basically
1207
00:52:15,070 --> 00:52:16,270
found the idea--
1208
00:52:16,270 --> 00:52:21,760
this concept of yummy goodness within a carbohydrate shell, whether or not it's
1209
00:52:21,760 --> 00:52:24,490
baked, or steamed, or fried.
1210
00:52:24,490 --> 00:52:27,100
So dumplings are universal.
1211
00:52:27,100 --> 00:52:29,890
Emoji, I didn't use them that much.
1212
00:52:29,890 --> 00:52:32,270
But I was like, they're all so kind of universal.
1213
00:52:32,270 --> 00:52:37,270
So the fact there was no dumpling emoji told me whatever system was in place
1214
00:52:37,270 --> 00:52:38,050
failed.
1215
00:52:38,050 --> 00:52:39,370
And I actually had no idea.
1216
00:52:39,370 --> 00:52:40,690
I was like, who controls emoji?
1217
00:52:40,690 --> 00:52:42,640
I'm going to go fix this problem.
1218
00:52:42,640 --> 00:52:46,390
There is something wrong with the universe if there's no dumpling emoji.
1219
00:52:46,390 --> 00:52:49,280
And I took it upon myself to go fix that.
1220
00:52:49,280 --> 00:52:54,306
So I Googled, and I basically discovered there was this thing called the Unicode
1221
00:52:54,306 --> 00:53:00,820
Consortium, which is a non-profit based in Mountain View, California
1222
00:53:00,820 --> 00:53:04,160
that when I looked had these like 12 full voting members.
1223
00:53:04,160 --> 00:53:06,250
So this is late 2015.
1224
00:53:06,250 --> 00:53:12,200
Of those 12, 9 were multinational US tech companies.
1225
00:53:12,200 --> 00:53:17,440
So there was Oracle, IBM, Microsoft, Adobe, Google, Apple,
1226
00:53:17,440 --> 00:53:19,090
Facebook, and Yahoo.
1227
00:53:19,090 --> 00:53:22,180
So these were eight, I think.
1228
00:53:22,180 --> 00:53:27,430
And then you had the German software company SAP,
1229
00:53:27,430 --> 00:53:30,970
the Chinese company called Huawei, and then the government of Oman.
1230
00:53:30,970 --> 00:53:34,270
So these were basically the people who were in charge
1231
00:53:34,270 --> 00:53:36,910
and had full voting power on Unicode.
1232
00:53:36,910 --> 00:53:42,490
So they paid $18,000 a year to have this full voting
1233
00:53:42,490 --> 00:53:43,990
power, which is a lot of money.
1234
00:53:43,990 --> 00:53:50,140
I was kind of very indignant on how this cabal of tech companies
1235
00:53:50,140 --> 00:53:58,130
basically controlled this global curated image based language on your keyboard.
1236
00:53:58,130 --> 00:54:01,720
So there was a little bit of a loophole, which
1237
00:54:01,720 --> 00:54:05,560
is could pay $18,000 a year to have full voting power,
1238
00:54:05,560 --> 00:54:09,470
or you could pay $75 a year as an individual.
1239
00:54:09,470 --> 00:54:11,210
You had no voting power.
1240
00:54:11,210 --> 00:54:17,155
But, you had the ability to sign up for the email list
1241
00:54:17,155 --> 00:54:18,530
and also show up at the meetings.
1242
00:54:18,530 --> 00:54:22,580
So, I put putting my credit card, got on email list.
1243
00:54:22,580 --> 00:54:28,447
And was checking my email one day, when there
1244
00:54:28,447 --> 00:54:31,280
was an invite that said they were going to have a quarterly meeting.
1245
00:54:31,280 --> 00:54:34,190
And I think this is going to be October of 2015.
1246
00:54:34,190 --> 00:54:36,950
And I looked-- it was in Sunnyvale.
1247
00:54:36,950 --> 00:54:37,970
I looked in my calendar.
1248
00:54:37,970 --> 00:54:41,270
I looked at the point that I was actually
1249
00:54:41,270 --> 00:54:44,330
going to be able to be in Silicon Valley at that time.
1250
00:54:44,330 --> 00:54:47,840
So I took a bus to Apple where they were having that meeting.
1251
00:54:47,840 --> 00:54:51,320
And I don't know completely what I thought I was going to see.
1252
00:54:51,320 --> 00:54:55,070
I think maybe it was going to be maybe like a Sanders Theater or a little mini
1253
00:54:55,070 --> 00:54:57,320
Congress, people making emoji decisions.
1254
00:54:57,320 --> 00:54:58,610
But that was not what it was.
1255
00:54:58,610 --> 00:55:03,110
Basically, this is the room where it happens, in 2015, where
1256
00:55:03,110 --> 00:55:04,820
the people who were deciding emoji.
1257
00:55:04,820 --> 00:55:07,250
These were emoji decision makers, which were not
1258
00:55:07,250 --> 00:55:10,922
the most demographically diverse group.
1259
00:55:10,922 --> 00:55:12,380
They had a sense of humor about it.
1260
00:55:12,380 --> 00:55:17,450
One guy had a shirt that said shadowy emoji overlord.
1261
00:55:17,450 --> 00:55:21,860
And so I decided, along with my friend Iing Lu, to create a group
1262
00:55:21,860 --> 00:55:26,450
called Emoji Nation, whose motto is emoji by the people, for the people.
1263
00:55:26,450 --> 00:55:33,410
And it brought the voice of the normal world into the decision making chain.
1264
00:55:33,410 --> 00:55:39,530
So, we launched a little campaign about dumpling emojis.
1265
00:55:39,530 --> 00:55:42,560
We made a Kickstarter video.
1266
00:55:42,560 --> 00:55:43,562
Let's see.
1267
00:55:43,562 --> 00:55:46,520
SPEAKER 2: Dumplings are one of the most universal cross-cultural foods
1268
00:55:46,520 --> 00:55:47,570
in the world.
1269
00:55:47,570 --> 00:55:53,150
Georgia has kinkily, Japan has gyoza, Korea has mandu, Italy has ravioli,
1270
00:55:53,150 --> 00:55:57,200
Poland has pierogi, Russia has pelmeni, Argentina has empanadas,
1271
00:55:57,200 --> 00:56:00,110
Jewish people have kreplach, China has potstickers,
1272
00:56:00,110 --> 00:56:02,960
Nepal and Tibet have momos.
1273
00:56:02,960 --> 00:56:05,030
Yet somehow, despite their popularity, there
1274
00:56:05,030 --> 00:56:07,950
is no dumpling emoji in the standard set.
1275
00:56:07,950 --> 00:56:08,840
Why is that?
1276
00:56:08,840 --> 00:56:13,280
Emoji exist for pizza, tempura, sushi, spaghetti, hot dog, and now tacos,
1277
00:56:13,280 --> 00:56:15,650
which Taco Bell takes credit for.
1278
00:56:15,650 --> 00:56:17,570
We need to right this disparity.
1279
00:56:17,570 --> 00:56:18,800
Dumplings a global.
1280
00:56:18,800 --> 00:56:20,190
Emoji are global.
1281
00:56:20,190 --> 00:56:21,815
Isn't it time we brought them together?
1282
00:56:21,815 --> 00:56:24,380
1283
00:56:24,380 --> 00:56:27,380
Oh, yeah, and while we're at it, how about an emoji for Chinese takeout?
1284
00:56:27,380 --> 00:56:30,958
1285
00:56:30,958 --> 00:56:31,458
JENNIFER 8.
1286
00:56:31,458 --> 00:56:36,680
LEE: So this is Thanksgiving of 2015.
1287
00:56:36,680 --> 00:56:38,630
I wrote a dumpling emoji proposal.
1288
00:56:38,630 --> 00:56:41,870
This is it, different styles, like whether or not
1289
00:56:41,870 --> 00:56:44,270
it's a head-on view or slightly diagonal view.
1290
00:56:44,270 --> 00:56:47,660
1291
00:56:47,660 --> 00:56:53,600
That's Iing with then one of the co-chairs of the Emoji Subcommittee.
1292
00:56:53,600 --> 00:56:57,710
And so along with dumpling, we also did takeout box, we got chopsticks,
1293
00:56:57,710 --> 00:57:02,360
and then fortune cookie, which actually, I have to be honest,
1294
00:57:02,360 --> 00:57:05,240
I don't think fortune cookie would have gotten in on its own merits
1295
00:57:05,240 --> 00:57:07,610
were it not on the coattails of the other three.
1296
00:57:07,610 --> 00:57:09,530
So we got these four through.
1297
00:57:09,530 --> 00:57:11,930
And that is how they look today.
1298
00:57:11,930 --> 00:57:15,540
And I have to say that dumpling looks really photorealistic in the Apple
1299
00:57:15,540 --> 00:57:16,040
world.
1300
00:57:16,040 --> 00:57:19,002
Unlike the fortune cookie which has no slit.
1301
00:57:19,002 --> 00:57:20,210
It looks like a dead Pac-Man.
1302
00:57:20,210 --> 00:57:25,340
I don't know what is going on with that design.
1303
00:57:25,340 --> 00:57:26,690
So very proud.
1304
00:57:26,690 --> 00:57:30,110
I also did a lot of research on Chinese food in America,
1305
00:57:30,110 --> 00:57:32,360
and wrote a book called The Fortune Cookie Chronicles,
1306
00:57:32,360 --> 00:57:34,735
produced a documentary called The Search for General Tso.
1307
00:57:34,735 --> 00:57:39,710
So I have a lot of moral authority on the issues of Asian food in America.
1308
00:57:39,710 --> 00:57:44,240
Not all things, but this one I felt like I had made a mark
1309
00:57:44,240 --> 00:57:47,720
on a 2,500 year history of emoji.
1310
00:57:47,720 --> 00:57:50,960
I'm sorry of dumplings, my moving them into emoji.
1311
00:57:50,960 --> 00:57:54,350
So it kind of gets into this very complicated thing.
1312
00:57:54,350 --> 00:57:56,060
How does an emoji become an emoji.
1313
00:57:56,060 --> 00:57:59,310
And it's actually fairly complex.
1314
00:57:59,310 --> 00:58:02,660
So let's say you have an idea for an emoji.
1315
00:58:02,660 --> 00:58:05,450
You write a proposal.
1316
00:58:05,450 --> 00:58:09,440
And then you submit it to the Emoji Subcommittee
1317
00:58:09,440 --> 00:58:12,020
that then debates and thinks about it.
1318
00:58:12,020 --> 00:58:14,610
Sometimes they have feedback and they kick it back to you.
1319
00:58:14,610 --> 00:58:17,830
And if so, then you have to revise it.
1320
00:58:17,830 --> 00:58:19,880
And kind of goes around and around in a circle.
1321
00:58:19,880 --> 00:58:23,540
And once they're happy with it, they kick it
1322
00:58:23,540 --> 00:58:25,550
to the full Unicode technical committee, which
1323
00:58:25,550 --> 00:58:31,250
is a governing body within Unicode on things technical and encoding.
1324
00:58:31,250 --> 00:58:34,790
So what are the kinds of things that impact
1325
00:58:34,790 --> 00:58:36,590
whether an emoji can be an emoji?
1326
00:58:36,590 --> 00:58:40,010
So one, is there popular demand?
1327
00:58:40,010 --> 00:58:43,670
Is it frequently requested?
1328
00:58:43,670 --> 00:58:48,080
And at this point one of the very crude ways that we measure
1329
00:58:48,080 --> 00:58:50,480
is, if you search for it on Google, does it
1330
00:58:50,480 --> 00:58:58,370
have more than 500 million results, which is what elephant gets in English.
1331
00:58:58,370 --> 00:59:00,740
And that's a median--
1332
00:59:00,740 --> 00:59:04,460
elephant is kind of right in the middle of popular emoji and not popular emoji.
1333
00:59:04,460 --> 00:59:07,080
So, we used that as a benchmark.
1334
00:59:07,080 --> 00:59:10,260
There's a plus if there's multiple usages and meanings.
1335
00:59:10,260 --> 00:59:14,730
For example, sloth.
1336
00:59:14,730 --> 00:59:16,860
That was an emoji that we did.
1337
00:59:16,860 --> 00:59:20,250
It's both in it literally an emoji of an animal,
1338
00:59:20,250 --> 00:59:22,080
but it also has lots of connotations.
1339
00:59:22,080 --> 00:59:27,090
So if something has lots of multiple meanings, that gives it a bump.
1340
00:59:27,090 --> 00:59:29,640
One thing is visually distinctive.
1341
00:59:29,640 --> 00:59:31,827
Doesn't it work at little tiny emoji sizes.
1342
00:59:31,827 --> 00:59:34,410
And that's actually really hard, because there are some things
1343
00:59:34,410 --> 00:59:39,420
that I think could have been emoji but don't completely
1344
00:59:39,420 --> 00:59:41,670
work when you try to shrink it down.
1345
00:59:41,670 --> 00:59:43,930
And I'll give some examples of that later.
1346
00:59:43,930 --> 00:59:48,610
And then filling the gap or completeness is another factor.
1347
00:59:48,610 --> 00:59:52,350
So, for a long time we had red heart, yellow heart, green heart, blue heart,
1348
00:59:52,350 --> 00:59:53,220
purple heart.
1349
00:59:53,220 --> 00:59:54,630
There was no orange heart.
1350
00:59:54,630 --> 00:59:57,900
And so there was a gay designer from Adobe
1351
00:59:57,900 --> 01:00:00,310
who was actually very heartbroken by that.
1352
01:00:00,310 --> 01:00:04,260
So he had been substituting the pumpkin to get the orange to get the rainbow.
1353
01:00:04,260 --> 01:00:05,820
And so he proposed an orange heart.
1354
01:00:05,820 --> 01:00:07,860
And that was obviously at that point, you're
1355
01:00:07,860 --> 01:00:10,560
like, yes, that will complete a set.
1356
01:00:10,560 --> 01:00:16,500
And another thing is, is it already something that one of the companies
1357
01:00:16,500 --> 01:00:21,660
has and therefore everyone else is going to adopt it.
1358
01:00:21,660 --> 01:00:26,490
And so a good example for that is the binary--
1359
01:00:26,490 --> 01:00:30,750
I think it was the non-gender binary emoji, the pink, blue, and white flag.
1360
01:00:30,750 --> 01:00:36,360
So I have to say WhatsApp is by far one of the most rouge platforms.
1361
01:00:36,360 --> 01:00:38,923
So they just randomly added it one day.
1362
01:00:38,923 --> 01:00:39,840
And we just notice it.
1363
01:00:39,840 --> 01:00:40,890
And we're like, oh, God.
1364
01:00:40,890 --> 01:00:42,140
Given that they have to do it.
1365
01:00:42,140 --> 01:00:45,180
Now we have the [INAUDIBLE].
1366
01:00:45,180 --> 01:00:49,920
So factors of exclusion or against inclusion, to be more PC.
1367
01:00:49,920 --> 01:00:53,010
Sometimes if it's too specific or narrow,
1368
01:00:53,010 --> 01:00:54,910
that works against being included.
1369
01:00:54,910 --> 01:00:59,610
So poutine, which the Canadians love, was kind of really specific.
1370
01:00:59,610 --> 01:01:01,830
And I know it's really important to the Canadians.
1371
01:01:01,830 --> 01:01:08,100
But it just didn't have enough global appeal if it's redundant.
1372
01:01:08,100 --> 01:01:10,710
So an example for that is a couple of years
1373
01:01:10,710 --> 01:01:14,040
ago Butterball proposed a roasted turkey emoji.
1374
01:01:14,040 --> 01:01:17,640
But we already had an unroasted live emoji of a turkey.
1375
01:01:17,640 --> 01:01:20,550
So it wasn't clear that we needed the cooked version
1376
01:01:20,550 --> 01:01:24,030
to go with the live version, so that didn't pass.
1377
01:01:24,030 --> 01:01:25,215
Not visually discernible.
1378
01:01:25,215 --> 01:01:29,820
So this one's actually really tricky and knocks out a lot of things.
1379
01:01:29,820 --> 01:01:32,220
So, it knocked out kimchi, for example.
1380
01:01:32,220 --> 01:01:34,980
Really hard to do kimchi on emoji sizes.
1381
01:01:34,980 --> 01:01:39,240
How are you going-- is it in the jar or is it just in a little bowl?
1382
01:01:39,240 --> 01:01:42,490
So kimchi kind of died on that.
1383
01:01:42,490 --> 01:01:46,740
Another one that was really hard was a cave emoji, actually.
1384
01:01:46,740 --> 01:01:48,540
Really hard at emoji sizes.
1385
01:01:48,540 --> 01:01:53,310
And then, this is interesting, no logos, brands, deities, or celebrities.
1386
01:01:53,310 --> 01:01:56,610
And this is a new policy we just introduced, which is no more flags.
1387
01:01:56,610 --> 01:02:00,332
Flags were killing us, in terms of all kinds of complicated reasons.
1388
01:02:00,332 --> 01:02:02,415
And there is much regret that we ever added flags.
1389
01:02:02,415 --> 01:02:05,020
1390
01:02:05,020 --> 01:02:09,090
And lots of politics, so at this point, no more flags.
1391
01:02:09,090 --> 01:02:14,370
So once it gets passed into the full Unicode technical committee,
1392
01:02:14,370 --> 01:02:17,160
the proposal gets voted on once a year.
1393
01:02:17,160 --> 01:02:20,470
And then they pass all the emoji for the next year.
1394
01:02:20,470 --> 01:02:22,470
We just actually did that a couple of weeks ago.
1395
01:02:22,470 --> 01:02:26,850
And it takes a while because it gets sent to all the companies like Apple,
1396
01:02:26,850 --> 01:02:28,260
Google, Adobe, Facebook.
1397
01:02:28,260 --> 01:02:31,590
And then they add it to all your devices.
1398
01:02:31,590 --> 01:02:32,760
And then, ta-da.
1399
01:02:32,760 --> 01:02:37,170
It takes about 18 to 24 months, from when you first have your proposal
1400
01:02:37,170 --> 01:02:40,360
to when it lands onto your devices.
1401
01:02:40,360 --> 01:02:43,585
So Emojination has worked on a bunch of emoji.
1402
01:02:43,585 --> 01:02:45,460
And so we've kind of shepherded this through.
1403
01:02:45,460 --> 01:02:49,320
So one of the interesting question is, why is it that Unicode controls emoji?
1404
01:02:49,320 --> 01:02:52,320
So a lot of it has to go--
1405
01:02:52,320 --> 01:02:53,880
has to do with the history of emoji.
1406
01:02:53,880 --> 01:02:56,670
They were originally popularized in Japan.
1407
01:02:56,670 --> 01:03:00,540
There was a very-- one of the initial sets is from 1999 from DoCoMo.
1408
01:03:00,540 --> 01:03:03,930
These were actually recently collected by the Museum
1409
01:03:03,930 --> 01:03:06,570
of Modern Art in New York City.
1410
01:03:06,570 --> 01:03:13,110
And so all the Japanese vendors had these little glyphs
1411
01:03:13,110 --> 01:03:15,780
that they added to their character set.
1412
01:03:15,780 --> 01:03:20,515
And the main problem is if you were DoCoMo, you had one set.
1413
01:03:20,515 --> 01:03:22,390
If you were in SoftBank, you had another set.
1414
01:03:22,390 --> 01:03:26,430
So no matter what you could only text the people who are on your platform,
1415
01:03:26,430 --> 01:03:28,470
not across platforms.
1416
01:03:28,470 --> 01:03:32,070
And that was a real big problem when Apple and Google started
1417
01:03:32,070 --> 01:03:35,430
introducing smartphones into Japan.
1418
01:03:35,430 --> 01:03:38,850
And there were this kind of understanding and expectation
1419
01:03:38,850 --> 01:03:42,450
that if you did something in your smartphone
1420
01:03:42,450 --> 01:03:48,540
you also want it to show up in email and be sent into the ether
1421
01:03:48,540 --> 01:03:51,678
and someone else is supposed to get the same image that you sent.
1422
01:03:51,678 --> 01:03:52,720
So that was not the case.
1423
01:03:52,720 --> 01:03:55,260
So in 2007, they went to Unicode and asked
1424
01:03:55,260 --> 01:03:59,400
them to basically unify the emoji set.
1425
01:03:59,400 --> 01:04:02,040
And Unicode is interesting, because its mission
1426
01:04:02,040 --> 01:04:05,850
is to enable everyone speaking every language on Earth
1427
01:04:05,850 --> 01:04:08,580
to be able to use the language on computers and smartphones.
1428
01:04:08,580 --> 01:04:10,890
And they actually see this as a human right.
1429
01:04:10,890 --> 01:04:14,280
Because at a certain point, if your language cannot be captured digitally,
1430
01:04:14,280 --> 01:04:15,580
it's going to disappear.
1431
01:04:15,580 --> 01:04:20,640
So they spent a lot of time doing Chinese, Arabic,
1432
01:04:20,640 --> 01:04:23,400
Cyrillic in the very early days.
1433
01:04:23,400 --> 01:04:27,570
In 2001, they actually had a proposal for Klingon, which they did not
1434
01:04:27,570 --> 01:04:29,560
actually accept at that point.
1435
01:04:29,560 --> 01:04:31,800
So they have three major projects.
1436
01:04:31,800 --> 01:04:33,600
They encode characters, including emoji.
1437
01:04:33,600 --> 01:04:36,150
That's actually what they're most famous for.
1438
01:04:36,150 --> 01:04:38,740
They also have a bunch of localization resources.
1439
01:04:38,740 --> 01:04:45,690
So that's like, in this country they use this as a currency,
1440
01:04:45,690 --> 01:04:50,160
and they use this kind of time format, and it's whether or not
1441
01:04:50,160 --> 01:04:52,800
it's month month, date date, year year year year.
1442
01:04:52,800 --> 01:04:56,290
In some countries it's date date, month month, year year year year,
1443
01:04:56,290 --> 01:04:57,460
in other countries.
1444
01:04:57,460 --> 01:05:00,870
So they tell you what country cares about what.
1445
01:05:00,870 --> 01:05:03,060
And then they also then have the libraries
1446
01:05:03,060 --> 01:05:06,450
so that no one's basically programming things from scratch.
1447
01:05:06,450 --> 01:05:11,640
So, what's really funny is you say CLDR really fast it sounds like seal deer.
1448
01:05:11,640 --> 01:05:14,700
And this really confused one of the girlfriends of one of the engineers
1449
01:05:14,700 --> 01:05:16,500
why he was always talking about seal deers.
1450
01:05:16,500 --> 01:05:22,890
And so she basically surgically attached a bunch of antlers to this little guy
1451
01:05:22,890 --> 01:05:25,020
and made a seal dear.
1452
01:05:25,020 --> 01:05:29,640
And so it took three years between 2007 to 2010
1453
01:05:29,640 --> 01:05:31,860
to introduce the first Unicode emoji set.
1454
01:05:31,860 --> 01:05:33,780
So, these were the ones that came out.
1455
01:05:33,780 --> 01:05:36,390
It took many, many years to figure out how
1456
01:05:36,390 --> 01:05:39,930
to reconcile all the different images and which ones should we include,
1457
01:05:39,930 --> 01:05:42,030
which ones we shouldn't include.
1458
01:05:42,030 --> 01:05:45,510
And as you guys probably know from CS50, a Unicode code point
1459
01:05:45,510 --> 01:05:48,360
is a unique number assigned to each Unicode character.
1460
01:05:48,360 --> 01:05:51,570
So you can represent that emoji--
1461
01:05:51,570 --> 01:05:59,020
tears with Facebook, tears of joy, as this, or this, or the binary code.
1462
01:05:59,020 --> 01:06:02,730
So, emojis were just hanging out on your phone
1463
01:06:02,730 --> 01:06:06,390
after 2010, until 2011 when Apple suddenly made them
1464
01:06:06,390 --> 01:06:09,430
much easier to access on your phone.
1465
01:06:09,430 --> 01:06:14,670
And one of the confusing things, of course, is emoji are very ambiguous.
1466
01:06:14,670 --> 01:06:16,410
And it's not always clear what they mean.
1467
01:06:16,410 --> 01:06:18,130
And that's one of the great joys, right?
1468
01:06:18,130 --> 01:06:21,150
It can be more--
1469
01:06:21,150 --> 01:06:24,570
there's much more interpretation on--
1470
01:06:24,570 --> 01:06:27,087
in terms between the sender and the receiver.
1471
01:06:27,087 --> 01:06:28,920
So if you actually look-- if you start doing
1472
01:06:28,920 --> 01:06:30,735
that on Google, the autocompletes are like,
1473
01:06:30,735 --> 01:06:32,610
what does it mean when a guy sends it to you?
1474
01:06:32,610 --> 01:06:34,485
What does it mean when the girl sends to you?
1475
01:06:34,485 --> 01:06:39,990
And clearly, many, many people have been confused by that emoji when
1476
01:06:39,990 --> 01:06:40,960
it's been sent to them.
1477
01:06:40,960 --> 01:06:42,360
So who can propose emoji?
1478
01:06:42,360 --> 01:06:44,550
And a short answer is basically anyone.
1479
01:06:44,550 --> 01:06:48,700
There's a Google form that is open between April and August.
1480
01:06:48,700 --> 01:06:52,110
So the hijab emoji actually was originally
1481
01:06:52,110 --> 01:06:56,490
proposed by a 15-year-old girl who is Saudi Arabian but lived in Germany.
1482
01:06:56,490 --> 01:07:00,518
Rayouf Alhumedhi, who actually got into Harvard and then chose Stanford,
1483
01:07:00,518 --> 01:07:02,310
so I was giving her a hard time about that.
1484
01:07:02,310 --> 01:07:02,700
I know.
1485
01:07:02,700 --> 01:07:03,200
Whoa.
1486
01:07:03,200 --> 01:07:05,880
[LAUGHS] I'm kind of on that.
1487
01:07:05,880 --> 01:07:07,780
I was like hmm.
1488
01:07:07,780 --> 01:07:11,610
So, she wrote the proposal, and it got through.
1489
01:07:11,610 --> 01:07:14,310
And she's actually the subject of the documentary
1490
01:07:14,310 --> 01:07:16,600
that we put together called The Emoji Story.
1491
01:07:16,600 --> 01:07:19,950
We also have a group of Argentinians who fought really hard for the mate
1492
01:07:19,950 --> 01:07:21,600
emoji, their national drink.
1493
01:07:21,600 --> 01:07:25,590
And then there was this nonprofit for girls advocacy
1494
01:07:25,590 --> 01:07:27,900
that really wanted a menstruation emoji.
1495
01:07:27,900 --> 01:07:32,585
And they sent in this bloody underpants proposal, which is really terrible,
1496
01:07:32,585 --> 01:07:33,210
I'll be honest.
1497
01:07:33,210 --> 01:07:38,220
So we kind of worked with them and got blood drop, which actually is one of--
1498
01:07:38,220 --> 01:07:42,240
actually it's done pretty statistically well.
1499
01:07:42,240 --> 01:07:46,050
We were kind of surprised actually how popular it is.
1500
01:07:46,050 --> 01:07:49,650
The skin tone emoji were actually proposed not from within Unicode,
1501
01:07:49,650 --> 01:07:50,370
clearly.
1502
01:07:50,370 --> 01:07:53,220
It was done by a mom from Houston who's also
1503
01:07:53,220 --> 01:07:59,230
an entrepreneur, because her daughter asked her-- came home one day and said,
1504
01:07:59,230 --> 01:08:01,170
I'd really like an emoji that looks like me.
1505
01:08:01,170 --> 01:08:05,580
And her mom, Katrina Parrot, was like, that's great, honey.
1506
01:08:05,580 --> 01:08:07,350
What's an emoji?
1507
01:08:07,350 --> 01:08:10,320
But she actually had worked in procurement with NASA,
1508
01:08:10,320 --> 01:08:12,150
and so she understood forum for proposals,
1509
01:08:12,150 --> 01:08:17,609
and she actually was the one we should thank for having five skin tones today.
1510
01:08:17,609 --> 01:08:18,840
Woman's flat shoe.
1511
01:08:18,840 --> 01:08:22,890
And the one piece bathing suit, as opposed
1512
01:08:22,890 --> 01:08:27,330
to just the yellow teeny weeny yellow polka bikini,
1513
01:08:27,330 --> 01:08:31,770
is a mother of three now four who just wrote that,
1514
01:08:31,770 --> 01:08:34,770
because she was very offended at that all of the shoe emoji
1515
01:08:34,770 --> 01:08:37,637
had high heels for women.
1516
01:08:37,637 --> 01:08:38,970
I actually really like this guy.
1517
01:08:38,970 --> 01:08:42,300
Some random guy in Germany came up with this emoji.
1518
01:08:42,300 --> 01:08:44,430
As we like to say, it's the Colbert emoji.
1519
01:08:44,430 --> 01:08:49,050
He wrote a proposal, and it got accepted because it was a really good proposal.
1520
01:08:49,050 --> 01:08:52,740
Then you even have governments, the Finnish government--
1521
01:08:52,740 --> 01:08:55,950
literally the Finnish government, their equivalent of the Department of State
1522
01:08:55,950 --> 01:08:59,880
proposed a sauna emoji, which these are the images.
1523
01:08:59,880 --> 01:09:01,770
And I think they're really ugly.
1524
01:09:01,770 --> 01:09:03,960
There's so many problems with this emoji.
1525
01:09:03,960 --> 01:09:07,120
But we helped them as Emojination.
1526
01:09:07,120 --> 01:09:13,268
First we got rid of the clubbed feet and then gave them examples.
1527
01:09:13,268 --> 01:09:18,220
Do you want them to hold the ladle, do you want the steam around it,
1528
01:09:18,220 --> 01:09:25,510
do you want it with clothing or not clothing?
1529
01:09:25,510 --> 01:09:30,290
We actually did a little bit of a towel for the more modest in us.
1530
01:09:30,290 --> 01:09:31,870
So it got passed.
1531
01:09:31,870 --> 01:09:36,010
And then the way it ended up is basically person in a steamy room.
1532
01:09:36,010 --> 01:09:38,080
So this is how it evolved.
1533
01:09:38,080 --> 01:09:41,500
So you can see that is what Finland submitted.
1534
01:09:41,500 --> 01:09:42,910
That is what we submitted.
1535
01:09:42,910 --> 01:09:46,040
And then that is how it's ended up on your phone.
1536
01:09:46,040 --> 01:09:50,560
And that is basically supposed to mean sauna emoji.
1537
01:09:50,560 --> 01:09:52,390
So one of the questions is like, why do I
1538
01:09:52,390 --> 01:09:54,910
care so much about emoji and representation of emoji?
1539
01:09:54,910 --> 01:09:57,160
And a lot of it has to do with the fact that I grew up
1540
01:09:57,160 --> 01:10:02,740
speaking Chinese and going to Saturday Chinese school.
1541
01:10:02,740 --> 01:10:06,565
And as you can see there's some really interesting parallels
1542
01:10:06,565 --> 01:10:10,510
between modern day emoji and Chinese radicals and characters
1543
01:10:10,510 --> 01:10:11,540
from a long time ago.
1544
01:10:11,540 --> 01:10:12,670
So this is fire.
1545
01:10:12,670 --> 01:10:13,910
This is mouth.
1546
01:10:13,910 --> 01:10:15,160
This is tree.
1547
01:10:15,160 --> 01:10:16,330
This is moon.
1548
01:10:16,330 --> 01:10:18,100
This is sun.
1549
01:10:18,100 --> 01:10:20,510
And you can mix and match them in Chinese as well.
1550
01:10:20,510 --> 01:10:25,600
So, one of the interesting ones is two trees together basically makes
1551
01:10:25,600 --> 01:10:26,710
a forest.
1552
01:10:26,710 --> 01:10:32,230
You have a sun and a moon together, and that means bright in Chinese.
1553
01:10:32,230 --> 01:10:33,220
It's kind of fun.
1554
01:10:33,220 --> 01:10:35,410
Then, this one's fun, right?
1555
01:10:35,410 --> 01:10:41,110
So it's basically a pig underneath a roof.
1556
01:10:41,110 --> 01:10:44,380
So you're like, oh, maybe that means farm.
1557
01:10:44,380 --> 01:10:48,250
Or I don't know, a barn or some kind of animal thing.
1558
01:10:48,250 --> 01:10:51,880
But actually, that in Chinese means home or family.
1559
01:10:51,880 --> 01:10:56,710
So home is where your pigs are, which I think says a lot about society
1560
01:10:56,710 --> 01:11:01,673
and what people cared about way back in ancient China.
1561
01:11:01,673 --> 01:11:02,840
This is one of my favorites.
1562
01:11:02,840 --> 01:11:06,400
So, this is a character for woman or female knee.
1563
01:11:06,400 --> 01:11:10,240
And I guess, it kind of looks like this like she's curtsy or something.
1564
01:11:10,240 --> 01:11:18,550
So super interesting character if you grow up writing your characters.
1565
01:11:18,550 --> 01:11:20,290
So this is a woman underneath a roof.
1566
01:11:20,290 --> 01:11:24,100
And you're like, oh, that might mean wife, or family, or something.
1567
01:11:24,100 --> 01:11:28,880
But it actually doesn't, it means peace on.
1568
01:11:28,880 --> 01:11:32,860
So the idea is like things are at peace when
1569
01:11:32,860 --> 01:11:37,270
the woman is under a roof, which I always thought I felt kind of weird
1570
01:11:37,270 --> 01:11:39,160
about that growing up.
1571
01:11:39,160 --> 01:11:43,960
Another one is-- OK there's the woman, and then you have a child or boy child,
1572
01:11:43,960 --> 01:11:45,620
specifically.
1573
01:11:45,620 --> 01:11:48,520
So you're like, oh, that might mean family, or mother, or something.
1574
01:11:48,520 --> 01:11:50,080
But actually it means good.
1575
01:11:50,080 --> 01:11:52,120
So the standard for good in ancient China
1576
01:11:52,120 --> 01:11:56,620
was a woman with a boy child, which I thought was also, as a six-year-old,
1577
01:11:56,620 --> 01:11:59,390
I found problematic as well.
1578
01:11:59,390 --> 01:12:03,430
And all kinds of [CHUCKLES] interesting things in Chinese
1579
01:12:03,430 --> 01:12:04,900
use the female radical.
1580
01:12:04,900 --> 01:12:08,230
So three women together means evil.
1581
01:12:08,230 --> 01:12:10,480
This one means greedy.
1582
01:12:10,480 --> 01:12:12,850
This one means slave.
1583
01:12:12,850 --> 01:12:16,570
This one means jealous.
1584
01:12:16,570 --> 01:12:20,510
This one means betrayal or adultery, which I think is interesting.
1585
01:12:20,510 --> 01:12:23,770
So, in case you want to bring this to your favorite 10-year-old,
1586
01:12:23,770 --> 01:12:28,000
we have a Chinese an emoji kid's book coming out
1587
01:12:28,000 --> 01:12:31,330
from MIT Press in the fall called Hanmoji.
1588
01:12:31,330 --> 01:12:32,590
So it's from MIT teen Press.
1589
01:12:32,590 --> 01:12:33,223
It's super fun.
1590
01:12:33,223 --> 01:12:35,890
So it's a lot of these concepts, but a little bit more rigorous.
1591
01:12:35,890 --> 01:12:43,180
And this idea of gender in emoji was really important to a bunch of us
1592
01:12:43,180 --> 01:12:44,930
as we were working through the issues.
1593
01:12:44,930 --> 01:12:48,350
So, for a long time, on the emoji keyboard
1594
01:12:48,350 --> 01:12:52,060
there are all kinds of jobs you could have as a man.
1595
01:12:52,060 --> 01:12:54,400
You could be a police officer, you could be a detective,
1596
01:12:54,400 --> 01:12:57,550
you could be Buckingham Palace guard, you could even be Santa,
1597
01:12:57,550 --> 01:13:01,090
you can be Black Santa, right?
1598
01:13:01,090 --> 01:13:05,290
As of 2015, if you're a woman, there are only four jobs
1599
01:13:05,290 --> 01:13:07,430
you could have on the emoji keyboard.
1600
01:13:07,430 --> 01:13:11,140
So you could be a princess, you could be a bride, you could be a dancer,
1601
01:13:11,140 --> 01:13:12,670
or you could play a Playboy bunny.
1602
01:13:12,670 --> 01:13:14,330
So there's were your four choices.
1603
01:13:14,330 --> 01:13:21,800
And so we worked really hard on trying to diversify what women could be.
1604
01:13:21,800 --> 01:13:25,120
And one of the ways we did it was through this idea of combining emoji.
1605
01:13:25,120 --> 01:13:29,470
So in emojiland there's something called ZWJ, a zero-width joiner.
1606
01:13:29,470 --> 01:13:33,080
And a lot of emoji that you see are actually glued together.
1607
01:13:33,080 --> 01:13:36,070
So the rainbow plus flag is how you get rainbow flag.
1608
01:13:36,070 --> 01:13:39,460
And this is actually how we worked on introducing a bunch
1609
01:13:39,460 --> 01:13:44,440
of the occupations in emojiland.
1610
01:13:44,440 --> 01:13:52,660
So a lot of these are the chef is a woman plus the frying frying pan,
1611
01:13:52,660 --> 01:13:57,175
or a teacher is a woman plus or man, actually, plus a school.
1612
01:13:57,175 --> 01:14:00,340
1613
01:14:00,340 --> 01:14:03,670
So one of the interesting things is you can actually
1614
01:14:03,670 --> 01:14:09,520
have as the result of all the gender parity stuff, we actually had
1615
01:14:09,520 --> 01:14:14,110
to make male and female versions of all the emoji, because some of them
1616
01:14:14,110 --> 01:14:17,140
originally were passes, like man and tuxedo.
1617
01:14:17,140 --> 01:14:20,290
And now, because we had gendered versions of everything,
1618
01:14:20,290 --> 01:14:21,643
we now have women with tuxedo.
1619
01:14:21,643 --> 01:14:22,810
I don't know if you noticed.
1620
01:14:22,810 --> 01:14:25,810
There's also a man in a wedding dress to compliment
1621
01:14:25,810 --> 01:14:28,630
the woman in a wedding dress.
1622
01:14:28,630 --> 01:14:30,400
There's now actually also bearded woman.
1623
01:14:30,400 --> 01:14:32,500
I don't know if you've noticed that.
1624
01:14:32,500 --> 01:14:33,700
So it gets interesting.
1625
01:14:33,700 --> 01:14:36,910
Because originally, at a certain point we had passed a woman breastfeeding.
1626
01:14:36,910 --> 01:14:39,970
And then there was all of this complaints coming into Unicode
1627
01:14:39,970 --> 01:14:41,887
about what about man as caretakers?
1628
01:14:41,887 --> 01:14:43,720
You can't actually tell she's breastfeeding.
1629
01:14:43,720 --> 01:14:45,262
It's more just like she's holding it.
1630
01:14:45,262 --> 01:14:47,800
So people are like, what about the man as a caretaker?
1631
01:14:47,800 --> 01:14:49,790
Paternity leave and da, da, da, da, da, da.
1632
01:14:49,790 --> 01:14:56,260
So, there is now man nursing the child.
1633
01:14:56,260 --> 01:15:04,210
And the other ways you can combine the emoji are through skin tones.
1634
01:15:04,210 --> 01:15:06,700
So unfortunately, those are not through ZWJs.
1635
01:15:06,700 --> 01:15:10,660
This is there an older technology where you have all the skin
1636
01:15:10,660 --> 01:15:15,250
tones are basically the yellow character plus a little square box at the end.
1637
01:15:15,250 --> 01:15:17,200
We call them skin tone modifiers.
1638
01:15:17,200 --> 01:15:22,240
And in terms of what are the things that we worked on at Emojination,
1639
01:15:22,240 --> 01:15:27,760
which was one of the hardest ones was to create the interracial couples.
1640
01:15:27,760 --> 01:15:30,940
And we worked on that with Tinder, which really cared about it.
1641
01:15:30,940 --> 01:15:33,340
Because apparently, which I thought was interesting,
1642
01:15:33,340 --> 01:15:36,610
when you introduce online dating into a community,
1643
01:15:36,610 --> 01:15:39,340
the rates of interracial marriage go up.
1644
01:15:39,340 --> 01:15:41,440
And there's a pretty interesting academic paper
1645
01:15:41,440 --> 01:15:44,500
that systematically looks at the rollout from different countries
1646
01:15:44,500 --> 01:15:47,540
and different communities.
1647
01:15:47,540 --> 01:15:51,250
So it was really nice to see it introduced on the phone.
1648
01:15:51,250 --> 01:15:53,260
One of my friends cried.
1649
01:15:53,260 --> 01:15:56,570
In terms of Emojination emoji, we've worked on a lot.
1650
01:15:56,570 --> 01:16:00,580
So these are just a sampling of the ones that we've done.
1651
01:16:00,580 --> 01:16:02,800
I really liked-- let's see.
1652
01:16:02,800 --> 01:16:04,420
DNA, I feel really good about.
1653
01:16:04,420 --> 01:16:08,200
Lobster, on behalf of people from Maine.
1654
01:16:08,200 --> 01:16:12,040
Yarn and thread, for all the people who like knitting.
1655
01:16:12,040 --> 01:16:16,390
There is bagel emoji, on behalf of all New Yorkers.
1656
01:16:16,390 --> 01:16:20,530
This emoji actually, which we called microbe, was like very sleepy
1657
01:16:20,530 --> 01:16:22,562
on the keyboard, until 2020.
1658
01:16:22,562 --> 01:16:23,770
And it really had its moment.
1659
01:16:23,770 --> 01:16:26,440
I'm really proud of that one.
1660
01:16:26,440 --> 01:16:32,890
And there is yoga emoji, sponge, so these are just a sampling of the ones
1661
01:16:32,890 --> 01:16:33,790
that we've worked on.
1662
01:16:33,790 --> 01:16:36,340
And this is a sampling of the people who have contributed.
1663
01:16:36,340 --> 01:16:38,710
You too, if you feel really passionate about emoji,
1664
01:16:38,710 --> 01:16:42,890
could impact billions of keyboards worldwide.
1665
01:16:42,890 --> 01:16:46,810
So it's interesting to see in terms of frequency of use.
1666
01:16:46,810 --> 01:16:51,220
It's very power law, right?
1667
01:16:51,220 --> 01:16:56,440
These are actually order of magnitude.
1668
01:16:56,440 --> 01:17:00,610
So one is half of this, two is half of one, all the way down.
1669
01:17:00,610 --> 01:17:03,910
And one of the most stunning things I was surprised to see
1670
01:17:03,910 --> 01:17:10,240
is that face of tears of joy by itself is almost 10% of all emoji sent,
1671
01:17:10,240 --> 01:17:13,780
9.9% of emoji is just that one character.
1672
01:17:13,780 --> 01:17:18,050
Number two is red heart, which I guess you guys can see in its binary form.
1673
01:17:18,050 --> 01:17:20,360
And then it falls off pretty quickly.
1674
01:17:20,360 --> 01:17:26,620
So I know I'm hearing that face of tears of joy is very Boomer or very Gen X,
1675
01:17:26,620 --> 01:17:33,430
and that maybe among you guys it's a little bit blasé or déclassé at this
1676
01:17:33,430 --> 01:17:34,330
point.
1677
01:17:34,330 --> 01:17:35,650
So the future emoji--
1678
01:17:35,650 --> 01:17:39,040
we really don't-- Unicode does not want to be encoding emoji.
1679
01:17:39,040 --> 01:17:42,190
And along the way I became a vise chair of the Unicode Emoji Subcommittee.
1680
01:17:42,190 --> 01:17:46,450
So I went from shaking my fist at the institution
1681
01:17:46,450 --> 01:17:50,150
to becoming part of the institution.
1682
01:17:50,150 --> 01:17:54,070
So there's one idea, this coded hashups of arbitrary images.
1683
01:17:54,070 --> 01:18:01,120
Can we create a system where instead of just using a binary code
1684
01:18:01,120 --> 01:18:05,980
to represent all the different emoji, we actually can do specific images,
1685
01:18:05,980 --> 01:18:09,040
we create hashes, and then you look and you
1686
01:18:09,040 --> 01:18:13,955
can look up by the hash which image you're looking at.
1687
01:18:13,955 --> 01:18:14,830
So that was the idea.
1688
01:18:14,830 --> 01:18:16,360
This is from a Stanford professor.
1689
01:18:16,360 --> 01:18:17,680
Didn't really get take off.
1690
01:18:17,680 --> 01:18:23,050
Then there was this idea using Wikipedia or Wikimedia, the wiki data QID
1691
01:18:23,050 --> 01:18:26,770
numbers, which I didn't know this, until this proposal came along.
1692
01:18:26,770 --> 01:18:31,990
But everything in Wikipedia has a number and that
1693
01:18:31,990 --> 01:18:34,610
allows it to match things between different languages.
1694
01:18:34,610 --> 01:18:39,920
So in Chinese the page for Obama is matched with the English page,
1695
01:18:39,920 --> 01:18:42,250
with this Arabic page.
1696
01:18:42,250 --> 01:18:43,520
And that went nowhere.
1697
01:18:43,520 --> 01:18:48,718
So, what I'm going to finish with is telling you what the new emoji are.
1698
01:18:48,718 --> 01:18:51,010
You guys are among the first people to hear about this,
1699
01:18:51,010 --> 01:18:53,587
because no one's really been paying attention.
1700
01:18:53,587 --> 01:18:55,420
So, this is published a couple of weeks ago,
1701
01:18:55,420 --> 01:19:01,180
but it made no news, because you have to be looking at the Unicode register.
1702
01:19:01,180 --> 01:19:04,630
So first off, more hearts, because you guys all love hearts.
1703
01:19:04,630 --> 01:19:07,378
So there's light blue heart, gray heart, and pink heart.
1704
01:19:07,378 --> 01:19:08,420
There was kind of debate.
1705
01:19:08,420 --> 01:19:09,587
Do we need more pink hearts?
1706
01:19:09,587 --> 01:19:11,380
And the answer seems to be yes.
1707
01:19:11,380 --> 01:19:12,760
Light blue is really interesting.
1708
01:19:12,760 --> 01:19:17,890
Because in some cultures, light blue and dark blue are different colors.
1709
01:19:17,890 --> 01:19:20,830
In our culture, we just call them versions of blue.
1710
01:19:20,830 --> 01:19:24,730
It's sort of like how in English pink and red are different colors,
1711
01:19:24,730 --> 01:19:29,658
but in some cultures there isn't a difference between pink and red.
1712
01:19:29,658 --> 01:19:31,450
And then there were a bunch of bird things.
1713
01:19:31,450 --> 01:19:35,410
The wing emoji is coming, blackbird and goose.
1714
01:19:35,410 --> 01:19:37,840
I don't really know why.
1715
01:19:37,840 --> 01:19:40,450
Hyacinth as a flower.
1716
01:19:40,450 --> 01:19:43,600
This has very popular in Iranian culture.
1717
01:19:43,600 --> 01:19:44,530
Jellyfish.
1718
01:19:44,530 --> 01:19:47,650
I don't know.
1719
01:19:47,650 --> 01:19:53,440
I'm very suspicious of a jellyfish, because they used man-of-war
1720
01:19:53,440 --> 01:19:56,320
as one of their phrases that they searched for.
1721
01:19:56,320 --> 01:19:57,460
And that had a billion--
1722
01:19:57,460 --> 01:19:59,500
I think-- it had a lot of entries.
1723
01:19:59,500 --> 01:20:03,423
And I feel like those were not about the actual invertebrate.
1724
01:20:03,423 --> 01:20:05,090
There was something else going on there.
1725
01:20:05,090 --> 01:20:07,990
But kind of rode in on that.
1726
01:20:07,990 --> 01:20:10,180
Moose on behalf of the Canadians.
1727
01:20:10,180 --> 01:20:15,092
Donkey on behalf of, I guess, the Democrats.
1728
01:20:15,092 --> 01:20:17,050
So that was interesting because you had to have
1729
01:20:17,050 --> 01:20:19,090
the donkey look different from a horse.
1730
01:20:19,090 --> 01:20:22,060
And there was a whole debate like, do you want a donkey head?
1731
01:20:22,060 --> 01:20:23,650
Or do you want a donkey body?
1732
01:20:23,650 --> 01:20:25,880
Do you want donkey with fluffy ears?
1733
01:20:25,880 --> 01:20:28,330
Do you want-- all kinds of donkey debate.
1734
01:20:28,330 --> 01:20:33,100
And it was actually originally proposed in 2019 and just got into this year.
1735
01:20:33,100 --> 01:20:36,250
Ginger and peapod.
1736
01:20:36,250 --> 01:20:37,420
These are kind of weird.
1737
01:20:37,420 --> 01:20:39,670
The food things kind of got in, in a weird way.
1738
01:20:39,670 --> 01:20:43,180
Ginger was good because it also represented root.
1739
01:20:43,180 --> 01:20:45,925
And then wireless got in, which is interesting.
1740
01:20:45,925 --> 01:20:48,550
Because we couldn't use a phrase Wi-Fi, because that's actually
1741
01:20:48,550 --> 01:20:50,530
trademarked by the Wi-Fi people.
1742
01:20:50,530 --> 01:20:54,340
And then on behalf of Sikhs, Khanda finally got in.
1743
01:20:54,340 --> 01:20:58,060
It was the largest religion that wasn't already represented
1744
01:20:58,060 --> 01:20:59,440
on the emoji keyboard.
1745
01:20:59,440 --> 01:21:01,795
And then on behalf of the faces--
1746
01:21:01,795 --> 01:21:02,590
[LAUGHTER]
1747
01:21:02,590 --> 01:21:04,720
--shaking face.
1748
01:21:04,720 --> 01:21:09,205
So, I'm glad you guys are really excited by that.
1749
01:21:09,205 --> 01:21:12,610
It is unclear to me--
1750
01:21:12,610 --> 01:21:15,520
I was not a big proponent of this, but your excitement about it
1751
01:21:15,520 --> 01:21:17,560
makes me change my mind.
1752
01:21:17,560 --> 01:21:19,952
Then, folding hand fan.
1753
01:21:19,952 --> 01:21:21,910
I actually find that one interesting, because I
1754
01:21:21,910 --> 01:21:23,500
think it was just college students.
1755
01:21:23,500 --> 01:21:25,540
Or fresh out of college students were like, we
1756
01:21:25,540 --> 01:21:27,430
want to do your proposal that passes.
1757
01:21:27,430 --> 01:21:31,330
And they were very opportunistic and just chose fan.
1758
01:21:31,330 --> 01:21:33,190
And then first they submitted electric fan.
1759
01:21:33,190 --> 01:21:37,600
And then we told them, oh, the longevity for electric fan
1760
01:21:37,600 --> 01:21:40,600
isn't great, even though it's been around for a couple years.
1761
01:21:40,600 --> 01:21:44,470
Why don't we go with the folding hand fan, which is a much longer history.
1762
01:21:44,470 --> 01:21:50,710
And then, this one is actually a big deal is afro hair pick.
1763
01:21:50,710 --> 01:21:53,710
There was a lot of controversy about and debate about curly hair.
1764
01:21:53,710 --> 01:21:57,010
And it supposed to be, represent, afros.
1765
01:21:57,010 --> 01:21:58,750
And then Apple did not do that.
1766
01:21:58,750 --> 01:22:01,210
So, everyone else has very Afro looking hair.
1767
01:22:01,210 --> 01:22:03,190
Apple just makes it look wavy.
1768
01:22:03,190 --> 01:22:06,040
And so there was like upsetness that Black hair
1769
01:22:06,040 --> 01:22:07,600
wasn't represented in the emoji set.
1770
01:22:07,600 --> 01:22:10,900
And so this was a proposal that someone worked on.
1771
01:22:10,900 --> 01:22:12,200
And then animals.
1772
01:22:12,200 --> 01:22:12,700
Sorry.
1773
01:22:12,700 --> 01:22:14,140
Not animals, instruments.
1774
01:22:14,140 --> 01:22:16,527
Maracas and flute.
1775
01:22:16,527 --> 01:22:17,110
And that's it.
1776
01:22:17,110 --> 01:22:19,880
So in terms of, if you have any questions,
1777
01:22:19,880 --> 01:22:22,270
you can look at emojination.org.
1778
01:22:22,270 --> 01:22:25,210
You can email me for all things emoji.
1779
01:22:25,210 --> 01:22:26,680
jenny@emojination.org.
1780
01:22:26,680 --> 01:22:29,740
And remember, you guys can actually impact
1781
01:22:29,740 --> 01:22:31,870
billions of keyboards around the world.
1782
01:22:31,870 --> 01:22:35,020
I mean, it's a little bit of impact for humans but billions,
1783
01:22:35,020 --> 01:22:36,850
so that adds up to a lot.
1784
01:22:36,850 --> 01:22:39,220
And you have any more questions, I am here
1785
01:22:39,220 --> 01:22:41,770
and can give lots of answers and questions.
1786
01:22:41,770 --> 01:22:45,310
And I'm really thrilled actually to bring
1787
01:22:45,310 --> 01:22:49,060
the emoji flag waving to such a large crowd
1788
01:22:49,060 --> 01:22:52,180
and especially a large, diverse and very motivated crowd.
1789
01:22:52,180 --> 01:22:57,467
And one of the interesting things is, we've kind of--
1790
01:22:57,467 --> 01:22:59,425
I'm not a proponent of this, but they've slowly
1791
01:22:59,425 --> 01:23:01,570
decreased the number of emoji per year.
1792
01:23:01,570 --> 01:23:02,620
It was 70.
1793
01:23:02,620 --> 01:23:03,400
Then it was 50.
1794
01:23:03,400 --> 01:23:04,550
Then it was 30.
1795
01:23:04,550 --> 01:23:06,190
And this year we only did 20.
1796
01:23:06,190 --> 01:23:09,970
And I'm a little bit sad about that.
1797
01:23:09,970 --> 01:23:17,230
But I hope that, if there's more excited proposals that
1798
01:23:17,230 --> 01:23:21,020
can be submitted to Unicode, we might be able to dial that number back up.
1799
01:23:21,020 --> 01:23:23,155
So, that is me.
1800
01:23:23,155 --> 01:23:24,400
Am I good?
1801
01:23:24,400 --> 01:23:24,920
Yeah.
1802
01:23:24,920 --> 01:23:25,420
So--
1803
01:23:25,420 --> 01:23:25,920
[APPLAUSE]
1804
01:23:25,920 --> 01:23:26,470
Thank you.
1805
01:23:26,470 --> 01:23:27,454
[LAUGHTER]
1806
01:23:27,454 --> 01:23:30,406
[APPLAUSE CONTINUES]
1807
01:23:30,406 --> 01:23:32,595
1808
01:23:32,595 --> 01:23:35,470
DAVID J. MALAN: This is about 20 years late, but thank you so much, Jenny.
1809
01:23:35,470 --> 01:23:39,290
We have an I took CS50 t-shirt for you.
1810
01:23:39,290 --> 01:23:42,050
On the way out too, we have some CS50 stress balls for you.
1811
01:23:42,050 --> 01:23:44,008
Cannot wait to see your final projects.
1812
01:23:44,008 --> 01:23:45,800
Coming up if you'd like to chat with Jenny.
1813
01:23:45,800 --> 01:23:47,570
This was CS50.
1814
01:23:47,570 --> 01:23:48,140
See you soon.
1815
01:23:48,140 --> 01:23:50,890
[CHEERS AND APPLAUSE]
1816
01:23:50,890 --> 01:24:23,000
[MUSIC PLAYING]
140712
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.