Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,730 --> 00:00:03,400
Yeah.
2
00:00:11,800 --> 00:00:17,120
Hi. Hi. Yeah, nice to meet you. Nice to meet
you as well. Yeah. I'm John Kim. Yeah, I'm the
3
00:00:17,400 --> 00:00:21,450
communication leader in today
is the first session.
4
00:00:22,810 --> 00:00:30,660
Talk about our life and our interest. I'll
do my best. Yeah, cause you introduce yourself briefly.
5
00:00:30,990 --> 00:00:39,780
So my name is Chris Ha. I'm currently a capital
fellow and I'm part of the Cat Digital, which is a
6
00:00:39,790 --> 00:00:40,820
business
unit that.
7
00:00:41,760 --> 00:00:48,630
Owns and manages a digital product. Yeah, within
the Caterpillar. Yeah. And I've been working in
8
00:00:48,640 --> 00:00:57,650
Caterpillar for 22 years, 20 years and before
that, three years at the utility company called Duke
9
00:00:57,660 --> 00:01:05,330
Energy. And currently I'm I live in Champaign and I
have a a wife who works at the university as
10
00:01:05,340 --> 00:01:07,290
well as a
daughter who's in college.
11
00:01:12,140 --> 00:01:17,150
One medial stay in character for 23 years.
Have you thought about our jobs at all?
12
00:01:18,320 --> 00:01:24,430
Pardon me, Have you, have you thought about,
like, getting another job? Yeah, sure. I mean, I've,
13
00:01:24,440 --> 00:01:25,510
I've,
I've.
14
00:01:26,610 --> 00:01:33,900
Has some opportunity to explore, yeah, other things,
but the frankly Caterpillar has been giving me
15
00:01:33,910 --> 00:01:41,160
a lot of great opportunities in a lot of
different area and most of the project I was fortunate
16
00:01:41,660 --> 00:01:44,050
enough to to
to feel about.
17
00:01:44,830 --> 00:01:46,210
That
uh.
18
00:01:48,420 --> 00:01:51,850
Very, very passionate about the
work that I've done. Yeah, so.
19
00:01:53,160 --> 00:01:58,790
Although there might have been you know some,
some sometimes that other opportunity, I decided to
20
00:01:58,800 --> 00:02:04,310
just to stay at Caterpillar because I felt like
it's the right, you know, choice for me and my
21
00:02:04,320 --> 00:02:09,270
family as well as my career. So I've been very
happy with the Caterpillar so far. Yeah, so far I am
22
00:02:09,280 --> 00:02:15,650
very happy to. I kept trying to calculate last year,
last year and but I think it's it's a great
23
00:02:15,660 --> 00:02:18,970
company and it gives me
a ton of opportunities now.
24
00:02:20,480 --> 00:02:24,230
So let's talk about
what you do in cab.
25
00:02:24,930 --> 00:02:33,390
So I am a a technical leader within
CAD digital, yeah, and my job is to.
26
00:02:34,150 --> 00:02:37,150
Provide thought leadership
in some of the.
27
00:02:38,270 --> 00:02:44,120
Enterprise big initiative within the for the
for the enterprise such as the condition monitoring
28
00:02:44,390 --> 00:02:50,330
electrification and more recently
in artificial intelligence space yeah.
29
00:02:51,150 --> 00:03:02,120
So I made my you know role is that individual
as an individual contributor is to to to ensure that
30
00:03:02,130 --> 00:03:05,700
that we're providing
solid and robust technical.
31
00:03:06,780 --> 00:03:17,440
You know technical, technical basis for our digital
product as well as to explore maybe the latest
32
00:03:17,660 --> 00:03:20,070
AI technology where
we can provide.
33
00:03:20,770 --> 00:03:22,880
You know, a
good, a solid.
34
00:03:24,240 --> 00:03:29,890
Analytics, yeah, for to to
gain, you know, business value.
35
00:03:30,530 --> 00:03:37,380
By applying them to a real practical Caterpillar
applications. Yeah, when I was looking for a job
36
00:03:37,390 --> 00:03:45,400
opportunity in Kiev, I came across some apps
developed by Cal which is called Starlink and Master
37
00:03:45,410 --> 00:03:50,560
Solution. Those two apps were
developed by you and your team.
38
00:03:51,850 --> 00:03:58,630
Not, not really. The the the mind star that you're
talking about is by the. Actually the the IT was
39
00:03:58,640 --> 00:04:00,160
a.
40
00:04:00,900 --> 00:04:08,290
To currently ICS, the integrated component solutions
division as well as the resource industry
41
00:04:08,340 --> 00:04:15,910
mostly we have a team in Australia, yeah that that
does this. The CAT digital is not the the one
42
00:04:15,920 --> 00:04:18,520
that's behind it. However,
we do have some.
43
00:04:19,260 --> 00:04:23,310
UH, projects that we're going
to be collaborating with the.
44
00:04:23,980 --> 00:04:31,470
The autonomous and and automation group which actually
owns Mindstar technology as well as the the,
45
00:04:31,480 --> 00:04:37,010
the the commercial group and the MRI side of
it which is resource industry. So we have some
46
00:04:37,860 --> 00:04:43,270
integration and collaboration we hope to do in the
future, but it might start was not part of the
47
00:04:43,280 --> 00:04:46,190
CAT digital I was
into actually being into.
48
00:04:47,710 --> 00:04:53,060
In terms of still myself till I am a
stock investor and everyone was talking about like Tesla
49
00:04:53,460 --> 00:04:56,880
autonomous driving, so
compared to those?
50
00:04:58,950 --> 00:05:07,500
The big companies, right, How far can is behind
or ahead like in terms of technology? So I'm not
51
00:05:07,510 --> 00:05:12,800
the expert at it, but I know
that we've been using, you know, autonomous.
52
00:05:15,000 --> 00:05:21,290
You know, perception. Yeah. Type of computer vision
technology for a long time, yeah. And we've
53
00:05:21,400 --> 00:05:29,950
had, you know, the, the big mining truck
at the mine site automatically, you know, driverless
54
00:05:30,680 --> 00:05:37,810
mining truck. It's been, you know, doing for a
long time and millions of tons of, you know, the
55
00:05:37,820 --> 00:05:39,270
payload
has been.
56
00:05:41,190 --> 00:05:47,520
Carried by those those mindset we have.
So from that perspective you know automation autonomous
57
00:05:47,530 --> 00:05:54,760
truck especially has been you know in a Caterpillar
for for a long time now. You know the the
58
00:05:54,770 --> 00:06:01,580
technology wise that that might be slightly different
but I'm sure that we at least from our
59
00:06:01,590 --> 00:06:07,960
industry perspective not the moving industry, heavy
machinery industry we're we're one of the best
60
00:06:07,970 --> 00:06:09,730
if not the
best in that area.
61
00:06:11,540 --> 00:06:18,170
So currently like yeah, so we make some profits
by selling the software where it's just well the
62
00:06:18,180 --> 00:06:24,790
mind star is actually a software system as well
as hardware, right. So it's a hardware with the lot
63
00:06:24,800 --> 00:06:30,520
of hardware that has to be installed and but we
also have a software that supports it and we have
64
00:06:30,530 --> 00:06:36,410
they have an entire team that has to be
dedicated to, to support that both at the the dealership
65
00:06:36,420 --> 00:06:40,650
and customer level as well as the
team that that supports that from the.
66
00:06:40,980 --> 00:06:45,490
Mostly from our eye side of it then put
it back end technologies developed by I would think that
67
00:06:45,500 --> 00:06:53,680
ICS side. Yeah. So we don't sell the software as
it is that we sell it with the machine. And yeah,
68
00:06:53,690 --> 00:06:57,240
I think you have to have hardware in order to
solve the software. You have to have a lot of
69
00:06:57,250 --> 00:07:02,240
equipment right to perception, you have to actually
have to you know the Lidars and all those
70
00:07:02,250 --> 00:07:07,120
things installed and those things. So those other those
other the must have and then I'm sure we we
71
00:07:07,130 --> 00:07:11,020
do also have as a part of the
package right there you have to you have to.
72
00:07:12,150 --> 00:07:16,810
The software, but I don't know whether we actually
sell as a software by itself without any type of
73
00:07:16,820 --> 00:07:22,490
a support, right? Yeah. As I understand it,
Tesla in those other multiple companies, they said it
74
00:07:22,500 --> 00:07:30,870
is a subscription basis. So to use it like
you have to subscribe to the system service. Yeah, this
75
00:07:30,880 --> 00:07:36,190
one, I'm sure there is a subscription.
Mindstar is also various different things. There's mindset
76
00:07:36,200 --> 00:07:40,930
command, there's my * fleet, there's Mindstar
Health. So it's not just the one.
77
00:07:41,100 --> 00:07:47,360
Nine star, there's a different level of different things
than my star can do. So yeah, but that's
78
00:07:47,370 --> 00:07:57,460
not really my area of expertise. So what was
your flagship like application that you did? So, OK,
79
00:07:57,500 --> 00:08:04,060
so those are what we do right now. The
condition monitoring side of it is something that we do
80
00:08:04,070 --> 00:08:05,330
analytics
to.
81
00:08:06,030 --> 00:08:07,670
Ohh
find out.
82
00:08:08,460 --> 00:08:10,930
Whether there would be
anything wrong with the?
83
00:08:11,790 --> 00:08:21,000
Asset in advance so that you avoid downtime with maybe
we can sell parts in advance or we can we
84
00:08:21,010 --> 00:08:27,820
can reduce unplanned you know maintenance So. So we
try to predict failure in the future so that we
85
00:08:27,830 --> 00:08:36,340
can you know ask our customers to to plan ahead so
that they don't have to to to to suffer with the
86
00:08:36,350 --> 00:08:41,800
downtime because especially in the you
know resource industry the downtime is the.
87
00:08:42,170 --> 00:08:42,730
The
key.
88
00:08:44,910 --> 00:08:50,440
And better, better manage their their machine. Yeah,
the uptime, right, that that you don't want
89
00:08:50,450 --> 00:08:55,260
all of a sudden machines to go down so that
you can really do the work, right. So you see some
90
00:08:55,270 --> 00:09:00,760
symptoms right from the machine because machine has a lot
of sensors and what we try to do is to
91
00:09:00,770 --> 00:09:05,800
warn them, you know, ahead so they can
go and fix it before we actually break down.
92
00:09:06,660 --> 00:09:14,210
Uh, I came across the news article on the demo
like UH, you are getting a huge reward and and
93
00:09:14,220 --> 00:09:21,010
something about data analytics and what was it
about. I read articles but I didn't quite understand
94
00:09:21,020 --> 00:09:26,010
what you actually did for. Yeah. So I think
the the word that you're talking about is an annual
95
00:09:26,020 --> 00:09:34,170
award. It's called the Business of Data Symposium
and they're given to innovative idea that was
96
00:09:34,180 --> 00:09:36,270
done by the
North American analytics.
97
00:09:37,150 --> 00:09:38,060
Yeah,
and.
98
00:09:38,880 --> 00:09:46,690
We had some stiff competition such as a Verizon,
but it was Verizon yeah, yeah but but that thing
99
00:09:46,700 --> 00:09:48,880
they're like four
other for their.
100
00:09:50,570 --> 00:09:56,380
Companies that were nominated for that, but the lot we
were lucky that we got the word and we were
101
00:09:56,390 --> 00:10:04,000
very proud and I did analytics, analytics, but
this particular one was about a what's called ACE,
102
00:10:04,070 --> 00:10:10,020
an analytics crowdsourcing environment. So when people
want to do analytics and there are many
103
00:10:10,030 --> 00:10:16,640
different people that has to collaborate. So it's
more like a contribution model of a different
104
00:10:16,650 --> 00:10:20,730
people creating the analytics
model and able to quickly.
105
00:10:20,840 --> 00:10:21,590
Validate.
106
00:10:22,280 --> 00:10:31,830
And deploy and we created this environment so
that the time that start from creating the model,
107
00:10:31,840 --> 00:10:38,670
testing the model then deploying it usually take months to
do it. Now it's it's in, it's in days or
108
00:10:38,680 --> 00:10:45,720
weeks, yeah. So we had a huge improvement
and and streamlining that process the collaboration. So
109
00:10:46,760 --> 00:10:52,120
it's a it's a collaboration especially also with
not just among the data scientists or engineers.
110
00:10:52,200 --> 00:10:57,840
That the data scientist has to get a feedback
from the subject matter expert. But these people may
111
00:10:57,850 --> 00:11:06,240
not be coders right? So we can just work
with them in an environment where engineers work in. We
112
00:11:06,250 --> 00:11:13,240
have to have a very a collaborative environment
like website. So it's a web application that they
113
00:11:13,250 --> 00:11:19,260
can work on. So that when when developer creates and
try to test it then you get a feedback from
114
00:11:19,270 --> 00:11:19,580
the.
115
00:11:20,810 --> 00:11:27,380
The subject matter expert right away and and you.
It's more of a both their validation tool as a
116
00:11:27,390 --> 00:11:37,170
communications tool and it had improve our workflow process
by 5 to 10 X fold. I love always using
117
00:11:37,180 --> 00:11:43,420
the Microsoft Teams when I'm working with my
coworkers, so this system that you're talking about
118
00:11:43,430 --> 00:11:44,120
is.
119
00:11:44,980 --> 00:11:51,850
All that different from the Microsoft Teams that we
use. Yeah well Microsoft team is a very good
120
00:11:51,860 --> 00:11:59,390
collaboration tool but this is very specific to sort of
analytic side of it. So, so in order for if
121
00:11:59,400 --> 00:12:04,330
you're developing some sort of software or some sort
of a coding, right and then let's say the
122
00:12:04,340 --> 00:12:12,770
Python base and and lot of people work on a
Jupiter notebook and and and they when they look at the
123
00:12:12,780 --> 00:12:14,770
answers and they
they try to create.
124
00:12:14,910 --> 00:12:21,660
Another code or try to script something to visualize the
data tab to you know if you have to. I
125
00:12:21,670 --> 00:12:26,060
don't know whether your software engineers, but when
you have a Python code that's a 200 lines.
126
00:12:27,740 --> 00:12:32,160
Like 90% of them is just getting the data, how
to get the data and show the result. But the actual
127
00:12:32,170 --> 00:12:38,130
logic, Yeah, right. And so that logic is only
10% of the time, yeah. So we're trying to avoid
128
00:12:38,140 --> 00:12:43,790
having data scientists really focus on the one that's
a non value added work and have them already
129
00:12:43,800 --> 00:12:49,630
set up so they can focus on the just
the larger part of it and visualization. And the data
130
00:12:49,640 --> 00:12:55,160
extraction is already sort of a with the button that
you can get. But at the same time if you look
131
00:12:55,170 --> 00:12:57,210
at the the
result, right, it's not.
132
00:12:57,690 --> 00:13:04,140
That you as a data scientist or engineers has
to validate, it has to validate by some other people
133
00:13:04,150 --> 00:13:10,130
like experts, right. They're not necessarily the corner.
So if you give them Jupiter notebook to do
134
00:13:10,190 --> 00:13:18,130
or some sort of static picture to do
that's really not integrative or inter interactive, right. So
135
00:13:18,670 --> 00:13:25,180
they they can come in through a website link
with that they can provide the feedback and yeah so
136
00:13:25,190 --> 00:13:27,840
that type of a
very easy to use tool.
137
00:13:28,280 --> 00:13:33,550
But also with the people who are not expert on
the upper right, we're not expert in the in the
138
00:13:33,560 --> 00:13:38,800
coding, they can come in collaborate. So that process
has been worked out really well. So it used
139
00:13:38,810 --> 00:13:44,370
to be taking a lot of time. Now the
the workflow has been improved by 5X to 10X.
140
00:13:46,820 --> 00:13:54,590
So you yourself or like programmer coder and I don't
anymore but yeah I used to do a little bit but
141
00:13:54,600 --> 00:14:01,030
it's we have a much smarter once you're younger
generation who are better than me right. So I'm
142
00:14:01,040 --> 00:14:07,690
more of a provide them guidelines and maybe
directions and and you know provide my expertise and
143
00:14:07,700 --> 00:14:13,830
products and and and and the business side of it I
used to be 1/4 too like I used to program and
144
00:14:13,840 --> 00:14:16,270
and there was one
problem I was really.
145
00:14:16,630 --> 00:14:22,970
Out of but after a few years, if I look at
the code that I wrote myself I don't understand. I know
146
00:14:23,140 --> 00:14:28,880
my code. Yeah. Yeah. These days you know why
things are all different. Syntax is different than we
147
00:14:28,890 --> 00:14:33,820
have lots of language that that's more of a
higher level language that you know people don't have
148
00:14:33,830 --> 00:14:40,130
to really write a lot of lines. So we have
a lot of you know younger generation who can definitely
149
00:14:40,140 --> 00:14:46,370
do the better coding but so we just
have to hire the right people. So we.
150
00:14:46,640 --> 00:14:53,060
That digital, yeah, the the we have a
lot of software engineer, application developer et cetera. So
151
00:14:53,070 --> 00:14:54,140
programming
skills.
152
00:14:55,270 --> 00:15:00,520
One of the fundamental skills in computer science
and those IT industries, but these days with the
153
00:15:00,530 --> 00:15:05,170
AI and like
the automatic codings like.
154
00:15:05,330 --> 00:15:12,230
And it's not necessarily you think that in
the future like it's the AI intelligence artificial
155
00:15:12,240 --> 00:15:19,040
intelligence really wipe out or intermediate programmers are
well and I know that you can ask
156
00:15:19,050 --> 00:15:22,950
ChatGPT for instance to write a simple
code right. So there will do that.
157
00:15:24,660 --> 00:15:31,230
Thing that you know the simple code like that can
be done, but when you wanted to do a really
158
00:15:31,660 --> 00:15:37,730
professional level right, production level code that there
always has to be at yet right that the
159
00:15:37,740 --> 00:15:39,810
person has to sort of
take a look at the.
160
00:15:41,840 --> 00:15:48,270
As to how those things are done, but I I
guess the the process can be improved, right? Process can
161
00:15:48,280 --> 00:15:54,240
be yeah like translation for instance. Like if you were
to have a chat chippy to do 90% of the
162
00:15:54,250 --> 00:15:59,930
work, but person to chat, you know what the error
is. That would be much faster than have a manual
163
00:15:59,940 --> 00:16:07,060
person a person do it manually right? So it it
would be similar type of thing. Things like AI will
164
00:16:07,070 --> 00:16:10,550
always be trying to look
to to to to replace.
165
00:16:11,890 --> 00:16:18,640
Uh, but I guess not entirely though. Yeah, not
entirely, Not entirely. I think we still have the
166
00:16:18,710 --> 00:16:19,970
change
management.
167
00:16:20,680 --> 00:16:26,510
Nothing is is is a big deal from the
people perspective even even if let's say you've mentioned
168
00:16:26,520 --> 00:16:28,210
about the
Tesla, right? So.
169
00:16:29,230 --> 00:16:34,460
There's possible that everybody can just to ride on
a car and they will do automatic but I don't
170
00:16:34,470 --> 00:16:39,480
think that people will become most people will
comfortable doing that right. Yeah. So you still
171
00:16:39,490 --> 00:16:44,400
have to be override by the the the manual
and that might take a long time to actually everybody
172
00:16:44,410 --> 00:16:49,950
even though technology might be there and and has own
era of a 99.99% of the time they would be
173
00:16:49,960 --> 00:16:56,400
safe but I don't think people will adapt to it
because it's more of a how feel, you know human are
174
00:16:56,410 --> 00:16:57,920
comfortable with
it so similar.
175
00:16:59,390 --> 00:17:05,310
So AI, AI technology might be already there or
it's it's getting there that's going to take a
176
00:17:05,320 --> 00:17:07,200
little bit of time,
but what's important is that.
177
00:17:08,330 --> 00:17:14,280
Especially our younger generations and and as well
as us that you know markets changing all the
178
00:17:14,290 --> 00:17:20,920
time job markets right job and and we just have
to be aware of the all the changing in a much
179
00:17:20,930 --> 00:17:26,590
faster pace these days how to be aware of the
change. Yeah and then we have to to start leveraging
180
00:17:26,600 --> 00:17:33,460
these tools. Yeah that's because that's who has somebody
has to leverage it right. So if you if you
181
00:17:33,470 --> 00:17:38,470
keep up with this thing and always try
to reinvent yourself of of using the tool.
182
00:17:38,590 --> 00:17:44,070
Like this, I think you'll be OK if that's that's
my opinion. We don't have to actually have to be
183
00:17:44,080 --> 00:17:51,670
programmable, but you know how to use the
program, like like checkpoint, like we use Google every
184
00:17:51,680 --> 00:17:55,710
day. Yeah. So knowing
how to use the program.
185
00:17:57,010 --> 00:18:02,370
And not knowing is makes makes a huge difference.
In the future, just yeah, know how to use the
186
00:18:02,380 --> 00:18:03,770
tools. I
agree, yeah.
187
00:18:04,840 --> 00:18:13,170
Yeah. Have you used the CHP? Yes, we actually
have a have a couple of different projects that we're
188
00:18:13,180 --> 00:18:20,170
trying to. So our executives right are very highly interested
in chat JPT and we I happen to be a
189
00:18:20,180 --> 00:18:20,790
part
of the.
190
00:18:21,620 --> 00:18:27,690
A ICOE Center of Excellence team. So we
are looking to to leverage something like ChatGPT for
191
00:18:28,180 --> 00:18:31,270
Caterpillar application, but there's
a little bit of a.
192
00:18:33,170 --> 00:18:39,740
Caviat that the chat deputies are very general
tool, yeah, leverages most of the you know, public
193
00:18:39,750 --> 00:18:46,520
information to train itself. But if from the
industry perspective, you know things like AI, they
194
00:18:46,530 --> 00:18:52,280
wanted to use it to to gain competitive competitive
advantage, right. Which made in order to do a
195
00:18:52,290 --> 00:18:57,260
competitive advantage that you cannot just do use
any public data. You have to start using your
196
00:18:57,270 --> 00:19:02,930
private proprietary data. But you also
don't want this to be get out.
197
00:19:03,170 --> 00:19:05,870
So that other people can
use it. Umm, so that.
198
00:19:06,810 --> 00:19:14,040
Protection of IP and and and legal issue and
trained model to be staying in Europe sort of a
199
00:19:14,230 --> 00:19:19,720
industry in order to make sure that they
have a competitive advantage might be the one that
200
00:19:19,730 --> 00:19:26,020
industry focused on So which means is that legal
people to people has to be involved in in doing
201
00:19:26,030 --> 00:19:30,340
this thing and and usually when legal teams involved
it takes a little bit of time to make sure
202
00:19:30,350 --> 00:19:36,580
that everything is protected So while the technology
might out there while the use case might be
203
00:19:36,590 --> 00:19:37,520
identified
that.
204
00:19:37,590 --> 00:19:42,610
It might take a little bit of time for it
to be really productionized and and and make the real
205
00:19:42,620 --> 00:19:44,140
impact.
Hmm.
206
00:19:44,990 --> 00:19:50,600
So I was very curious, right, like when we
think about the search engine always Google is the king
207
00:19:50,610 --> 00:19:57,320
of the search engine industry like it's always,
but how come Microsoft suddenly like well Microsoft
208
00:19:57,330 --> 00:20:03,570
is an investor of this opening, right. So this
is just the other companies are also doing this
209
00:20:03,580 --> 00:20:10,520
thing, but it's more like a think like a the
human. So a lot of their conversational type of things
210
00:20:10,530 --> 00:20:15,130
and and I think the interface that
we're going to be doing from here.
211
00:20:15,200 --> 00:20:20,620
One would be more like a human like rather
than just type in a certain word or moral conversational
212
00:20:20,630 --> 00:20:26,380
like. Right. So I think what's the what's
the maybe difference between what Google did with the
213
00:20:26,390 --> 00:20:33,900
this generative AI which is the chat JPT is that
there would be a lot more human like I user
214
00:20:33,910 --> 00:20:39,840
interface. So that that's what they really, really
like. Right. So yeah you you, you know the
215
00:20:39,850 --> 00:20:45,540
little bit of nuances of if you
think about like even translation for instance.
216
00:20:46,280 --> 00:20:52,310
You know is a French French is a little
different than you know the Canadian French. Yeah if you
217
00:20:52,320 --> 00:21:00,990
wanted to to to write a story but from a, you
know a more of a a children type of story might be
218
00:21:01,000 --> 00:21:08,630
different than you know adult type of novel, right.
I think this type of a nuances is something
219
00:21:08,640 --> 00:21:15,050
that the chatty PT can can can do anywhere where
the human can you know experience a little bit of
220
00:21:15,060 --> 00:21:15,930
that
interface.
221
00:21:16,010 --> 00:21:16,910
And
even.
222
00:21:17,760 --> 00:21:22,580
Even for now that you you've been, you've
been experiencing with the, you know, the, the, the.
223
00:21:23,480 --> 00:21:24,120
Like
a?
224
00:21:25,570 --> 00:21:32,080
What is that Amazon. Amazon. Yeah. The the how
they ask questions like saying things like that so a
225
00:21:32,090 --> 00:21:38,340
lot of interfaces more like this conversational stuff
right. So they're Google the the search you
226
00:21:38,350 --> 00:21:45,250
know the if you really explain about the long
sentences that that I don't think that's how people
227
00:21:45,260 --> 00:21:49,470
usually search it but with the
chat T right that you you can.
228
00:21:50,200 --> 00:21:55,510
Even write the entire like two or three sentences
then it'll still process it and try to give you
229
00:21:55,520 --> 00:21:58,970
the best answer. But
what's important that it can.
230
00:21:59,900 --> 00:22:02,810
It's still based on the training
data set. I know there are some.
231
00:22:03,710 --> 00:22:09,400
Different things that they can actually learn from. Yeah,
but still I I had the other day I ask
232
00:22:09,410 --> 00:22:15,040
about myself, right The Who is correct. Chris
Hyatt Caterpillar and that gave me, you know,
233
00:22:15,050 --> 00:22:20,320
completely wrong answer because I don't have a
lot of public information that's out there for chat
234
00:22:20,330 --> 00:22:26,240
TPT to train about me. So you just need
to be careful, right. It's not there's some amazing things
235
00:22:26,250 --> 00:22:33,360
to chat you can do, but a lot of times
also it also gives you completely wrong answer or may even
236
00:22:33,370 --> 00:22:33,630
lie.
237
00:22:33,700 --> 00:22:40,280
About it yeah. And and or no answer. So it
it, you know right now it's any technology that goes
238
00:22:40,290 --> 00:22:45,240
through what's called hype curve. Yeah, so right now
we're at that peak of the hype or getting
239
00:22:45,250 --> 00:22:51,500
getting there as a hype. So they're all surely
be like a you know unexpected type of a.
240
00:22:52,490 --> 00:22:59,600
The I'm sorry the unreasonable or expectation there
will be there people will be disappointed there
241
00:22:59,610 --> 00:23:06,960
would be then then then that that that they'll
come down from the high expectation then they'll be
242
00:23:07,010 --> 00:23:12,900
yeah there will be yeah then then slowly is going
to get up there once they find people find the
243
00:23:12,910 --> 00:23:20,360
right use cases and you'll play plateau out to a
some sort of a good use cases for people can
244
00:23:20,370 --> 00:23:22,170
actually
either gain.
245
00:23:22,260 --> 00:23:27,320
Value out of it. But right now it's that's
going through that big hype cycle it in my opinion.
246
00:23:27,330 --> 00:23:35,330
Yeah. So other big tech companies like Amazon and
Google why they were just standing by. Well no.
247
00:23:35,340 --> 00:23:40,960
I'm I'm pretty sure that they're also doing the
research of course. Yeah. They have their own you
248
00:23:40,970 --> 00:23:48,820
know own version of a what's called generative
AI which is like a a chat. Yeah.
249
00:23:50,160 --> 00:23:55,810
It's just that, you know Chappie was one of the
the better ones that sort of came out. Yeah. And
250
00:23:55,820 --> 00:24:01,770
I'm sure other, I mean Google tried it that
they didn't really success in the last ad. But yeah,
251
00:24:01,880 --> 00:24:03,560
I'm sure
there's other other.
252
00:24:04,220 --> 00:24:09,500
You know the the Technet tech company that's doing
this right now? They have been doing it. It's
253
00:24:09,510 --> 00:24:14,850
not charged particles came out all of a sudden, right?
There's a GPT 2 dot O 2.5 then open sources
254
00:24:14,860 --> 00:24:16,290
that's based
on, right?
255
00:24:18,020 --> 00:24:20,510
Ah damn,
it's a church.
256
00:24:22,510 --> 00:24:29,170
Just to interface them like if there were and then
and then. More of a a demo version of what but.
257
00:24:29,870 --> 00:24:36,700
There has been there yeah the GT2 dot O1 one
that O that there that this has been being being
258
00:24:37,070 --> 00:24:43,740
studied throughout and being developed yeah yeah I know
you have the case like the when Apple made
259
00:24:44,290 --> 00:24:51,220
an iPhone the technology they use were
already existed so it's already interface revolutionized the
260
00:24:52,020 --> 00:24:59,110
interface to make people understand their interface and
so I when I think of the chat.
261
00:24:59,840 --> 00:25:05,610
To continue down, it's kind of something
similar case like it's just make people.
262
00:25:06,510 --> 00:25:12,360
Yeah, I think, I think they did also improve on
you know making more much more human like type of
263
00:25:12,370 --> 00:25:19,800
yeah type of a a logic and and conversational like.
So it wasn't just about the the UI side of it,
264
00:25:19,810 --> 00:25:25,160
but it had an improvement over time, OK. So that
maybe they felt like, hey, this is good enough to
265
00:25:25,170 --> 00:25:30,000
really make a big splash. So they they advertise it
right. They just came out. So a lot of people
266
00:25:30,010 --> 00:25:34,280
grabbed to it, but people have been studying
these type of thing for for a while.
267
00:25:39,470 --> 00:25:45,360
So you'll think Google down out of big tech
companies will come out with their own version of what
268
00:25:45,370 --> 00:25:53,150
Google already came out, right? So they did some.
I think they didn't they they tried to to.
269
00:25:53,850 --> 00:25:59,220
Demonstrate their own version to AI or or
advertisement. It didn't. It wasn't very successful at
270
00:25:59,230 --> 00:25:59,960
the
time, but.
271
00:26:00,740 --> 00:26:05,190
I'm I'm pretty sure they'll come up with the next
version, I don't know. So it's that this is still
272
00:26:05,200 --> 00:26:06,700
a
A.
273
00:26:07,380 --> 00:26:13,250
Topic for for for for different tech companies to
really. Yeah. To grab onto and and and make sure
274
00:26:13,260 --> 00:26:17,690
that that they're also
bring their own version, yeah.
275
00:26:19,010 --> 00:26:25,940
When I think about like those kind of AI and
big programs, all the big tests in the US can manage
276
00:26:25,950 --> 00:26:33,090
to do that. So if a smaller content like
South Korea and it's nothing more about in terms of
277
00:26:33,100 --> 00:26:34,660
population
but those.
278
00:26:37,830 --> 00:26:45,220
Better developed company and the countries can survive
how they can use the the AI technology to
279
00:26:45,330 --> 00:26:46,410
for
their industry.
280
00:26:47,740 --> 00:26:48,510
You
mean like?
281
00:26:49,800 --> 00:26:55,400
About yeah Labor and those I'm sure they're they're
using it and and I'm sure there are lots of
282
00:26:55,790 --> 00:26:57,840
there was there was
I mean Korea is a.
283
00:26:58,850 --> 00:27:06,140
Heavy hitter right among the the the tech community
throughout the globe. So I don't think that we
284
00:27:06,150 --> 00:27:10,530
should underestimate I mean certainly you
shouldn't we shouldn't we shouldn't under yeah
285
00:27:10,540 --> 00:27:17,930
underestimate their capability is you know the Samsung
and and all these tech company and I know
286
00:27:18,250 --> 00:27:24,470
yeah they're they're using probably on a daily
basis to to to own their marketing and their
287
00:27:24,480 --> 00:27:28,850
advertisement sure are they
going to come up with.
288
00:27:28,920 --> 00:27:30,540
Your
own aversion.
289
00:27:32,040 --> 00:27:38,510
I mean, if not they're already using their own
version for their development side of it. Yeah, you
290
00:27:38,520 --> 00:27:44,810
know they they they may not advertise it or try not
to sell it. But I am sure there's a lot of
291
00:27:44,820 --> 00:27:51,190
people there doing AI within that company
already think that technology itself is not very
292
00:27:51,200 --> 00:27:52,270
difficult
to.
293
00:27:53,240 --> 00:27:54,190
No,
I mean.
294
00:27:56,520 --> 00:28:02,570
Technology is out there, there there's the people
always developing it, right. So it's open source,
295
00:28:02,620 --> 00:28:09,090
it's open source and there's a community that that
that always been due so that the that there are.
296
00:28:10,100 --> 00:28:14,510
You know exploring. So so
I'm I'm I'm sure that.
297
00:28:16,390 --> 00:28:18,550
Companies like
you know.
298
00:28:20,570 --> 00:28:24,260
Taco or or
neighbor or these?
299
00:28:24,990 --> 00:28:31,060
Sort of what you would think as a more of
a Google like yeah. So yeah, I'm pretty sure that they
300
00:28:31,070 --> 00:28:37,820
already have their own version of AI. And and
I mean there's a reason why Google was not successful
301
00:28:37,830 --> 00:28:45,090
in Korea because neighbor or Cocoto has their
own version, which probably Korean film is a much
302
00:28:45,100 --> 00:28:51,630
stronger of getting what they want versus Google,
right. They have, I think they have like blogs
303
00:28:51,640 --> 00:28:54,940
and yeah, so, so there's
a so you have to.
304
00:28:55,010 --> 00:29:02,810
Understand the market also and they try to
customize yeah customize that their search engine for
305
00:29:02,820 --> 00:29:03,550
for
these.
306
00:29:04,350 --> 00:29:07,220
Market, right. And then seems
like that was the case.
307
00:29:08,100 --> 00:29:14,560
If you look at most of the other companies
in other countries and globally, right Google is used.
308
00:29:15,300 --> 00:29:21,880
Almost, you know, dominantly. Yeah. Used. Except. Except
Korea. Yeah. And maybe China as well. But
309
00:29:21,920 --> 00:29:27,250
there, there is a reason why neighbour and
Kakatoe is doing better. Yeah. Justin. South Korea.
310
00:29:27,260 --> 00:29:33,830
Yeah. Yeah. Because. So they have a very
good version. Same thing, actually. Even for Uber, right.
311
00:29:33,840 --> 00:29:40,620
Uber. Uber is not very popular in Korea. There
was a legal issue. But it's just that story. Yeah,
312
00:29:40,630 --> 00:29:45,290
they got kicked out basically
by the government. Yeah. OK Yeah.
313
00:29:45,390 --> 00:29:51,980
It's not the technology here. It's not about cultural
issue was by the law. Yeah well that's the
314
00:29:52,070 --> 00:30:00,190
that that is that is a yeah that's that's a
side thing. I don't know why that other company other
315
00:30:00,430 --> 00:30:02,010
countries cannot
do that.
316
00:30:03,460 --> 00:30:07,200
Yeah, there's deeper story. I
didn't look into that, but yeah.
317
00:30:09,080 --> 00:30:17,110
Yeah, in the US like the government paid off
the like the taxi drivers drivers that by allowing.
37483
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.