All language subtitles for 012实例化 Transformers模型 (TensorFlow)

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified) Download
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:05,540 --> 00:00:07,870 How to instantiate a Transformers model? 2 00:00:07,870 --> 00:00:14,800 In this video we will look at how we can create and use a model from the Transformers library. 3 00:00:14,800 --> 00:00:20,130 As we've seen before, the TFAutoModel class allows you to instantiate a pretrained model 4 00:00:20,130 --> 00:00:23,490 from any checkpoint on the Hugging Face Hub. 5 00:00:23,490 --> 00:00:27,740 It will pick the right model class from the library to instantiate the proper architecture 6 00:00:27,740 --> 00:00:31,310 and load the weights of the pretrained model inside it. 7 00:00:31,310 --> 00:00:36,630 As we can see, when given a BERT checkpoint, we end up with a TFBertModel, and similarly 8 00:00:36,630 --> 00:00:39,890 for GPT-2 or BART. 9 00:00:39,890 --> 00:00:44,489 Behind the scenes, this API can take the name of a checkpoint on the Hub, in which case 10 00:00:44,489 --> 00:00:49,649 it will download and cache the configuration file as well as the model weights file. 11 00:00:49,649 --> 00:00:54,059 You can also specify the path to a local folder that contains a valid configuration file and 12 00:00:54,059 --> 00:00:56,739 a model weights file. 13 00:00:56,739 --> 00:01:02,480 To instantiate the pretrained model, the AutoModel API will first open the configuration file 14 00:01:02,480 --> 00:01:06,409 to look at the configuration class that should be used. 15 00:01:06,409 --> 00:01:13,509 The configuration class depends on the type of the model (BERT, GPT-2 or BART for instance). 16 00:01:13,509 --> 00:01:18,130 Once it has the proper configuration class, it can instantiate that configuration, which 17 00:01:18,130 --> 00:01:20,420 is a blueprint to know how to create the model. 18 00:01:20,420 --> 00:01:25,420 It also uses this configuration class to find the proper model class, which is combined 19 00:01:25,420 --> 00:01:28,470 with the loaded configuration, to load the model. 20 00:01:28,470 --> 00:01:33,759 This model is not yet our pretrained model as it has just been initialized with random 21 00:01:33,759 --> 00:01:34,759 weights. 22 00:01:34,759 --> 00:01:40,299 The last step is to load the weights from the model file inside this model. 23 00:01:40,299 --> 00:01:44,659 To easily load the configuration of a model from any checkpoint or a folder containing 24 00:01:44,659 --> 00:01:48,100 the configuration folder, we can use the AutoConfig class. 25 00:01:48,100 --> 00:01:54,270 Like the TFAutoModel class, it will pick the right configuration class from the library. 26 00:01:54,270 --> 00:01:58,869 We can also use the specific class corresponding to a checkpoint, but we will need to change 27 00:01:58,869 --> 00:02:03,280 the code each time we want to try a different model. 28 00:02:03,280 --> 00:02:07,490 As we said before, the configuration of a model is a blueprint that contains all the 29 00:02:07,490 --> 00:02:11,190 information necessary to create the model architecture. 30 00:02:11,190 --> 00:02:14,629 For instance the BERT model associated with the bert-base-cased checkpoint has 12 layers, 31 00:02:14,629 --> 00:02:21,790 a hidden size of 768, and a vocabulary size of 28,996. 32 00:02:21,790 --> 00:02:28,959 Once we have the configuration, we can create a model that has the same architecture as 33 00:02:28,959 --> 00:02:31,420 our checkpoint but is randomly initialized. 34 00:02:31,420 --> 00:02:36,080 We can then train it from scratch like any PyTorch module/TensorFlow model. 35 00:02:36,080 --> 00:02:40,870 We can also change any part of the configuration by using keyword arguments. 36 00:02:40,870 --> 00:02:45,860 The second snippet of code instantiates a randomly initialized BERT model with ten layers 37 00:02:45,860 --> 00:02:48,379 instead of 12. 38 00:02:48,379 --> 00:02:53,019 Saving a model once it's trained or fine-tuned is very easy: we just have to use the save_pretrained 39 00:02:53,019 --> 00:02:54,019 method. 40 00:02:54,019 --> 00:03:00,510 Here the model will be saved in a folder named my-bert-model inside the current working directory. 41 00:03:00,510 --> 00:03:13,120 Such a model can then be reloaded using the from_pretrained method. 4221

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.