All language subtitles for 011实例化 Transformers模型 (PyTorch)

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified) Download
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:05,120 --> 00:00:07,440 How to instantiate a Transformers model?   2 00:00:08,640 --> 00:00:12,960 In this video we will look at how we can create  and use a model from the Transformers library.   3 00:00:14,160 --> 00:00:19,440 As we've seen before, the AutoModel class allows  you to instantiate a pretrained model from any   4 00:00:19,440 --> 00:00:24,960 checkpoint on the Hugging Face Hub. It will  pick the right model class from the library to   5 00:00:24,960 --> 00:00:30,800 instantiate the proper architecture and load the  weights of the pretrained model inside it. As we   6 00:00:30,800 --> 00:00:37,760 can see, when given a BERT checkpoint, we end up  with a BertModel, and similarly for GPT-2 or BART.   7 00:00:39,680 --> 00:00:43,440 Behind the scenes, this API can take  the name of a checkpoint on the Hub,   8 00:00:44,080 --> 00:00:48,400 in which case it will download and cache the  configuration file as well as the model weights   9 00:00:48,400 --> 00:00:54,800 file. You can also specify the path to a local  folder that contains a valid configuration file   10 00:00:54,800 --> 00:01:00,720 and a model weights file. To instantiate the  pretrained model, the AutoModel API will first   11 00:01:00,720 --> 00:01:04,960 open the configuration file to look at the  configuration class that should be used.   12 00:01:06,080 --> 00:01:12,240 The configuration class depends on the type of  the model (BERT, GPT-2 or BART for instance).   13 00:01:13,440 --> 00:01:18,160 Once it has the proper configuration class,  it can instantiate that configuration,   14 00:01:18,160 --> 00:01:23,920 which is a blueprint to know how to create the  model. It also uses this configuration class   15 00:01:23,920 --> 00:01:29,360 to find the proper model class, which is combined  with the loaded configuration, to load the model.   16 00:01:30,800 --> 00:01:35,520 This model is not yet our pretrained model as it  has just been initialized with random weights.   17 00:01:36,560 --> 00:01:42,960 The last step is to load the weights from the  model file inside this model. To easily load   18 00:01:42,960 --> 00:01:47,360 the configuration of a model from any checkpoint  or a folder containing the configuration folder,   19 00:01:48,000 --> 00:01:49,920 we can use the AutoConfig class.   20 00:01:51,040 --> 00:01:55,360 Like the AutoModel class, it will pick the  right configuration class from the library.   21 00:01:56,800 --> 00:02:01,360 We can also use the specific class corresponding  to a checkpoint, but we will need to change the   22 00:02:01,360 --> 00:02:08,320 code each time we want to try a different model.  As we said before, the configuration of a model is   23 00:02:08,320 --> 00:02:12,720 a blueprint that contains all the information  necessary to create the model architecture.   24 00:02:13,600 --> 00:02:19,680 For instance the BERT model associated with  the bert-base-cased checkpoint has 12 layers,   25 00:02:19,680 --> 00:02:29,120 a hidden size of 768, and a vocabulary size  of 28,996. Once we have the configuration,   26 00:02:29,680 --> 00:02:33,120 we can create a model that has the same  architecture as our checkpoint but is   27 00:02:33,120 --> 00:02:37,840 randomly initialized. We can then train it from  scratch like any PyTorch module/TensorFlow model.   28 00:02:38,800 --> 00:02:42,960 We can also change any part of the  configuration by using keyword arguments.   29 00:02:43,920 --> 00:02:49,280 The second snippet of code instantiates a  randomly initialized BERT model with ten layers   30 00:02:49,280 --> 00:02:56,160 instead of 12. Saving a model once it's trained  or fine-tuned is very easy: we just have to use   31 00:02:56,160 --> 00:03:02,880 the save_pretrained method. Here the model will  be saved in a folder named my-bert-model inside   32 00:03:02,880 --> 00:03:08,240 the current working directory. Such a model can  then be reloaded using the from_pretrained method. 4093

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.