All language subtitles for [SubtitleTools.com] Data Pump Import - Learning Oracle 12c [Video]

af Afrikaans
sq Albanian
am Amharic
ar Arabic Download
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:01,390 --> 00:00:03,790 In this lesson, we're going to look at data migration 2 00:00:03,790 --> 00:00:06,130 using Data Pump Import. 3 00:00:06,130 --> 00:00:09,010 So Data Pump Import is the partner tool 4 00:00:09,010 --> 00:00:10,290 to Data Pump Export. 5 00:00:10,290 --> 00:00:13,930 So data is exported, or taken out of the database, 6 00:00:13,930 --> 00:00:15,700 into files using Export. 7 00:00:15,700 --> 00:00:18,040 And Import will take it from the file 8 00:00:18,040 --> 00:00:20,870 back into a table in the database. 9 00:00:20,870 --> 00:00:23,290 So again, we used to use an older tool 10 00:00:23,290 --> 00:00:26,290 to import data, the imp command line 11 00:00:26,290 --> 00:00:29,860 tool, that could import data that had been exported 12 00:00:29,860 --> 00:00:31,930 using the exp tool. 13 00:00:31,930 --> 00:00:34,420 But starting in 10g, Oracle recommends 14 00:00:34,420 --> 00:00:36,380 the use of Data Pump Import. 15 00:00:36,380 --> 00:00:38,500 And primarily, because it has a number of different 16 00:00:38,500 --> 00:00:43,570 features, and it can be up to 15 times faster in importing data. 17 00:00:43,570 --> 00:00:45,100 It also has a number of features, 18 00:00:45,100 --> 00:00:47,890 such as compression, encryption, some important things that we 19 00:00:47,890 --> 00:00:50,470 can use that are important when we migrate data 20 00:00:50,470 --> 00:00:52,820 from one place to another. 21 00:00:52,820 --> 00:00:56,440 It's also important to note that Data Pump Import and Export are 22 00:00:56,440 --> 00:01:00,130 also integrated with tools such as Oracle Enterprise Manager 23 00:01:00,130 --> 00:01:05,300 and can be accessed from the GUI using those tools. 24 00:01:05,300 --> 00:01:07,050 So let's take a look at what we have here. 25 00:01:07,050 --> 00:01:09,480 We already have some exported files. 26 00:01:09,480 --> 00:01:11,520 In order to run Data Pump Import, 27 00:01:11,520 --> 00:01:14,130 we're going to create a parameter 28 00:01:14,130 --> 00:01:17,880 file that will specify how we want the data imported. 29 00:01:17,880 --> 00:01:20,880 So to verify something here first, 30 00:01:20,880 --> 00:01:27,500 let's log in as Scott and select star from dept_copy. 31 00:01:27,500 --> 00:01:31,160 So we have no table called dept_copy at this point. 32 00:01:31,160 --> 00:01:34,280 What we're going to do is we're going to import data 33 00:01:34,280 --> 00:01:38,180 from the dump file that was dumped out of the DEPT table 34 00:01:38,180 --> 00:01:40,600 into a table called dept_copy. 35 00:01:40,600 --> 00:01:44,510 So we need to construct our import parameter file. 36 00:01:44,510 --> 00:01:49,290 I'm just going to create a text file here called dept_imp.par. 37 00:01:52,380 --> 00:01:55,230 So let's add some commands here. 38 00:01:55,230 --> 00:01:58,230 User ID will be Scott. 39 00:01:58,230 --> 00:02:03,010 The directory object we're using will be dept_external. 40 00:02:03,010 --> 00:02:07,290 The dump file we'll specify as dept_exp01.dump. 41 00:02:10,700 --> 00:02:12,920 Notice that we have that file right here. 42 00:02:12,920 --> 00:02:16,680 It contains the data from the DEPT table. 43 00:02:16,680 --> 00:02:20,540 Logfile will be dept_imp.log. 44 00:02:20,540 --> 00:02:21,330 And here's the key. 45 00:02:21,330 --> 00:02:30,180 We're going to say remap_table equals scott.dept:dept_copy. 46 00:02:30,180 --> 00:02:32,700 Every reference there is in this dump file 47 00:02:32,700 --> 00:02:35,850 to the scott.dept table will now be 48 00:02:35,850 --> 00:02:39,460 mapped to a new table called dept_copy. 49 00:02:39,460 --> 00:02:41,490 Save and close this. 50 00:02:41,490 --> 00:02:48,300 And in order to invoke our Data Pump Import tool, we use impdp. 51 00:02:48,300 --> 00:02:50,130 parfile will be dept_imp.par. 52 00:02:53,110 --> 00:02:58,730 So that is our dept_imp parameter file, and hit Enter. 53 00:03:02,130 --> 00:03:06,570 And notice that it imports the scott.dept_copy table and puts 54 00:03:06,570 --> 00:03:07,980 four rows. 55 00:03:07,980 --> 00:03:12,210 So let's take a look, if we can see the data. 56 00:03:12,210 --> 00:03:14,630 Select star from dept_copy, which if you recall 57 00:03:14,630 --> 00:03:18,250 did not exist before we ran our import. 58 00:03:18,250 --> 00:03:19,980 And there is the data. 59 00:03:19,980 --> 00:03:21,550 Let's look at one more thing. 60 00:03:21,550 --> 00:03:24,850 What would happen if we ran this again, 61 00:03:24,850 --> 00:03:31,020 if it was our intent to load the same data into the table again? 62 00:03:31,020 --> 00:03:34,070 Notice that it says, completes with 1 error. 63 00:03:34,070 --> 00:03:39,180 And it says the table scott.dept_copy exists. 64 00:03:39,180 --> 00:03:41,430 And so it says any dependent metadata 65 00:03:41,430 --> 00:03:43,350 and data will be skipped. 66 00:03:43,350 --> 00:03:45,930 So if we want to do something like that, as load 67 00:03:45,930 --> 00:03:50,580 the data again, we need to make a modification to our parameter 68 00:03:50,580 --> 00:03:51,930 file. 69 00:03:51,930 --> 00:03:56,300 We're just going to add, table_exists_action 70 00:03:56,300 --> 00:03:58,130 equals append. 71 00:03:58,130 --> 00:04:01,100 So there is a number of different table_exists_action 72 00:04:01,100 --> 00:04:03,550 commands that we can use-- append, truncate, 73 00:04:03,550 --> 00:04:05,270 a number of different ones. 74 00:04:05,270 --> 00:04:09,150 So we want to append that data into the table again. 75 00:04:09,150 --> 00:04:15,560 We'll save it, and run our Data Pump Import again. 76 00:04:15,560 --> 00:04:20,240 Again, we're importing data from that dump file into the table. 77 00:04:20,240 --> 00:04:23,190 It says it's completed successfully. 78 00:04:23,190 --> 00:04:29,060 Let's log in as Scott, do a select star from dept_copy. 79 00:04:29,060 --> 00:04:32,790 And then notice that the data was imported again. 80 00:04:32,790 --> 00:04:34,580 So that's how we can use Data Pump 81 00:04:34,580 --> 00:04:37,610 Import to leverage data migration in an Oracle 82 00:04:37,610 --> 00:04:39,140 database. 6607

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.