Let us go through some of the differences between Data-pump which is successor of the original Export-Import.
This blog will help you understand the key differences and features of both the utilities.
SrNo. | Data-pump | Export-Import |
1 | Datapump was introduced in Oracle 10g and Oracle recommends its usage in the subsequent releases. | Export-Import is available in versions prior to 10g and also in current 11g release 2. However it has been deprecated since Oracle 11g in favor of Datapump |
2 | Datapump export is 2 times faster than original export-import | Export is not as fast as the datapump Export |
3 | Datapump import is approximately 15 to 40 times faster than original export-import | Original Import is very slow as compared to Datapump Import. |
4 | Datapump uses direct path access method to access data. It thus bypasses the SGA and directly accesses the data blocks. It determines on its own which is the best way to access data. | We need to explicitly give DIRECT=y to enable direct path.If not mentioned it uses the conventional path which is very slow as compared to Directpath. Also the datapump direct path configuraton is much better than export-import. |
5 | Datapump jobs require creation of directory objects where the dumpfiles and logfiles are created. Users executing the datapump jobs need to have grants on this directory. This provides much better security. | Files and logs can be created anywhere where the user wants.This can lead users accessing data to which they should not have any access to. |
6 | Data can be imported or exported over a network link | No such thing can be done in original export-import. |
7 | It has a option called parallel which can increase the number of worker processes (processes which help in loading and unloading data in the dumpfile) which increases the speed of the job | No such option in original export import |
8 | Object selection is much fine grained as compared to original export import. Using ‘include’ and ‘exclude’ parameters we can have a very good control on what data needs to be exported or imported | Object selection is limited to tables,triggers,statistics, constraints,grants and indexes. |
9 | Once the job is started and the command prompt is shut down still the job runs in the background(it runs using server processes). | After the job is started and the command prompt is shut down the job is also killed.(Its completely dependent on the export-import clients). |
10 | We can obtain the status of the different worker processes along with the status of the job. | Not possible to explicitly view the status separately. |
11 | We can stop the job,kill the job and also add dump files in between the job execution | We cannot restart a job once its killed. We cannot also stop a job once its started without killing it. |
12 | We can generate a SQLfile during the import which will have all the DDLs about all objects that will be imported. | There is option of Indexfile which generates DDLs of indexes. |
13 | TABLE_EXISTS_ACTION Action to take if imported object already exists. Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE. | Not possible in original export-Import |
14 | We can generate the estimated size of the dumpfile without actually exporting the data | Not possible in original export-import |
15 | Version parameter exists which generates a dumpfile which is compatible to the version of the database where it will be imported. | Not possible in original export-Import |
Note:
Data-pump is most effective when its used for very large jobs.In case of small jobs original export-import is to be preferred.
1 thought on “Data-pump V/S Export-Import”