1 / 2

Data Migration With Oracle DataPump Tool

Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certain tools and processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data post migration. Appropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data. Vijay Kumar Tiwari "Data Migration With Oracle DataPump Tool " Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-2 | Issue-3 , April 2018, URL: https://www.ijtsrd.com/papers/ijtsrd11557.pdf Paper URL: http://www.ijtsrd.com/computer-science/database/11557/data-migration-with-oracle-datapump-tool-/vijay-kumar-tiwari<br>

ijtsrd
Download Presentation

Data Migration With Oracle DataPump Tool

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. International Research Research and Development (IJTSRD) International Open Access Journal Data Migration with Oracle DataPump Tool Vijay Kumar Tiwari Consultant, HCL Tech, Texas, United States International Journal of Trend in Scientific Scientific (IJTSRD) International Open Access Journal ISSN No: 2456 ISSN No: 2456 - 6470 | www.ijtsrd.com | Volume 6470 | www.ijtsrd.com | Volume - 2 | Issue – 3 Data Migration ith Oracle DataPump Tool IT Consultant Step3: Create a par file to import the data only on target database by below command The index creation, statistics and constraint were all removed during import. These would be eventually created, post the data was inserted into the tables. Hence, a separate par file was created which did not have constraint, index etc. The archive importing generation had to be avoided, so the disable archive option was present in the par file. Par file details as below. Imp_dp.par dumpfile=expdp_OBtables01.dmp,expdp_OBtables02 .dmplogfile=impdp_OB.logdirectory=DATAPUMP1 exclude=index,CONSTRAINT,statistics METRICS=YES transform=disable_archive_logging:y On OS prompt run the below command. impdp system/system123 PARFILE=impdp.par Note: - Herewe are considering that the user and tablespace of sufficient size was available on the target database. Step4: By running the import in above fashion, our import completed in just one hour. This method was successful and the import completed in an hour. The next task at hand was to create index, constraint on the tables etc. So the source database was revisited for information gathering. Once procured, this information was incorporated into th table using the below command ABSTRACT Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certain and processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data post Appropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data. CASE STUDY:- Data Migration is a procedure where data is sent to and from in between computer storages. It is usable in making a new system or enhancing an existing one. Data Migration is where data gets derived from an old system and infused into a new system. Certain tools and processes are used to attain such a cumbersome task as Data Migration without much interference. Data verification is another step which is very necessary to scrutinize and inspect the originating source and target source of data post-migration. Appropriate outlining along with specific reasoning is key to bring about a triumphant transfer of data. Create a par file to import the data only on target database by below command statistics and constraint were all removed during import. These would be eventually created, post the data was inserted into the tables. Hence, a separate par file was created which did not have constraint, index etc. The archive importing o be avoided, so the disable archive option was present in the par file. Par file details as dumpfile=expdp_OBtables01.dmp,expdp_OBtables02 .dmplogfile=impdp_OB.logdirectory=DATAPUMP1 exclude=index,CONSTRAINT,statistics Sharing an experience on a real-life migration of 5 TB tables in oracle through data pump tool TB tables in oracle through data pump tool life migration of 5 parallel=32 transform=disable_archive_logging:y On OS prompt run the below command. impdp system/system123 PARFILE=impdp.par Herewe are considering that the user and tablespace of sufficient size was available on the Initially it looked quite easy because we thought that it was just an export and import of a table to another database but that was quite not the case. To migrate the large tables, we had to break down the activity into 5 steps. Step1: The export of table through expdb was taken as shown below expdp \'/ as sysdba\' directory=DATAPUMP_COP dumpfile= DATAPUMP_COP expdp_OBtables%u.dmpfilesize logfile=cap_cs.log tables=CAP_CS.CAP_CS_OS_SLS Step2: The EXP dump files were copied in the target location and the table through metadata of EXP d was created. Post retaining the export data, the dump file was copied to the target database using similar configuration by OS command. we thought that it was just an export and import of a table to another database but that was quite not the case. To migrate we had to break down the activity The export of table through expdb was taken ning the import in above fashion, our import completed in just one hour. This method was successful and the import completed in an hour. The next task at hand was to create index, constraint on the tables etc. So the source database mation gathering. Once procured, this information was incorporated into the table using the below command ' directory=DATAPUMP_COP DATAPUMP_COP dumpfile= expdp_OBtables%u.dmpfilesize : : =100g =100g The EXP dump files were copied in the target location and the table through metadata of EXP dump was created. Post retaining the export data, the dump file was copied to the target database using similar @ IJTSRD | Available Online @ www.ijtsrd.com @ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 3 | Mar-Apr 2018 Apr 2018 Page: 1767

  2. International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470 Alter session enable parallel DDL; ALTER TABLE CAP_CS.CS_OB_CS_SLS ADD CONSTRAINT XPKCS_OB_CS_SLS PRIMARY KEY (BATCH_ID, REC_ID, SEQ_NBR) DISABLE ; CREATE UNIQUE CAP_CS.XPKCS_OB_CS_SLS CAP_CS.CS_OB_CS_SLS SEQ_NBR) PARALLEL 20; ALTER TABLE CAP_CS.CS_OB_CS_SLS ENABLE PRIMARY KEY; ALTER INDEX CAP_CS.XPKCS_OB_CS_SLS NOPARALLEL; This approach led us to complete the entire import procedure in 4 hours. This is a commendable downtime for a 5TB table to migrate. CONCLUSION: INDEX ON REC_ID, (BATCH_ID, Oracle Data Pump is a great tool for the fast movement of data between the databases and much of this performance is derived from the use of its features. @ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 3 | Mar-Apr 2018 Page: 1768

More Related