Oracle 11g Logo

Datapump export job failed with ORA-01555: snapshot too old: rollback segment number x with name “_SYSSMU8$” too small

The ORA-1555 errors can happen when a query is unable to access enough undo to build a copy of the data when the query started.
Committed “versions” of blocks are maintained along with newer uncommitted “versions” of those blocks so that queries can access data as it existed in the database at the time of the query. These are referred to as “consistent read” blocks and are maintained using Oracle undo management.

Error logs:

ORA-31693: Table data object "RPROD"."AOUP_DAILY_STBACT_SUMM_DET" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-01555: snapshot too old: rollback segment number 8 with name "_SYSSMU8$" too small

As a part of the solution, set undo_retention to a higher value. also, take care of the size of undo tablespace, it should be large enough.

SQL> show parameter undo
NAME TYPE VALUE
------------------------------------ ----------- ---------------
undo_management string AUTO
undo_retention integer 900
undo_tablespace string UNDOTBS1
 
SQL> ALTER SYSTEM SET UNDO_RETENTION = 1800 scope=both;
System altered.
 
SQL> show parameter undo
NAME TYPE VALUE
------------------------------------ ----------- ---------------
undo_management string AUTO
undo_retention integer 1800
undo_tablespace string UNDOTBS1

Export was successful after above changes.

Oracle 11g Logo

ORA-39126: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS and KUPW$WORKER.DISPATCH_WO ORA-01187: cannot read from file because it failed verification tests

Data pump export backup failed due to below mentioned oracle errors. It was failed due to verification tests failed on temp file: temp01. As a part of solution, I have delete old temporary tablespace and added new temporary tablespace with new temp file.

Data Pump Export Error Logs:

Export: Release 11.2.0.1.0 - Production on Mon Jun 20 11:47:01 2016
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production
Starting "SYSTEM"."SYS_EXPORT_FULL_09": system/******** directory=data_pump_bkup dumpfile=full_db_export.dmp logfile=export.log full=y
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS []
ORA-01187: cannot read from file because it failed verification tests
ORA-01110: data file 201: '/u02/oradata/db1/temp01.dbf'
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.KUPW$WORKER", line 8159
----- PL/SQL Call Stack -----
object line object
handle number name
0x1ed213be8 19028 package body SYS.KUPW$WORKER
0x1ed213be8 8191 package body SYS.KUPW$WORKER
0x1ed213be8 8980 package body SYS.KUPW$WORKER
0x1ed213be8 1651 package body SYS.KUPW$WORKER
0x1b97859d0 2 anonymous block
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WO
ORA-01187: cannot read from file because it failed verification te
ORA-01110: data file 201: '/u02/oradata/db1/temp01.dbf'
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.KUPW$WORKER", line 8159
----- PL/SQL Call Stack -----
object line object
handle number name
0x1ed213be8 19028 package body SYS.KUPW$WORKER
0x1ed213be8 8191 package body SYS.KUPW$WORKER
0x1ed213be8 8980 package body SYS.KUPW$WORKER
0x1ed213be8 1651 package body SYS.KUPW$WORKER
0x1b97859d0 2 anonymous block
Job "SYSTEM"."SYS_EXPORT_FULL_09" stopped due to fatal error at 11:

Solution:

Here is article on Delete old temporary tablespace and add new temporary tablespace with new temp file.

Thank you.¬†Stay Tune. ūüôā

Oracle 11g Logo

How to kill oracle datapump export job

We can kill oracle datapump job by two methods, First method includes killing data pump job via data pump export prompt and another method includes running SQL package on SQL prompt as sysdba.

//To simulate both the scenario, i am going to start oracle datapump export as below:

[oracle@dbserver ~]$ expdp system/manager full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Export: Release 11.2.0.3.0 – Production on Fri Apr 11 16:43:56 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.3.0 – 64bit Production
Starting “SYSTEM”.”SYS_EXPORT_FULL_01″: system/******** full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Estimate in progress using BLOCKS method…
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3.431 GB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
. .. …

First Method: Kill Data pump job by datapump export prompt:

//After initiating export backup, Kindly make sure datapump job by issuing the following query as sysdba:

SQL> select * from dba_datapump_jobs;

OWNER_NAME JOB_NAME
—————————— ——————————
OPERATION JOB_MODE
—————————— ——————————
STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS
—————————— ———- —————– —————–
SYSTEM SYS_EXPORT_FULL_01
EXPORT FULL
EXECUTING 1 1 3

//Now connect to datapump export prompt with JOB_NAME(attach) as below & issue the datapump command: KILL_JOB.

[oracle@dbserver ~]$ expdp system/manager attach=SYS_EXPORT_FULL_01
Export: Release 11.2.0.3.0 – Production on Fri Apr 11 17:01:13 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.3.0 – 64bit Production

Job: SYS_EXPORT_FULL_01
Owner: SYSTEM
Operation: EXPORT
Creator Privs: TRUE
GUID: F6C3A9B1D87AC043E0430100007F07F7
Start Time: Friday, 11 April, 2014 17:00:38
Mode: FULL
Instance: orcl
Max Parallelism: 1
EXPORT Job Parameters:
Parameter Name Parameter Value:
CLIENT_COMMAND system/******** full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
State: EXECUTING
Bytes Processed: 0
Current Parallelism: 1
Job Error Count: 0
Dump File: /backup/Export/Full_export.dmp
bytes written: 4,096

Worker 1 Status:
Process Name: DW00
State: EXECUTING
Object Schema: ELET
Object Type: DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
Completed Objects: 80
Worker Parallelism: 1

Export> KILL_JOB
Are you sure you wish to stop this job ([yes]/no): yes
[oracle@dbserver ~]$

//Datapump export job has been killed successfully. Same message will be display in datapump logfile as below:

[oracle@dbserver ~]$ expdp system/manager full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Export: Release 11.2.0.3.0 – Production on Fri Apr 11 16:43:56 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.3.0 – 64bit Production
Starting “SYSTEM”.”SYS_EXPORT_FULL_01″: system/******** full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Estimate in progress using BLOCKS method…
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3.431 GB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/ROLE
Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
. ..¬†…
Job “SYSTEM”.”SYS_EXPORT_FULL_01″ stopped due to fatal error at 16:45:35

—x—

Second Method: Kill Datapump job by running SQL package:

//After inititating the oracle datapump export, ensure datapump job by issuing the following query as sysdba:

SQL> select * from dba_datapump_jobs;

OWNER_NAME JOB_NAME
—————————— ——————————
OPERATION JOB_MODE
—————————— ——————————
STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS
—————————— ———- —————– —————–
SYSTEM SYS_EXPORT_FULL_01
EXPORT FULL
EXECUTING 1 1 3

//To kill datapump job, We need two parameter as input to SQL package are: JOB_NAME of the datapump job & OWNER_NAME who initiated export.

SQL> DECLARE
h1 NUMBER;
BEGIN
h1:=DBMS_DATAPUMP.ATTACH(‘SYS_EXPORT_FULL_01‘,’SYSTEM‘);
DBMS_DATAPUMP.STOP_JOB (h1,1,0);
END;
/

PL/SQL procedure successfully completed.
SQL>

//Datapump export job has been killed successfully, same message will be display in datapump logfile as below:

[oracle@dbserver ~]$ expdp system/manager full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Export: Release 11.2.0.3.0 – Production on Fri Apr 11 17:00:37 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.3.0 – 64bit Production

Starting “SYSTEM”.”SYS_EXPORT_FULL_01″: system/******** full=y directory=bkupdir dumpfile=Full_export.dmp logfile=Export_log.LOG
Estimate in progress using BLOCKS method…
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3.431 GB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/ROLE
Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
. .. …
Job “SYSTEM”.”SYS_EXPORT_FULL_01″ stopped due to fatal error at 17:01:23

By above two methods, we can kill oracle datapump export job.

***********************************************************************

abc

Note: Please don’t hesitate to revert in case of any query OR feedback.

Thanking you.

Have a easy life ahead.

Oracle 11g Logo

ORA-39171: Job is experiencing a resumable wait

While exporting my one of the database with data pump backup utility, Backup has been stuck with ORA-01691: unable to extend lob segment & ORA-39171: Job is experiencing a resumable wait ora errors.

Oracle error pretty much self explanatory.¬†After diagnosing ORA-01691 & ORA-39171, I came across the system tablespace having lack of space allocated, after adding new datafile ( alter tablespace… ) to system tablespace backup job has been resumed & successfully completed.

//Export Logs as below:

OracleDB:ora 19> expdp system/manager directory=dirdmp schemas=Prod dumpfile=Prod_260314%U.dmp logfile=Prod_260314.log compression=all parallel=4 Continue reading

Oracle 11g Logo

Data Pump Export-Import Performance tips

Oracle Data Pump offered lots of benefits & performance gain over original Export/Import. We can tremendously increase data pump export/import performance by considering following several methods:

Export performance tips:

  • parallelism in data pump.¬†

With the help of PARALLEL parameter ( tuning parameter ), we can achieve dynamic increase & decrease of resource consumption for each job. Worker (Parallel) count should be EXACT no of dump file & twice the Continue reading

Oracle 11g Logo

INCLUDE & EXCLUDE database object’s with Data Pump Export Utility

Most of the time we need to backup single database object instead of full schema & database, in this scenario we can use data pump parameter INCLUDE/EXCLUDE to export or import objects as per our convenience.
These parameters are used to limit the export/import to specific objects.

INCLUDE: Data Pump parameter to be specify objects to be included while export/import.
EXCLUDE: Data Pump parameter to be specify objects to be exclude while export/import. Continue reading

Oracle 11g Logo

Wildcard Search in Oracle Data Pump

Consider following case for wildcard search in Oracle Datapump export.

If your database contains thousands of tables and you wish to backup only selected tables then probably you will use TABLES datapump parameter by supplying multiple schema.table name in comma separated format. This will be feasible if tables count within 10 or 20.

In this case you can use wildcard search ( % – Percent ) in TABLES datapump parameter. Continue reading

Oracle 11g Logo

Export oracle tablespace with Data Pump

We can also export tablespace[s] with the help of TABLESPACES parameter in data pump utility.

In this, we can backup only the tables contains in specified set of tablespaces. Table will be backup with the dependent objects.
Note: In this mode both object metadata and data is backup.

Syntax:
TABLESPACES = tablespace_1, tablespace_2, [,..] Continue reading

Oracle 11g Logo

Export Import oracle schema with data pump

We can export single as well as multiple schema’s/users with the help of SCHEMAS data pump parameter.

Consider following hands-on on exporting schema/user with the help of data pump utility:

Note:
If EXP_FULL_DATABASE role not granted to user who performing export operation, then user can only perform export his schema not others. (default)

 

Here we are exporting scott schema ( Single schema ) from orcl database:

expdp system/manager schemas=scott directory=data_pump_bkup dumpfile=scott_export.dmp logfile=scott_export.log

. . exported “SCOTT”.”DEPT” 5.937 KB 4 rows
. . exported “SCOTT”.”EMP” 8.570 KB 14 rows
. . exported “SCOTT”.”SALGRADE” 5.867 KB 5 rows
. . exported “SCOTT”.”BONUS” 0 KB 0 rows
Master table “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
/home/oracle/app/data_pump_bkup/scott_export.dmp
Job “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully completed at 11:58:35
Importing above exported schema(scott):
impdp system/manager schemas=scott directory=data_pump_bkup dumpfile=scott_export.dmp logfile=scott_import.log
–x–

 

we can also export multiple schema’s with the help of same SCHEMAS data pump parameter.

In following case, we are exporting scott & test schema’s from orcl database;

expdp system/manager schemas=scott,test directory=data_pump_bkup dumpfile=scott_test_export.dmp logfile=scott_test_export.log

. . exported “SCOTT”.”DEPT” 5.937 KB 4 rows
. . exported “SCOTT”.”EMP” 8.570 KB 14 rows
. . exported “SCOTT”.”SALGRADE” 5.867 KB 5 rows
. . exported “SCOTT”.”BONUS” 0 KB 0 rows
. . exported “TEST”.”DEPT” 0 KB 0 rows
. . exported “TEST”.”EMP” 0 KB 0 rows
Master table “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
/home/oracle/app/data_pump_bkup/scott_test_export.dmp
Job “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully completed at 12:07:36
Importing above exported schemas(scott & test):
impdp system/manager schemas=scott,test directory=data_pump_bkup dumpfile=scott_test_export.dmp logfile=scott_test_import.log

. . imported “SCOTT”.”DEPT” 5.937 KB 4 rows
. . imported “SCOTT”.”EMP” 8.570 KB 14 rows
. . imported “SCOTT”.”SALGRADE” 5.867 KB 5 rows
. . imported “SCOTT”.”BONUS” 0 KB 0 rows
. . imported “TEST”.”DEPT” 0 KB 0 rows
. . imported “TEST”.”EMP” 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job “SYSTEM”.”SYS_IMPORT_SCHEMA_01″ successfully completed at 12:09:06
–x–

Likewise we can Export-Import multiple schema’s with the help of data pump utility.

************************************************************************
Note: Please don’t hesitate to revert in case of any query OR feedback.

Thanking you.

Have a easy life ahead.

Oracle 11g Logo

Export Import oracle tables with Data Pump utility

We can export single as well as multiple tables with the help of TABLES data pump parameter.

Consider following hands-on on exporting tables with the help of data pump utility.

//Consider example on single table export, in this we are exporting table emp from scott schema. Continue reading

Oracle 11g Logo

Oracle full database backup with Data Pump Export Utility

Introduction to Oracle Data Pump Utility:
This utility is used to backup database into operating system files, called as dump file. This file is only imported or accessible by import data pump utility.

Roles required to perform¬†Data Pump Export-Import are EXP_FULL_DATABASE & IMP_FULL_DATABASE, Without these role we can’t do backup & restore database instead of schema level Export-Import. Continue reading

Oracle 12c Logo

Enhanced compression options in oracle 12c data pump

COMPRESSION_ALGORITHM parameter with options ( BASIC | LOW | MEDIUM | HIGH ) have been used to compress data pump dump files, Prior to oracle 12c compression algorithms have been used in RMAN backup only & only Basic compression has been used in data pump export.

But now on-words same feature is also available with data pump in oracle 12c edition. Continue reading

Oracle 12c Logo

Exporting views as tables – Oracle 12c Data Pump

Data Pump Export with VIEWS_AS_TABLES:

Now with the help of “VIEWS_AS_TABLES” export parameter we can export multiple views as tables, this smart feature of data pump is invented in Oracle 12c edition.

According to Oracle, VIEWS_AS_TABLES parameter exports data in unencrypted format & creates an unencrypted tables. If the data is sensitive then it is strongly recommended to enable encryption while unloading. Continue reading