Dontcheff

Data Pump Legacy Mode

In DBA, Oracle database, Oracle utilities on May 19, 2011 at 16:42

During the past 10 years we have seen 3 new versions of Oracle: 9i, 10g and 11g. I am often asked and I do ask as well which is the best or which are the best features of Oracle. 10 persons, I bet there will be 10 different opinions. Here is how I see it:

Real Application Clusters and Data Guard in 9i
Grid Control and Data Pump in 10g
Automatic SQL Tuning and Edition-Based Redefinition in 11g

There have been so many new, very good and important features but let us say that they all are part of the progress of mankind 🙂 And the recent boom of IT. Perhaps?

In 11gR2, Oracle decides to introduce Data Pump Legacy Mode in order to provide backward compatibility for scripts and parameter files used for original export/import scripts. The documentation briefly says: “This feature enables users to continue using original Export and Import scripts with Data Pump Export and Import. Development time is reduced as new scripts do not have to be created.”

If you check AristaDBA’s Oracle Blog and read the first three paragraphs you will probably see what it is about. I totally agree with all written there.

And I don’t really get it, why do we have Data Pump Legacy Mode? Lazy Oracle clients badly need it? Exp/imp was so, so good that we must have it back? How about RAC Legacy Mode if I want to use gc_files_to_lock or freeze_db_for_fast_instance_recovery parameters? There really was a parameter called freeze_db_for_fast_instance_recovery, I am not making this up. Run this one:

SELECT kspponm, 
DECODE(ksppoflg, 1,'Obsolete', 2, 'Underscored') as "Status"
FROM x$ksppo
WHERE kspponm like '%freeze%'
ORDER BY kspponm;

Or how about if I want to run DBMS_SCHEDULER in DBMS_JOB mode 🙂 And if you want to run DB2 in Oracle mode?

However, the Data Pump Legacy Mode feature exists and once you use any conventional export/import parameter you enable data pump in legacy mode. Just one parameter is enough. In Oracle terminology, Data Pump enters legacy mode once it determines a parameter unique to original Export or Import is present. Of course some parameters like buffer, commit, compress, object_consistent, recordlength, resumable, statistics, compile, filesize, tts_owners, streams_configuration and streams_instantiation are simply ignored.

Now, here is a paradox or trivial documentation error: deferred_segment_creation is set to TRUE by default in 11gR2. Have a look at the documentation:

It is true. Data Pump does not ignore tables without segments: regardless of the mode it runs in. Note additionally that legacy mode can be activated by even adding just one ignorable parameter (like buffer for example):

 SQL> create table TEST (c1 number, c2 varchar2(10), c3 date) storage (initial 5M);

 Table created.

 SQL> select bytes, blocks, segment_type, segment_name from dba_segments where segment_name='TEST';

 no rows selected

 C:\>expdp julian/password dumpfile=data_pump_dir:abc_%U.dat schemas=julian include=TABLE:in('TEST') logfile=abc.log buffer=1024

 Export: Release 11.2.0.2.0 - Production on Wed May 11 07:53:45 2011

 Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
 
 Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
 With the Partitioning, OLAP, Data Mining and Real Application Testing options
  Legacy Mode Active due to the following parameters:
  Legacy Mode Parameter: "buffer=1024" Location: Command Line, ignored.
  Legacy Mode has set reuse_dumpfiles=true parameter.
 Starting "JULIAN"."SYS_EXPORT_SCHEMA_01":  julian/******** dumpfile=data_pump_dir:abc_%U.dat schemas=julian 
  include=TABLE:in('TEST') logfile=abc.log reuse_dumpfiles=true 
 Estimate in progress using BLOCKS method...
 Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
 Total estimation using BLOCKS method: 0 KB
 Processing object type SCHEMA_EXPORT/TABLE/TABLE
 . . exported "JULIAN"."TEST"                                 0 KB       0 rows
 Master table "JULIAN"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
 ******************************************************************************
 Dump file set for JULIAN.SYS_EXPORT_SCHEMA_01 is:
   C:\ORACLE\ADMIN\JMD\DPDUMP\ABC_01.DAT
 Job "JULIAN"."SYS_EXPORT_SCHEMA_01" successfully completed at 07:54:07

But it is not true that you cannot export tables with no segments. Here is the proof:

 C:\>exp userid=julian/abc file=C:\Oracle\admin\JMD\dpdump\a.dmp tables=julian.test

 Export: Release 11.2.0.2.0 - Production on Wed May 11 08:31:08 2011

 Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

 Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
 With the Partitioning, OLAP, Data Mining and Real Application Testing options
 Export done in AL32UTF8 character set and AL16UTF16 NCHAR character set

 About to export specified tables via Conventional Path ...
 . . exporting table                           TEST          0 rows exported
 Export terminated successfully without warnings.

But forget about this Legacy Mode. Do not use. Pretend it does not exit 🙂

Let us look now at some new features of Data Pump. Note that in 11gR2 couple of restrictions have been already removed:

– The restriction that in TABLES mode all tables had to reside in the same schema.
– The restriction that only one object (table or partition) could be specified if wildcards were used as part of the object name.

For RAC users: Data Pump worker processes can now be distributed across RAC instances, a subset of Oracle RAC instances, or confined to the instance where the Data Pump job is started. It is also now possible to start Data Pump jobs and run them on different Oracle RAC instances simultaneously.

For XMLType column users: there is a new DISABLE_APPEND_HINT value on the DATA_OPTIONS parameter which disables the APPEND hint while loading the data object.

For EBR users: specific editions can be exported and imported. Using the SOURCE_EDITION parameter upon export and the import TARGET_EDITION parameter you can export a certain edition of the database and import into a specific edition of the database.

P.S. I think Data Pump is among the Oracle tools with least bugs! Although it is hard to say nowadays what is a bug, what is a feature, what is a duplicate enhancement… There are about 100 types of bugs.

P.P.S. I did attend the 11g beta OCP exam as soon as it was released. There were like more than 300 questions, not a single one on Data Pump! And 11g came with so many new features.

Advertisements
  1. I have difficulty purging prior editions that clutter up my system: we deploy new code every 2 weeks, hence a new edition every time:

    ORA$BASE –> v37 –> v38 for example.

    I actualized all objects from ORA$BASE to v37, and for v38, same thing, actualized all v37 into v38

    then i tried this:

    DROP EDITION v37 cascade;

    and no luck.. got this:
    SQL Error: ORA-38810: Implementation restriction: cannot drop edition that has a parent and a child

    Any words of wisdom?

    Cheers!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: