# File and Directory Related Parameters
SQLFILE - extracts the DDL from the export dump file.
REUSE_DATAFILES N,Y - Specifies whether it should use existing data files for creating tablespaces during an import
DIRECTORY
LOGFILE
NOLOGFILE
PARFILE
DUMPFILE
# Filtering Parameters
TABLE_EXISTS_ACTION - tells DP import what to do when a table already exists. You can specify the following values
SKIP - DP will skip if the table exists
APPEND - Appends rows to a table
TRUNCATE- Truncates the table and reloads the data from the export dumpfile.
REPLACE - Drops the table if exists and recreates it and reloads it.
CONTENT
EXCLUDE
INCLUDE
QUERY
Note 1: When you use CONTENT=DATA_ONLY, you cannot use either EXCLUDE or INCLUDE.
Note 2: When you use QUERY option DP will use only external tables data method, rather than direct path method.
# Job-Related Parameters
JOB_NAME
STATUS
PARALLEL
# Import Mode related Parameters
FULL
TABLES
SCHEMAS
TABLESPACES
TRANSPORT_TABLESPACES
TRANSPORT_FULL_CHECK - applicable only if you are using NETWORK_LINK parameter.
TRANSPORT_DATAFILES - used during transportable tablespaces to specify the list of data filesthe job should import into the target database.
# REMAP Parameters
REMAP_SCHEMA - from_schema:to_schema (move objects from one schema to another schema)
REMAP_TABLESPACE - 'source_tbs_name':'target_tbs_name' (Move objects from one tablespace to another tablespace)
REMAP_DATAFILE - 'filename_in_source_platform': 'filename_in_target_platform' (Useful during import to a different platform with different naming convention. It changes the source filename to target filename in the sql statements referenced in dump file.)
# Transform Parameters
TRANSFORM - transform_name:value[:object_type]
transform_name - SEGMENT_ATTRIBUTES=Y, N
STORAGE= Y,N
OID= Y,N
PCTSPCAE
object_type - Which type of objects should be transformed.
# Network Link Parameter
NETWORK_LINK - perform an import across the network without using dump files
# Flashback parameter
FLASHBACK_TIME
FLASHBACK_SCN
Thanks
Friday, December 19, 2008
Monday, December 15, 2008
Data Pump Export Parameters
The Following are the various data pump export parameters.
# File and directory related parameters
DIRECTORY - Directory object name
DUMPFILE - File name. Can specify multiple file names
LOGFILE - File name
NOLOGFILE - NY
COMPRESSION - [METADATA_ONLY] NONE. (Only metadata is compressed others not possible)
FILESIZE - Size of the dumpfile. If size is reached the job is stopped.
PARFILE - Parameter file name.
# Export Mode related parameters
FULL
SCHEMAS
TABLES
TABLESPACES
TRANSPORT_TABLESPACES
TRANSPORT_FULL_CHECK - Y N. Y - 2 way dependency. N - 1 way dependency.
# Export Filtering parameters
CONTENT - ALL DATA_ONLY METADATA_ONLY
EXCLUDE - object_type[:name_clause](Filter the type of database objects for eg., package, index etc)
INCLUDE - object_type[:name_clause] (name_clause lets you to apply SQL Function)
QUERY (Filters the data within an object)
SAMPLE - [[schema_name.]table_name:]sample_percent
# Estimation Parameters
ESTIMATE - {blocks statistics} (how much space the export job is going to consume)
ESTIMATE_ONLY - {Y N} (estimates space without actually starting the job)
# The network link parameters
NETWORK_LINK (database link) (Export data from remote database and put the dump file in the local server)
# Job related parameters
JOB_NAME (Explicitly name the job. The master table will be same as job_name)
STATUS (status of the job at a specified interval in time in seconds)
FLASHBACK_SCN (If you specify this parameter, the export will be consistent as of this SCN)
FLASHBACK_TIME (export will be consistent as of this time)
PARALLEL (Lets you decide the number of Worker process)
# Encryption Parameter
ENCRYPTION_PASSWORD - pwd (to prevent the encrypted data to be written as clear text in the dump file)
# Interactive mode export Paramters/Commands
ATTACH-schema.Export_jobname (To intervene a running export job)
Ctrl + C
Thanks
# File and directory related parameters
DIRECTORY - Directory object name
DUMPFILE - File name. Can specify multiple file names
LOGFILE - File name
NOLOGFILE - NY
COMPRESSION - [METADATA_ONLY] NONE. (Only metadata is compressed others not possible)
FILESIZE - Size of the dumpfile. If size is reached the job is stopped.
PARFILE - Parameter file name.
# Export Mode related parameters
FULL
SCHEMAS
TABLES
TABLESPACES
TRANSPORT_TABLESPACES
TRANSPORT_FULL_CHECK - Y N. Y - 2 way dependency. N - 1 way dependency.
# Export Filtering parameters
CONTENT - ALL DATA_ONLY METADATA_ONLY
EXCLUDE - object_type[:name_clause](Filter the type of database objects for eg., package, index etc)
INCLUDE - object_type[:name_clause] (name_clause lets you to apply SQL Function)
QUERY (Filters the data within an object)
SAMPLE - [[schema_name.]table_name:]sample_percent
# Estimation Parameters
ESTIMATE - {blocks statistics} (how much space the export job is going to consume)
ESTIMATE_ONLY - {Y N} (estimates space without actually starting the job)
# The network link parameters
NETWORK_LINK (database link) (Export data from remote database and put the dump file in the local server)
# Job related parameters
JOB_NAME (Explicitly name the job. The master table will be same as job_name)
STATUS (status of the job at a specified interval in time in seconds)
FLASHBACK_SCN (If you specify this parameter, the export will be consistent as of this SCN)
FLASHBACK_TIME (export will be consistent as of this time)
PARALLEL (Lets you decide the number of Worker process)
# Encryption Parameter
ENCRYPTION_PASSWORD - pwd (to prevent the encrypted data to be written as clear text in the dump file)
# Interactive mode export Paramters/Commands
ATTACH-schema.Export_jobname (To intervene a running export job)
Ctrl + C
Thanks
Labels:
Data Pump
Wednesday, December 10, 2008
Directory Objects
Directory Objects
Directory Object is a logical structure that represents a physical location of a file system.
It is used in External Tables, UTL_FILE package and in Data Pump.
Important Points
Directory Object is a logical structure that represents a physical location of a file system.
It is used in External Tables, UTL_FILE package and in Data Pump.
Important Points
- Always owned by SYS user (Even created by another user).
- Names must be unique.
- You must have CREATE ANY DIRECTORY privilege to create Directory object.
- By default the owner gets the READ WRITE privilege on the directory object.
- Grant necessary privilege on this object if you want other user to access.
- READ and WRITE privilege means only oracle database will read and write on behalf of a user. The user has no direct access to these files.
Creating Directory Object
SQL> Create or Replace Directory data_pump_dir As ‘/oracle/export/dpump’;
SQL> Grant Read,Write on directory data_pump_dir to public;
Data Dictionary Views
SQL> Select * from ALL_DIRECTORIES;
It returns owner, directory name and directory path.
SQL> Select * from ALL_TAB_PRIVS where privilege in (‘READ’, ‘WRITE’);
It shows whether you have required read/write privilege.
Thanks
Labels:
10g Features,
Data Pump
Subscribe to:
Posts (Atom)