The Following are the various data pump export parameters.
# File and directory related parameters
DIRECTORY - Directory object name
DUMPFILE - File name. Can specify multiple file names
LOGFILE - File name
NOLOGFILE - NY
COMPRESSION - [METADATA_ONLY] NONE. (Only metadata is compressed others not possible)
FILESIZE - Size of the dumpfile. If size is reached the job is stopped.
PARFILE - Parameter file name.
# Export Mode related parameters
FULL
SCHEMAS
TABLES
TABLESPACES
TRANSPORT_TABLESPACES
TRANSPORT_FULL_CHECK - Y N. Y - 2 way dependency. N - 1 way dependency.
# Export Filtering parameters
CONTENT - ALL DATA_ONLY METADATA_ONLY
EXCLUDE - object_type[:name_clause](Filter the type of database objects for eg., package, index etc)
INCLUDE - object_type[:name_clause] (name_clause lets you to apply SQL Function)
QUERY (Filters the data within an object)
SAMPLE - [[schema_name.]table_name:]sample_percent
# Estimation Parameters
ESTIMATE - {blocks statistics} (how much space the export job is going to consume)
ESTIMATE_ONLY - {Y N} (estimates space without actually starting the job)
# The network link parameters
NETWORK_LINK (database link) (Export data from remote database and put the dump file in the local server)
# Job related parameters
JOB_NAME (Explicitly name the job. The master table will be same as job_name)
STATUS (status of the job at a specified interval in time in seconds)
FLASHBACK_SCN (If you specify this parameter, the export will be consistent as of this SCN)
FLASHBACK_TIME (export will be consistent as of this time)
PARALLEL (Lets you decide the number of Worker process)
# Encryption Parameter
ENCRYPTION_PASSWORD - pwd (to prevent the encrypted data to be written as clear text in the dump file)
# Interactive mode export Paramters/Commands
ATTACH-schema.Export_jobname (To intervene a running export job)
Ctrl + C
Thanks
Monday, December 15, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment