- Create a bucket in S3
your_oracle_dump_bucketof your choice. Make a note of this as this will be used for further policy creation. - Create policies and roles as mentioned using url
- Create policy
- Create role
- Go to
RDS>Database>{yourdatabase}>Connectivityand underManage IAM rolesselect the role created above and select feature asS3_INTEGRATIONand clickAdd roleaction - Create a option group
RDS>Option Groups>Create Groupor select existing option group and clickAdd Option. Select Option Name asS3_INTEGRATIONparameter, Version as1.0, selectApply Immediatelyand clickAdd Option. - If a new option group is created then Modify the database and attach this option group and Apply immediately, wait for some time till the database becomes available.
- Login with the Master user of the RDS database.
- Run the below query to create a export dump file by replacing
you_dumpfileandyour_schemavalues
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => '{your_dumpfile}', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'exp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''{your_schema}'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;- You can check if the job is completed using the below query
SELECT owner_name, job_name, operation, job_mode,DEGREE, state FROM dba_datapump_jobs where state='EXECUTING'-
Once you get zero records from above query move the
-
Use the below query to upload all the dump files to S3 bucket
your_oracle_dump_bucket
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(
p_bucket_name => 'your_oracle_dump_bucket',
p_prefix => '',
p_s3_prefix => '',
p_directory_name => 'DATA_PUMP_DIR')
AS TASK_ID FROM DUAL;- Take the
task_idgenerated from previous query and fire the below query withtask_idto get the status
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-{task_id}.log'))- Once the dump file is transferred then delete the
.dmpfile follow Clean Up
- Post import/export activity the dump files must be deleted.
- The following query will list all the dump files in the
DATA_PUMP_DIR.
SELECT * FROM TABLE(rdsadmin.rds_file_util.listdir('DATA_PUMP_DIR')) ORDER BY MTIME;- Following query will help you to delete specific
FILENAMEfrom the above
begin
UTL_FILE.FREMOVE('DATA_PUMP_DIR','{FILENAME}');
END;