Skip to content

Instantly share code, notes, and snippets.

@prabhuvikas
Last active August 16, 2023 02:56
Show Gist options
  • Save prabhuvikas/a56728c9ca04a449c98c461121fe2c92 to your computer and use it in GitHub Desktop.
Save prabhuvikas/a56728c9ca04a449c98c461121fe2c92 to your computer and use it in GitHub Desktop.

Oracle RDS Data Dump with S3

Prerequisite for S3 Integration with RDS

  • Create a bucket in S3 your_oracle_dump_bucket of your choice. Make a note of this as this will be used for further policy creation.
  • Create policies and roles as mentioned using url
    • Create policy
    • Create role
  • Go to RDS>Database>{yourdatabase}>Connectivity and under Manage IAM roles select the role created above and select feature as S3_INTEGRATION and click Add role action
  • Create a option group RDS>Option Groups>Create Group or select existing option group and click Add Option. Select Option Name as S3_INTEGRATION parameter, Version as 1.0 , select Apply Immediately and click Add Option.
  • If a new option group is created then Modify the database and attach this option group and Apply immediately, wait for some time till the database becomes available.

Export RDS dump to S3

  • Login with the Master user of the RDS database.
  • Run the below query to create a export dump file by replacing you_dumpfile and your_schema values
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => '{your_dumpfile}', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'exp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''{your_schema}'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
  • You can check if the job is completed using the below query
SELECT owner_name, job_name, operation, job_mode,DEGREE, state FROM dba_datapump_jobs where state='EXECUTING'
  • Once you get zero records from above query move the

  • Use the below query to upload all the dump files to S3 bucket your_oracle_dump_bucket

SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(
      p_bucket_name    =>  'your_oracle_dump_bucket', 
      p_prefix         =>  '', 
      p_s3_prefix      =>  '', 
      p_directory_name =>  'DATA_PUMP_DIR') 
AS TASK_ID FROM DUAL;
  • Take the task_id generated from previous query and fire the below query with task_id to get the status
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-{task_id}.log'))
  • Once the dump file is transferred then delete the .dmp file follow Clean Up

Cleanup Dump files

  • Post import/export activity the dump files must be deleted.
  • The following query will list all the dump files in the DATA_PUMP_DIR.
SELECT * FROM TABLE(rdsadmin.rds_file_util.listdir('DATA_PUMP_DIR')) ORDER BY MTIME;
  • Following query will help you to delete specific FILENAME from the above
begin
UTL_FILE.FREMOVE('DATA_PUMP_DIR','{FILENAME}'); 
END;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment