Skip to content

Instantly share code, notes, and snippets.

@prabhuvikas
Last active August 16, 2023 02:56
Show Gist options
  • Save prabhuvikas/a56728c9ca04a449c98c461121fe2c92 to your computer and use it in GitHub Desktop.
Save prabhuvikas/a56728c9ca04a449c98c461121fe2c92 to your computer and use it in GitHub Desktop.

Revisions

  1. prabhuvikas renamed this gist Aug 16, 2023. 1 changed file with 0 additions and 0 deletions.
    File renamed without changes.
  2. prabhuvikas created this gist Aug 14, 2023.
    59 changes: 59 additions & 0 deletions oraclerds.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,59 @@
    # Oracle RDS Data Dump with S3

    ## Prerequisite for S3 Integration with RDS
    - Create a bucket in S3 `your_oracle_dump_bucket` of your choice. Make a note of this as this will be used for further policy creation.
    - Create policies and roles as mentioned using [url](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html)
    - Create policy
    - Create role
    - Go to `RDS>Database>{yourdatabase}>Connectivity` and under `Manage IAM roles` select the role created above and select feature as `S3_INTEGRATION` and click `Add role` action
    - Create a option group `RDS>Option Groups>Create Group` or select existing option group and click `Add Option`. Select Option Name as `S3_INTEGRATION` parameter, Version as `1.0` , select `Apply Immediately` and click `Add Option`.
    - If a new option group is created then Modify the database and attach this option group and Apply immediately, wait for some time till the database becomes available.

    ## Export RDS dump to S3
    - Login with the Master user of the RDS database.
    - Run the below query to create a export dump file by replacing `you_dumpfile` and `your_schema` values
    ```sql
    DECLARE
    hdnl NUMBER;
    BEGIN
    hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => '{your_dumpfile}', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'exp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
    DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''{your_schema}'')');
    DBMS_DATAPUMP.START_JOB(hdnl);
    END;
    ```
    - You can check if the job is completed using the below query
    ```sql
    SELECT owner_name, job_name, operation, job_mode,DEGREE, state FROM dba_datapump_jobs where state='EXECUTING'
    ```
    - Once you get zero records from above query move the

    - Use the below query to upload all the dump files to S3 bucket `your_oracle_dump_bucket`
    ```sql
    SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(
    p_bucket_name => 'your_oracle_dump_bucket',
    p_prefix => '',
    p_s3_prefix => '',
    p_directory_name => 'DATA_PUMP_DIR')
    AS TASK_ID FROM DUAL;
    ```
    - Take the `task_id` generated from previous query and fire the below query with `task_id` to get the status
    ```sql
    SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-{task_id}.log'))
    ```
    - Once the dump file is transferred then delete the `.dmp` file follow [Clean Up](#cleanup-dump-files)

    ## Cleanup Dump files
    - Post import/export activity the dump files must be deleted.
    - The following query will list all the dump files in the `DATA_PUMP_DIR`.
    ```sql
    SELECT * FROM TABLE(rdsadmin.rds_file_util.listdir('DATA_PUMP_DIR')) ORDER BY MTIME;
    ```
    - Following query will help you to delete specific `FILENAME` from the above
    ```sql
    begin
    UTL_FILE.FREMOVE('DATA_PUMP_DIR','{FILENAME}');
    END;
    ```