Skip to content

Instantly share code, notes, and snippets.

@avoidik
Last active May 10, 2025 19:13
Show Gist options
  • Save avoidik/9ce10e7ead1fe137af77a509828b62df to your computer and use it in GitHub Desktop.
Save avoidik/9ce10e7ead1fe137af77a509828b62df to your computer and use it in GitHub Desktop.

Revisions

  1. avoidik revised this gist May 9, 2025. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion repro.sh
    Original file line number Diff line number Diff line change
    @@ -45,7 +45,7 @@ eval $(aws configure export-credentials --format env)
    export AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN AWS_CREDENTIAL_EXPIRATION

    hyperfine \
    --prepare "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob ; dd if=/dev/urandom bs=1024 count=1000000 of=${SOURCE_FILE_PATH} conv=notrunc status=none" \
    --prepare "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob ; dd if=/dev/urandom bs=1024 count=10000000 of=${SOURCE_FILE_PATH} conv=notrunc status=none" \
    --cleanup "rm -f ${SOURCE_FILE_PATH}" \
    --conclude "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --min-runs 5 \
  2. avoidik revised this gist May 9, 2025. 1 changed file with 5 additions and 0 deletions.
    5 changes: 5 additions & 0 deletions repro.sh
    Original file line number Diff line number Diff line change
    @@ -10,6 +10,11 @@ if [[ -z "$AWS_REGION" ]]; then
    exit 1
    fi

    if [[ -z "$GEM_HOME" ]]; then
    echo "Error: please set GEM_HOME first!"
    exit 1
    fi

    commands_sanity=("ruby" "bundle" "aws" "dd" "hyperfine")

    for cmd_var in "${commands_sanity[@]}"; do
  3. avoidik revised this gist May 9, 2025. 1 changed file with 1 addition and 2 deletions.
    3 changes: 1 addition & 2 deletions app.rb
    Original file line number Diff line number Diff line change
    @@ -24,14 +24,13 @@ def initialize
    def upload
    file = File.basename(SOURCE_FILE_PATH)
    target_path = File.join(TARGET_DIRECTORY, file)
    multipart_chunk_size = CHUNK_SIZE

    File.open(SOURCE_FILE_PATH, 'r') do |local_file|
    options = {
    key: target_path,
    body: local_file,
    public: false,
    multipart_chunk_size: multipart_chunk_size,
    multipart_chunk_size: CHUNK_SIZE,
    concurrency: CONCURRENCY,
    }
    @directory.files.create(options)
  4. avoidik revised this gist May 9, 2025. 1 changed file with 2 additions and 2 deletions.
    4 changes: 2 additions & 2 deletions repro.sh
    Original file line number Diff line number Diff line change
    @@ -41,7 +41,7 @@ export AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN AWS_CREDENTIAL_

    hyperfine \
    --prepare "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob ; dd if=/dev/urandom bs=1024 count=1000000 of=${SOURCE_FILE_PATH} conv=notrunc status=none" \
    --cleanup "rm -f $SOURCE_FILE_PATH" \
    --cleanup "rm -f ${SOURCE_FILE_PATH}" \
    --conclude "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --min-runs 5 \
    --parameter-list test_chunk_size 5242880,10485760,104857600,1073741824 \
    @@ -73,7 +73,7 @@ hyperfine \
    'CHUNK_SIZE={test_chunk_size} CONCURRENCY={concurrency} ruby app.rb' \
    "aws configure set s3.multipart_chunksize {test_chunk_size} ; \
    aws configure set s3.max_concurrent_requests {concurrency} ; \
    aws s3 cp $SOURCE_FILE_PATH s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    aws s3 cp ${SOURCE_FILE_PATH} s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --export-markdown benchmark.md

    # hyperfine \
  5. avoidik revised this gist May 8, 2025. 1 changed file with 23 additions and 15 deletions.
    38 changes: 23 additions & 15 deletions repro.sh
    Original file line number Diff line number Diff line change
    @@ -47,31 +47,39 @@ hyperfine \
    --parameter-list test_chunk_size 5242880,10485760,104857600,1073741824 \
    --parameter-list concurrency 1,5,10 \
    --command-name 'measure-transfer-rate-fog-5mb-1' \
    --command-name 'measure-transfer-rate-fog-5mb-5' \
    --command-name 'measure-transfer-rate-fog-5mb-10' \
    --command-name 'measure-transfer-rate-awscli-5mb-1' \
    --command-name 'measure-transfer-rate-fog-10mb-1' \
    --command-name 'measure-transfer-rate-fog-10mb-5' \
    --command-name 'measure-transfer-rate-fog-10mb-10' \
    --command-name 'measure-transfer-rate-awscli-10mb-1' \
    --command-name 'measure-transfer-rate-fog-100mb-1' \
    --command-name 'measure-transfer-rate-fog-100mb-5' \
    --command-name 'measure-transfer-rate-fog-100mb-10' \
    --command-name 'measure-transfer-rate-awscli-100mb-1' \
    --command-name 'measure-transfer-rate-fog-1000mb-1' \
    --command-name 'measure-transfer-rate-fog-1000mb-5' \
    --command-name 'measure-transfer-rate-fog-1000mb-10' \
    --command-name 'measure-transfer-rate-awscli-5mb-1' \
    --command-name 'measure-transfer-rate-awscli-1000mb-1' \
    --command-name 'measure-transfer-rate-fog-5mb-5' \
    --command-name 'measure-transfer-rate-awscli-5mb-5' \
    --command-name 'measure-transfer-rate-awscli-5mb-10' \
    --command-name 'measure-transfer-rate-awscli-10mb-1' \
    --command-name 'measure-transfer-rate-fog-10mb-5' \
    --command-name 'measure-transfer-rate-awscli-10mb-5' \
    --command-name 'measure-transfer-rate-awscli-10mb-10' \
    --command-name 'measure-transfer-rate-awscli-100mb-1' \
    --command-name 'measure-transfer-rate-fog-100mb-5' \
    --command-name 'measure-transfer-rate-awscli-100mb-5' \
    --command-name 'measure-transfer-rate-awscli-100mb-10' \
    --command-name 'measure-transfer-rate-awscli-1000mb-1' \
    --command-name 'measure-transfer-rate-fog-1000mb-5' \
    --command-name 'measure-transfer-rate-awscli-1000mb-5' \
    --command-name 'measure-transfer-rate-fog-5mb-10' \
    --command-name 'measure-transfer-rate-awscli-5mb-10' \
    --command-name 'measure-transfer-rate-fog-10mb-10' \
    --command-name 'measure-transfer-rate-awscli-10mb-10' \
    --command-name 'measure-transfer-rate-fog-100mb-10' \
    --command-name 'measure-transfer-rate-awscli-100mb-10' \
    --command-name 'measure-transfer-rate-fog-1000mb-10' \
    --command-name 'measure-transfer-rate-awscli-1000mb-10' \
    'CHUNK_SIZE={test_chunk_size} CONCURRENCY={concurrency} ruby app.rb' \
    "aws configure set s3.multipart_chunksize {test_chunk_size} ; \
    aws configure set s3.max_concurrent_requests {concurrency} ; \
    aws s3 cp $SOURCE_FILE_PATH s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --export-markdown benchmark.md

    # hyperfine \
    # --show-output \
    # --runs 1 \
    # --parameter-list test_chunk_size 5242880,10485760,104857600,1073741824 \
    # --parameter-list concurrency 1,5,10 \
    # 'echo CHUNK_SIZE={test_chunk_size} CONCURRENCY={concurrency} command 1 ; sleep 1 ; exit 0' \
    # 'echo CHUNK_SIZE={test_chunk_size} CONCURRENCY={concurrency} command 2 ; sleep 1 ; exit 0'
  6. avoidik created this gist May 8, 2025.
    3 changes: 3 additions & 0 deletions Gemfile
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,3 @@
    source 'https://rubygems.org'

    gem 'fog-aws', '~> 3.26'
    45 changes: 45 additions & 0 deletions app.rb
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,45 @@
    require 'fog/aws'

    BUCKET_NAME = ENV['BUCKET_NAME']
    SOURCE_FILE_PATH = ENV['SOURCE_FILE_PATH']
    TARGET_DIRECTORY = ENV['TARGET_DIRECTORY']
    STORAGE_REGION = ENV["STORAGE_REGION"] || 'eu-central-1'
    CHUNK_SIZE = ENV['CHUNK_SIZE'].to_i || 104_857_600
    CONCURRENCY = ENV['CONCURRENCY'].to_i || 10

    class S3Backup
    def initialize
    @connection = Fog::Storage.new(
    provider: 'AWS',
    use_iam_profile: false,
    aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
    aws_secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],
    aws_session_token: ENV['AWS_SESSION_TOKEN'],
    aws_credentials_expire_at: ENV['AWS_CREDENTIAL_EXPIRATION'],
    region: STORAGE_REGION,
    )
    @directory = @connection.directories.get(BUCKET_NAME) || @connection.directories.create(key: BUCKET_NAME)
    end

    def upload
    file = File.basename(SOURCE_FILE_PATH)
    target_path = File.join(TARGET_DIRECTORY, file)
    multipart_chunk_size = CHUNK_SIZE

    File.open(SOURCE_FILE_PATH, 'r') do |local_file|
    options = {
    key: target_path,
    body: local_file,
    public: false,
    multipart_chunk_size: multipart_chunk_size,
    concurrency: CONCURRENCY,
    }
    @directory.files.create(options)
    end

    puts 'Upload completed successfully.'
    end
    end

    backup = S3Backup.new
    backup.upload
    77 changes: 77 additions & 0 deletions repro.sh
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,77 @@
    #!/usr/bin/env bash

    if [[ -z "$AWS_PROFILE" ]]; then
    echo "Error: please set AWS_PROFILE first!"
    exit 1
    fi

    if [[ -z "$AWS_REGION" ]]; then
    echo "Error: please set AWS_REGION first!"
    exit 1
    fi

    commands_sanity=("ruby" "bundle" "aws" "dd" "hyperfine")

    for cmd_var in "${commands_sanity[@]}"; do
    if ! command -v "$cmd_var" &> /dev/null ; then
    echo "Error: '$cmd_var' is not installed!"
    exit 1
    fi
    done

    bundle install

    TARGET_ACCOUNT="$(aws sts get-caller-identity --output text --query 'Account')"
    BUCKET_NAME="transfer-test-in-${TARGET_ACCOUNT}"

    if ! aws s3api head-bucket --bucket "$BUCKET_NAME" 2>/dev/null ; then
    aws s3api create-bucket --bucket "$BUCKET_NAME" \
    --create-bucket-configuration LocationConstraint="$AWS_REGION"
    aws s3api put-bucket-encryption --bucket "$BUCKET_NAME" \
    --server-side-encryption-configuration '{"Rules":[{"ApplyServerSideEncryptionByDefault":{"SSEAlgorithm":"AES256"}}]}'
    fi

    SOURCE_FILE_PATH="${PWD}/file.blob"

    export BUCKET_NAME="$BUCKET_NAME" SOURCE_FILE_PATH="$SOURCE_FILE_PATH" TARGET_DIRECTORY="transfer" STORAGE_REGION="$AWS_REGION"

    eval $(aws configure export-credentials --format env)

    export AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN AWS_CREDENTIAL_EXPIRATION

    hyperfine \
    --prepare "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob ; dd if=/dev/urandom bs=1024 count=1000000 of=${SOURCE_FILE_PATH} conv=notrunc status=none" \
    --cleanup "rm -f $SOURCE_FILE_PATH" \
    --conclude "aws s3 rm s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --min-runs 5 \
    --parameter-list test_chunk_size 5242880,10485760,104857600,1073741824 \
    --parameter-list concurrency 1,5,10 \
    --command-name 'measure-transfer-rate-fog-5mb-1' \
    --command-name 'measure-transfer-rate-fog-5mb-5' \
    --command-name 'measure-transfer-rate-fog-5mb-10' \
    --command-name 'measure-transfer-rate-fog-10mb-1' \
    --command-name 'measure-transfer-rate-fog-10mb-5' \
    --command-name 'measure-transfer-rate-fog-10mb-10' \
    --command-name 'measure-transfer-rate-fog-100mb-1' \
    --command-name 'measure-transfer-rate-fog-100mb-5' \
    --command-name 'measure-transfer-rate-fog-100mb-10' \
    --command-name 'measure-transfer-rate-fog-1000mb-1' \
    --command-name 'measure-transfer-rate-fog-1000mb-5' \
    --command-name 'measure-transfer-rate-fog-1000mb-10' \
    --command-name 'measure-transfer-rate-awscli-5mb-1' \
    --command-name 'measure-transfer-rate-awscli-5mb-5' \
    --command-name 'measure-transfer-rate-awscli-5mb-10' \
    --command-name 'measure-transfer-rate-awscli-10mb-1' \
    --command-name 'measure-transfer-rate-awscli-10mb-5' \
    --command-name 'measure-transfer-rate-awscli-10mb-10' \
    --command-name 'measure-transfer-rate-awscli-100mb-1' \
    --command-name 'measure-transfer-rate-awscli-100mb-5' \
    --command-name 'measure-transfer-rate-awscli-100mb-10' \
    --command-name 'measure-transfer-rate-awscli-1000mb-1' \
    --command-name 'measure-transfer-rate-awscli-1000mb-5' \
    --command-name 'measure-transfer-rate-awscli-1000mb-10' \
    'CHUNK_SIZE={test_chunk_size} CONCURRENCY={concurrency} ruby app.rb' \
    "aws configure set s3.multipart_chunksize {test_chunk_size} ; \
    aws configure set s3.max_concurrent_requests {concurrency} ; \
    aws s3 cp $SOURCE_FILE_PATH s3://${BUCKET_NAME}/${TARGET_DIRECTORY}/file.blob" \
    --export-markdown benchmark.md