Skip to content

Instantly share code, notes, and snippets.

View nags28's full-sized avatar

Nagarjuna N nags28

View GitHub Profile
import org.jenkinsci.plugins.credentialsbinding.impl.CredentialNotFoundException
import hudson.util.Secret
import com.cloudbees.plugins.credentials.CredentialsScope
import org.jenkinsci.plugins.plaincredentials.impl.StringCredentialsImpl
import com.cloudbees.plugins.credentials.domains.Domain
node {
try{
withCredentials([
@nags28
nags28 / combineS3Files.py
Created March 4, 2020 09:17 — forked from jasonrdsouza/combineS3Files.py
Python script to efficiently concatenate S3 files
'''
This script performs efficient concatenation of files stored in S3. Given a
folder, output location, and optional suffix, all files with the given suffix
will be concatenated into one file stored in the output location.
Concatenation is performed within S3 when possible, falling back to local
operations when necessary.
Run `python combineS3Files.py -h` for more info.
'''
@nags28
nags28 / combineS3Files.py
Created March 4, 2020 09:17 — forked from jasonrdsouza/combineS3Files.py
Python script to efficiently concatenate S3 files
'''
This script performs efficient concatenation of files stored in S3. Given a
folder, output location, and optional suffix, all files with the given suffix
will be concatenated into one file stored in the output location.
Concatenation is performed within S3 when possible, falling back to local
operations when necessary.
Run `python combineS3Files.py -h` for more info.
'''
@nags28
nags28 / get_cloudwatch_logs.py
Created February 12, 2020 11:29 — forked from eldondevcg/get_cloudwatch_logs.py
Pull down cloudwatch logs with boto
# IF YOU INCUR HUGE COSTS WITH THIS OR IT BREAKS DON'T BLAME ME License
# This is a throw-away script I wrote to pull the json events for all of the streams from a cloudwatch log
# For some reason, the naive way to do vpc network logging does logging to different streams in a cloudwatch
# log based on interface.
# Great for diagnosing lots of things, and generating verbose logs, but for the broad-stroke analysis I was doing,
# all I really wanted was the basic data. This would have been easier if I had logged to s3, but I did not see a
# way to do that in 2 clicks.
group_name = 'CHANGEME'
@nags28
nags28 / Jenkinsfile
Created February 7, 2020 09:42 — forked from oifland/Jenkinsfile
Loops in Jenkinsfiles
// Related to https://issues.jenkins-ci.org/browse/JENKINS-26481
abcs = ['a', 'b', 'c']
node('master') {
stage('Test 1: loop of echo statements') {
echo_all(abcs)
}
stage('Test 2: loop of sh commands') {
@nags28
nags28 / checkDockerDisks.sh
Created January 31, 2020 06:27 — forked from robsonke/checkDockerDisks.sh
This Bash script will loop through all running docker containers on a host and list the disk usage per mount. In case it's breaching the 65%, it will email you.
#!/bin/bash
# get all running docker container names
containers=$(sudo docker ps | awk '{if(NR>1) print $NF}')
host=$(hostname)
# loop through all containers
for container in $containers
do
echo "Container: $container"