Skip to content

Instantly share code, notes, and snippets.

@thekensta
Created October 14, 2015 20:27
Show Gist options
  • Select an option

  • Save thekensta/21068ef1b6f4af08eb09 to your computer and use it in GitHub Desktop.

Select an option

Save thekensta/21068ef1b6f4af08eb09 to your computer and use it in GitHub Desktop.

Revisions

  1. thekensta created this gist Oct 14, 2015.
    23 changes: 23 additions & 0 deletions spark_s3a_instructions.sh
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,23 @@
    # For a local environment
    # Install hadoop and apache-spark via homebrew

    # Apache Spark conf file
    # libexec/conf/spark-defaults.conf
    # Make the AWS jars available to Spark
    spark.executor.extraClassPath /usr/local/Cellar/hadoop/2.7.1/libexec/share/hadoop/tools/lib/aws-java-sdk-1.7.4.jar:/usr/local/Cellar/hadoop/2.7.1/libexec/share/hadoop/tools/lib/hadoop-aws-2.7.1.jar
    spark.driver.extraClassPath /usr/local/Cellar/hadoop/2.7.1/libexec/share/hadoop/tools/lib/aws-java-sdk-1.7.4.jar:/usr/local/Cellar/hadoop/2.7.1/libexec/share/hadoop/tools/lib/hadoop-aws-2.7.1.jar

    # Add file
    # libexec/conf/hdfs-site.xml
    # http://stackoverflow.com/questions/30262567/unable-to-load-aws-credentials-when-using-spark-sql-through-beeline
    <?xml version="1.0"?>
    <configuration>
    <property>
    <name>fs.s3a.access.key</name>
    <value>xxx</value>
    </property>
    <property>
    <name>fs.s3a.secret.key</name>
    <value>xxx</value>
    </property>
    </configuration>