Skip to content

Instantly share code, notes, and snippets.

@iambudi
Forked from viecode09/Hadoop_install_osx.md
Last active December 22, 2024 03:22
Show Gist options
  • Select an option

  • Save iambudi/7b10aac79c3049a574f566a664800a73 to your computer and use it in GitHub Desktop.

Select an option

Save iambudi/7b10aac79c3049a574f566a664800a73 to your computer and use it in GitHub Desktop.

Revisions

  1. iambudi revised this gist Apr 2, 2019. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion Hadoop_install_osx.md
    Original file line number Diff line number Diff line change
    @@ -29,7 +29,7 @@ Edit Core-site.xml, The file can be located at `/usr/local/Cellar/hadoop/3.1.1/l
    </property>
    ```

    Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/mapred-site.xml and by default will be blank add below config
    Edit mapred-site.xml, The file can be located at `/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/mapred-site.xml` and by default will be blank add below config
    ```
    <configuration>
    <property>
  2. iambudi revised this gist Apr 2, 2019. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion Hadoop_install_osx.md
    Original file line number Diff line number Diff line change
    @@ -47,7 +47,7 @@ Edit hdfs-site.xml, The file can be located at `/usr/local/Cellar/hadoop/3.1.1/l
    </property>
    </configuration>
    ```
    To simplify life edit a ~/.profile or ~/.zshrc and add the following commands. By default ~/.profile or ~/.zshrc might not exist.
    To simplify life edit a `~/.profile` or `~/.zshrc` or any shell profile used and add the following commands. By default the profile file might not exist.

    ```
    alias hstart="/usr/local/Cellar/hadoop/3.1.1/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/3.1.1/sbin/start-yarn.sh"
  3. iambudi revised this gist Apr 2, 2019. 1 changed file with 5 additions and 5 deletions.
    10 changes: 5 additions & 5 deletions Hadoop_install_osx.md
    Original file line number Diff line number Diff line change
    @@ -8,14 +8,14 @@ $ brew search hadoop
    $ brew install hadoop
    ```

    Hadoop will be installed at path /usr/local/Cellar/hadoop
    Hadoop will be installed at path `/usr/local/Cellar/hadoop`

    STEP 3: Configure Hadoop:

    Edit hadoop-env.sh, the file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh where 2.6.0 is the hadoop version. Change the line
    Edit hadoop-env.sh, the file can be located at `/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hadoop-env.sh` where 3.1.1 is the hadoop version. Change the line.


    Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/core-site.xml add between `<configuration></configuration>`
    Edit Core-site.xml, The file can be located at `/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/core-site.xml` add between `<configuration></configuration>`

    ```
    <property>
    @@ -38,7 +38,7 @@ Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/
    </property>
    </configuration>
    ```
    Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hdfs-site.xml add
    Edit hdfs-site.xml, The file can be located at `/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hdfs-site.xml` add
    ```
    <configuration>
    <property>
    @@ -63,7 +63,7 @@ Before running Hadoop format HDFS
    ```$ hdfs namenode -format```


    STEP 4: To verify if SSH Localhost is working check for files ~/.ssh/id_rsa and the ~/.ssh/id_rsa.pub files.
    STEP 4: To verify if SSH Localhost is working check for files `~/.ssh/id_rsa` and the `~/.ssh/id_rsa.pub` files.
    If they don’t exist generate the keys using below command

    ```$ ssh-keygen -t rsa``` :: Just press enter if asking for password
  4. iambudi revised this gist Apr 2, 2019. 1 changed file with 13 additions and 14 deletions.
    27 changes: 13 additions & 14 deletions Hadoop_install_osx.md
    Original file line number Diff line number Diff line change
    @@ -15,11 +15,8 @@ STEP 3: Configure Hadoop:
    Edit hadoop-env.sh, the file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh where 2.6.0 is the hadoop version. Change the line


    ```export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"```
    to
    Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/core-site.xml add between `<configuration></configuration>`

    ```export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc=" ```
    Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/core-site.xml add below config
    ```
    <property>
    <name>hadoop.tmp.dir</name>
    @@ -32,7 +29,7 @@ Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/li
    </property>
    ```

    Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/mapred-site.xml and by default will be blank add below config
    Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/mapred-site.xml and by default will be blank add below config
    ```
    <configuration>
    <property>
    @@ -41,7 +38,7 @@ Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/
    </property>
    </configuration>
    ```
    Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hdfs-site.xml add
    Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hdfs-site.xml add
    ```
    <configuration>
    <property>
    @@ -50,32 +47,34 @@ Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/li
    </property>
    </configuration>
    ```
    To simplify life edit a ~/.profile and add the following commands. By default ~/.profile might not exist.
    To simplify life edit a ~/.profile or ~/.zshrc and add the following commands. By default ~/.profile or ~/.zshrc might not exist.

    ```
    alias hstart=<"/usr/local/Cellar/hadoop/2.6.0/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/start-yarn.sh">
    alias hstop=<"/usr/local/Cellar/hadoop/2.6.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/stop-dfs.sh">
    alias hstart="/usr/local/Cellar/hadoop/3.1.1/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/3.1.1/sbin/start-yarn.sh"
    alias hstop="/usr/local/Cellar/hadoop/3.1.1/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/3.1.1/sbin/stop-dfs.sh"
    ```

    and source it

    ```$ source ~/.profile```
    ```$ source ~/.profile``` or ```$ source ~/.zshrc```

    Before running Hadoop format HDFS

    ```$ hdfs namenode -format```


    STEP 4: To verify if SSH Localhost is working check for files ~/.ssh/id_rsa and the ~/.ssh/id_rsa.pub files. If they don’t exist generate the keys using below command
    STEP 4: To verify if SSH Localhost is working check for files ~/.ssh/id_rsa and the ~/.ssh/id_rsa.pub files.
    If they don’t exist generate the keys using below command

    ```$ ssh-keygen -t rsa```
    ```$ ssh-keygen -t rsa``` :: Just press enter if asking for password

    Enable Remote Login: open Mac “System Preferences” -> “Sharing”. Check “Remote Login”

    Enable Remote Login: “System Preferences” -> “Sharing”. Check “Remote Login”
    Authorize SSH Keys: To allow your system to accept login, we have to make it aware of the keys that will be used

    ```$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys```

    Test login.
    Test login. It should not prompt for any password. Otherwise while run hadoop it will display error `Permission denied (publickey,password,keyboard-interactive)`

    ```
    $ ssh localhost
  5. @viecode09 viecode09 created this gist Mar 18, 2017.
    92 changes: 92 additions & 0 deletions Hadoop_install_osx.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,92 @@
    STEP 1: First Install HomeBrew, download it from http://brew.sh

    ```$ ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"```

    STEP 2: Install Hadoop
    ```
    $ brew search hadoop
    $ brew install hadoop
    ```

    Hadoop will be installed at path /usr/local/Cellar/hadoop

    STEP 3: Configure Hadoop:

    Edit hadoop-env.sh, the file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh where 2.6.0 is the hadoop version. Change the line


    ```export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"```
    to

    ```export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc=" ```
    Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/core-site.xml add below config
    ```
    <property>
    <name>hadoop.tmp.dir</name>
    <value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
    <description>A base for other temporary directories.</description>
    </property>
    <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
    </property>
    ```

    Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/mapred-site.xml and by default will be blank add below config
    ```
    <configuration>
    <property>
    <name>mapred.job.tracker</name>
    <value>localhost:9010</value>
    </property>
    </configuration>
    ```
    Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hdfs-site.xml add
    ```
    <configuration>
    <property>
    <name>dfs.replication</name>
    <value></value>
    </property>
    </configuration>
    ```
    To simplify life edit a ~/.profile and add the following commands. By default ~/.profile might not exist.

    ```
    alias hstart=<"/usr/local/Cellar/hadoop/2.6.0/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/start-yarn.sh">
    alias hstop=<"/usr/local/Cellar/hadoop/2.6.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/stop-dfs.sh">
    ```

    and source it

    ```$ source ~/.profile```

    Before running Hadoop format HDFS

    ```$ hdfs namenode -format```


    STEP 4: To verify if SSH Localhost is working check for files ~/.ssh/id_rsa and the ~/.ssh/id_rsa.pub files. If they don’t exist generate the keys using below command

    ```$ ssh-keygen -t rsa```

    Enable Remote Login: “System Preferences” -> “Sharing”. Check “Remote Login”
    Authorize SSH Keys: To allow your system to accept login, we have to make it aware of the keys that will be used

    ```$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys```

    Test login.

    ```
    $ ssh localhost
    Last login: Fri Mar 6 20:30:53 2015
    $ exit
    ```

    STEP 5: Run Hadoop

    ```$ hstart```

    and stop using

    ```$ hstop```