Home:ALL Converter>$HADOOP_HOME is deprecated ,Hadoop

$HADOOP_HOME is deprecated ,Hadoop

Ask Time:2013-06-05T17:50:50         Author:Mazy

Json Formatter

I tried to install Hadoop on a single node cluster (my own labtop-ubuntu 12.04). I followed this tutorial and checked it line by line two times . http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

everything seems correct. I set all the core-site.xml ,mapred-site.xml ,hdfs-site.xml .

when I run the following command in hduser su :

hduser@maziyar-Lenovo-IdeaPad-U300s:~$ /usr/local/hadoop/usr/sbin/start-all.sh

I get the following errors :

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-namenode-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/masters: No such file or directory
starting jobtracker, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-jobtracker-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory

I added the export HADOOP_HOME_WARN_SUPPRESS="TRUE" into hadoop-env.sh and still same error.

On the file /home/hduser/.bashrc where I guess my error comming from I have :

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/jdk-7u10-linuxi586/usr/java/jdk1.7.0_10

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/usr/sbin

I added /usr/sbin as a bin directory because start-all.sh and commands are there.

I also tried "HADOOP_PREFIX" instead of "HADOOP_HOME" in bashrc file but still the same error.

I have this folders in my hadoop directory ,

maziyar@maziyar-Lenovo-IdeaPad-U300s:/usr/local/hadoop$ ls -lha
total 20K
drwxr-xr-x  5 hduser hadoop 4.0K May 30 15:25 .
drwxr-xr-x 12 root   root   4.0K May 30 15:25 ..
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 etc
drwxr-xr-x 12 hduser hadoop 4.0K Jun  4 21:29 usr
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 var

I downloaded the latest version of apache-hadoop last week: hadoop-1.1.2-1.i386.rpm

Author:Mazy,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/16936745/hadoop-home-is-deprecated-hadoop
VikasG :

I tried setting export HADOOP_HOME_WARN_SUPPRESS=\"TRUE\" in my conf/hadoop-env.sh file and the warning vanished. Although, I am still unsure of why this warning came in first place.",
2013-06-13T06:19:23
Agraj :

Replacing HADOOP_HOME by HADOOP_PREFIX in the ~/.bashrc solves this for me.\n\nDid you try logging out of the current session after making this change and tried again? The changes you make to your bash profile will come into effect when you login into the shell again. ",
2013-10-11T09:33:31
Alex Bitek :

Your bash session may still have HADOOP_HOME variable defined. Try to echo $HADOOP_HOME and see if you get any value. \n\nIf you get a value for HADOOP_HOME it means the shell gets it from some config files, check those files (~/.bashrc, ~/.profile, /etc/profile/, /etc/bash.bashrc, etc.) and remove the exported HADOOP_HOME variable.\n\nOpen a new session after you've set HADOOP_PREFIX environment variable instead of HADOOP_HOME in ~/.bashrc and you're sure $HADOOP_HOME is not exported in any config file, and you shouldn't see that warning message.",
2013-07-01T12:01:55
Adnan Khan :

Deprecated error means that a particular version you are using is considered unimportant or it will stop supporting soon as read on website.\n\nBy this I mean to say that you have installed Openjdk in hadoop. What I did is instead of installing openjdk I installed Oraclejdk. Maybe you should try doing that. \n\nLemme know if this helped.\n\nRegards.",
2016-03-05T03:53:44
yy