Home:ALL Converter>hadoop hdfs points to file:/// not hdfs://

hadoop hdfs points to file:/// not hdfs://

Ask Time:2012-09-12T23:06:32         Author:fair_data

Json Formatter

So I installed Hadoop via Cloudera Manager cdh3u5 on CentOS 5. When I run cmd

hadoop fs -ls /

I expected to see the contents of hdfs://localhost.localdomain:8020/

However, it had returned the contents of file:///

Now, this goes without saying that I can access my hdfs:// through

hadoop fs -ls hdfs://localhost.localdomain:8020/

But when it came to installing other applications such as Accumulo, accumulo would automatically detect Hadoop Filesystem in file:///

Question is, has anyone ran into this issue and how did you resolve it?

I had a look at HDFS thrift server returns content of local FS, not HDFS , which was a similar issue, but did not solve this issue. Also, I do not get this issue with Cloudera Manager cdh4.

Author:fair_data,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/12391226/hadoop-hdfs-points-to-file-not-hdfs
Donald Miner :

By default, Hadoop is going to use local mode. You probably need to set fs.default.name to hdfs://localhost.localdomain:8020/ in $HADOOP_HOME/conf/core-site.xml.\n\nTo do this, you add this to core-site.xml:\n\n <property>\n <name>fs.default.name</name>\n <value>hdfs://localhost.localdomain:8020/</value>\n</property>\n\n\nThe reason why Accumulo is confused is because it's using the same default configuration to figure out where HDFS is... and it's defaulting to file://",
2012-09-12T15:42:22
yy