Thursday, October 04, 2012

Hadoop - Single Node Setup

Wanted to try hadoop for a longtime, got a chance to take a test drive today. Primed a fresh  CentOS6 Linux VM on my windows box (thanks to VMware Player) & setup hadoop with the help of below instruction.

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

Everthing went well, until this error throwed up on ./start-all.sh

2012-10-04 19:18:13,900 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.lang.IllegalArgumentException: Does not contain a valid host:port authority: file:///
    at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:162)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:198)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:228)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:262)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:496)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)


Same issue has been reported before - https://issues.apache.org/jira/browse/HDFS-2515

Command "strace -fe open start-dfs.sh" tells you exactly where the configuration is getting loaded.


Resolution:

Even thought I configured the core-site.xml, mapred-site.xml & hdfs-site.xml under /usr/local/hadoop/conf/ folder, by default the system is referring to /etc/hadoop/ *.xml. Once I update the configuration files in /etc/hadoop location everything started working.

Very exciting, now that the environment is ready have to try some samples & step into mutli node cluster setup.







1 comment:

ramya parvathaneni said...

Hi,
Thanks for nice data provided easy way to learn training with real time on
hadoop online training
through the experienced experts