Note that there are some explanatory texts on larger screens.

plurals
  1. POException during start.dfs.sh
    text
    copied!<p>I set up a two node cluster using hadoop.</p> <p>When I run <code>start-dfs.sh</code> I got this error:</p> <pre><code>starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-mohit-ubuntu.out slave: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-balaji-ubuntu.out slave: Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName slave: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName slave: at java.net.URLClassLoader$1.run(URLClassLoader.java:202) slave: at java.security.AccessController.doPrivileged(Native Method) slave: at java.net.URLClassLoader.findClass(URLClassLoader.java:190) slave: at java.lang.ClassLoader.loadClass(ClassLoader.java:307) slave: at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) slave: at java.lang.ClassLoader.loadClass(ClassLoader.java:248) slave: Could not find the main class: org.apache.hadoop.util.PlatformName. Program will exit. slave: Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/datanode/DataNode master: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-mohit-ubuntu.out master: starting secondarynamenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-mohit-ubuntu.out </code></pre> <p>After getting this error I changed the <code>HADOOP_CLASSPATH</code> to <code>export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$HADOOP_HOME/lib/commons*.jar:$HADOOP_HOME:$HADDOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/conf</code></p> <p>But it didn't help.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload