Note that there are some explanatory texts on larger screens.

plurals
  1. POUnable to execute Map/Reduce job
    text
    copied!<p>I've been trying to figure out how execute my Map/Reduce job for almost 2 days now. I keep getting a ClassNotFound exception.</p> <p>I've installed a Hadoop cluster in Ubuntu using Cloudera CDH4.3.0. The .java file (DemoJob.java which is not inside any package) is inside a folder called inputs and all required jar files are inside inputs/lib.</p> <p>I followed <a href="http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_5_2.html" rel="nofollow">http://www.cloudera.com/content/cloudera-content/cloudera-docs/HadoopTutorial/CDH4/Hadoop-Tutorial/ht_topic_5_2.html</a> for reference.</p> <ol> <li><p>I compile the .java file using:</p> <pre><code>javac -cp "inputs/lib/hadoop-common.jar:inputs/lib/hadoop-map-reduce-core.jar" -d Demo inputs/DemoJob.java </code></pre> <p>(In the link, it says -cp should be "/usr/lib/hadoop/<em>:/usr/lib/hadoop/client-0.20/</em>". But I don't have those folders in my system at all)</p></li> <li><p>Create jar file using:</p> <pre><code>jar cvf Demo.jar Demo </code></pre></li> <li><p>Move 2 input files to HDFS (Now this is where I'm confused. Do I need to move the jar file to HDFS as well? It doesn't say so in the link. But if it is not in HDFS, then how does the hadoop jar .. command work? I mean how does it combine the jar file which is in Linux system and the input files which are in HDFS?)</p></li> <li><p>I run my code using:</p> <pre><code>hadoop jar Demo.jar DemoJob /Inputs/Text1.txt /Inputs/Text2.txt /Outputs </code></pre></li> </ol> <p>I keep getting <code>ClassNotFoundException : DemoJob</code>.</p> <p>Somebody please help.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload