Note that there are some explanatory texts on larger screens.

plurals
  1. POHadoop pipes problem
    primarykey
    data
    text
    <p>I have configured hadoop in pseudo-distributed mode (single -node cluster) on my ubuntu 10.04.</p> <p>I have a problem in running hadoop pipes code my code is following:</p> <pre><code>#include "/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/include/hadoop/Pipes.hh" #include "/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/include/hadoop/TemplateFactory.hh" #include "/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/include/hadoop/StringUtils.hh" #include "/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs/hdfs.h" const std::string WORDCOUNT = "WORDCOUNT"; const std::string INPUT_WORDS = "INPUT_WORDS"; const std::string OUTPUT_WORDS = "OUTPUT_WORDS"; //hdfs fs; //hdfs writefile; hdfsFS fs; hdfsFile writefile; const char* writepath="/temp/mest"; class WordCountMap: public HadoopPipes::Mapper { public: HadoopPipes::TaskContext::Counter* inputWords; WordCountMap(HadoopPipes::TaskContext&amp; context) { fs = hdfsConnect("192.168.0.133", 54310); inputWords = context.getCounter(WORDCOUNT, INPUT_WORDS); } ~WordCountMap() { hdfsCloseFile(fs, writefile); } void map(HadoopPipes::MapContext&amp; context) { hdfsFile writefile = hdfsOpenFile(fs, writepath, O_WRONLY|O_CREAT, 0, 0, 0); std::vector&lt;std::string&gt; words = HadoopUtils::splitString(context.getInputValue(), " "); for(unsigned int i=0; i &lt; words.size(); ++i) { context.emit(words[i], "1"); } context.incrementCounter(inputWords, words.size()); } }; class WordCountReduce: public HadoopPipes::Reducer { public: HadoopPipes::TaskContext::Counter* outputWords; WordCountReduce(HadoopPipes::TaskContext&amp; context) { outputWords = context.getCounter(WORDCOUNT, OUTPUT_WORDS); } void reduce(HadoopPipes::ReduceContext&amp; context) { int sum = 0; while (context.nextValue()) { sum += HadoopUtils::toInt(context.getInputValue()); } context.emit(context.getInputKey(), HadoopUtils::toString(sum)); context.incrementCounter(outputWords, 1); } }; int main(int argc, char *argv[]) { return HadoopPipes::runTask(HadoopPipes::TemplateFactory&lt;WordCountMap, WordCountReduce&gt;()); } </code></pre> <p>I compiled it it compiled successfully.</p> <p>I run it with following command:</p> <blockquote> <p>bin/hadoop pipes -D java.pipes.recordreader=true -D java.pipes.recordwriter=true -input gutenberg -output manish_gut2 -program bin/cat</p> </blockquote> <p>but when i run it it shows following problems:</p> <pre><code>11/05/04 16:13:12 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/05/04 16:13:12 INFO mapred.FileInputFormat: Total input paths to process : 3 11/05/04 16:13:13 INFO mapred.JobClient: Running job: job_201105041611_0001 11/05/04 16:13:14 INFO mapred.JobClient: map 0% reduce 0% 11/05/04 16:13:24 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000000_0, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000000_0: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:24 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000001_0, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000001_0: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:29 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000001_1, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000001_1: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:29 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000000_1, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000000_1: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:35 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000000_2, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000000_2: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:35 INFO mapred.JobClient: Task Id : attempt_201105041611_0001_m_000001_2, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201105041611_0001_m_000001_2: Hadoop Pipes Exception: RecordReader not defined at /export/crawlspace/chris/work/branch-0.20/src/c++/pipes/impl/HadoopPipes.cc:692 in virtual void HadoopPipes::TaskContextImpl::runMap(std::string, int, bool) 11/05/04 16:13:44 INFO mapred.JobClient: Job complete: job_201105041611_0001 11/05/04 16:13:44 INFO mapred.JobClient: Counters: 3 11/05/04 16:13:44 INFO mapred.JobClient: Job Counters 11/05/04 16:13:44 INFO mapred.JobClient: Launched map tasks=8 11/05/04 16:13:44 INFO mapred.JobClient: Data-local map tasks=8 11/05/04 16:13:44 INFO mapred.JobClient: Failed map tasks=1 Exception in thread "main" java.io.IOException: Job failed! at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252) at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248) at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479) at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494) </code></pre> <p>I dont know what i'm doing wrong ? how can i run this program ? how to resolve these error?</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload