Note that there are some explanatory texts on larger screens.

plurals
  1. POEOFException thrown by a Hadoop pipes program
    primarykey
    data
    text
    <p>First of all, I am a newbie of Hadoop. </p> <p>I have a small Hadoop pipes program that throws java.io.EOFException. The program takes as input a small text file and uses hadoop.pipes.java.recordreader and hadoop.pipes.java.recordwriter. The input is very simple like: </p> <pre><code>1 262144 42.8084 15.9157 4.1324 0.06 0.1 </code></pre> <p>However, Hadoop will throw an EOFException, which I can't see the reason. Below is the stack trace:</p> <pre><code>10/12/08 23:04:04 INFO mapred.JobClient: Running job: job_201012081252_0016 10/12/08 23:04:05 INFO mapred.JobClient: map 0% reduce 0% 10/12/08 23:04:16 INFO mapred.JobClient: Task Id : attempt_201012081252_0016_m_000000_0, Status : FAILED java.io.IOException: pipe child exception at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:267) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) </code></pre> <p>BTW, I ran this on a fully-distributed mode (a cluster with 3 work nodes).</p> <p>Any help is appreciated! Thanks</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload