Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>To expand on Tariq's links, and to simply detail one possible implementation for a <code>&lt;Text, IntWritable&gt;</code> treemap:</p> <pre><code>public class TreeMapWritable extends TreeMap&lt;Text, IntWritable&gt; implements Writable { @Override public void write(DataOutput out) throws IOException { // write out the number of entries out.writeInt(size()); // output each entry pair for (Map.Entry&lt;Text, IntWritable&gt; entry : entrySet()) { entry.getKey().write(out); entry.getValue().write(out); } } @Override public void readFields(DataInput in) throws IOException { // clear current contents - hadoop re-uses objects // between calls to your map / reduce methods clear(); // read how many items to expect int count = in.readInt(); // deserialize a key and value pair, insert into map while (count-- &gt; 0) { Text key = new Text(); key.readFields(in); IntWritable value = new IntWritable(); value.readFields(in); put(key, value); } } } </code></pre> <p>Basically the default serialization factory in Hadoop expects objects output to implement the Writable interface (the readFields and write methods detailed above). In this way you can pretty much extend any class to retro-fit the serialization methods.</p> <p>Another option is to enable Java Serialization (which uses default java serialization methods) <code>org.apache.hadoop.io.serializer.JavaSerialization</code> by configuring the <code>io.serializations</code> configuration property, but i wouldn't recommend that.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload