Note that there are some explanatory texts on larger screens.

plurals
  1. POmapreduce.LoadIncrementalHFiles :: Not able to load the HFiles into HBase
    primarykey
    data
    text
    <p>I would greatly appreciate your help on this.</p> <p>I want to insert the out-put of my map-reduce job into a HBase table using HBase Bulk loading API : LoadIncrementalHFiles.doBulkLoad(new Path(), hTable);</p> <p>I am emmitting the KeyValue data type from my mapper and then using the HFileOutputFormat to prepare my HFiles using its default reducer.</p> <p>when I run my map-reduce job, it gets completed without any errors and it creates the outfile, however, the final step - inserting HFiles to HBase is not happening. I get the below error after my map-reduce completes:</p> <pre><code>13/09/08 03:39:51 WARN mapreduce.LoadIncrementalHFiles: Skipping non-directory hdfs://localhost:54310/user/xx.xx/output/_SUCCESS 13/09/08 03:39:51 WARN mapreduce.LoadIncrementalHFiles: Bulk load operation did not find any files to load in directory output/. Does it contain files in subdirectories that correspond to column family names? </code></pre> <p>But I can see the output directory containing:</p> <pre><code>1. _SUCCESS 2. _logs 3. _0/2aa96255f7f5446a8ea7f82aa2bd299e file (which contains my data) </code></pre> <p>I have no clue as to why my bulkloader is not picking the files from output directory.</p> <p>Below is the code of my Map-Reduce driver class:</p> <pre><code>public static void main(String[] args) throws Exception{ String inputFile = args[0]; String tableName = args[1]; String outFile = args[2]; Path inputPath = new Path(inputFile); Path outPath = new Path(outFile); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); //set the configurations conf.set("mapred.job.tracker", "localhost:54311"); //Input data to HTable using Map Reduce Job job = new Job(conf, "MapReduce - Word Frequency Count"); job.setJarByClass(MapReduce.class); job.setInputFormatClass(TextInputFormat.class); FileInputFormat.addInputPath(job, inputPath); fs.delete(outPath); FileOutputFormat.setOutputPath(job, outPath); job.setMapperClass(MapReduce.MyMap.class); job.setMapOutputKeyClass(ImmutableBytesWritable.class); job.setMapOutputValueClass(KeyValue.class); HTable hTable = new HTable(conf, tableName.toUpperCase()); // Auto configure partitioner and reducer HFileOutputFormat.configureIncrementalLoad(job, hTable); job.waitForCompletion(true); // Load generated HFiles into table LoadIncrementalHFiles loader = new LoadIncrementalHFiles(conf); loader.doBulkLoad(new Path(outFile), hTable); } </code></pre> <p>I would appreciate if anybody can help me in figuring out the wrong thing happening here which I avoiding my data insert to HBase.</p> <p>Thanks in advance !</p>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload