Note that there are some explanatory texts on larger screens.

plurals
  1. POCreate back of hbase data on S3 and the restore
    text
    copied!<p>I had hbase cluster running on <code>amazon ec2 nodes</code>. I want to create the backup of my hbase table. So, I came up with this <a href="http://blog.bizosys.com/2011/12/hbase-backup-to-amazon-s3.html#comment-form" rel="nofollow">tool</a>. I was able to create the back up of table dummy on s3 using the following command : </p> <pre><code>java com.bizosys.oneline.maintenance.HBaseBackup mode=backup.full backup.folder=s3://mybucket/ tables=dummy </code></pre> <p>But when i tried to restore the same data on some table(model). It failed with the following :</p> <pre><code>`13/10/24 10:52:52 WARN mapred.FileOutputCommitter: Output path is null in cleanup 13/10/24 10:52:52 WARN mapred.LocalJobRunner: job_local_0002 java.lang.NullPointerException at org.apache.hadoop.fs.s3.Jets3tFileSystemStore.retrieveBlock(Jets3tFileSystemStore.java:209) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) at $Proxy5.retrieveBlock(Unknown Source) at org.apache.hadoop.fs.s3.S3InputStream.blockSeekTo(S3InputStream.java:160) at org.apache.hadoop.fs.s3.S3InputStream.read(S3InputStream.java:119) at java.io.DataInputStream.readFully(DataInputStream.java:195) at java.io.DataInputStream.readFully(DataInputStream.java:169) at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508) at org.apache.hadoop.io.SequenceFile$Reader.&lt;init&gt;(SequenceFile.java:1486) at org.apache.hadoop.io.SequenceFile$Reader.&lt;init&gt;(SequenceFile.java:1475) at org.apache.hadoop.io.SequenceFile$Reader.&lt;init&gt;(SequenceFile.java:1470) at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.initialize(SequenceFileRecordReader.java:50) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:522) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212) 13/10/24 10:52:53 INFO mapred.JobClient: Job complete: job_local_0002 13/10/24 10:52:53 INFO mapred.JobClient: Counters: 0 Error in Job completetion Params tablename inputputdir model s3://mybucket/Wed_Oct_23_19_45_49_IST_2013/model Access Failure to s3://mybucket/Wed_Oct_23_19_45_49_IST_2013/model , tries=1 </code></pre> <p>`.</p> <pre><code>java com.bizosys.oneline.maintenance.HBaseBackup mode=restore backup.folder=s3://mybucket/Wed_Oct_23_19_45_49_IST_2013 tables="model" </code></pre> <p>FYI, please don't suggest me that there is an option of installation of hbase as well as back up on EMR. That i know but for some reason i am not using it.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload