Note that there are some explanatory texts on larger screens.

plurals
  1. POError in reading a txt file on HDFS and copying/writing the content of it into a newly created file on LOCAL filesystem
    primarykey
    data
    text
    <p>I am trying to read a file on HDFS and copy the content of the file into a newly created local file using the following java program. FYI, I have installed hadoop single node cluster on my machine.</p> <p>HdfsCli.java</p> <pre><code>package com; import java.io.BufferedOutputStream; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.io.OutputStream; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FSDataInputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class HdfsCli { public void readFile(String file) throws IOException { Configuration conf = new Configuration(); String hadoopConfPath = "\\opt\\hadoop\\etc\\hadoop\\"; conf.addResource(new Path(hadoopConfPath + "core-site.xml")); conf.addResource(new Path(hadoopConfPath + "hdfs-site.xml")); conf.addResource(new Path(hadoopConfPath + "mapred-site.xml")); FileSystem fileSystem = FileSystem.get(conf); // For the join type of queries, output file in the HDFS has 'r' in it. // String type="r"; Path path = new Path(file); if (!fileSystem.exists(path)) { System.out.println("File " + file + " does not exists"); return; } FSDataInputStream in = fileSystem.open(path); String filename = file.substring(file.lastIndexOf('/') + 1, file.length()); OutputStream out = new BufferedOutputStream(new FileOutputStream( new File("/home/DAS_Pig/" + filename))); byte[] b = new byte[1024]; int numBytes = 0; while ((numBytes = in.read(b)) &gt; 0) { out.write(b, 0, numBytes); } conf.clear(); in.close(); out.close(); fileSystem.close(); } public static void main(String[] args) throws IOException { // TODO Auto-generated method stub HDFSClient hc = new HDFSClient(); hc.readFile("hdfs://localhost:9000//DasData//salaries.txt"); System.out.println("Successfully Done!"); } } </code></pre> <p>However, when I am running this code, the following error is coming:</p> <pre><code>Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4 at org.apache.hadoop.ipc.Client.call(Client.java:1066) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379) at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119) at org.apache.hadoop.hdfs.DFSClient.&lt;init&gt;(DFSClient.java:238) at org.apache.hadoop.hdfs.DFSClient.&lt;init&gt;(DFSClient.java:203) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123) at com.HDFSClient.readFile(HDFSClient.java:22) at com.HdfsCli.main(HdfsCli.java:57) </code></pre> <p>I am newb in hadoop development. Can anyone guide me in resolving this? Thank you!</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload