1

I am trying to write a file in HDFS. Below is my sample code

URI uri = URI.create(sURI);
            System.setProperty(HADOOP_USER_NAME, grailsApplication.config.hadoop.user.name);
            Configuration conf = new Configuration();
            conf.set(FS_DEFAULT_NAME, grailsApplication.config.fs.default.name);
            conf.set(DFS_REPLICATION, grailsApplication.config.dfs.replication);
            Path path = new Path(uri);
            FileSystem file = FileSystem.get(uri, conf);
            FSDataOutputStream outputStream;
            if (file.exists(path))
                outputStream = file.append(new Path(uri));
            else outputStream = file.create(new Path(uri))


            outputStream.write(request.data.getBytes());
            outputStream.close();

I get the following below exception. Please advise what could i probably be doing wrong.

HDFS write failed org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.RecoveryInProgressException): Failed to close file /EligibilityDataFeederJob/status.txt. Lease recovery is in progress. Try again later.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:3071)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.appendFileInternal(FSNamesystem.java:2861)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.appendFileInt(FSNamesystem.java:3145)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.appendFile(FSNamesystem.java:3108)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.append(NameNodeRpcServer.java:598)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.append(ClientNamenodeProtocolServerSideTranslatorPB.java:415)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2036)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2034)

1 Answer 1

1

There is an operation called outputStream = file.append(new Path(uri)); in your code. Append operation generally works better if the replication factor is set to 1 in our code. Just check the replication factor you are using.This error occurs because there is a possibility that the replicas of a block may have different Generation Stamp values.

Sign up to request clarification or add additional context in comments.

1 Comment

I have dfs.replication=1 and using it in my code as conf.set(DFS_REPLICATION, grailsApplication.config.dfs.replication); In hdfs-site.xml also i have <property> <name>dfs.replication</name> <value>1</value> </property>

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.