0

I'm running a java app using hadoop-2.0.5-alpha. My code looks like:

FileSystem fileSystem = FileSystem.get(conf);
Path path = new Path("/tmp/sample.txt");
System.out.println(fileSystem.exists(path));

But I get an exception

com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status

I haven't been able to find much on what causes this error. Any thoughts?

5
  • Can you provide your environment? What's your HDFS version? It seems you are using hadoop-2.0.5-alpha in the client. But the sever version may be too slow. Type hadoop version to check the version. Commented Aug 7, 2013 at 1:46
  • Ah yes, you may be correct here. Let me try changing the version and I'll post back shortly. Commented Aug 7, 2013 at 1:48
  • That was it, thanks. Can you post that as an answer? Commented Aug 7, 2013 at 2:01
  • OK. Thanks! I'll post it. Commented Aug 7, 2013 at 3:14
  • I had the same issue (Hadoop server 2.10 in Docker, Spark 2.3.2+hadoop-client 2.6.5). It magically disappeared after recreating Hadoop container from scratch. Commented Nov 8, 2022 at 10:18

1 Answer 1

1

It seems you are using hadoop-2.0.5-alpha in the client. But the sever version may be too low.

Type hadoop version to check the version.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.