3

I was trying to run a matrix multiplication example presented by Mr. Norstadt under following link http://www.norstad.org/matrix-multiply/index.html. I can run it successfully with hadoop 0.20.2 but I tried to run it with hadoop 1.0.3 but I am getting following error. Is it the problem with my hadoop configuration or it is compatibility problem in the code which was written in hadoop 0.20 by author.Also please guide me that how can I fix this error in either case. Here is the error I am getting.

in thread "main" java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:180)
        at java.io.DataInputStream.readFully(DataInputStream.java:152)
        at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
        at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
        at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
        at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Thanks in advance

Regards, waqas

6
  • If it makes problems due it can't read the sequencefile header, why don't you regenerate the file? Or just use 20.2? Commented May 25, 2012 at 13:10
  • Thomas yes ofcourse 20.2 is an option but I wanted to use it with newer version(1.0.3). And if this is all working fine with 20.2 then it means that method for writing or reading sequence files is correct(at least for 20.2). Do you think that there might be difference in support of reading and writing sequence files in 20.2 and 1.0.3? Commented May 29, 2012 at 11:55
  • 1
    This error can only occur if the size of the input file is less than 4 bytes - can you provide the file size of the input file(s) in your original question Commented May 29, 2012 at 23:44
  • @Chris thanks.. I got the problem. Infact it reads _SUCCEESS file along with others while using FileSystem.listStatus method. Can you please tell me how to filter _SUCCESS files while using listStatus. Thanks Commented May 30, 2012 at 14:43
  • I see you've answered your own question - stackoverflow.com/questions/10817824/…, please post an answer linking to that question so as to mark this question as answered, and to help those that stumble across this question in future Commented May 30, 2012 at 15:21

2 Answers 2

5

I also encountered same problem, in my case it was giving exception because I haven't closed SequenceFile.Writer object. Problem got resolved when I added sequenceFileWriter.close() statement in my code.

Also if input in MapReduce program is output of some previous MapReduce program then you have to explicitly write code to ignore _SUCCESS file.

PS: I am using CHD4 Cloudera Hadoop configuration

Sign up to request clarification or add additional context in comments.

Comments

2

I got it right. Infact it was not filtering _SUCCESS file automatically and due to reading this log file exception mentioned in the question was occuring. I filtered the files and now its working fine. Here is the link how I filtered the files Filter log files(_success and _log) in FileSystem.liststatus

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.