4

We are developing a project to process files using lambda functions triggered by an API Gateway request. The function then gets the file in a S3 bucket and starts reading it. Until this points, everything works like expected, but when the file reading starts we receive the following error:

(...)
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.24.amzn1.x86_64/jre/lib/rt.jar: error reading zip file
2017-02-06 19:15:06 <9025af71-eca0-11e6-82d2-9ff4b9184005> ERROR JRestlessHandlerContainer:339 - container failure
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.24.amzn1.x86_64/jre/lib/rt.jar: error reading zip file
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.24.amzn1.x86_64/jre/lib/rt.jar: error reading zip file
END RequestId: (some id)
REPORT RequestId: (some id)  Duration: 3047.44 ms    Billed Duration: 3100 ms        Memory Size: 1536 MB    Max Memory Used: 94 MB  

Exception in thread "main" java.lang.Error: java.lang.NoClassDefFoundError: java/lang/Throwable$WrappedPrintWriter
    at lambdainternal.AWSLambda.<clinit>(AWSLambda.java:59)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at lambdainternal.LambdaRTEntry.main(LambdaRTEntry.java:94)
Caused by: java.lang.NoClassDefFoundError:     java/lang/Throwable$WrappedPrintWriter
    at java.lang.Throwable.printStackTrace(Throwable.java:721)
    at lambdainternal.UserFault.trace(UserFault.java:43)
    at lambdainternal.UserFault.makeUserFault(UserFault.java:26)
    at lambdainternal.AWSLambda.startRuntime(AWSLambda.java:290)
    at lambdainternal.AWSLambda.<clinit>(AWSLambda.java:57)
    ... 3 more
START RequestId: (some id) Version: $LATEST
END RequestId: (some id)

We use our own custom file reading/process library (a Java project) due to the fact the file is also customized to our needs, we added it to our project using Maven. Our lambda jar was generated using the Maven Shade plugin:

<plugin>
         <groupId>org.apache.maven.plugins</groupId>
         <artifactId>maven-shade-plugin</artifactId>
         <version>2.3</version>
         <configuration>
                <createDependencyReducedPom>false</createDependencyReducedPom>
         </configuration>
         <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
  </plugin>

Testing the project locally, it works fine and we can get the file information. Our code uses JAX-RS and Spring to deal with the API Gateway (we aren't sure if this could interfere in the results). But, until now we weren't able to solve this problem when running the project using lambda. During our tests we increased the function timeout and memory on purpose, but no matter what the limits are, the error persists.

Thank you in advance.

6
  • Can you show your entire pom.xml file? Did you see this: docs.aws.amazon.com/lambda/latest/dg/… Commented Feb 19, 2017 at 22:18
  • This "container failure" that's shown there can indeed indicate that the error is on the AWS side. You should take this to their support team or the AWS forums. Commented Feb 20, 2017 at 6:13
  • Hi @JeshanBabooa, it's what we suspect too. We already posted this issue in the AWS forums as well, but until now we had no answer and decided to post it here too. Commented Feb 20, 2017 at 17:21
  • Hi @Zigglzworth, unfortunately I can't post entire pom.xml, but before we added the file processing library our function worked fine and the suggested plugin to generate the jar is the same we're using. Commented Feb 20, 2017 at 17:27
  • When you built the project you did so on your local machine (Right?) but maybe there are some native dependencies used somewhere. So try spinning up an Amazon Linux EC2 instance and building your java project there. Then upload that to your lambda function Commented Feb 20, 2017 at 22:52

2 Answers 2

1

Using log techniques, debuging our code and contacting AWS support we saw that one of our libraries was creating files inside a forbidden directory, as explained here: https://aws.amazon.com/lambda/faqs/ , and so the Lambda function was failing.

They explained that if you need to create files, you must use the /tmp directory, we didn't notice this until we had this problem. We read the documentation over and over, but this one thing still caught us, it happens, I think. Anyway, after changing this in the library the function now executes perfectly as expected.

Thank you all for the help.

Sign up to request clarification or add additional context in comments.

2 Comments

Could you point to a specific question from the FAQs? I couldn't find it by skimming.
Well, I couldn't find it either (that's odd, but I believe they've updated the page since the last time I saw it). Anyway, I found this other link that can be helpful: docs.aws.amazon.com/lambda/latest/dg/limits.html Search for something like 'The /tmp directory can be used for ...'
0

Are you still having the problem? It could be that Lambda service is being maintained in that particular region you are using. This error tells me that the JRE is corrupted and libraries provided by rt.jar cannot be accessed. Contact the support if necessary. You can check the AWS healths here.

2 Comments

Hi, yes, the error persists. We are still looking for a solution.
Not sure about the down vote. We have had similar problems in the past with Lambdas and they resolved on their own. Like I said, contact support if you are charged for using the service and getting errors.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.