3
I get below error when i package (jar) and run my defaulthadoopjob. 

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Tool
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 12 more
Could not find the main class: DefaultHadoopJobDriver. Program will exit.


Commands used to build Jar. 

# jar -cvf dhj.jar 
# hadoop -jar dhj.jar DefaultHadoopJobDriver

The above command gave me error "Failed to load Main-Class manifest attribute from dhj.jar"

rebuilt jar with manifest using below command

jar -cvfe dhj.jar DefaultHadoopJobDriver .
hadoop -jar dhj.jar DefaultHadoopJobDriver -- This returned the original error message that I reported above.

My Hadoop job has single class "DefaultHoopJobDrive" that extends Configures and implements Tool, and run method as only code for Job creation and inputpath,outpurpath set. Aslo I.m using new API.

I'm running hadoop 1.2.1 and the Job works fine from eclipse.

This might be something to do with the classpath. Please help.

3 Answers 3

5

For executing that jar you don't have to give hadoop -jar. The command is like so:

 hadoop jar <jar> [mainClass] args...

If this jar again gets the java.lang.ClassNotFoundException exception then can you use the:

hadoop classpath

command to see whether hadoop-core-1.2.1.jar is present in your hadoop installations classpath?

FYI, and if it's not present in this list you have to add this jar to the hadoop lib directory.

Sign up to request clarification or add additional context in comments.

Comments

1

Try building your hadoop java code with all hadoop jars available in hadoop's lib folder. In this case you are missing the hadoop util class which is present in the hadoop-core-*.jar

Classpath can be specified while building the code within the jar or you can externalise it using the following command

    hadoop -cp <path_containing_hadoop_jars> -jar <jar_name>

6 Comments

Same error I compiled using javac -classpath hadoop classpath -d bin src/*
and built jar using jar -cvfe dhj.jar DefaultHadoopJobDriver -C bin .
Also tried running using hadoop -cp hadoop classpath -jar dhj.jar DefaultHadoopJobDriver
Have you exported your code as runnable jar with all hadoop jars in its buildpath ?
its not runnable jar its just regular jar with just the class file and Manifest
|
0

In case anyone is using Maven and lands here: Dependency issues can be resolved by asking Maven to include any jars it requires within the parent project's jar itself. That way, Hadoop doesn't have to look elsewhere for dependencies -- it can find them right there itself. Here's how to do this: 1. Go to pom.xml

  1. Add a section to your <project> tag, called <build>

  2. Add the following to your <build></build> section:

    <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>1.7.1</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.slf4j:slf4j-api</exclude>
                                <exclude>junit:junit</exclude>
                                <exclude>jmock:jmock</exclude>
                                <exclude>xml-apis:xml-apis</exclude>
                                <exclude>org.testng:testng</exclude>
                                <exclude>org.mortbay.jetty:jetty</exclude>
                                <exclude>org.mortbay.jetty:jetty-util</exclude>
                                <exclude>org.mortbay.jetty:servlet-api-2.5</exclude>
                                <exclude>tomcat:jasper-runtime</exclude>
                                <exclude>tomcat:jasper-compiler</exclude>
                                <exclude>org.apache.hadoop:hadoop-core</exclude>
                                <exclude>org.apache.mahout:mahout-math</exclude>
                                <exclude>commons-logging:commons-logging</exclude>
                                <exclude>org.mortbay.jetty:jsp-api-2.1</exclude>
                                <exclude>org.mortbay.jetty:jsp-2.1</exclude>
                                <exclude>org.eclipse.jdt:core</exclude>
                                <exclude>ant:ant</exclude>
                                <exclude>org.apache.hadoop:avro</exclude>
                                <exclude>jline:jline</exclude>
                                <exclude>log4j:log4j</exclude>
                                <exclude>org.yaml:snakeyaml</exclude>
                                <exclude>javax.ws.rs:jsr311-api</exclude>
                                <exclude>org.slf4j:jcl-over-slf4j</exclude>
                                <exclude>javax.servlet:servlet-api</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/jruby.home</exclude>
                                    <exclude>META-INF/license</exclude>
                                    <exclude>META-INF/maven</exclude>
                                    <exclude>META-INF/services</exclude>
                                </excludes>
                            </filter>
                        </filters>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    

Now build your project again, and run with the normal hadoop java my.jar ... command. It shouldn't cry about dependencies now. Hope this helps!

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.