5

I'm trying Hadoop's Basic MapReduce Program whose tutorial is on http://java.dzone.com/articles/hadoop-basics-creating

The Full code of the class is(the code is present on net on above url)

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class Dictionary {
public static class WordMapper extends Mapper<Text, Text, Text, Text> {
    private Text word = new Text();

    public void map(Text key, Text value, Context context) throws IOException, InterruptedException {
        StringTokenizer itr = new StringTokenizer(value.toString(), ",");
        while (itr.hasMoreTokens()) {
            word.set(itr.nextToken());
            context.write(key, word);
        }
    }
}

public static class AllTranslationsReducer extends Reducer<Text, Text, Text, Text> {
    private Text result = new Text();

    public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
        String translations = "";
        for (Text val : values) {
            translations += "|" + val.toString();
        }
        result.set(translations);
        context.write(key, result);
    }
}

public static void main(String[] args) throws Exception {
    System.out.println("welcome to Java 1");
    Configuration conf = new Configuration();
    System.out.println("welcome to Java 2");
    Job job = new Job(conf, "dictionary");
    job.setJarByClass(Dictionary.class);
    job.setMapperClass(WordMapper.class);
    job.setReducerClass(AllTranslationsReducer.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    job.setInputFormatClass(KeyValueTextInputFormat.class);
    FileInputFormat.addInputPath(job, new Path("/tmp/hadoop-cscarioni/dfs/name/file"));
    FileOutputFormat.setOutputPath(job, new Path("output"));
    System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}

But after running in eclipse; I'm getting the error,

welcome to Java 1
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:73)
at Dictionary.main(Dictionary.java:43)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 2 more
2
  • 1
    Have you referenced apache commons logging in your project? Commented Dec 8, 2012 at 11:15
  • I get this error when I forget to launch yarn services. Just sbin/start-yarn.sh before you execute the job. Commented Aug 4, 2014 at 9:02

5 Answers 5

10

Please note that the exception is NoClassDefFoundError instead of ClassNotFoundException.

Note : NoClassDefFoundError is thrown when a class is not visible at run time but was visible at compile time. This may be something that can happen in the distribution or production of JAR files, where not all the required class files were included.

To fix : Please check for differences in your build time and runtime classpaths.

NoClassDefFoundError and ClassNotFoundException are different. one is an Error and the other is an Exception.

NoClassDefFoundError: arises from the JVM having problems finding a class it expected to find. A program that was working at compile-time can't run because of class files not being found.

ClassNotFoundException: This exception indicates that the class was not found on the classpath i.e we are trying to load the class definition and class/jar containing the class does not exist in the classpath.

Sign up to request clarification or add additional context in comments.

1 Comment

Not exactly true. The named file was found in the classpath at run time, but for some reason (there are about a dozen different ones) it could not be successfully loaded.
6

NoClassDefFoundError comes when a class is not visible at run time but was at compile time. Which may be related to JAR files, because all the required class files were not included.

So try adding in your class path commons-logging-1.1.1 jar which you can get from http://commons.apache.org/logging/download_logging.cgi

1 Comment

Not exactly true. The named file was found in the classpath at run time, but for some reason (there are about a dozen different ones) it could not be successfully loaded.
4

NoClassDefFoundError occurs when the named class is successfully located in the classpath, but for some reason cannot be loaded and verified. Most often the problem is that another class needed for the verification of the named class is either missing or is the wrong version.

Generally speaking, this error means "double-check that you have all the right JAR files (of the right version) in your classpath".

Comments

2

It's a very common error when you run a Hadoop Map/Reduce program in local IDE (Eclipse).

You should already added hadoop-core.jar in your build path, so no compile error detected in your program. But you get the error when you run it, because hadoop-core is dependent on commons-logging.jar (as well as some other jars). You may need to add the jars under /lib to your build path.

I recommend you to use Maven or other dependency management tool to manage the dependency.

Comments

0

Please read an article: http://kishorer.in/2014/10/22/running-a-wordcount-mapreduce-example-in-hadoop-2-4-1-single-node-cluster-in-ubuntu-14-04-64-bit/. It explains how to reference dependencies in Eclipse without Marven. However, Marven is preferred way, from what I understood.

1 Comment

The site seems to be down. However, there is a snapshot in the way-back-machine: web.archive.org/web/20160122234327/http://kishorer.in/2014/10/…

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.