1

I have successfully executed mapreduce java code on same machine. and now I am trying to execute Mapreduce code written in python on same machine. For this I am using hadoop_3.2.1 and hadoop-streaming-3.2.1.jar.

I have tested the code by command

[dsawale@localhost ~]$ cat Desktop/sample.txt | python PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py | sort | python PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py

I found it displays correct output.

But when i try to execute on hadoop cluster using command

[dsawale@localhost ~]$ hadoop jar Desktop/JAR/hadoop-streaming-3.2.1.jar -mapper mapper.py -reducer reducer.py -file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py -file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py -input /sample.txt -output pysamp

I am getting output as :

packageJobJar: [PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py, PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py, /tmp/hadoop-unjar6715579504628929924/] [] /tmp/streamjob3211585412475799030.jar tmpDir=null
Streaming Command Failed!

This is my very first python MapReduce program. Could you please help me to get rid of this error. Thanks!

Configuration files: mapred-site.xml

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
            <name>yarn.app.mapreduce.am.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
        <property>
            <name>mapreduce.map.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
        <property>
            <name>mapreduce.reduce.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
</configuration>

core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml

<configuration>
    <property>
            <name>dfs.replication</name>
            <value>1</value>
    </property>
    <property>  
        <name>dfs.permission</name>
        <value>false</value>
    </property>
    <property>  
        <name>dfs.namenode.name.dir</name>
        <value>/home/dsawale/hadoop-3.2.1/hadoop2_data/hdfs/namenode</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/home/dsawale/hadoop-3.2.1/hadoop2_data/hdfs/datanode</value>
    </property>
</configuration>

yarn-site.xml:

    <configuration>
<!-- Site specific YARN configuration properties -->
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
</configuration>
3
  • Which is your mapper script and reducer script? The filepaths you have mentioned for -file is different from -mapper and -reducer. Commented Mar 23, 2020 at 14:56
  • Mapper script is : PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py Reducer script: PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py Commented Mar 23, 2020 at 15:15
  • They must be passed to mapper and reducer arguments. Updated it as the answer. Commented Mar 23, 2020 at 15:21

1 Answer 1

1

You have incorrect file paths passed to mapper and reducer arguments.

Try,

hadoop jar Desktop/JAR/hadoop-streaming-3.2.1.jar \
-mapper PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py \
-reducer PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py  \
-file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py \
-file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py \
-input /sample.txt \
-output pysamp
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.