0

I am using a 10 node HDP cluster where I am trying to run a simple WordCount job using shell script on Bash.Below is the Commmand line arguments I am using.

    yarn jar /usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming-2.7.3.2.6.5.0-292.jar \
    -mapper 'wc -l' \
    -reducer './reducer_wordcount.sh' \
    -file /home/pathirippilly/map_reduce_jobs/shell_scripts/reducer_wordcount.sh \
    -numReduceTasks 1 \
    -input /user/pathirippilly/cards/smalldeck.txt \
    -output /user/pathirippilly/mapreduce_jobs/output_shell
  1. Here reducer_wordcount.sh is the reducer shell script which is available in my local directory /home/pathirippilly/map_reduce_jobs/shell_scripts
  2. smalldeck.txt is the input file on hadoop directory /user/pathirippilly/cards
  3. /user/pathirippilly/mapreduce_jobs/output_shell is the output directory
  4. The version of hadoop I am using is Hadoop 2.7.3.2.6.5.0-292
  5. I am running the above map reduce job on yarn mode

reducer_wordcount.sh is having:

    #! /user/bin/env bash
    awk '{line_count += $1} END  { print line_count }'

When I run this on my cluster , I am getting below error for reducer_wordcount.sh

    Error: java.lang.RuntimeException: Error in configuring object
            at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
            at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
            at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
            at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:410)
            at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
            at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
            at java.security.AccessController.doPrivileged(Native Method)
            at javax.security.auth.Subject.doAs(Subject.java:422)
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
            at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
    Caused by: java.lang.reflect.InvocationTargetException
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
            ... 9 more
    Caused by: java.lang.RuntimeException: configuration exception
            at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:222)
            at org.apache.hadoop.streaming.PipeReducer.configure(PipeReducer.java:67)
            ... 14 more
    Caused by: java.io.IOException: Cannot run program "/hdp01/hadoop/yarn/local/usercache/pathirippilly/appcache/application_1533622723243_17238/container_e38_1533622723243_17238_01_000004/./reducer_wordcount.sh": error=2, No such file or directory
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
            at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:209)
            ... 15 more
    Caused by: java.io.IOException: error=2, No such file or directory
            at java.lang.UNIXProcess.forkAndExec(Native Method)
            at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
            at java.lang.ProcessImpl.start(ProcessImpl.java:134)
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)

If I run the same reducer script directly as commandline commad as below, it works

    yarn jar /usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming.jar \
    -mapper 'wc -l' \
    -reducer "awk '{line_count += \$1} END  { print line_count }'" \
    -numReduceTasks 1 \
    -input /user/pathirippilly/cards/smalldeck.txt \
    -output /user/pathirippilly/mapreduce_jobs/output_shell

Expecting helping hands here, I am pretty new to hadoop streaming. Full error stack is given below:

    18/09/09 10:10:02 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.
    packageJobJar: [reducer_wordcount.sh] [/usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming-2.7.3.2.6.5.0-292.jar] /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir/streamjob8506373101127930734.jar tmpDir=null
    18/09/09 10:10:03 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
    18/09/09 10:10:03 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
    18/09/09 10:10:03 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
    18/09/09 10:10:03 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
    18/09/09 10:10:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    18/09/09 10:10:06 INFO mapreduce.JobSubmitter: number of splits:2
    18/09/09 10:10:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533622723243_17238
    18/09/09 10:10:08 INFO impl.YarnClientImpl: Submitted application application_1533622723243_17238
    18/09/09 10:10:08 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19288/proxy/application_1533622723243_17238/
    18/09/09 10:10:08 INFO mapreduce.Job: Running job: job_1533622723243_17238
    18/09/09 10:10:14 INFO mapreduce.Job: Job job_1533622723243_17238 running in uber mode : false
    18/09/09 10:10:14 INFO mapreduce.Job:  map 0% reduce 0%
    18/09/09 10:10:19 INFO mapreduce.Job:  map 100% reduce 0%
    18/09/09 10:10:23 INFO mapreduce.Job: Task Id : attempt_1533622723243_17238_r_000000_0, Status : FAILED
    Error: java.lang.RuntimeException: Error in configuring object
            at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
            at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
            at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
            at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:410)
            at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
            at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
            at java.security.AccessController.doPrivileged(Native Method)
            at javax.security.auth.Subject.doAs(Subject.java:422)
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
            at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
    Caused by: java.lang.reflect.InvocationTargetException
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
            ... 9 more
    Caused by: java.lang.RuntimeException: configuration exception
            at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:222)
            at org.apache.hadoop.streaming.PipeReducer.configure(PipeReducer.java:67)
            ... 14 more
    Caused by: java.io.IOException: Cannot run program "/hdp01/hadoop/yarn/local/usercache/pathirippilly/appcache/application_1533622723243_17238/container_e38_1533622723243_17238_01_000004/./reducer_wordcount.sh": error=2, No such file or directory
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
            at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:209)
            ... 15 more
    Caused by: java.io.IOException: error=2, No such file or directory
            at java.lang.UNIXProcess.forkAndExec(Native Method)
            at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
            at java.lang.ProcessImpl.start(ProcessImpl.java:134)
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
            ... 16 more
1
  • 1
    In reducer_wordcount.sh it's /usr/bin/env not /user/bin/env. Commented Sep 10, 2018 at 2:26

1 Answer 1

1

Refer Making files available for tasks and Packaging files for job submission

Basically, you only require the file name for the scripts, not a path

-reducer 'reducer_wordcount.sh' -file /local/path/to/reducer_wordcount.sh

Make sure the file is executable

 chmod +x /local/path/to/reducer_wordcount.sh

You can optionally rename the file using the # marker as shown in the links, but your local script name is the same as the reducer file, so that's not necessary.

You also need to fix the shebang to this #!/usr/bin/env bash

(by the way, your mapper and reducer are doing the same thing, counting lines, not necessarily "words")

Sign up to request clarification or add additional context in comments.

11 Comments

But I have mentioned full path for -file only ` -file /home/pathirippilly/map_reduce_jobs/shell_scripts/reducer_wordcount.sh ` . This is because I am submitting the job from /home/pathirippilly/ but my reducer job is in above first mentioned path. And this reducer is not available in clusters , so only using -file to ship the reducer to the cluster.
As you said , even if I execute as : ` hadoop jar /usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming- 2.7.3.2.6.5.0-292.jar -input /user/pathirippilly/cards/smalldeck.txt -output /user/pathirippilly/mapreduce_jobs/output_shell -mapper 'wc -l' -reducer 'reducer_wordcount.sh' still I am getting the same error : ` No such file or directory ` I am still confused where I am going wrong –
From what you just pasted, you're not giving a -file flag, which is still required if you refer to the two links. You additionally need to set the executable bit on your script
I didn't get it :(. Can you show me just one small example or a full format of the argument, I am pretty new to this
The example is in the links... One of them shows python script, but that doesn't really matter here. Are you even able to run the examples shown in the documentation?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.