Map Reduce client jars for 2.4.1 hadoop in eclipse(在 Eclipse 中为 2.4.1 hadoop 映射 Reduce 客户端 jar)
问题描述
当我在 shell 的 hadoop
文件夹中运行我的 hadoop mapreduce word count jar
时,它运行正常并且输出正确生成,
When I run my hadoop mapreduce word count jar
in hadoop
folder in shell, it is running properly and the output is generated correctly,
由于我在 hadoop 2.4.1
的情况下使用 yarn
,所以当我从 eclipse 运行 MapReduce 示例程序
时,MAP 过程完成并且在减少过程中失败.
Since I use yarn
in case of hadoop 2.4.1
, when I run from eclipse for MapReduce Sample program
, MAP process completed and getting failed in reduce process.
很明显问题出在 jar 配置上.
Its clear that the problem is with jar configuration.
请找到罐子,我已添加...
这是我遇到的错误
INFO:减少任务执行器完成.2014 年 11 月 21 日晚上 8:50:35org.apache.hadoop.mapred.LocalJobRunner$Job 运行警告:job_local1638918104_0001 java.lang.Exception:java.lang.NoSuchMethodError:org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V在org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)在org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)引起:java.lang.NoSuchMethodError:org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V在org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:309)在java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)在 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)在 java.util.concurrent.FutureTask.run(FutureTask.java:166) 在java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)在java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)在 java.lang.Thread.run(Thread.java:722)
INFO: reduce task executor complete. Nov 21, 2014 8:50:35 PM org.apache.hadoop.mapred.LocalJobRunner$Job run WARNING: job_local1638918104_0001 java.lang.Exception: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529) Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:309) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722)
线程Thread-12"java.lang.NoClassDefFoundError 中的异常:org/apache/commons/httpclient/HttpMethod 在org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:562)引起:java.lang.ClassNotFoundException:org.apache.commons.httpclient.HttpMethod 在java.net.URLClassLoader$1.run(URLClassLoader.java:366) 在java.net.URLClassLoader$1.run(URLClassLoader.java:355) 在java.security.AccessController.doPrivileged(Native Method) 在java.net.URLClassLoader.findClass(URLClassLoader.java:354) 在java.lang.ClassLoader.loadClass(ClassLoader.java:423) 在sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 在java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 1 更多
Exception in thread "Thread-12" java.lang.NoClassDefFoundError: org/apache/commons/httpclient/HttpMethod at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:562) Caused by: java.lang.ClassNotFoundException: org.apache.commons.httpclient.HttpMethod at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 1 more
推荐答案
根据屏幕截图,您正在手动将所有依赖的 jar 添加到类路径中.强烈建议为此使用 maven,它将自动将依赖 jar 添加到类路径的过程.我们只需要添加主要的依赖 jars.
我在 pom.xml 中使用了以下依赖项,它帮助我运行没有任何问题..
As per the screenshot, you are manually adding all the dependent jars to the classpath.
It's highly recommended to use maven for this, which will automate the process of adding dependent jars to the classpath. We just need to add main dependent jars.
I used the following dependencies in pom.xml which helped me to run without any issues..
<properties>
<hadoop.version>2.5.2</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-api</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-server-nodemanager</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-server-resourcemanager</artifactId>
<version>${hadoop.version}</version>
</dependency>
</dependencies>
来解决您的问题,我检查了类路径,正好有 82 个 jar 文件可用.
像这样找到每个罐子会很乏味.
您可以添加功能明智的罐子这里.
其他解决方法是,将安装的 hadoop 目录路径中的所有 jar 文件添加为 <hadoop-installed>/share/hadoop/
并添加所有 lib 文件夹中的所有 jar.这是你能做的最好的事情.. 或
仅添加 avro 特定 jar,因为 avro 类根据屏幕截图抛出异常. 这可以解决 avro jar 问题.但您可能会面临其他依赖问题.在使用 Hadoop V1 时,我也遇到了同样的问题.所以后来我意识到并将 Maven 与 Hadoop V2 一起使用.所以不用担心依赖 jars.
您的重点将放在 Hadoop 和业务需求上.:)
希望对你有帮助..
come to your problem,
I checked in the classpath, there are exactly 82 jar files available.
It will be tedious job to find each jar like this.
You can add the functional wise jars HERE.
Other workaround would be, add all the jar files in installed hadoop directory path as <hadoop-installed>/share/hadoop/
and add all jars from all the lib folder. which is the best thing you can do.. or
Add only avro specific jars, because exception thrown by avro class as per the screenshot. This could solve avro jars issue. but you may face other dependecy issues.
I also faced the same problem while working with Hadoop V1. So later i realized and using Maven with Hadoop V2. So no worries of dependent jars.
Your focus will be on Hadoop and Business needs. :)
Hope it helps you..
这篇关于在 Eclipse 中为 2.4.1 hadoop 映射 Reduce 客户端 jar的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:在 Eclipse 中为 2.4.1 hadoop 映射 Reduce 客户端 jar
基础教程推荐
- Java:带有char数组的println给出乱码 2022-01-01
- 如何使用 Java 创建 X509 证书? 2022-01-01
- FirebaseListAdapter 不推送聊天应用程序的单个项目 - Firebase-Ui 3.1 2022-01-01
- 无法使用修饰符“public final"访问 java.util.Ha 2022-01-01
- 减少 JVM 暂停时间 >1 秒使用 UseConcMarkSweepGC 2022-01-01
- “未找到匹配项"使用 matcher 的 group 方法时 2022-01-01
- 降序排序:Java Map 2022-01-01
- 设置 bean 时出现 Nullpointerexception 2022-01-01
- 在 Libgdx 中处理屏幕的正确方法 2022-01-01
- Java Keytool 导入证书后出错,"keytool error: java.io.FileNotFoundException &拒绝访问" 2022-01-01