求英语课前3分钟演讲ppt演讲的ppt,时间5分钟左右,主题是关于科技的,最好能有演讲

log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:441)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
at org.apache.hadoop.util.Shell.run(Shell.java:379)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
at test.WordCount.run(WordCount.java:150)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at test.WordCount.main(WordCount.java:155)
hadoop运行正常,且wordCount的jar在hadoop上执行是成功的,但是用eclipse上运行就报空指针,运行参数是:
input output
DFS也可以正常的连接上传、下载
^_^ 如果您热爱技术、热爱编程,想与更多的朋友一起交流学习,欢迎加入本站官方QQ群: ^_^最近浏览论坛:
关注/收藏的论坛:
热门论坛推荐:
验证码:输入右侧的字母和数字
下次自动登录
还未注册用户?&&&
您的赞赏是对楼主的鼓励!
1~200元之间
金额须在1~200元之间
去车系频道
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人一级勋章
自动加载图片
11:51:21 | 来自
1.6T车主请进,求解答,多谢!
请问各位1.6T油耗多少?双离合是否稳定。还有就是你们的落地价格麻烦告知。
引用 瑞纳车车
11:51:21 发表于 主楼 的内容:
禁止发布色情、反动及广告内容!
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人一级勋章
完成汽车之家·知道解答问答任务,并被提问者采纳为满意回答。并符合升级规则,即可得解答达人二级勋章
11:54:58 | 来自
价格看我帖子。目前1400公里,显示油耗10.0,稳步下降中。双离合没问题。
引用 禽兽还不快放开那妞
11:54:58 发表于 1楼 的内容:
禁止发布色情、反动及广告内容!
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人一级勋章
20:31:06 | 来自
11:51:21 发表在 请问各位1.6T油耗多少?双离合是否稳定。还有就是你们的落地价格麻烦告知。三千公里了,我是1.6T刚出就订车了2016八月提的车。双离合没问题,落地15.6。重庆地区的
引用 吃菜的老虎虎
20:31:06 发表于 2楼 的内容:
禁止发布色情、反动及广告内容!
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人一级勋章
20:38:41 | 来自
1600公里,没问题
引用 梦想_成功
20:38:41 发表于 3楼 的内容:
禁止发布色情、反动及广告内容!
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人一级勋章
完成汽车之家·知道解答问答任务,并被提问者采纳为满意回答。并符合升级规则,即可得解答达人二级勋章
完成汽车之家·知道升级任务,解答问答,并被提问者采纳为满意回答,可得解答达人三级勋章。
完成汽车之家·知道解答问答任务,并被提问者采纳为满意回答。并符合升级规则,即可得解答达人四级勋章
21:23:55 | 来自
20:31:06 发表在 三千公里了,我是1.6T刚出就订车了2016八月提的车。双离合没问题,落地15.6。重庆地区的我也是去年8月提的1.6旗舰!你确定是15.6落地?
引用 深蓝图图图
21:23:55 发表于 4楼 的内容:
禁止发布色情、反动及广告内容!
正在提交回复,请稍候...
mainTopic.xRender=YES
楼主有更新时通知我
已取消此帖的收藏
您已被楼主屏蔽,不能回复该帖子& 虚拟化技术
& 分布式计算&start-all启动正常,防火墙关闭,safemode也已关闭,伪分式的windows&cygwin环境,在-put操作时还是会报以下错,请高手指教一下呀,已经烦了一个下午,刚接触hadoop,我自己真的是没辙了。
org.apache.hadoop.ipc.RemoteException:&java.io.IOException:&File&/user/dz64/input4/wordcount.txt&cou
ld&only&be&replicated&to&0&nodes,&instead&of&1
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:422)
&&&&&&&&at&sun.reflect.GeneratedMethodAccessor9.invoke(Unknown&Source)
&&&&&&&&at&sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
&&&&&&&&at&java.lang.reflect.Method.invoke(Method.java:597)
&&&&&&&&at&org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
&&&&&&&&at&java.security.AccessController.doPrivileged(Native&Method)
&&&&&&&&at&javax.security.auth.Subject.doAs(Subject.java:396)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
&&&&&&&&at&org.apache.hadoop.ipc.Client.call(Client.java:740)
&&&&&&&&at&org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
&&&&&&&&at&$Proxy0.addBlock(Unknown&Source)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke0(Native&Method)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
&&&&&&&&at&sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
&&&&&&&&at&java.lang.reflect.Method.invoke(Method.java:597)
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.jav
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
&&&&&&&&at&$Proxy0.addBlock(Unknown&Source)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2937
&&&&&&&&at&org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:281
&&&&&&&&at&org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
回答: hadoop fs -put报错
查看一下datanode的logs里面会有错误的,datanode没有成功启动
Q hadoop&2.3.0&linux下完全分布式安装
在windows下安装的Eclipse,运行WordCount报错如下:
&20:12:01,750&INFO&&[main]&Configuration.deprecation&(Configuration.java:warnOnceIfDeprecated(996))&-&fs.default.name&is&deprecated.&Instead,&use&fs.defaultFS
SLF4J:&This&version&of&SLF4J&requires&log4j&version&1.2.12&or&later.&See&also&http://www.slf4j.org/codes.html#log4j_version
&20:12:02,760&WARN&&[main]&util.NativeCodeLoader&(NativeCodeLoader.java:&clinit&(62))&-&Unable&to&load&native-hadoop&library&for&your&platform...&using&builtin-java&classes&where&applicable
&20:12:02,812&ERROR&[main]&util.Shell&(Shell.java:getWinUtilsPath(336))&-&Failed&to&locate&the&winutils&binary&in&the&hadoop&binary&path
java.io.IOException:&Could&not&locate&executable&null\bin\winutils.exe&in&the&Hadoop&binaries.
at&org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
at&org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
at&org.apache.hadoop.util.Shell.&clinit&(Shell.java:326)
at&org.apache.hadoop.util.StringUtils.&clinit&(StringUtils.java:76)
at&org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:92)
at&org.apache.hadoop.security.Groups.&init&(Groups.java:76)
at&org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:239)
at&org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
at&org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
at&org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
at&org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
at&org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
at&org.apache.hadoop.mapreduce.task.JobContextImpl.&init&(JobContextImpl.java:72)
at&org.apache.hadoop.mapreduce.Job.&init&(Job.java:133)
at&org.apache.hadoop.mapreduce.Job.&init&(Job.java:123)
at&org.apache.hadoop.mapreduce.Job.&init&(Job.java:128)
at&wc.WordCount.main(WordCount.java:103)
Job&start!
Exception&in&thread&"main"&java.lang.NoClassDefFoundError:&com/google/protobuf/ServiceException
at&org.apache.hadoop.ipc.ProtobufRpcEngine.&clinit&(ProtobufRpcEngine.java:69)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Unknown&Source)
at&org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1821)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1786)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880)
at&org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:203)
at&org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
at&org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:334)
at&org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:241)
at&org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141)
at&org.apache.hadoop.hdfs.DFSClient.&init&(DFSClient.java:569)
at&org.apache.hadoop.hdfs.DFSClient.&init&(DFSClient.java:512)
at&org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:142)
at&org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2316)
at&org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)
at&org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)
at&org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)
at&org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)
at&org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at&org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:466)
at&wc.WordCount.main(WordCount.java:135)
Caused&by:&java.lang.ClassNotFoundException:&com.google.protobuf.ServiceException
at&java.net.URLClassLoader$1.run(Unknown&Source)
at&java.net.URLClassLoader$1.run(Unknown&Source)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(Unknown&Source)
at&java.lang.ClassLoader.loadClass(Unknown&Source)
at&sun.misc.Launcher$AppClassLoader.loadClass(Unknown&Source)
at&java.lang.ClassLoader.loadClass(Unknown&Source)
...&22&more
嗯,然后在代码里加上&System.setProperty("hadoop.home.dir",&"d:/hadoop");并查看Windows环境下Hadoop目录下的bin目录下有没有winutils.exe,没有就下一个拷贝过去
Q package&.day140109;
import&java.io.IOE
import&java.util.I
import&java.util.StringT
import&org.apache.hadoop.fs.P
import&org.apache.hadoop.io.IntW
import&org.apache.hadoop.io.LongW
import&org.apache.hadoop.io.T
import&org.apache.hadoop.mapred.FileInputF
import&org.apache.hadoop.mapred.FileOutputF
import&org.apache.hadoop.mapred.JobC
import&org.apache.hadoop.mapred.JobC
import&org.apache.hadoop.mapred.MapReduceB
import&org.apache.hadoop.mapred.M
import&org.apache.hadoop.mapred.OutputC
import&org.apache.hadoop.mapred.R
import&org.apache.hadoop.mapred.R
import&org.apache.hadoop.mapred.TextInputF
import&org.apache.hadoop.mapred.TextOutputF
public&class&Dedup&{
public&static&class&Map&extends&MapReduceBase&implements
&&&&Mapper&LongWritable,&Text,&Text,&IntWritable&&{
private&final&static&IntWritable&one&=&new&IntWritable(1);
private&Text&word&=&new&Text();
public&void&map(LongWritable&key,&Text&value,
&&&&&&&&OutputCollector&Text,&IntWritable&&output,&Reporter&reporter)
&&&&&&&&throws&IOException&{
&&&&String&line&=&value.toString();
&&&&StringTokenizer&tokenizer&=&new&StringTokenizer(line);
&&&&while&(tokenizer.hasMoreTokens())&{
&&&&&&&&word.set(tokenizer.nextToken());
&&&&&&&&output.collect(word,&one);
public&static&class&Reduce&extends&MapReduceBase&implements
&&&&Reducer&Text,&IntWritable,&Text,&IntWritable&&{
public&void&reduce(Text&key,&Iterator&IntWritable&&values,
&&&&&&&&OutputCollector&Text,&IntWritable&&output,&Reporter&reporter)
&&&&&&&&throws&IOException&{
&&&&int&sum&=&0;
&&&&while&(values.hasNext())&{
&&&&&&&&sum&+=&values.next().get();
&&&&output.collect(key,&new&IntWritable(sum));
public&static&void&main(String[]&args)&throws&Exception&{
JobConf&conf&=&new&JobConf(Dedup.class);
conf.setJobName("wordcount");
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(IntWritable.class);
conf.setMapperClass(Map.class);
conf.setCombinerClass(Reduce.class);
conf.setReducerClass(Reduce.class);
conf.setInputFormat(TextInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);
FileInputFormat.setInputPaths(conf,&new&Path(args[0]));
FileOutputFormat.setOutputPath(conf,&new&Path(args[1]));
JobClient.runJob(conf);//73行
log4j:WARN&No&appenders&could&be&found&for&logger&(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN&Please&initialize&the&log4j&system&properly.
log4j:WARN&See&http://logging.apache.org/log4j/1.2/faq.html#noconfig&for&more&info.
Exception&in&thread&"main"&java.lang.NullPointerException
at&java.lang.ProcessBuilder.start(Unknown&Source)
at&org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
at&org.apache.hadoop.util.Shell.run(Shell.java:379)
at&org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at&org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
at&org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
at&org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at&org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)
at&org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)
at&org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at&org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)
at&org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at&org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Unknown&Source)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at&org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at&org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at&org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Unknown&Source)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at&org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at&org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at&org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
at&.day140109.Dedup.main(Dedup.java:73)
我这是找了个wordcount例子&要不然大神你给我个例子我运行以下呗&我的集群环境是hadoop2.2&里面好像没有自带的例子。。。。
站内贴有很多例子
Q 我是装了三台机器集群
一个namenode&两个datanode&
在namenode机器上&输入bin/start-all.sh&启动集群时&报如下错误:请高手指点!!!
This&script&is&Deprecated.&Instead&use&start-dfs.sh&and&start-mapred.sh
starting&namenode,&logging&to&/home/dbrg/HadoopInstall/hadoop-0.21.0/bin/../logs/hadoop-root-namenode-dbrg.out
dbrg:&starting&datanode,&logging&to&/home/dbrg/HadoopInstall/hadoop-0.21.0/bin/../logs/hadoop-root-datanode-dbrg.out
localhost:&datanode&running&as&process&4238.&Stop&it&first.
dbrg:&datanode&running&as&process&4238.&Stop&it&first.
localhost:&starting&secondarynamenode,&logging&to&/home/dbrg/HadoopInstall/hadoop-0.21.0/bin/../logs/hadoop-root-secondarynamenode-dbrg.out
dbrg:&secondarynamenode&running&as&process&4431.&Stop&it&first.
localhost:&Exception&in&thread&"main"&java.lang.RuntimeException:&com.sun.org.apache.xerces.internal.impl.io.MalformedByteSequenceException:&Invalid&byte&1&of&1-byte&UTF-8&sequence.
localhost:&&&&&&at&org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1513)
localhost:&&&&&&at&org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1378)
localhost:&&&&&&at&org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1324)
localhost:&&&&&&at&org.apache.hadoop.conf.Configuration.get(Configuration.java:545)
localhost:&&&&&&at&org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:159)
localhost:&&&&&&at&org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:188)
localhost:&&&&&&at&org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.&init&(SecondaryNameNode.java:113)
localhost:&&&&&&at&org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:481)
localhost:&Caused&by:&com.sun.org.apache.xerces.internal.impl.io.MalformedByteSequenceException:&Invalid&byte&1&of&1-byte&UTF-8&sequence.
starting&jobtracker,&logging&to&/home/dbrg/HadoopInstall/hadoop-0.21.0/bin/../logs/hadoop-root-jobtracker-dbrg.out
localhost:&starting&tasktracker,&logging&to&/home/dbrg/HadoopInstall/hadoop-0.21.0/bin/../logs/hadoop-root-tasktracker-dbrg.out
dbrg:&tasktracker&running&as&process&4762.&Stop&it&first.
dbrg:&tasktracker&running&as&process&4762.&Stop&it&first.
有iaas的&frum么?
Q &12:25:47,472&INFO&org.apache.hadoop.hdfs.server.namenode.NameNode:&STARTUP_MSG:&
/************************************************************
STARTUP_MSG:&Starting&NameNode
STARTUP_MSG:&&&host&=&xiaohua-PC/192.168.1.100
STARTUP_MSG:&&&args&=&[]
STARTUP_MSG:&&&version&=&0.20.2
STARTUP_MSG:&&&build&=&https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20&-r&911707;&compiled&by&'chrisdo'&on&Fri&Feb&19&08:07:34&UTC&2010
************************************************************/
&12:25:47,550&ERROR&org.apache.hadoop.hdfs.server.namenode.NameNode:&java.net.BindException:&Problem&binding&to&localhost/127.0.0.1:9000&:&Address&already&in&use:&bind
at&org.apache.hadoop.ipc.Server.bind(Server.java:190)
at&org.apache.hadoop.ipc.Server$Listener.&init&(Server.java:253)
at&org.apache.hadoop.ipc.Server.&init&(Server.java:1026)
at&org.apache.hadoop.ipc.RPC$Server.&init&(RPC.java:488)
at&org.apache.hadoop.ipc.RPC.getServer(RPC.java:450)
at&org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:191)
at&org.apache.hadoop.hdfs.server.namenode.NameNode.&init&(NameNode.java:279)
at&org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:956)
at&org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965)
Caused&by:&java.net.BindException:&Address&already&in&use:&bind
at&sun.nio.ch.Net.bind0(Native&Method)
at&sun.nio.ch.Net.bind(Net.java:344)
at&sun.nio.ch.Net.bind(Net.java:336)
at&sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
at&sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at&org.apache.hadoop.ipc.Server.bind(Server.java:188)
端口问题,可以通过360查看下端口被那个程序占用了,结束他,也可以换用其他端口试试。不过默认的是
LINUX上能装360吗?
Q Hadoop的一个测试程序报错,原来执行是没问题的,请指教
可是几天不动了居然报错了,以下是执行语句:
$&hadoop&jar&/home/hadoop/myprogram/BAK/wordcount.jar&WordCount&/test/test.txt&/test/out
以下是执行结果:
14/04/25&12:13:27&INFO&client.RMProxy:&Connecting&to&ResourceManager&at&localhost/127.0.0.1:54311
14/04/25&12:13:28&WARN&hdfs.DFSClient:&DataStreamer&Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException):&File&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362/job.jar&could&only&be&replicated&to&0&nodes&instead&of&minReplication&(=1).&&There&are&1&datanode(s)&running&and&no&node(s)&are&excluded&in&this&operation.
&&&&&&&&at&org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1384)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2477)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:555)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
&&&&&&&&at&org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
&&&&&&&&at&org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
&&&&&&&&at&org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
&&&&&&&&at&java.security.AccessController.doPrivileged(Native&Method)
&&&&&&&&at&javax.security.auth.Subject.doAs(Subject.java:396)
&&&&&&&&at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
&&&&&&&&at&org.apache.hadoop.ipc.Client.call(Client.java:1347)
&&&&&&&&at&org.apache.hadoop.ipc.Client.call(Client.java:1300)
&&&&&&&&at&org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
&&&&&&&&at&com.sun.proxy.$Proxy9.addBlock(Unknown&Source)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke0(Native&Method)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
&&&&&&&&at&sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
&&&&&&&&at&java.lang.reflect.Method.invoke(Method.java:606)
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
&&&&&&&&at&com.sun.proxy.$Proxy9.addBlock(Unknown&Source)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:330)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1226)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1078)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
14/04/25&12:13:28&INFO&mapreduce.JobSubmitter:&Cleaning&up&the&staging&area&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362
14/04/25&12:13:28&ERROR&security.UserGroupInformation:&PriviledgedActionException&as:hadoop&(auth:SIMPLE)&cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException):&File&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362/job.jar&could&only&be&replicated&to&0&nodes&instead&of&minReplication&(=1).&&There&are&1&datanode(s)&running&and&no&node(s)&are&excluded&in&this&operation.
&&&&&&&&at&org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1384)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2477)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:555)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
&&&&&&&&at&org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
&&&&&&&&at&org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
&&&&&&&&at&org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
&&&&&&&&at&java.security.AccessController.doPrivileged(Native&Method)
&&&&&&&&at&javax.security.auth.Subject.doAs(Subject.java:396)
&&&&&&&&at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
Exception&in&thread&"main"&org.apache.hadoop.ipc.RemoteException(java.io.IOException):&File&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362/job.jar&could&only&be&replicated&to&0&nodes&instead&of&minReplication&(=1).&&There&are&1&datanode(s)&running&and&no&node(s)&are&excluded&in&this&operation.
&&&&&&&&at&org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1384)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2477)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:555)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
&&&&&&&&at&org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
&&&&&&&&at&org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
&&&&&&&&at&org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
&&&&&&&&at&java.security.AccessController.doPrivileged(Native&Method)
&&&&&&&&at&javax.security.auth.Subject.doAs(Subject.java:396)
&&&&&&&&at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
&&&&&&&&at&org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
&&&&&&&&at&org.apache.hadoop.ipc.Client.call(Client.java:1347)
&&&&&&&&at&org.apache.hadoop.ipc.Client.call(Client.java:1300)
&&&&&&&&at&org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
&&&&&&&&at&com.sun.proxy.$Proxy9.addBlock(Unknown&Source)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke0(Native&Method)
&&&&&&&&at&sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
&&&&&&&&at&sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
&&&&&&&&at&java.lang.reflect.Method.invoke(Method.java:606)
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
&&&&&&&&at&org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
&&&&&&&&at&com.sun.proxy.$Proxy9.addBlock(Unknown&Source)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:330)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1226)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1078)
&&&&&&&&at&org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
14/04/25&12:13:28&ERROR&hdfs.DFSClient:&Failed&to&close&file&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362/job.jar
org.apache.hadoop.ipc.RemoteException(java.io.IOException):&File&/tmp/hadoop-yarn/staging/hadoop/.staging/job_1_17362/job.jar&could&only&be&replicated&to&0&nodes&instead&of&minReplication&(=1).&&There&are&1&datanode(s)&running&and&no&node(s)&are&excluded&in&this&operation.
&&&&&&&&at&org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1384)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2477)
&&&&&&&&at&org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:555)
&&&&&&&&at&org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
&&&&&&&&at&org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod
我的jps命令:
1255&SecondaryNameNode
1426&ResourceManager
1525&NodeManager
1119&DataNode
1023&NameNode
You&have&mail&in&/var/spool/mail/hadoop
hdfs命令执行也没问题
$&hadoop&fs&-ls&/test
Found&2&items
-rw-r--r--&&&1&hadoop&supergroup&&&&&&&&&&0&&12:04&/test/HadoopData.txt
-rw-r--r--&&&1&hadoop&supergroup&&&&&&&&&78&&00:53&/test/test.txt
请大侠指教,救命
Q Hadoop是linux环境&2.3.0
Eclipse是window7环境下
WordCount.java代码如下:
import&java.io.F
import&java.io.IOE
import&java.text.SimpleDateF
import&java.util.D
import&java.util.StringT
import&org.apache.hadoop.conf.C
import&org.apache.hadoop.fs.P
import&org.apache.hadoop.fs.permission.FsP
import&org.apache.hadoop.io.IntW
import&org.apache.hadoop.io.T
import&org.apache.hadoop.mapred.JobC
import&org.apache.hadoop.mapreduce.J
import&org.apache.hadoop.mapreduce.M
import&org.apache.hadoop.mapreduce.R
import&org.apache.hadoop.mapreduce.lib.input.FileInputF
import&org.apache.hadoop.mapreduce.lib.output.FileOutputF
import&org.apache.hadoop.util.GenericOptionsP
public&class&WordCount&{
&&&&&*&用户自定义map函数,对以&key,&value&为输入的结果文件进行处理
&&&&&*&Map过程需要继承org.apache.hadoop.mapreduce包中Mapper类,并重写其map方法。
&&&&&*&通过在map方法中添加两句把key值和value值输出到控制台的代码
&&&&&*&,可以发现map方法中value值存储的是文本文件中的一行(以回车符为行结束标记),而key值为该行的首字母相对于文本文件的首地址的偏移量。
&&&&&*&然后StringTokenizer类将每一行拆分成为一个个的单词
&&&&&*&,并将&word,1&作为map方法的结果输出,其余的工作都交有MapReduce框架处理。&每行数据调用一次&Tokenizer:单词分词器
&&&&public&static&class&TokenizerMapper&extends
&&&&&&&&&&&&Mapper&Object,&Text,&Text,&IntWritable&&{
&&&&&&&&private&final&static&IntWritable&one&=&new&IntWritable(1);
&&&&&&&&private&Text&word&=&new&Text();
&&&&&&&&/*
&&&&&&&&&*&重写Mapper类中的map方法
&&&&&&&&&*/
&&&&&&&&public&void&map(Object&key,&Text&value,&Context&context)
&&&&&&&&&&&&&&&&throws&IOException,&InterruptedException&{
&&&&&&&&&&&&StringTokenizer&itr&=&new&StringTokenizer(value.toString());
&&&&&&&&&&&&//System.out.println(value.toString());
&&&&&&&&&&&&while&(itr.hasMoreTokens())&{
&&&&&&&&&&&&&&&&word.set(itr.nextToken());//&获取下个字段的值并写入文件
&&&&&&&&&&&&&&&&context.write(word,&one);
&&&&&&&&&&&&}
&&&&&*&用户自定义reduce函数,如果有多个热度测,则每个reduce处理自己对应的map结果数据
&&&&&*&Reduce过程需要继承org.apache.hadoop.mapreduce包中Reducer类,并重写其reduce方法。
&&&&&*&Map过程输出&key,values&中key为单个单词,而values是对应单词的计数值所组成的列表,Map的输出就是Reduce的输入,
&&&&&*&所以reduce方法只要遍历values并求和,即可得到某个单词的总次数。
&&&&public&static&class&IntSumReducer&extends
&&&&&&&&&&&&Reducer&Text,&IntWritable,&Text,&IntWritable&&{
&&&&&&&&private&IntWritable&result&=&new&IntWritable();
&&&&&&&&public&void&reduce(Text&key,&Iterable&IntWritable&&values,
&&&&&&&&&&&&&&&&Context&context)&throws&IOException,&InterruptedException&{
&&&&&&&&&&&&int&sum&=&0;
&&&&&&&&&&&&for&(IntWritable&val&:&values)&{
&&&&&&&&&&&&&&&&sum&+=&val.get();
&&&&&&&&&&&&}
&&&&&&&&&&&&result.set(sum);
&&&&&&&&&&&&context.write(key,&result);
&&&&public&static&void&main(String[]&args)&throws&Exception&{
&&&&&&&&/**
&&&&&&&&&*&环境变量配置
&&&&&&&&&*/
&&&&&&&&File&jarFile&=&EJob.createTempJar("bin");
&&&&&&&&ClassLoader&classLoader&=&EJob.getClassLoader();
&&&&&&&&Thread.currentThread().setContextClassLoader(classLoader);
&&&&&&&&/**
&&&&&&&&&*&连接hadoop集群配置
&&&&&&&&&*/
&&&&&&&&Configuration&conf&=&new&Configuration(true);
&&&&&&&&conf.set("fs.default.name",&"hdfs://192.168.52.128:9000");
&&&&&&&&conf.set("hadoop.job.user",&"hadoop");
&&&&&&&&conf.set("mapreduce.framework.name",&"yarn");
&&&&&&&&//&192.168.52.128:9001
&&&&&&&&conf.set("mapreduce.jobtracker.address",&"192.168.52.128:9001");
&&&&&&&&conf.set("yarn.resourcemanager.hostname",&"192.168.52.128");
&&&&&&&&conf.set("yarn.resourcemanager.admin.address",&"192.168.52.128:8033");
&&&&&&&&conf.set("yarn.resourcemanager.address",&"192.168.52.128:8032");
&&&&&&&&conf.set("yarn.resourcemanager.resource-tracker.address",&"192.168.52.128:8036");
&&&&&&&&conf.set("yarn.resourcemanager.scheduler.address",&"192.168.52.128:8030");
&&&&&&&&System.setProperty("hadoop.home.dir",&"E:/hadoop/hadoop-2.3.0");
&&&&&&&&String[]&otherArgs&=&new&String[2];
&&&&&&&&otherArgs[0]&=&"hdfs://192.168.52.128:9000/dir_input";//计算原文件目录,需提前在里面存入文件
&&&&&&&&String&time&=&new&SimpleDateFormat("yyyyMMddHHmmss").format(new&Date());
&&&&&&&&otherArgs[1]&=&"hdfs://192.168.52.128:9000/t/"&+&//计算后的计算结果存储目录,每次程序执行的结果目录不能相同,所以添加时间标签
&&&&&&&&/*
&&&&&&&&&*&setJobName()方法命名这个Job。对Job进行合理的命名有助于更快地找到Job,
&&&&&&&&&*&以便在JobTracker和Tasktracker的页面中对其进行监视
&&&&&&&&&*/
&&&&&&&&Job&job&=&new&Job(conf,&"word&count");
&&&&&&&&job.setJarByClass(WordCount.class);
&&&&&&&&((JobConf)&job.getConfiguration()).setJar(jarFile.toString());//环境变量调用,添加此句则可在eclipse中直接提交mapreduce任务,如果将该java文件打成jar包,需要将该句注释掉,否则在执行时反而找不到环境变量
&&&&&&&&//&job.setMaxMapAttempts(100);//设置最大试图产生底map数量,该命令不一定会设置该任务运行过车中的map数量
&&&&&&&&//&job.setNumReduceTasks(5);//设置reduce数量,即最后生成文件的数量
&&&&&&&&/*
&&&&&&&&&*&Job处理的Map(拆分)、Combiner(中间结果合并)以及Reduce(合并)的相关处理类。
&&&&&&&&&*&这里用Reduce类来进行Map产生的中间结果合并,避免给网络数据传输产生压力。
&&&&&&&&&*/
&&&&&&&&job.setMapperClass(TokenizerMapper.class);//&执行用户自定义map函数
&&&&&&&&job.setCombinerClass(IntSumReducer.class);//&对用户自定义map函数的数据处理结果进行合并,可以减少带宽消耗
&&&&&&&&job.setReducerClass(IntSumReducer.class);//&执行用户自定义reduce函数
&&&&&&&&/*
&&&&&&&&&*&接着设置Job输出结果&key,value&的中key和value数据类型,因为结果是&单词,个数&,
&&&&&&&&&*&所以key设置为"Text"类型,相当于Java中String类型
&&&&&&&&&*&。Value设置为"IntWritable",相当于Java中的int类型。
&&&&&&&&&*/
&&&&&&&&job.setOutputKeyClass(Text.class);
&&&&&&&&job.setOutputValueClass(IntWritable.class);
&&&&&&&&/*
&&&&&&&&&*&加载输入文件夹或文件路径,即输入数据的路径
&&&&&&&&&*&将输入的文件数据分割成一个个的split,并将这些split分拆成&key,value&对作为后面用户自定义map函数的输入
&&&&&&&&&*&其中,每个split文件的大小尽量小于hdfs的文件块大小
&&&&&&&&&*&(默认64M),否则该split会从其它机器获取超过hdfs块大小的剩余部分数据,这样就会产生网络带宽造成计算速度影响
&&&&&&&&&*&默认使用TextInputFormat类型,即输入数据形式为文本类型数据文件
&&&&&&&&&*/
&&&&&&&&System.out.println("Job&start!");
&&&&&&&&FileInputFormat.addInputPath(job,&new&Path(otherArgs[0]));
&&&&&&&&/*
&&&&&&&&&*&设置输出文件路径&默认使用TextOutputFormat类型,即输出数据形式为文本类型文件,字段间默认以制表符隔开
&&&&&&&&&*/
&&&&&&&&FileOutputFormat.setOutputPath(job,&new&Path(otherArgs[1]));
&&&&&&&&/*
&&&&&&&&&*&开始运行上面的设置和算法
&&&&&&&&&*/
&&&&&&&&if&(job.waitForCompletion(true))&{
&&&&&&&&&&&&System.out.println("ok!");
&&&&&&&&}&else&{
&&&&&&&&&&&&System.out.println("error!");
&&&&&&&&&&&&System.exit(0);
右键点击Run&as&hadoop报错如下:
&14:26:58,734&INFO&&[main]&Configuration.deprecation&(Configuration.java:warnOnceIfDeprecated(996))&-&fs.default.name&is&deprecated.&Instead,&use&fs.defaultFS
SLF4J:&This&version&of&SLF4J&requires&log4j&version&1.2.12&or&later.&See&also&http://www.slf4j.org/codes.html#log4j_version
&14:26:59,393&WARN&&[main]&util.NativeCodeLoader&(NativeCodeLoader.java:&clinit&(62))&-&Unable&to&load&native-hadoop&library&for&your&platform...&using&builtin-java&classes&where&applicable
Job&start!
Exception&in&thread&"main"&java.lang.VerifyError:&class&org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AppendRequestProto&overrides&final&method&getUnknownFields.()Lcom/google/protobuf/UnknownFieldS
at&java.lang.ClassLoader.defineClass1(Native&Method)
at&java.lang.ClassLoader.defineClass(Unknown&Source)
at&java.security.SecureClassLoader.defineClass(Unknown&Source)
at&java.net.URLClassLoader.defineClass(Unknown&Source)
at&java.net.URLClassLoader.access$100(Unknown&Source)
at&java.net.URLClassLoader$1.run(Unknown&Source)
at&java.net.URLClassLoader$1.run(Unknown&Source)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(Unknown&Source)
at&java.lang.ClassLoader.loadClass(Unknown&Source)
at&sun.misc.Launcher$AppClassLoader.loadClass(Unknown&Source)
at&java.lang.ClassLoader.loadClass(Unknown&Source)
at&java.lang.Class.getDeclaredMethods0(Native&Method)
at&java.lang.Class.privateGetDeclaredMethods(Unknown&Source)
at&java.lang.Class.privateGetPublicMethods(Unknown&Source)
at&java.lang.Class.privateGetPublicMethods(Unknown&Source)
at&java.lang.Class.getMethods(Unknown&Source)
at&sun.misc.ProxyGenerator.generateClassFile(Unknown&Source)
at&sun.misc.ProxyGenerator.generateProxyClass(Unknown&Source)
at&java.lang.reflect.Proxy.getProxyClass0(Unknown&Source)
at&java.lang.reflect.Proxy.newProxyInstance(Unknown&Source)
at&org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
at&org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
at&org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:334)
at&org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:241)
at&org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141)
at&org.apache.hadoop.hdfs.DFSClient.&init&(DFSClient.java:569)
at&org.apache.hadoop.hdfs.DFSClient.&init&(DFSClient.java:512)
at&org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:142)
at&org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2316)
at&org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)
at&org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)
at&org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)
at&org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)
at&org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at&org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:466)
at&wc.WordCount.main(WordCount.java:137)
用Maven下载吧
&dependency&
&groupId&com.google.guava&/groupId&
&artifactId&guava&/artifactId&
&version&18.0&/version&
&/dependency&
Q 在eclipse里面run&as&-&&run&on&&hadoop测试一个mapereduce去重的小程序(来自/thread-.html)报错找不到类:
14/11/14&13:39:56&WARN&mapred.JobClient:&No&job&jar&file&set.&&User&classes&may&not&be&found.&See&JobConf(Class)&or&JobConf#setJar(String).
14/11/14&13:39:56&INFO&input.FileInputFormat:&Total&input&paths&to&process&:&2
14/11/14&13:39:56&WARN&util.NativeCodeLoader:&Unable&to&load&native-hadoop&library&for&your&platform...&using&builtin-java&classes&where&applicable
14/11/14&13:39:56&WARN&snappy.LoadSnappy:&Snappy&native&library&not&loaded
14/11/14&13:39:58&INFO&mapred.JobClient:&Running&job:&job__0018
14/11/14&13:39:59&INFO&mapred.JobClient:&&map&0%&reduce&0%
14/11/14&13:40:09&INFO&mapred.JobClient:&Task&Id&:&attempt__0018_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&com.hebut.mr.Dedup$Map at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:849)
软件版本:eclipse&jee&4.4.1,&hadoop-1.1.2,&hadoop-eclipse-plugin-1.1.2.jar,
对java和hadoop都不熟悉,刚接触hadoop一个月,请大侠指点。之前测试hadoop的范例wordcount的时候也是抱着个错误,后来折腾了很久,重新编译了插件搞定了。现在做另外一个去重的测试,又报这个错误。
dedup.java:
package&com.hebut.[indent]
import&java.io.IOE[/indent][indent]
import&org.apache.hadoop.conf.C
import&org.apache.hadoop.fs.P
import&org.apache.hadoop.io.IntW
import&org.apache.hadoop.io.T
import&org.apache.hadoop.mapreduce.J
import&org.apache.hadoop.mapreduce.M
import&org.apache.hadoop.mapreduce.R
import&org.apache.hadoop.mapreduce.lib.input.FileInputF
import&org.apache.hadoop.mapreduce.lib.output.FileOutputF
import&org.apache.hadoop.util.GenericOptionsP
public&class&Dedup&{
&&&&//map将输入中的value复制到输出数据的key上,并直接输出
&&&&public&static&class&Map&extends&Mapper&Object,Text,Text,Text&{
&&&&&&&&private&static&Text&line=new&Text();//每行数据
&&&&&&&&//实现map函数
&&&&&&&&public&void&map(Object&key,Text&value,Context&context)
&&&&&&&&&&&&&&&&throws&IOException,InterruptedException{
&&&&&&&&&&&&line=
&&&&&&&&&&&&context.write(line,&new&Text(""));[/indent]
&&&&//reduce将输入中的key复制到输出数据的key上,并直接输出
&&&&public&static&class&Reduce&extends&Reducer&Text,Text,Text,Text&{
&&&&&&&&//实现reduce函数
&&&&&&&&public&void&reduce(Text&key,Iterable&Text&&values,Context&context)
&&&&&&&&&&&&&&&&throws&IOException,InterruptedException{
&&&&&&&&&&&&context.write(key,&new&Text(""));
&&&&public&static&void&main(String[]&args)&throws&Exception{
&&&&&&&&Configuration&conf&=&new&Configuration();
&&&&&&&&//这句话很关键
&&&&&&&&conf.set("mapred.job.tracker",&"192.168.1.2:9001");
&&&&&&&&String[]&ioArgs=new&String[]{"dedup_in","dedup_out"};
&&&&&String[]&otherArgs&=&new&GenericOptionsParser(conf,&ioArgs).getRemainingArgs();
&&&&&if&(otherArgs.length&!=&2)&{
&&&&&System.err.println("Usage:&Data&Deduplication&&in&&&out&");
&&&&&System.exit(2);
&&&&&Job&job&=&new&Job(conf,&"Data&Deduplication");
&&&&&job.setJarByClass(Dedup.class);
&&&&&//设置Map、Combine和Reduce处理类
&&&&&job.setMapperClass(Map.class);
&&&&&job.setCombinerClass(Reduce.class);
&&&&&job.setReducerClass(Reduce.class);
&&&&&//设置输出类型
&&&&&job.setOutputKeyClass(Text.class);
&&&&&job.setOutputValueClass(Text.class);
&&&&&//设置输入和输出目录
&&&&&FileInputFormat.addInputPath(job,&new&Path(otherArgs[0]));
&&&&&FileOutputFormat.setOutputPath(job,&new&Path(otherArgs[1]));
&&&&&System.exit(job.waitForCompletion(true)&?&0&:&1);
打包jar到集群使用hadoop&jar命令运行是最可靠的,在Eclipse下使用插件提交任务,有文章说可以实现,但是很少有能够实现的,反正我自己是没实现
Q 我在用crawl脚本运行的时候在solr的部分会报错,内容如下:
14/11/20&22:34:49&INFO&solr.SolrDeleteDuplicates:&SolrDeleteDuplicates:&starting...
14/11/20&22:34:49&INFO&solr.SolrDeleteDuplicates:&SolrDeleteDuplicates:&Solr&url:&http://192.168.83.208:8983/solr/xhnutch
14/11/20&22:34:50&WARN&mapred.JobClient:&No&job&jar&file&set.&&User&classes&may&not&be&found.&See&JobConf(Class)&or&JobConf#setJar(String).
14/11/20&22:34:51&INFO&mapred.JobClient:&Running&job:&job__0076
14/11/20&22:34:52&INFO&mapred.JobClient:&&map&0%&reduce&0%
14/11/20&22:35:02&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused&by:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at&java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at&sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Class.java:270)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
...&8&more
14/11/20&22:35:06&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused&by:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at&java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at&sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Class.java:270)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
...&8&more
14/11/20&22:35:14&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused&by:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at&java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at&sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Class.java:270)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
...&8&more
14/11/20&22:35:16&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused&by:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at&java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at&sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Class.java:270)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
...&8&more
14/11/20&22:35:20&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused&by:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at&java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at&java.security.AccessController.doPrivileged(Native&Method)
at&java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at&sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at&java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at&java.lang.Class.forName0(Native&Method)
at&java.lang.Class.forName(Class.java:270)
at&org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
...&8&more
14/11/20&22:35:24&INFO&mapred.JobClient:&Task&Id&:&attempt__0076_m_,&Status&:&FAILED
java.lang.RuntimeException:&java.lang.ClassNotFoundException:&org.apache.nutch.indexer.solr.SolrDeleteDuplicates$SolrInputFormat
at&org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at&org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187)
at&org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at&org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at&org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at&java.security.AccessController.doPrivileged(Native&Method)
at&javax.security.auth.Subject.doAs(Subject.java:415)
at&org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at&org.apache.hadoop.mapred.Child.main(Child.java:249)
但是在编译的job里面这个东西是有的啊,而且在local里面运行的话也没有错,在Google上面搜了一下,有用的信息几乎没有,希望大神能给个答复
可以了,在Job&job&=&new&Job(getConf(),&"solrdedup");后面添加job.setJarByClass(SolrDeleteDuplicates.class);
Q 刚开始学习Hadoop,现在部署了Hadoop&2.4.1的集群,编译了eclipse插件,现在运行《Hadoop实战》里面的WordCount程序,jar包已经打好,input文件也已上传。运行时报错&java.io.IOException:&Job&[root@master&~]#&hdfs&dfs&-ls&/
Found&1&items
drwxr-xr-x&&&-&root&supergroup&&&&&&&&&&0&&00:56&/input
[root@master&~]#&hdfs&dfs&-ls&/input
Found&1&items
-rw-r--r--&&&3&root&supergroup&&&&&&&&&22&&00:56&/input/file.txt
[root@master&~]#运行程序报错如下:[root@master&~]#&hadoop&jar&wordcount.jar&WordCount&hdfs://master:9000/input/file.txt&hdfs://master:9000/output
14/08/26&01:14:50&INFO&Configuration.deprecation:&session.id&is&deprecated.&Instead,&use&dfs.metrics.session-id
14/08/26&01:14:50&INFO&jvm.JvmMetrics:&Initializing&JVM&Metrics&with&processName=JobTracker,&sessionId=
14/08/26&01:14:50&INFO&jvm.JvmMetrics:&Cannot&initialize&JVM&Metrics&with&processName=JobTracker,&sessionId=&-&already&initialized
14/08/26&01:14:51&WARN&mapreduce.JobSubmitter:&Hadoop&command-line&option&parsing&not&performed.&Implement&the&Tool&interface&and&execute&your&application&with&ToolRunner&to&remedy&this.
14/08/26&01:14:51&INFO&mapred.FileInputFormat:&Total&input&paths&to&process&:&1
14/08/26&01:14:51&INFO&mapreduce.JobSubmitter:&number&of&splits:1
14/08/26&01:14:51&INFO&mapreduce.JobSubmitter:&Submitting&tokens&for&job:&job_local01
14/08/26&01:14:51&WARN&conf.Configuration:&file:/data/hadoop/tmp/mapred/staging/root/.staging/job_local01/job.xml:an&attempt&to&override&final&parameter:&mapreduce.job.end-notification.max.retry.&&Ignoring.
14/08/26&01:14:51&WARN&conf.Configuration:&file:/data/hadoop/tmp/mapred/staging/root/.staging/job_local01/job.xml:an&attempt&to&override&final&parameter:&mapreduce.job.end-notification.max.&&Ignoring.
14/08/26&01:14:51&WARN&conf.Configuration:&file:/data/hadoop/tmp/mapred/local/localRunner/root/job_local01/job_local01.xml:an&attempt&to&override&final&parameter:&mapreduce.job.end-notification.max.retry.&&Ignoring.
14/08/26&01:14:51&WARN&conf.Configuration:&file:/data/hadoop/tmp/mapred/local/localRunner/root/job_local01/job_local01.xml:an&attempt&to&override&final&parameter:&mapreduce.job.end-notification.max.&&Ignoring.
14/08/26&01:14:51&INFO&mapreduce.Job:&The&url&to&track&the&job:&http://localhost:8080/
14/08/26&01:14:51&INFO&mapreduce.Job:&Running&job:&job_local01
14/08/26&01:14:51&INFO&mapred.LocalJobRunner:&OutputCommitter&set&in&config&null
14/08/26&01:14:51&INFO&mapred.LocalJobRunner:&OutputCommitter&is&org.apache.hadoop.mapred.FileOutputCommitter
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&Waiting&for&map&tasks
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&Starting&task:&attempt_local01_m_
14/08/26&01:14:52&INFO&mapred.Task:&&Using&ResourceCalculatorProcessTree&:&[&]
14/08/26&01:14:52&INFO&mapred.MapTask:&Processing&split:&hdfs://master:9000/input/file.txt:0+22
14/08/26&01:14:52&INFO&mapred.MapTask:&numReduceTasks:&1
14/08/26&01:14:52&INFO&mapred.MapTask:&Map&output&collector&class&=&org.apache.hadoop.mapred.MapTask$MapOutputBuffer
14/08/26&01:14:52&INFO&mapred.MapTask:&(EQUATOR)&0&kvi&857584)
14/08/26&01:14:52&INFO&mapred.MapTask:&mapreduce.task.io.sort.mb:&100
14/08/26&01:14:52&INFO&mapred.MapTask:&soft&limit&at&
14/08/26&01:14:52&INFO&mapred.MapTask:&bufstart&=&0;&bufvoid&=&
14/08/26&01:14:52&INFO&mapred.MapTask:&kvstart&=&;&length&=&6553600
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:
14/08/26&01:14:52&INFO&mapred.MapTask:&Starting&flush&of&map&output
14/08/26&01:14:52&INFO&mapred.MapTask:&Spilling&map&output
14/08/26&01:14:52&INFO&mapred.MapTask:&bufstart&=&0;&bufend&=&38;&bufvoid&=&
14/08/26&01:14:52&INFO&mapred.MapTask:&kvstart&=&857584);&kvend&=&857536);&length&=&13/6553600
14/08/26&01:14:52&INFO&mapred.MapTask:&Finished&spill&0
14/08/26&01:14:52&INFO&mapred.Task:&Task:attempt_local01_m_&is&done.&And&is&in&the&process&of&committing
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&hdfs://master:9000/input/file.txt:0+22
14/08/26&01:14:52&INFO&mapred.Task:&Task&'attempt_local01_m_'&done.
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&Finishing&task:&attempt_local01_m_
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&map&task&executor&complete.
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&Waiting&for&reduce&tasks
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&Starting&task:&attempt_local01_r_
14/08/26&01:14:52&INFO&mapred.Task:&&Using&ResourceCalculatorProcessTree&:&[&]
14/08/26&01:14:52&INFO&mapred.ReduceTask:&Using&ShuffleConsumerPlugin:&org.apache.hadoop.mapreduce.task.reduce.Shuffle@fcc06e0
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&MergerManager:&memoryLimit=,&maxSingleShuffleLimit=,&mergeThreshold=,&ioSortFactor=10,&memToMemMergeOutputsThreshold=10
14/08/26&01:14:52&INFO&reduce.EventFetcher:&attempt_local01_r_&Thread&started:&EventFetcher&for&fetching&Map&Completion&Events
14/08/26&01:14:52&INFO&reduce.LocalFetcher:&localfetcher#1&about&to&shuffle&output&of&map&attempt_local01_m_&decomp:&48&len:&52&to&MEMORY
14/08/26&01:14:52&INFO&reduce.InMemoryMapOutput:&Read&48&bytes&from&map-output&for&attempt_local01_m_
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&closeInMemoryFile&-&&map-output&of&size:&48,&inMemoryMapOutputs.size()&-&&1,&commitMemory&-&&0,&usedMemory&-&48
14/08/26&01:14:52&INFO&reduce.EventFetcher:&EventFetcher&is&interrupted..&Returning
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&1&/&1&copied.
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&finalMerge&called&with&1&in-memory&map-outputs&and&0&on-disk&map-outputs
14/08/26&01:14:52&WARN&io.ReadaheadPool:&Failed&readahead&on&ifile
EBADF:&Bad&file&descriptor
&&&&&&&&at&org.apache.hadoop.io.nativeio.NativeIO$POSIX.posix_fadvise(Native&Method)
&&&&&&&&at&org.apache.hadoop.io.nativeio.NativeIO$POSIX.posixFadviseIfPossible(NativeIO.java:263)
&&&&&&&&at&org.apache.hadoop.io.nativeio.NativeIO$POSIX$CacheManipulator.posixFadviseIfPossible(NativeIO.java:142)
&&&&&&&&at&org.apache.hadoop.io.ReadaheadPool$ReadaheadRequestImpl.run(ReadaheadPool.java:206)
&&&&&&&&at&java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
&&&&&&&&at&java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
&&&&&&&&at&java.lang.Thread.run(Thread.java:745)
14/08/26&01:14:52&INFO&mapred.Merger:&Merging&1&sorted&segments
14/08/26&01:14:52&INFO&mapred.Merger:&Down&to&the&last&merge-pass,&with&1&segments&left&of&total&size:&42&bytes
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&Merged&1&segments,&48&bytes&to&disk&to&satisfy&reduce&memory&limit
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&Merging&1&files,&52&bytes&from&disk
14/08/26&01:14:52&INFO&reduce.MergeManagerImpl:&Merging&0&segments,&0&bytes&from&memory&into&reduce
14/08/26&01:14:52&INFO&mapred.Merger:&Merging&1&sorted&segments
14/08/26&01:14:52&INFO&mapred.Merger:&Down&to&the&last&merge-pass,&with&1&segments&left&of&total&size:&42&bytes
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&1&/&1&copied.
14/08/26&01:14:52&INFO&mapred.LocalJobRunner:&reduce&task&executor&complete.
14/08/26&01:14:52&WARN&mapred.LocalJobRunner:&job_local01
java.lang.Exception:&java.lang.RuntimeException:&java.lang.NoSuchMethodException:&org.apache.hadoop.mapred.Reducer.&init&()
&&&&&&&&at&org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
&&&&&&&&at&org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused&by:&java.lang.RuntimeException:&java.lang.NoSuchMethodException:&org.apache.hadoop.mapred.Reducer.&init&()
&&&&&&&&at&org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
&&&&&&&&at&org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:409)
&&&&&&&&at&org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
&&&&&&&&at&org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
&&&&&&&&at&java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
&&&&&&&&at&java.util.concurrent.FutureTask.run(FutureTask.java:262)
&&&&&&&&at&java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
&&&&&&&&at&java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
&&&&&&&&at&java.lang.Thread.run(Thread.java:745)
Caused&by:&java.lang.NoSuchMethodException:&org.apache.hadoop.mapred.Reducer.&init&()
&&&&&&&&at&java.lang.Class.getConstructor0(Class.java:2849)
&&&&&&&&at&java.lang.Class.getDeclaredConstructor(Class.java:2053)
&&&&&&&&at&org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
&&&&&&&&...&8&more
是不是权限问题?}

我要回帖

更多关于 英语课演讲ppt 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信