如何在CentOS6.5下编译64位的Hadoop2.x(一)

前端之家收集整理的这篇文章主要介绍了如何在CentOS6.5下编译64位的Hadoop2.x(一)前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
hadoop2.x在apache官网直接下载的并没有64位直接能用的版本,如果我们想在64位系统使用,那么就需要重新编译hadoop,否则直接使用32位的hadoop运行在64位的系统上,将会出现一些库不兼容的异常。如下图所示,最直接的一个异常:



在这之前,先用一个表格来描述下散仙的编译的环境的状况:
序号 描述 备注
1 centos6.5系统64位 linux环境
2 Apache Ant1.9 ant编译
3 Apache Maven3.2.1 maven打包部署
4 gcc,gcc-c++,make 依赖库
5 protobuf-2.5.0 序列化库
6 JDK1.7 JAVA 环境
7 Hadoop2.2.0源码包 官网下载
8 屌丝工程师一名 主角
9 hadoop交流群376932160 技术交流





下面进入正题,散仙的环境是在centos下,所以大部分安装编译依赖库,都可以很方便的使用yum命令来完成。

1,安装gcc,执行如下的几个yum命令即可



Java代码
  1. yum-yinstallgcc
  2. yuminstall-ybzip2-devel
  3. yum-yinstallgcc-c++
  4. yuminstallmake
  5. yuminstallautoconfautomakelibtoolcmakencurses-developenssl-develgcc*


2,安装JDK,并设置环境变量,完成后测试安装成功否
Java代码
  1. [root@ganglia~]#java-version
  2. javaversion"1.5.0"
  3. gij(GNUlibgcj)version4.4.720120313(RedHat4.4.7-4)
  4. Copyright(C)2007FreeSoftwareFoundation,Inc.
  5. Thisisfreesoftware;seethesourceforcopyingconditions.ThereisNO
  6. warranty;notevenforMERCHANTABILITYorFITNESSFORAPARTICULARPURPOSE.
  7. [root@ganglia~]#

3, 安装Maven,安装完成后测试安装与否
Java代码
  1. [root@ganglia~]#mvn-v
  2. ApacheMaven3.2.1(ea8b2b07643dbb1b84b6d16e1f08391b666bc1e9;2014-02-15T01:37:52+08:00)
  3. Mavenhome:/usr/local/maven
  4. Javaversion:1.7.0_25,vendor:OracleCorporation
  5. Javahome:/usr/local/jdk1.7.0_25/jre
  6. Defaultlocale:zh_CN,platformencoding:UTF-8
  7. OSname:"linux",version:"2.6.32-431.el6.x86_64",arch:"amd64",family:"unix"
  8. [root@ganglia~]#

4,安装Ant, 安装完成后,依旧测试成功与否
Java代码
  1. [root@ganglia~]#ant-version
  2. ApacheAnt(TM)version1.9.4compiledonApril292014
  3. [root@ganglia~]#

5,安装protobuf,安装方式,从官网下载tar.gz的包点击下载,并上传到linux上解压,然后进入根目录下,执行如下的几个命令:
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.bz2

Java代码
  1. ./configure
  2. make
  3. makecheck
  4. makeinstall

然后,执行如下命令,进行测试安装成功与否
Java代码
  1. [root@gangliaprotobuf-2.5.0]#protoc
  2. Missinginputfile.
  3. [root@gangliaprotobuf-2.5.0]#


6,从hadoop官网下载hadoop2.2.0的版本的源码的src的包,并查看目录
Java代码
  1. [root@ganglia~]#cdhadoop-2.2.0-src
  2. [root@gangliahadoop-2.2.0-src]#ll
  3. 总用量108
  4. -rw-r--r--.167974users99681072013BUILDING.txt
  5. drwxr-xr-x.267974users40961072013dev-support
  6. drwxr-xr-x.467974users40966917:05hadoop-assemblies
  7. drwxr-xr-x.367974users40966917:27hadoop-client
  8. drwxr-xr-x.967974users40966917:14hadoop-common-project
  9. drwxr-xr-x.367974users40966917:26hadoop-dist
  10. drwxr-xr-x.767974users40966917:20hadoop-hdfs-project
  11. drwxr-xr-x.1167974users40966917:25hadoop-mapreduce-project
  12. drwxr-xr-x.467974users40966917:06hadoop-maven-plugins
  13. drwxr-xr-x.367974users40966917:27hadoop-minicluster
  14. drwxr-xr-x.467974users40966917:03hadoop-project
  15. drwxr-xr-x.367974users40966917:05hadoop-project-dist
  16. drwxr-xr-x.1267974users40966917:26hadoop-tools
  17. drwxr-xr-x.467974users40966917:24hadoop-yarn-project
  18. -rw-r--r--.167974users151641072013LICENSE.txt
  19. -rw-r--r--.167974users1011072013NOTICE.txt
  20. -rw-r--r--.167974users165691072013pom.xml
  21. -rw-r--r--.167974users13661072013README.txt
  22. [root@gangliahadoop-2.2.0-src]#


7,修改/root/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/pom.xml文件增加,补丁内容,这部分是hadoop2.2.0的bug,如果是其他的2.x的版本,可以视情况而定,内容如下:
Xml代码
  1. <dependency>
  2. <groupId>org.mockito</groupId>
  3. <artifactId>mockito-all</artifactId>
  4. <scope>test</scope>
  5. </dependency>
  6. <!--新增的内容开始-->
  7. <dependency>
  8. <groupId>org.mortbay.jetty</groupId>
  9. <artifactId>jetty-util</artifactId>
  10. <scope>test</scope>
  11. </dependency>
  12. <!--新增的内容结束-->
  13. <dependency>
  14. <groupId>org.mortbay.jetty</groupId>
  15. <artifactId>jetty</artifactId>
  16. <scope>test</scope>
  17. </dependency>


8,修改完毕后,回到hadoop-2.2.0-src的跟目录下执行编译打包命令:
Java代码
  1. mvnclean
  2. mvnpackage-Pdist,native-DskipTests-Dtar

然后等待半个小时左右的编译时间,网速快的话,时间可能会更短,编译完成后,输出的打包信息如下:
Java代码
  1. [INFO]
  2. [INFO]---maven-resources-plugin:2.2:resources(default-resources)@hadoop-minicluster---
  3. [INFO]Usingdefaultencodingtocopyfilteredresources.
  4. [INFO]
  5. [INFO]---maven-compiler-plugin:2.5.1:compile(default-compile)@hadoop-minicluster---
  6. [INFO]Nosourcestocompile
  7. [INFO]
  8. [INFO]---maven-resources-plugin:2.2:testResources(default-testResources)@hadoop-minicluster---
  9. [INFO]Usingdefaultencodingtocopyfilteredresources.
  10. [INFO]
  11. [INFO]---maven-compiler-plugin:2.5.1:testCompile(default-testCompile)@hadoop-minicluster---
  12. [INFO]Nosourcestocompile
  13. [INFO]
  14. [INFO]---maven-surefire-plugin:2.12.3:test(default-test)@hadoop-minicluster---
  15. [INFO]Testsareskipped.
  16. [INFO]
  17. [INFO]---maven-jar-plugin:2.3.1:jar(default-jar)@hadoop-minicluster---
  18. [WARNING]JARwillbeempty-nocontentwasmarkedforinclusion!
  19. [INFO]Buildingjar:/root/hadoop-2.2.0-src/hadoop-minicluster/target/hadoop-minicluster-2.2.0.jar
  20. [INFO]
  21. [INFO]---maven-source-plugin:2.1.2:jar-no-fork(hadoop-java-sources)@hadoop-minicluster---
  22. [INFO]Nosourcesinproject.Archivenotcreated.
  23. [INFO]
  24. [INFO]---maven-source-plugin:2.1.2:test-jar-no-fork(hadoop-java-sources)@hadoop-minicluster---
  25. [INFO]Nosourcesinproject.Archivenotcreated.
  26. [INFO]
  27. [INFO]---maven-site-plugin:3.0:attach-descriptor(attach-descriptor)@hadoop-minicluster---
  28. [INFO]
  29. [INFO]---maven-javadoc-plugin:2.8.1:jar(module-javadocs)@hadoop-minicluster---
  30. [INFO]Buildingjar:/root/hadoop-2.2.0-src/hadoop-minicluster/target/hadoop-minicluster-2.2.0-javadoc.jar
  31. [INFO]------------------------------------------------------------------------
  32. [INFO]ReactorSummary:
  33. [INFO]
  34. [INFO]ApacheHadoopMain................................SUCCESS[01:43min]
  35. [INFO]ApacheHadoopProjectPOM.........................SUCCESS[01:21min]
  36. [INFO]ApacheHadoopAnnotations.........................SUCCESS[42.256s]
  37. [INFO]ApacheHadoopAssemblies..........................SUCCESS[0.291s]
  38. [INFO]ApacheHadoopProjectDistPOM....................SUCCESS[41.053s]
  39. [INFO]ApacheHadoopMavenPlugins.......................SUCCESS[44.283s]
  40. [INFO]ApacheHadoopAuth................................SUCCESS[01:49min]
  41. [INFO]ApacheHadoopAuthExamples.......................SUCCESS[18.950s]
  42. [INFO]ApacheHadoopCommon..............................SUCCESS[05:31min]
  43. [INFO]ApacheHadoopNFS.................................SUCCESS[40.498s]
  44. [INFO]ApacheHadoopCommonProject......................SUCCESS[0.050s]
  45. [INFO]ApacheHadoopHDFS................................SUCCESS[03:43min]
  46. [INFO]ApacheHadoopHttpFS..............................SUCCESS[26.962s]
  47. [INFO]ApacheHadoopHDFSBookKeeperJournal.............SUCCESS[47.056s]
  48. [INFO]ApacheHadoopHDFS-NFS............................SUCCESS[4.237s]
  49. [INFO]ApacheHadoopHDFSProject........................SUCCESS[0.029s]
  50. [INFO]hadoop-yarn.......................................SUCCESS[01:25min]
  51. [INFO]hadoop-yarn-api...................................SUCCESS[40.841s]
  52. [INFO]hadoop-yarn-common................................SUCCESS[31.228s]
  53. [INFO]hadoop-yarn-server................................SUCCESS[0.161s]
  54. [INFO]hadoop-yarn-server-common.........................SUCCESS[12.289s]
  55. [INFO]hadoop-yarn-server-nodemanager....................SUCCESS[19.271s]
  56. [INFO]hadoop-yarn-server-web-proxy......................SUCCESS[3.586s]
  57. [INFO]hadoop-yarn-server-resourcemanager................SUCCESS[14.674s]
  58. [INFO]hadoop-yarn-server-tests..........................SUCCESS[1.153s]
  59. [INFO]hadoop-yarn-client................................SUCCESS[7.861s]
  60. [INFO]hadoop-yarn-applications..........................SUCCESS[0.106s]
  61. [INFO]hadoop-yarn-applications-distributedshell.........SUCCESS[4.540s]
  62. [INFO]hadoop-mapreduce-client...........................SUCCESS[0.168s]
  63. [INFO]hadoop-mapreduce-client-core......................SUCCESS[29.360s]
  64. [INFO]hadoop-yarn-applications-unmanaged-am-launcher....SUCCESS[3.353s]
  65. [INFO]hadoop-yarn-site..................................SUCCESS[0.128s]
  66. [INFO]hadoop-yarn-project...............................SUCCESS[29.610s]
  67. [INFO]hadoop-mapreduce-client-common....................SUCCESS[19.908s]
  68. [INFO]hadoop-mapreduce-client-shuffle...................SUCCESS[3.357s]
  69. [INFO]hadoop-mapreduce-client-app.......................SUCCESS[12.116s]
  70. [INFO]hadoop-mapreduce-client-hs........................SUCCESS[5.807s]
  71. [INFO]hadoop-mapreduce-client-jobclient.................SUCCESS[6.713s]
  72. [INFO]hadoop-mapreduce-client-hs-plugins................SUCCESS[2.001s]
  73. [INFO]ApacheHadoopMapReduceExamples..................SUCCESS[7.684s]
  74. [INFO]hadoop-mapreduce..................................SUCCESS[3.664s]
  75. [INFO]ApacheHadoopMapReduceStreaming.................SUCCESS[5.645s]
  76. [INFO]ApacheHadoopDistributedCopy....................SUCCESS[29.953s]
  77. [INFO]ApacheHadoopArchives............................SUCCESS[2.277s]
  78. [INFO]ApacheHadoopRumen...............................SUCCESS[7.743s]
  79. [INFO]ApacheHadoopGridmix.............................SUCCESS[5.608s]
  80. [INFO]ApacheHadoopDataJoin...........................SUCCESS[3.385s]
  81. [INFO]ApacheHadoopExtras..............................SUCCESS[3.509s]
  82. [INFO]ApacheHadoopPipes...............................SUCCESS[8.266s]
  83. [INFO]ApacheHadoopToolsDist..........................SUCCESS[2.073s]
  84. [INFO]ApacheHadoopTools...............................SUCCESS[0.025s]
  85. [INFO]ApacheHadoopDistribution........................SUCCESS[23.928s]
  86. [INFO]ApacheHadoopClient..............................SUCCESS[6.876s]
  87. [INFO]ApacheHadoopMini-Cluster........................SUCCESS[0.514s]
  88. [INFO]------------------------------------------------------------------------
  89. [INFO]BUILDSUCCESS
  90. [INFO]------------------------------------------------------------------------
  91. [INFO]Totaltime:26:04min
  92. [INFO]Finishedat:2014-06-09T17:27:26+08:00
  93. [INFO]FinalMemory:96M/239M
  94. [INFO]------------------------------------------------------------------------


编译好的hadoop包,路径在:
Java代码
  1. [root@gangliatarget]#pwd
  2. /root/hadoop-2.2.0-src/hadoop-dist/target
  3. [root@gangliatarget]#ll
  4. 总用量282348
  5. drwxr-xr-x.2rootroot40966917:26antrun
  6. -rw-r--r--.1rootroot16186917:26dist-layout-stitching.sh
  7. -rw-r--r--.1rootroot6356917:26dist-tar-stitching.sh
  8. drwxr-xr-x.9rootroot40966917:26hadoop-2.2.0
  9. -rw-r--r--.1rootroot961838336917:27hadoop-2.2.0.tar.gz
  10. -rw-r--r--.1rootroot27456917:26hadoop-dist-2.2.0.jar
  11. -rw-r--r--.1rootroot1929034726917:27hadoop-dist-2.2.0-javadoc.jar
  12. drwxr-xr-x.2rootroot40966917:27javadoc-bundle-options
  13. drwxr-xr-x.2rootroot40966917:26maven-archiver
  14. drwxr-xr-x.2rootroot40966917:26test-dir
  15. [root@gangliatarget]#

编译完成后的本地库,位于如下位置,并查看本地库支持位数:
Java代码
  1. [root@ganglianative]#pwd
  2. /root/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native
  3. [root@ganglianative]#ll
  4. 总用量3596
  5. -rw-r--r--.1rootroot7331146917:26libhadoop.a
  6. -rw-r--r--.1rootroot14872366917:26libhadooppipes.a
  7. lrwxrwxrwx.1rootroot186917:26libhadoop.so->libhadoop.so.1.0.0
  8. -rwxr-xr-x.1rootroot4118706917:26libhadoop.so.1.0.0
  9. -rw-r--r--.1rootroot5819446917:26libhadooputils.a
  10. -rw-r--r--.1rootroot2733306917:26libhdfs.a
  11. lrwxrwxrwx.1rootroot166917:26libhdfs.so->libhdfs.so.0.0.0
  12. -rwxr-xr-x.1rootroot1810426917:26libhdfs.so.0.0.0
  13. [root@ganglianative]#filelibhadoop.so
  14. libhadoop.so:symboliclinkto`libhadoop.so.1.0.0'
  15. [root@ganglianative]#filelibhadoop.so.1.0.0
  16. libhadoop.so.1.0.0:ELF64-bitLSBsharedobject,x86-64,version1(SYSV),dynamicallylinked,notstripped
  17. [root@ganglianative]#


至此,我们的编译已经,成功完成,然后,我们就可以使用在target目录下,编译生成的hadoop新的tar.gz包,来部署我们的hadoop集群。

原文链接:https://www.f2er.com/centos/380394.html

猜你在找的CentOS相关文章