转载:http://blog.csdn.net/osg_yanglinping/article/details/25702333
一、准备工作
下载插件源码地址:https://github.com/winghc/hadoop2x-eclipse-plugin
下载ANT:http://www.apache.org/dist/ant/binaries/apache-ant-1.9.4-bin.zip
二、修改源码
1、修改ivy配置文件(hadoop2x-eclipse-plugin\src\ivy目录下)
2、修改ant 编译所用的buding.xml配置文件
将文件内容替换为:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project default="jar" name="eclipse-plugin">
<import file="../build-contrib.xml"/>
<path id="eclipse-sdk-jars">
<fileset dir="${eclipse.home}/plugins/">
<include name="org.eclipse.ui*.jar"/>
<include name="org.eclipse.jdt*.jar"/>
<include name="org.eclipse.core*.jar"/>
<include name="org.eclipse.equinox*.jar"/>
<include name="org.eclipse.debug*.jar"/>
<include name="org.eclipse.osgi*.jar"/>
<include name="org.eclipse.swt*.jar"/>
<include name="org.eclipse.jface*.jar"/>
<include name="org.eclipse.team.cvs.ssh2*.jar"/>
<include name="com.jcraft.jsch*.jar"/>
</fileset>
</path>
<path id="hadoop-sdk-jars">
<fileset dir="${hadoop.home}/share/hadoop/mapreduce">
<include name="hadoop*.jar"/>
</fileset>
<fileset dir="${hadoop.home}/share/hadoop/hdfs">
<include name="hadoop*.jar"/>
</fileset>
<fileset dir="${hadoop.home}/share/hadoop/common">
<include name="hadoop*.jar"/>
</fileset>
</path>
<path id="classpath">
<pathelement location="${build.classes}"/>
<path refid="eclipse-sdk-jars"/>
<path refid="hadoop-sdk-jars"/>
</path>
<target name="check-contrib" unless="eclipse.home">
<property name="skip.contrib" value="yes"/>
<echo message="eclipse.home unset: skipping eclipse plugin"/>
</target>
<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
<echo message="contrib: ${name}"/>
<javac
encoding="${build.encoding}"
srcdir="${src.dir}"
includes="**/*.java"
destdir="${build.classes}"
debug="${javac.debug}"
deprecation="${javac.deprecation}">
<classpath refid="classpath"/>
</javac>
</target>
<target name="jar" depends="compile" unless="skip.contrib">
<mkdir dir="${build.dir}/lib"/>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/mapreduce">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/common">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/hdfs">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/yarn">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/classes" verbose="true">
<fileset dir="${root}/src/java">
<include name="*.xml"/>
</fileset>
</copy>
<copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<!-- 下边这行是新增的,原来没有 -->
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<jar
jarfile="${build.dir}/hadoop-${name}-${version}.jar"
manifest="${root}/META-INF/MANIFEST.MF">
<manifest>
<attribute name="Bundle-ClassPath"
value="classes/,
lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/hadoop-common-${hadoop.version}.jar,
lib/hadoop-hdfs-${hadoop.version}.jar,
lib/protobuf-java-${protobuf.version}.jar,
lib/log4j-${log4j.version}.jar,
lib/commons-cli-1.2.jar,
lib/commons-configuration-1.6.jar,
lib/commons-httpclient-3.1.jar,
lib/commons-lang-${commons-lang.version}.jar,
lib/commons-collections-${commons-collections.version}.jar,
lib/jackson-core-asl-1.8.8.jar,
lib/jackson-mapper-asl-1.8.8.jar,
lib/slf4j-log4j12-1.7.5.jar,
lib/slf4j-api-1.7.5.jar,
lib/guava-${guava.version}.jar,
lib/netty-${netty.version}.jar"/>
</manifest>
<fileset dir="${build.dir}" includes="classes/ lib/"/>
<fileset dir="${root}" includes="resources/ plugin.xml"/>
</jar>
</target>
</project>
三、编译插件(记得联网状态下编译,会轻松搞定)
具体cmd过程如下:
打开cmd,执行如下命令:
cd /d D:\11\hadoop2x-eclipse-plugin-master\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin
配置ant的环境变量,然后执行:
ant jar -Dversion=2.4.0 -Declipse.home=D:\Program Files\MyEclipse 8.5 -Dhadoop.home=D:\11\hadoop-2.4.0\hadoop-2.4.0 Buildfile: D:\11\hadoop2x-eclipse-plugin-master\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin\build.xml
四、插件安装及测试
进入插件源码主目录,会生成一个build文件夹,编译成功的插件就在里面。
将hadoop-eclipse-plugin-2.4.0.jar复制到你编译所用的eclipse目录下的plugins下。
启动eclipse,可能发现插件并未加载,不要着急,因为你编译所用的版本可能比较低,你重新找个高点的eclipse,将插件安装,成功完成配置。
至此hadoop-2.4.0 Eclipse 插件已编译成功,Good Luck!
这是我写的第一篇博客,有不对的地方,希望大家原谅,哈哈!
相关推荐
Hadoop2.4的Eclipse插件。
spark-2.4.0-bin-hadoop2.7
环境配置是:hadoop 2.4.0+ eclipse 4.3.2+mac os+10.9.4 eclipse插件
hadoop-common-2.4.0 win7 编译
spark-2.4.0-bin-hadoop2.6.tgz-----------------------------------------------linux spark安装
此插件是本人亲自编译好的可以让Eclipse上实现hadoop的mapreduce编程,目标集群运行了hadoop2.4.0,集群系统CENTOS6.5,jdk1.8.20,Eclipse运行在win7系统中
hadoop-native-64-2.4.0.tar 解决运行hadoop 警告:WARN util.NativeCodeLoader: Unable to load native-hadoop
spark-2.4.0-bin-hadoop2.7 ,下载超级慢,放在这里分享给大家.. 加压rar就可以
Hadoop官方不提供64位编译版,在此提供编译结果分享给大家 ...hadoop-2.4.0-amd64.z01 hadoop-2.4.0-amd64.z02 hadoop-2.4.0-amd64.zip 2.解压获得文件包: hadoop-2.4.0-amd64.tar.gz 3.在服务器上部署使用
Hadoop官方不提供64位编译版,在此提供编译结果分享给大家 ...hadoop-2.4.0-amd64.z01 hadoop-2.4.0-amd64.z02 hadoop-2.4.0-amd64.zip 2.解压获得文件包: hadoop-2.4.0-amd64.tar.gz 3.在服务器上部署使用
hadoop-eclipse-plugin-2.4.0
这是用hadoop官方源码生成的hadoop用在windows环境下eclipse 的插件。更新了最新的jar包文件。与资源。
spark的安装包,主要用于安装spark。Apache Spark是一个新兴的大数据处理通用引擎,提供了分布式的内存抽象。
Hadoop官方不提供64位编译版,在此提供编译结果分享给大家 ...hadoop-2.4.0-amd64.z01 hadoop-2.4.0-amd64.z02 hadoop-2.4.0-amd64.zip 2.解压获得文件包: hadoop-2.4.0-amd64.tar.gz 3.在服务器上部署使用
Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...
spark-2.4.0-bin-hadoop2.7
hadoop-eclipse-plugin-2.7.4.jar和hadoop-eclipse-plugin-2.7.3.jar还有hadoop-eclipse-plugin-2.6.0.jar的插件都在这打包了,都可以用。
Hadoop Eclipse是Hadoop开发环境的插件,用户在创建Hadoop程序时,Eclipse插件会自动导入Hadoop编程接口的jar文件,这样用户就可以在Eclipse插件的图形界面中进行编码、调试和运行Hadop程序,也能通过Eclipse插件...
Hadoop2.4.0 Eclipse插件