Skip to content
This repository was archived by the owner on Dec 15, 2025. It is now read-only.

Commit e5bedc0

Browse files
committed
Refactor Travis CI to support multiple Java Builds
* Moved mapred-site and yarn-site xml files to a created a folder that contains the artifacts for either hadoop 2.6 or 3.2, those will be pickd up depending on the testing needs in travis.yml * Moved spark-env file to a created a folder that contains the artifacts for either spark1.6 or 2.4, those will be pickd up depending on the testing needs in travis.yml * Created hadoop-env.sh file for Hadoop 3.2 to store required environment variables to start hdfs and yarn services. * Removed harcoded values from haddop.conf and spark.conf, this will be filled up depending on the testing needs. * Added an `install_hadoop_spark` script that will download hadoop and spark binaries depending on the testing needs. * Added a `config_hadoop_spark` script that will setup hadoop, spark and hibench depending on the testing needs. * Added a `jdk_ver` script to pick up the current java version installed for travis CI. * `restart_hadoop_spark` script modified to be agnostic to the required binaries for testing. * travis/config_hadoop_spark.sh: * for Java 8 and 11 skiping `sql` test since HIVE is no longer used to perform queries. Newer Spark version perform queries using `SparkSession` no longer used `import org.apache.spark.sql` * .travis.yml: * Added `dist: trusty` to keep using this distro, Travis picks up xenial if not especified.. If Any greather Ubuntu version required in Travis won't support openjdk 7. * Refactored the CI flow to behave, download, setup, run and test hadoop and spark depending on the jdk required either versions 7, 8 and 11. * Hibench will be configured depending on the jdk required either versions 7, 8 and 11. * Hibench will be built depending on the jdk required either versions 7, 8 and 11. * benchmarks will be run for all jdk versions set. Signed-off-by: Luis Ponce <luis.f.ponce.navarro@linux.intel.com>
1 parent 4eb1191 commit e5bedc0

16 files changed

Lines changed: 804 additions & 36 deletions

.travis.yml

Lines changed: 47 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,9 @@
1+
dist: trusty
12
sudo: required
23
language: java
34
jdk:
5+
- openjdk11
6+
- openjdk8
47
- openjdk7
58
before_install:
69
- cat /etc/hosts # optionally check the content *before*
@@ -10,32 +13,53 @@ before_install:
1013
- cat /proc/cpuinfo | grep cores | wc -l
1114
- free -h
1215
install:
13-
- hibench=$(pwd)
14-
- cd /opt/
15-
- wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
16-
- tar -xzf spark-1.6.0-bin-hadoop2.6.tgz
17-
- wget https://archive.apache.org/dist/hadoop/core/hadoop-2.6.5/hadoop-2.6.5.tar.gz
18-
- tar -xzf hadoop-2.6.5.tar.gz
19-
- cd ${hibench}
20-
- cp ./travis/spark-env.sh /opt/spark-1.6.0-bin-hadoop2.6/conf/
21-
- cp ./travis/core-site.xml /opt/hadoop-2.6.5/etc/hadoop/
22-
- cp ./travis/hdfs-site.xml /opt/hadoop-2.6.5/etc/hadoop/
23-
- cp ./travis/mapred-site.xml /opt/hadoop-2.6.5/etc/hadoop/
24-
- cp ./travis/yarn-site.xml /opt/hadoop-2.6.5/etc/hadoop/
25-
- cp ./travis/hibench.conf ./conf/
26-
- cp ./travis/benchmarks.lst ./conf/
16+
- |
17+
export java_ver=$(./travis/jdk_ver.sh)
18+
if [[ "$java_ver" == 11 ]]; then
19+
export HADOOP_VER=3.2.0
20+
export SPARK_VER=2.4.3
21+
export SPARK_PACKAGE_TYPE=without-hadoop-scala-2.12
22+
elif [[ "$java_ver" == 8 ]]; then
23+
export HADOOP_VER=3.2.0
24+
export SPARK_VER=2.4.3
25+
export SPARK_PACKAGE_TYPE=without-hadoop
26+
elif [[ "$java_ver" == 7 ]]; then
27+
export HADOOP_VER=2.6.5
28+
export SPARK_VER=1.6.0
29+
export SPARK_PACKAGE_TYPE=hadoop2.6
30+
else
31+
exit 1
32+
fi
33+
34+
# Folders where are stored Spark and Hadoop depending on version required
35+
export SPARK_BINARIES_FOLDER=spark-$SPARK_VER-bin-$SPARK_PACKAGE_TYPE
36+
export HADOOP_BINARIES_FOLDER=hadoop-$HADOOP_VER
37+
export HADOOP_CONF_DIR=/opt/$HADOOP_BINARIES_FOLDER/etc/hadoop/
38+
export HADOOP_HOME=/opt/$HADOOP_BINARIES_FOLDER
39+
40+
sudo -E ./travis/install_hadoop_spark.sh
41+
sudo -E ./travis/config_hadoop_spark.sh
2742
before_script:
2843
- "export JAVA_OPTS=-Xmx512m"
2944
cache:
3045
directories:
3146
- $HOME/.m2
3247
script:
33-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
34-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
35-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
36-
- sudo -E ./travis/configssh.sh
37-
- sudo -E ./travis/restart_hadoop_spark.sh
38-
- cp ./travis/hadoop.conf ./conf/
39-
- cp ./travis/spark.conf ./conf/
40-
- /opt/hadoop-2.6.5/bin/yarn node -list 2
41-
- sudo -E ./bin/run_all.sh
48+
- |
49+
if [[ "$java_ver" == 11 ]]; then
50+
mvn clean package -Psparkbench -Phadoopbench -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.12 -Dmaven-compiler-plugin.version=3.8.0 -Dexclude-streaming
51+
elif [[ "$java_ver" == 8 ]]; then
52+
mvn clean package -q -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.11
53+
sudo -E ./travis/configssh.sh
54+
sudo -E ./travis/restart_hadoop_spark.sh
55+
sudo -E ./bin/run_all.sh
56+
elif [[ "$java_ver" == 7 ]]; then
57+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
58+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
59+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
60+
sudo -E ./travis/configssh.sh
61+
sudo -E ./travis/restart_hadoop_spark.sh
62+
sudo -E ./bin/run_all.sh
63+
else
64+
exit 1
65+
fi
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
# - SPARK_YARN_DIST_ARCHIVES, Comma separated list of archives to be distributed with the job.
4747

4848
# Options for the daemons used in the standalone deploy mode
49-
export SPARK_MASTER_IP= localhost
49+
export SPARK_MASTER_IP= localhost
5050
#i, to bind the master to a different IP address or hostname
5151
# - SPARK_MASTER_PORT / SPARK_MASTER_WEBUI_PORT, to use non-default ports for the master
5252
# - SPARK_MASTER_OPTS, to set config properties only for the master (e.g. "-Dx=y")

0 commit comments

Comments
 (0)