Skip to content
This repository was archived by the owner on Dec 15, 2025. It is now read-only.

Commit 2e04a1d

Browse files
committed
Refactor Travis CI to support multiple Java Builds
* Moved mapred-site and yarn-site xml files to a created a folder that contains the artifacts for either hadoop 2.6 or 3.2, those will be pickd up depending on the testing needs in travis.yml * Moved spark-env file to a created a folder that contains the artifacts for either spark1.6 or 2.4, those will be pickd up depending on the testing needs in travis.yml * Created hadoop-env.sh file for Hadoop 3.2 to store required environment variables to start hdfs and yarn services. * Removed harcoded values from haddop.conf and spark.conf, this will be filled up depending on the testing needs. * Added an `install_hadoop_spark` script that will download hadoop and spark binaries depending on the testing needs. * Added a `config_hadoop_spark` script that will setup hadoop, spark and hibench depending on the testing needs. * Added a `jdk_ver` script to pick up the current java version installed for travis CI. * `restart_hadoop_spark` script modified to be agnostic to the required binaries for testing. * travis/config_hadoop_spark.sh: * for Java 8 and 11 skiping `sql` test since HIVE is no longer used to perform queries. Newer Spark version perform queries using `SparkSession` no longer used `import org.apache.spark.sql` * .travis.yml: * Added `dist: trusty` to keep using this distro, Travis picks up xenial if not especified.. If Any greather Ubuntu version required in Travis won't support openjdk 7. * Refactored the CI flow to behave, download, setup, run and test hadoop and spark depending on the jdk required either versions 7, 8 and 11. * Hibench will be configured depending on the jdk required either versions 7, 8 and 11. * Hibench will be built depending on the jdk required either versions 7, 8 and 11. * benchmarks will be run for all jdk versions set. Signed-off-by: Luis Ponce <luis.f.ponce.navarro@linux.intel.com>
1 parent 4eb1191 commit 2e04a1d

16 files changed

Lines changed: 773 additions & 35 deletions

.travis.yml

Lines changed: 44 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,9 @@
1+
dist: trusty
12
sudo: required
23
language: java
34
jdk:
5+
- openjdk11
6+
- openjdk8
47
- openjdk7
58
before_install:
69
- cat /etc/hosts # optionally check the content *before*
@@ -10,32 +13,50 @@ before_install:
1013
- cat /proc/cpuinfo | grep cores | wc -l
1114
- free -h
1215
install:
13-
- hibench=$(pwd)
14-
- cd /opt/
15-
- wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
16-
- tar -xzf spark-1.6.0-bin-hadoop2.6.tgz
17-
- wget https://archive.apache.org/dist/hadoop/core/hadoop-2.6.5/hadoop-2.6.5.tar.gz
18-
- tar -xzf hadoop-2.6.5.tar.gz
19-
- cd ${hibench}
20-
- cp ./travis/spark-env.sh /opt/spark-1.6.0-bin-hadoop2.6/conf/
21-
- cp ./travis/core-site.xml /opt/hadoop-2.6.5/etc/hadoop/
22-
- cp ./travis/hdfs-site.xml /opt/hadoop-2.6.5/etc/hadoop/
23-
- cp ./travis/mapred-site.xml /opt/hadoop-2.6.5/etc/hadoop/
24-
- cp ./travis/yarn-site.xml /opt/hadoop-2.6.5/etc/hadoop/
25-
- cp ./travis/hibench.conf ./conf/
26-
- cp ./travis/benchmarks.lst ./conf/
16+
- |
17+
export java_ver=$(./travis/jdk_ver.sh)
18+
if [[ "$java_ver" == 11 ]]; then
19+
export HADOOP_VER=3.2.0
20+
export SPARK_VER=2.4.3
21+
export SPARK_PACKAGE_TYPE=without-hadoop-scala-2.12
22+
elif [[ "$java_ver" == 8 ]]; then
23+
export HADOOP_VER=3.2.0
24+
export SPARK_VER=2.4.3
25+
export SPARK_PACKAGE_TYPE=without-hadoop
26+
elif [[ "$java_ver" == 7 ]]; then
27+
export HADOOP_VER=2.6.5
28+
export SPARK_VER=1.6.0
29+
export SPARK_PACKAGE_TYPE=hadoop2.6
30+
else
31+
exit 1
32+
fi
33+
34+
# Folders where are stored Spark and Hadoop depending on version required
35+
export SPARK_BINARIES_FOLDER=spark-$SPARK_VER-bin-$SPARK_PACKAGE_TYPE
36+
export HADOOP_BINARIES_FOLDER=hadoop-$HADOOP_VER
37+
export HADOOP_CONF_DIR=/opt/$HADOOP_BINARIES_FOLDER/etc/hadoop/
38+
39+
sudo -E ./travis/install_hadoop_spark.sh
40+
sudo -E ./travis/config_hadoop_spark.sh
2741
before_script:
2842
- "export JAVA_OPTS=-Xmx512m"
2943
cache:
3044
directories:
3145
- $HOME/.m2
3246
script:
33-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
34-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
35-
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
36-
- sudo -E ./travis/configssh.sh
37-
- sudo -E ./travis/restart_hadoop_spark.sh
38-
- cp ./travis/hadoop.conf ./conf/
39-
- cp ./travis/spark.conf ./conf/
40-
- /opt/hadoop-2.6.5/bin/yarn node -list 2
41-
- sudo -E ./bin/run_all.sh
47+
- |
48+
if [[ "$java_ver" == 11 ]]; then
49+
mvn clean package -q -Psparkbench -Phadoopbench -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.12 -Dmaven-compiler-plugin.version=3.8.0 -Dexclude-streaming
50+
elif [[ "$java_ver" == 8 ]]; then
51+
mvn clean package -q -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.11
52+
elif [[ "$java_ver" == 7 ]]; then
53+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
54+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
55+
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
56+
else
57+
exit 1
58+
fi
59+
60+
sudo -E ./travis/configssh.sh
61+
sudo -E ./travis/restart_hadoop_spark.sh
62+
sudo -E ./bin/run_all.sh

0 commit comments

Comments
 (0)