Skip to content

Commit 0dd9c8e

Browse files
committed
Updated README file
1 parent f9fc1f7 commit 0dd9c8e

2 files changed

Lines changed: 73 additions & 18 deletions

File tree

tests/translators/airflow/Dockerfile_Airflow

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ RUN apt-get -y install cmake-data
2121
RUN apt-get -y install sudo
2222
RUN apt-get -y install vim --fix-missing
2323
RUN apt-get -y install gcc
24-
RUN apt-get -y install gcc-multilib
24+
#RUN apt-get -y install gcc-multilib
2525

2626
# Python stuff
2727
RUN apt-get -y install python3 python3-pip

tests/translators/airflow/README

Lines changed: 72 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,43 +1,98 @@
1-
Download wfcommons:
2-
pip install wfcommons
1+
This README file describes steps to install/run Airflow, and then run a translated workflow.
2+
3+
There are three sections:
4+
- Installing Airflow on bare-metal
5+
- Installing Airflow via Docker
6+
- Running a translated workflow
7+
8+
9+
Install Airflow on bare-metal
10+
------------------------------
11+
12+
1. Install Airflow
313

4-
Download airflow with:
514
pip install apache-airflow==2.10.2 --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.2/constraints-3.12.txt"
615

7-
Download MySQL and MySQLClient:
16+
2. Install MySQL and MySQLClient
17+
818
apt-get -y install pkg-config
919
apt-get install -y mysql-server
1020
apt-get install -y python3-dev build-essential
1121
apt-get install -y default-libmysqlclient-dev
1222
pip install mysqlclient
1323

14-
Setup database for Airflow:
24+
3. Setup database for Airflow
25+
1526
mysqld --explicit-defaults-for-timestamp &
16-
In MySQL:
27+
In MySQL client, type the following:
1728
CREATE DATABASE airflow_db CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
1829
CREATE USER 'airflow_user'@'%' IDENTIFIED BY 'airflow_pass';
1930
GRANT ALL PRIVILEGES ON airflow_db.* TO 'airflow_user';
2031

21-
Set airflow home directory:
32+
4. Set env variable for Airflow's home directory
33+
2234
export AIRFLOW_HOME="$(pwd)"
2335

24-
Edit AIRFLOW_HOME/airflow.cfg (may need to run 'airflow dags list' to make the file appear):
36+
5. Edit $AIRFLOW_HOME/airflow.cfg (may need to run `airflow dags list` to create this file in the first place)
37+
38+
Update the "sql_alchemy_conn = ..." line to be:
39+
2540
sql_alchemy_conn = mysql+mysqldb://airflow_user:airflow_pass@localhost:3306/airflow_db
2641

27-
Finish setting up database:
42+
6. Finish setting up the database
43+
2844
airflow db migrate
2945

30-
Move the translated workflow.py file to AIRFLOW_HOME/dags/
3146

32-
Run workflow:
33-
airflow dags test {workflow_name}
47+
Installing Airflow via Docker
48+
-----------------------------
49+
50+
A much simpler alternative is to use Docker.
51+
52+
1. Build the docker image
53+
54+
docker build -t wfcommons-dev -f Dockerfile_Airflow .
55+
56+
(if building on a Mac, add the `--platform linux/amd64` argument after build above)
57+
58+
2. Run the docker container in the directory to contains the translated workflow (see last section below)
59+
60+
docker run -it --rm -v .:/home/wfcommons/mount wfcommons-dev /bin/bash
61+
62+
63+
Running a translated workflow with Airflow
64+
-------------------------------------------
65+
66+
Assuming that you have run the airflow translator, for instance, using this Python code:
67+
68+
```
69+
import pathlib
70+
71+
from wfcommons import BlastRecipe
72+
from wfcommons.wfbench import WorkflowBenchmark, AirflowTranslator
73+
74+
# create a workflow benchmark object to generate specifications based on a recipe
75+
benchmark = WorkflowBenchmark(recipe=BlastRecipe, num_tasks=45)
76+
77+
# generate a specification based on performance characteristics
78+
benchmark.create_benchmark(pathlib.Path("/tmp/"), cpu_work=100, data=10, percent_cpu=0.6)
79+
80+
# generate an Airflow workflow
81+
translator = AirflowTranslator(benchmark.workflow)
82+
translator.translate(output_folder=pathlib.Path("/tmp/translated_workflow/"))
83+
```
84+
85+
The above will create a JSON worfklow file in /tmp/blast-benchmark-45.json.
86+
In that file, the workflow name (this is used below) is set to
87+
"Blast-Benchmark". The above will also create the translated workflow the
88+
/tmp/translated_workflow/ directory. Some directories and files need to be copied/moved as follows:
89+
90+
cp -r /tmp/translated_workflow/ $AIRFLOW_HOME/dags/
91+
mv $AIRFLOW_HOME/dags/translated_workflow/workflow.py $AIRFLOW_HOME/dags/
3492

93+
Finally, run the workflow as:
3594

95+
airflow dags test Blast-Benchmark (not the "Blast-Benchmark" workflow name from above)
3696

3797

3898

39-
(and running the translator)
40-
Initialize the AirflowTranslator with the workflow to translate and the directory that will contain the workflow's input files.
41-
Build the Airflow Docker container and run it in the directory with the translated dag. (Make sure there isn't a pre-existing 'airflow' directory.)
42-
docker build -t wfcommons-dev -f Dockerfile_Airflow .
43-
docker run -it --rm -v .:/home/wfcommons wfcommons-dev /bin/bash

0 commit comments

Comments
 (0)