@@ -12,8 +12,8 @@ distributed jobs:
12124 . A spot fleet of EC2 instances
1313
1414All of them can be managed through the AWS Management Console. However, this code helps to get
15- started quickly and run a job autonomously if all the configuration is correct. The code includes
16- a fabric script that links all these components and prepares the infrastructure to run a distributed
15+ started quickly and run a job autonomously if all the configuration is correct. The code runs a
16+ script that links all these components and prepares the infrastructure to run a distributed
1717job. When the job is completed, the code is also able to stop resources and clean up components.
1818
1919## Running the code
@@ -29,18 +29,13 @@ and you can modify the worker code to build your own. Anytime you modify the wor
2929to update the docker registry using the Makefile script inside the worker directory.
3030
3131### Step 2
32- After the first script runs successfully, the job can now be submitted to AWS using EITHER of the
33- following commands:
32+ After the first script runs successfully, the job can now be submitted to AWS using the following command:
3433
3534 $ python run.py submitJob files/exampleJob.json
3635
37- OR
38-
39- $ python run_batch_general.py
40-
41- Running either script uploads the tasks that are configured in the json file. This assumes that your
36+ Running the script uploads the tasks that are configured in the json file. This assumes that your
4237data is stored in S3, and the json file has the paths to find input and output directories. You have to
43- customizethe exampleJob.json file or the run_batch_general file with paths that make sense for your project.
38+ customize the exampleJob.json file or the run_batch_general file with paths that make sense for your project.
4439Each job will be run in parallel - you define each task in your input file to guide the parallelization.
4540
4641### Step 3
0 commit comments