You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-5Lines changed: 9 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -84,8 +84,8 @@ The backend looks for a file named `config.json` at the location from where it i
84
84
85
85
## First Steps
86
86
87
-
Using the DRES CLI, the `help` command lists available commands and their usage. If one would like to know more about a certain command, use the argument `-h`.
88
-
Following the first steps towards a successful installment of a (distributed) retrieval evaluation campagn. A prerequisit is the previous deployment, see [Setup](#setup) and a running DRES instance.
87
+
Using the DRES CLI, the `help` command lists available commands and their usage. If one would like to know more about a certain command`$cmd`, use the argument `-h`: `DRES> $cmd -h`
88
+
Following the first steps towards a successful installment of a (distributed) retrieval evaluation campaign. A prerequisit is the previous deployment, see [Setup](#setup) and a running DRES instance.
89
89
90
90
### Create User
91
91
@@ -130,17 +130,18 @@ Then, navigate to _Evaluation Template Builder_ and create a new competition. Fo
130
130
### Create Competition Run
131
131
132
132
An evaluation template serves as the template for one or more _evaluation runs_.
133
-
Please keep in mind, that once a _run_ was created, changes on the template are not reflected in the run.
133
+
Please keep in mind, that once a _run_ was created, changes on the template are not reflected in that run.
134
134
135
-
Evaluation runs are created from the _Evaluations_view, where one uses the "+" button to create a new one.
135
+
Evaluation runs are created from the _Evaluation Template Overview_view, where one uses the "Exit" (a running person) button to create a new one.
136
136
137
137
In a non distributed setting, it might be desirable, that participants cannot view the actual run from the frontend,
138
138
but require an external source for the query hints (e.g. a large monitor). This could be achieved by unchecking the corresponding option in the dialog.
139
+
A run must be of either type `SYNCHRONOUS` or `ASYNCHRONOUS`. The former has task presentation and task execution syncrhonised for all participants and the latter enables task execution individually per participant.
139
140
140
141
### Runnig the evaluation
141
142
142
143
As evaluation _operator_, one has to first start the run, then switch to a fitting task and ultimately start the task.
143
-
Query hints are displayed as configured to all viewers, once they are all loaded (depending on the setup, this might take a breif moment).
144
+
Query hints are displayed as configured to all viewers, once they are all loaded (depending on the setup, this might take a brief moment).
144
145
Viewers and participants are shown the message "_Waiting for host to start task_". In case this seems to take too long,
145
146
the operator can switch to the admin view and force all participants to be ready, by clicking the red ones.
146
147
@@ -153,6 +154,9 @@ It is recommended that all programmatic interaction with the DRES server is done
153
154
154
155
**Notice:** We strongly recommend the usage of the [client OpenApi Specification](doc/oas-client.json) to generate the code to submit (and generally interact with the server)!
155
156
157
+
**Notice:** With version 2.0.0, we provide a new POST submission endpoint which is more flexible than the legacy GET one.
158
+
That is, for version 2.0.0 the legacy submission endpoint remains in the API version 1 and is deprecated. We highly encourage to move to the new POST-based endpoint.
159
+
156
160
---
157
161
158
162
For legacy reasons, we provide further information below:
0 commit comments