Unlike the Import UI, which lets us monitor or debug usage, we have no insight into the stand-alone client.
One thing we could do is to ship the report.json content at the end of the run. It should be well under 1MB in size.
Perhaps we can have a new recon API endpoint to save the report.json in a GCS bucket as a time-stamped file.
To make sure disconnected (or poorly-connected) users can use the tool, we should probably do this at the very end so even if interrupted the command will have done its job.
Finally, if we also report back the GCS file path, users can then more conveniently share that URL while asking us questions (rather than sending along a report.json).
@chejennifer, @beets ... wdyt?
Unlike the Import UI, which lets us monitor or debug usage, we have no insight into the stand-alone client.
One thing we could do is to ship the
report.jsoncontent at the end of the run. It should be well under 1MB in size.Perhaps we can have a new recon API endpoint to save the
report.jsonin a GCS bucket as a time-stamped file.To make sure disconnected (or poorly-connected) users can use the tool, we should probably do this at the very end so even if interrupted the command will have done its job.
Finally, if we also report back the GCS file path, users can then more conveniently share that URL while asking us questions (rather than sending along a report.json).
@chejennifer, @beets ... wdyt?