Skip to content

Commit 708b19b

Browse files
Donglai Weiclaude
andcommitted
Skip model build when inference.saved_prediction_path is set
When saved_prediction_path is configured, creates lightweight nn.Identity() model and skips checkpoint loading — no GPU needed for decode-only mode. Works with: just test waterz_decoding "" Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent e399302 commit 708b19b

1 file changed

Lines changed: 5 additions & 2 deletions

File tree

scripts/main.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -889,10 +889,13 @@ def main():
889889
):
890890
return
891891

892-
# Check for cached intermediate predictions early so we can skip both the
892+
# Check for cached/external predictions early so we can skip both the
893893
# expensive model build and checkpoint restore for test/tune modes.
894+
_saved_pred = getattr(getattr(cfg, "inference", None), "saved_prediction_path", "")
895+
has_saved_prediction = bool(_saved_pred and isinstance(_saved_pred, str) and _saved_pred.strip())
894896
tta_cached = args.mode in ("test", "tune", "tune-test") and (
895-
_has_tta_prediction_file(cfg)
897+
has_saved_prediction
898+
or _has_tta_prediction_file(cfg)
896899
or _has_cached_predictions_in_output_dir(
897900
cfg,
898901
mode=args.mode,

0 commit comments

Comments
 (0)