Fix imports for fake op wrappers used in export#366
Fix imports for fake op wrappers used in export#366geoffreyQiu wants to merge 1 commit intoNVIDIA:mainfrom
Conversation
Greptile SummaryThis PR fixes missing imports for fake op wrapper modules ( Confidence Score: 5/5Safe to merge — changes are additive import fixes with no logic modifications. Both files receive only import additions. The load-order invariant (real ops before fake impls) is correctly preserved via isort guards in exportable_embedding.py. In fused_hstu_op.py the new hstu.hstu_ops_gpu import is a submodule of hstu, so Python's package initialization guarantees the parent's init.py runs first regardless of line order, making an isort guard unnecessary there. No logic changes, no regressions. No files require special attention. Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A[Module import starts] --> B[_load_inference_emb_ops\nloads inference_emb_ops.so\nregisters torch.ops.INFERENCE_EMB.*]
B --> C[import dynamicemb.index_range_meta\nregisters fake impls for torch.export]
C --> D[import dynamicemb.lookup_meta\nregisters fake impls for torch.export]
D --> E[import hstu_cuda_ops\nregisters torch.ops.hstu_cuda_ops.*]
E --> F[import commons.ops.cuda_ops.fake_hstu_cuda_ops\nregisters fake impls for torch.export]
F --> G[isort: on - Normal imports resume]
G --> H[ExportableEmbedding class available]
subgraph fused_hstu_op.py
I[import hstu\nregisters torch.ops.fbgemm.*] --> J[import hstu.hstu_ops_gpu\nregisters fake impls for torch.export]
end
Reviews (1): Last reviewed commit: "Fix imports for fake ops wrapper used in..." | Re-trigger Greptile |
|
/build |
1 similar comment
|
/build |
|
❌ Pipeline #48606612 -- failed
Result: 8/14 jobs passed |
|
Close as meged to #363 |
Description
torch.export