Commit 13cf84a
feat(ascend): op-simple group — Add, Mul, Cast, Cat, Matmul, Gemm, Linear (#65)
* feat(ascend): op-simple group — Add, Mul, Cast, Cat, Matmul, Gemm, Linear
Seven foundational Ascend operators:
| op | impl |
|---|---|
| Add | aclnnAdd |
| Mul | aclnnMul |
| Cast | aclnnCast |
| Cat | aclnnCat |
| Matmul | aclnnMatmul |
| Gemm | aclnnMm (also carries the cached-executor / workspace-pool rework) |
| Linear | aclnnMatmul + optional bias |
Also ships:
- `src/base/<op>.h` for the 5 new ops (cast/cat/linear/matmul/mul);
`add.h` and `gemm.h` existed on master and are updated in-place
- `src/cpu/<op>/<op>.h` reference impls for cast/cat/linear/mul (add/gemm/matmul
had CPU refs on master already)
- `tests/test_<op>.py` for each operator (add and gemm have MODIFY diffs;
others are new)
* fix(ascend): Add/Cat destructor — use `release()` for executor-owned caches
- `add/kernel.h`: swap destroy() → release() on in_cache_/oth_cache_/out_cache_
and drop aclDestroyAclOpExecutor (both are referenced by the Repeatable
executor; destroying them causes double-free at shutdown per the pattern
documented in common.h and commit 64c367c).
- `cat/kernel.h`: release all in_caches_[i] in the destructor; without it,
~AclTensorCache() on vector teardown double-frees descriptors held by
tensor_list_ / executor_.
- Also group the alpha_* storage members with blank lines to match file
convention.
* test: generate `implementation_index` dynamically from `active_implementation_indices`
Replaces hardcoded `(0, 1)` / `(0, 1, 2)` tuples in test_add, test_gemm,
test_rms_norm, test_swiglu with a union over the locally-available devices'
active implementation indices.
New helper `tests.utils.all_active_implementation_indices(op_cls)` only
iterates `get_available_devices()` to avoid `DispatchFunc::std::abort` on
device types outside the build's `ActiveDevices` set.
Effect on Ascend CI: skipped-test count drops from 3246 to 1686 — impl=1
(`cuBLASLt`) no longer parametrized when no CUDA device is visible, and
RmsNorm/Swiglu's custom-kernel slot drops out of the matrix on op-simple
where the framework layer hasn't merged the AscendC impl yet.
* test(conftest): joint `(device, implementation_index)` parametrize
Replaces the per-test `@pytest.mark.parametrize("implementation_index", ...)`
+ runtime `if impl not in active_indices: skip` pattern with a single hook in
`conftest.pytest_generate_tests` that emits only the (device, impl) pairs
actually active on each device.
Rationale: kernel dispatch is per-device, so cross-device union (previous
`all_active_implementation_indices` helper) polluted the matrix with impls
that the selected device can't run — runtime-skipped noise. Joint generation
keeps the matrix to its semantic cell: "this device has this impl, so run it".
- `tests/conftest.py`: when both `device` and `implementation_index` are in
fixturenames, emit pairs via `op_cls.active_implementation_indices(dev)`;
fall back to a skipped placeholder (`id="skip"`) when no device has an
active impl, avoiding `[NOTSET-...]` test IDs.
- `tests/{test_add,test_gemm,test_rms_norm,test_swiglu}.py`: drop the hardcoded
`implementation_index` parametrize decorator and the runtime `active_indices`
guard — conftest now handles both.
- `tests/utils.py`: remove the `all_active_implementation_indices` helper
(superseded by per-device generation in conftest).
Same test outcome on Ascend CI (1935 passed / 1686 skipped) but the remaining
skips are now either semantically mandatory (uint dtypes unsupported by
`torch_npu`, Gemm impl=2 SFINAE-only workaround, op missing ascend impl on
op-simple pending PR #66) rather than mechanism artifacts.
* refactor(conftest): dedupe `_op_class_from_module`, short-circuit redundant fixture
Post-review cleanup of the joint-parametrize refactor (1dd288f):
- Extract `_op_class_from_module` as a shared helper; `skip_op_without_platform_impl` fixture now calls it instead of re-deriving the snake→pascal class name inline.
- Short-circuit the fixture when `implementation_index` is already in callspec — `pytest_generate_tests` has already pruned empty-impl pairs, so per-case `active_implementation_indices` calls are wasted work.
- Drop `try/except ImportError` inside the helper — collection has already imported `infini.ops` via test modules; masking a real import failure only turns it into a cryptic NOTSET fixture.
- Drop the `devices[0] if devices else "cpu"` fallback — `get_available_devices()` always includes `"cpu"`, making the `else` arm unreachable.
* refactor(cpu): flatten nested `DispatchFunc` in Cast; snake_case variables in Linear
Per PR #65 review:
- `src/cpu/cast/cast.h`: replace nested `DispatchFunc(in_dtype, ...)` inside
`DispatchFunc(out_dtype, ...)` with a single multi-dispatch call
`DispatchFunc<kCpu, AllTypes, AllTypes>({in, out}, [](in_tag, out_tag) {...})`
per the multi-dispatch idiom documented in `CONTRIBUTING.md`.
- `src/cpu/linear/linear.h`: rename PascalCase locals to snake_case:
`A/B/Out/Bias` → `a_ptr/b_ptr/out_ptr/bias_ptr`,
`A_batch/B_batch/Out_batch` → `a_batch/b_batch/out_batch`,
`M/N/K` → `m/n/k` (matching master's `src/cpu/gemm/gemm.h` which already
uses lowercase dim names `m_/n_/k_`).
* refactor(cpu/linear): drop redundant `&& bias` guard + narrating comment
- `if (bias_ptr && bias)` → `if (bias_ptr)` (line 75). `bias_ptr` is
`nullptr` iff `!bias` by construction at line 38, so `&& bias` is dead.
- Remove `// Determine `m`, `n`, `k` from shapes and transpose flags.` —
the three lines below literally do exactly that; self-describing now that
names are snake_case.
---------
Co-authored-by: zhangyue <zhangyue@example.com>1 parent a05713b commit 13cf84a
27 files changed
Lines changed: 1584 additions & 107 deletions
File tree
- src
- ascend
- add
- cast
- cat
- gemm
- linear
- matmul
- mul
- base
- cpu
- cast
- cat
- linear
- mul
- tests
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
21 | 21 | | |
22 | 22 | | |
23 | 23 | | |
24 | | - | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
25 | 29 | | |
26 | 30 | | |
27 | 31 | | |
28 | 32 | | |
29 | 33 | | |
30 | | - | |
31 | | - | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
32 | 44 | | |
33 | 45 | | |
34 | 46 | | |
35 | 47 | | |
36 | 48 | | |
37 | 49 | | |
38 | 50 | | |
39 | | - | |
40 | | - | |
41 | | - | |
42 | | - | |
43 | | - | |
44 | | - | |
45 | | - | |
46 | | - | |
47 | | - | |
48 | | - | |
49 | | - | |
50 | | - | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
51 | 67 | | |
52 | | - | |
53 | | - | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
54 | 72 | | |
55 | 73 | | |
56 | | - | |
| 74 | + | |
57 | 75 | | |
58 | 76 | | |
59 | | - | |
| 77 | + | |
60 | 78 | | |
61 | | - | |
| 79 | + | |
62 | 80 | | |
63 | | - | |
64 | | - | |
65 | | - | |
66 | | - | |
67 | | - | |
68 | 81 | | |
69 | 82 | | |
70 | 83 | | |
| |||
77 | 90 | | |
78 | 91 | | |
79 | 92 | | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
80 | 105 | | |
81 | 106 | | |
82 | 107 | | |
| |||
0 commit comments