You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: subgroups/sbom-sg/outcomes/QualityGuide/SBOM-Document-Quality-Guide.en.md
+12-8Lines changed: 12 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -373,18 +373,22 @@ For example, the two SBOMs below use different package names - one as 'hello' an
373
373
Although verifying that these represent the same package is possible by comparing, for instance, their PURLs, the fact that different tools may write different values for the same key frequently leads to confusion and poses challenges to the smooth operation of SBOM management.
374
374
375
375
#### 5.1.3 Improvement Measures
376
-
- Establish and disseminate unified naming conventions based on industry standards or internal guidelines, including concrete examples (such as regular expression format examples).
377
-
- Deploy automated verification tools to assess whether package names, version numbers, and supplier details comply with the set standards.
378
-
- Set up periodic reviews and feedback loops among stakeholders to refine and update the naming guidelines.
376
+
- Organizations involved in the software supply chain shall agree to document values - such as purl - that uniquely identify a package.
377
+
- A standardized notation rule based on industry standards or internal guidelines shall be formulated, and concrete examples (e.g., format examples using regular expressions) shall be widely shared among the organizations involved in the software supply chain.
378
+
- Verification tools shall be shared among the organizations in the software supply chain, and a mechanism shall be established to check whether the following items - crucial for package identification and prone to inconsistent notation—comply with the established rules:
379
+
- Package name
380
+
- Package version
381
+
- Package supplier name
382
+
- Package source
383
+
- Presence of purl
379
384
380
385
#### 5.1.4 Evaluation Methods
381
-
- Quantitatively monitor rule violation counts (errors or deviations) using automated checkers before and after guideline implementation.
382
-
- Conduct random sampling of SBOM entries to manually verify adherence to the naming conventions.
383
-
- Gather qualitative feedback from internal users regarding improvements in consistency and clarity.
386
+
- Randomly check whether each documented item adheres to the standardized notation rules, and evaluate the compliance rate.
384
387
385
388
#### 5.1.5 Risks and Considerations
386
-
- Automated tools might not capture all edge cases or exceptions in naming.
387
-
- Updates to naming guidelines may induce short-term confusion if not clearly communicated and documented.
389
+
- If the documentation rules are inadequately defined or overly complex, there is a risk of misclassification due to incomplete handling of exceptional cases or unique notations.
390
+
- If the rules deviate from actual operational practices, there is a risk of misclassification resulting from either insufficient or excessively stringent checks.
391
+
- It should be noted that complete automation of tools and processes is challenging; therefore, final checks and exception handling will require manual review.
388
392
389
393
### 5.2 Standardization and Normalization of Component Granularity
0 commit comments