[Accepted in ICSE 2026]
This repository contains the complete artifact and replication package for the ICSE 2026 submission: "Why Attention Fails: A Taxonomy of Faults in Attention-Based Neural Networks"
Modern AI systems, including LLMs like ChatGPT and Gemini, depend on attention mechanisms. However, reliability failures in these architectures are often poorly understood and inadequately addressed by existing taxonomies. Our study provides the first systematic taxonomy of faults in attention-based neural networks (ABNNs), based on empirical analysis of 555 real-world faults from 96 projects and 10 frameworks.
This repository is intended for both researchers and practitioners. It offers curated datasets, code for analysis and taxonomy development, as well as scripts and validation artifacts for replication and artifact evaluation.
Fault-Taxonomy-for-Attention-Based-Neural-Networks/
Code/
requirements.txt # Python dependencies
config.json # Configuration settings for scripts
validation_framework.py # Validation and sanity-check framework
manual_analysis.py # Manual qualitative coding/annotation pipeline
statistical_analysis.py # Statistical analysis scripts (RQ2/RQ4)
main_pipeline.py # Orchestrates the full study pipeline
data_collection.py # Script for data crawling and bug collection
visualization_framework.py # For generating figures and visualizations
taxonomy_development.py # Core script for taxonomy coding and analysis
readme.md # Detailed instructions for code reproduction
Qualitative/
agreement_full_manual_analysis.md
agreement_pilot_analysis.md
agreement_fault_relevance.md
agreement_taxonomy_uniqueness.md
Manual_Analysis.xlsx
agreement_taxonomy_development.md
agreement_analysis_summary.md
agreement_content_sufficiency.md
readme.md # Guide to inter-rater agreement & qualitative reliability
ICSE_2026_ABNN.pdf # Submitted research paper
-
Read the Paper
- The main findings, taxonomy, and methodology are in
ICSE_2026_ABNN.pdf(root).
- The main findings, taxonomy, and methodology are in
-
Replication & Artifact Evaluation
-
Code and Data: Navigate to
./Code/for all scripts and supporting files.- Follow setup and run instructions in
./Code/readme.md.
- Follow setup and run instructions in
-
Qualitative Reliability: All manual annotation and inter-rater agreement analyses supporting the taxonomy are documented in
./Qualitative/.- See
./Qualitative/readme.mdfor structure and how to interpret agreement files.
- See
-
-
Environment Setup
-
Python 3.9+ is recommended. Install dependencies via:
pip install -r ./Code/requirements.txt -
Adjust
config.jsonas needed to reproduce or extend analyses.
-
-
Reproducing Results
- Main pipeline: run
main_pipeline.pyto orchestrate end-to-end reproduction. - For custom analysis (RQ2, RQ4, taxonomy development), see dedicated scripts and explanations in
./Code/readme.md.
- Main pipeline: run
-
Artifact Evaluation
- All required outputs, analysis checkpoints, and validation artifacts are included.
- For further information or troubleshooting, see respective
readme.mdfiles in each folder.
- Purpose: Scripts for data collection, manual and statistical analysis, taxonomy development, and visualization.
- Instructions: Step-by-step usage instructions for running each script and reproducing study results can be found in
./Code/readme.md.
- Purpose: Reliability and validity assessment of manual annotation and taxonomy development. Includes agreement statistics, consensus-building records, and metrics for qualitative validity.
- Instructions: See
./Qualitative/readme.mdfor explanations of each file and guidance on how to use these artifacts in your own studies.
-
Paper Reference:
@article{anonymous2026attention, title={Why Attention Fails: A Taxonomy of Faults in Attention-Based Neural Networks}, author={Anonymous Author(s)}, journal={ICSE 2026 (submitted)}, year={2026} } -
Correspondence: Please use the anonymized contact provided in the paper or GitHub repository.
- This replication package and dataset are provided for research and artifact evaluation purposes. Check the
ICSE_2026_ABNN.pdffor up-to-date publication and citation information. - For more information or updates, visit the ICSE 2026 conference site.
For any issues, questions, or requests, please consult the individual README files inside each subfolder or contact the authors as listed in the paper.