| title | diffblas: algorithmically differentiated BLAS routines | |||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| tags |
|
|||||||||||||||||||||
| authors |
|
|||||||||||||||||||||
| affiliations |
|
|||||||||||||||||||||
| date | 25 February 2026 | |||||||||||||||||||||
| bibliography | paper.bib |
diffblas is a library that provides BLAS routines algorithmically differentiated using algorithmic differentiation principles [@griewank2008] from their reference implementations in LAPACK on GitHub using the automatic differentiation tool Tapenade [@tapenade].
It supports four modes: tangent (_d), vector tangent (_dv), adjoint (_b), and vector adjoint (_bv).
In addition to differentiating the standard Fortran-style BLAS interface, diffblas also provides differentiated CBLAS routines, facilitating interoperability with C and other languages.
Its API mirrors BLAS / CBLAS, with additional arguments specifying differentiation variables, making integration into existing workflows straightforward.
diffblas relies on the underlying standard BLAS implementation and is agnostic to the backend (OpenBLAS, BLIS, MKL, Apple Accelerate, or libblastrampoline), ensuring both high performance and portability.
Precompiled artifacts are available on GitHub, simplifying installation and use across different programming environments.
By providing efficient and accurate derivatives of linear algebra operations, diffblas facilitates gradient-based optimization, sensitivity analysis, and derivative-based workflows in scientific computing.
Linear algebra routines such as those in LAPACK are widely used in scientific computing, optimization, and machine learning. However, they do not provide derivatives, which are often required for gradient-based algorithms.
diffblas addresses this gap by providing algorithmically differentiated BLAS routines directly from reference LAPACK implementations and following relevant differntiation rules [@giles2008].
Automatic source-to-source differentiation tools, such as Tapenade [@tapenade], ADOL-C [@ADOLC], or TAF [@TAF], provide general mechanisms to compute derivatives of code.
[@jonasson2020] derives scalar reverse mode derivative formulae for BLAS routines and provides their Fortran code. Such an approach is generally more efficient that differentiating through the BLAS routines using an AD tool [@jonasson2020]. They do not however consider the forward mode or vector modes.
Derivatives of linear algebra routines are available in Python packages such as JAX [@jax] and PyTorch [@pytorch]. These implementations are not available outside these frameworks. Enzyme and Enzyme.jl perform optimized BLAS and CuBLAS differentiation at the MLIR level[@Enzyme] but the derivative code is not available externally. ...
This work was inspired in part by a need to differentiate a Fortran code [@HFBTHO] that uses BLAS and LAPACK routines, and to use the differentiated application for gradient-based optimization. We were able to to differentiate HFBTHO using Tapenade by providing the source code for the BLAS routines used by HFBTHO and their dependencies to Tapenade [@HFBTHOAD]. We handcoded the derivatives for the LAPACK routine DSYEVR.
Providing both the standard and CBLAS interfaces ensures that diffblas can be adopted across different programming environments, facilitating derivative computations in diverse scientific computing projects. Precompiled artifacts on GitHub further simplify integration, enabling rapid deployment in multiple languages and scientific computing projects.
This work was supported in part by the Applied Mathematics activity within the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research Applied Mathematics, and Office of Nuclear Physics SciDAC program under Contract No. DE-AC02-06CH11357. This work was supported in part by NSF CSSI grant 2104068.
Generative AI was used to ... ChatGPT was used to check spelling, grammar, and clarity of the English text in this paper.