Skip to content

Create a tracking system for benchmarks and performance regressions. #723

@im7mortal

Description

@im7mortal

The project currently lacks a system to track performance and detect regressions.

I am currently looking into criterion-rs to track baselines and detect regressions. I don’t think we need extremely precise benchmarking, since most of the heavy work happens in the compression/decompression, encryption/decryption, and hashing libraries. This library mostly performs basic operations, and if there is an issue (for example in iteration loops), it should be relatively easy to detect.

Since I proposed moving to a plugin-like architecture, performance checks will be important. From the hybrid solution I presented earlier, after further research I now think that using dyn traits should not significantly affect performance. Most of the heavy work is done in lower-level libraries, and their aggressive optimizations will remain unchanged. The overhead should be close to zero.

However, the ecosystem first needs a good benchmarking system.

I ran some scripts with AI assistance and got the following results. I run quick benchmarks locally, including on a 10-year-old Raspberry Pi 3B. I think having an automated system for quick assessment would be useful.

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions