Skip to content

Commit 9310b81

Browse files
authored
Enhance documentation in lib.rs with comprehensive architecture overview and usage examples (#27)
- Expanded module documentation to provide a detailed architecture overview, including a trait hierarchy and data structure descriptions. - Added a Quick Start section with code examples for using the `CacheBuilder` and policy-specific APIs, improving usability for new users. - Included a table of eviction policies with feature flags, complexity, and best use cases to guide users in selecting appropriate policies. - Documented thread safety options and metrics features, ensuring clarity on concurrent usage and performance tracking. - Improved overall structure and navigation of the documentation to align with Rust API Guidelines.
1 parent 94e0f4f commit 9310b81

1 file changed

Lines changed: 176 additions & 2 deletions

File tree

src/lib.rs

Lines changed: 176 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,180 @@
1-
//! cachekit: cache policies and tiered cache management primitives.
1+
//! High-performance cache primitives with pluggable eviction policies.
22
//!
3-
//! See `docs/design.md` for internal architecture and invariants.
3+
//! `cachekit` provides a trait-based cache framework with 17 eviction policies,
4+
//! arena-backed data structures, and optional metrics — all designed for
5+
//! allocation-free hot paths and predictable tail latency.
6+
//!
7+
//! # Quick Start
8+
//!
9+
//! The fastest way to get a cache is through the [`builder`] module:
10+
//!
11+
//! ```
12+
//! use cachekit::builder::{CacheBuilder, CachePolicy};
13+
//!
14+
//! let mut cache = CacheBuilder::new(1000).build::<u64, String>(CachePolicy::Lru);
15+
//! cache.insert(1, "hello".to_string());
16+
//! assert_eq!(cache.get(&1), Some(&"hello".to_string()));
17+
//! ```
18+
//!
19+
//! For direct access to policy-specific APIs, use the concrete types:
20+
//!
21+
//! ```
22+
//! use cachekit::policy::lru_k::LrukCache;
23+
//! use cachekit::traits::{CoreCache, LrukCacheTrait};
24+
//!
25+
//! let mut cache = LrukCache::with_k(1000, 2);
26+
//! cache.insert(42, "value");
27+
//! cache.get(&42); // second access — now scan-resistant
28+
//! assert!(cache.k_distance(&42).is_some());
29+
//! ```
30+
//!
31+
//! # Architecture
32+
//!
33+
//! ```text
34+
//! ┌──────────────────────────────────────────────────────────────────────┐
35+
//! │ cachekit │
36+
//! │ │
37+
//! │ traits Trait hierarchy (ReadOnlyCache → CoreCache → …) │
38+
//! │ builder Unified CacheBuilder + Cache<K,V> wrapper │
39+
//! │ policy 17 eviction policies behind feature flags │
40+
//! │ ds Arena, ring buffer, intrusive list, ghost list, … │
41+
//! │ store Storage backends (HashMap, slab, weighted) │
42+
//! │ metrics Hit/miss counters and snapshots (feature-gated) │
43+
//! │ error ConfigError and InvariantError types │
44+
//! └──────────────────────────────────────────────────────────────────────┘
45+
//! ```
46+
//!
47+
//! **Policy ↔ Storage separation.** Policies only manage metadata and eviction
48+
//! ordering; the underlying storage is a separate concern. This lets each
49+
//! policy use the most cache-friendly layout (contiguous arenas, ring buffers,
50+
//! frequency buckets) without coupling to a single map implementation.
51+
//!
52+
//! # Trait Hierarchy
53+
//!
54+
//! All caches implement [`traits::CoreCache`], which extends
55+
//! [`traits::ReadOnlyCache`]. Policy-specific behaviour is expressed through
56+
//! additional traits:
57+
//!
58+
//! | Trait | Extends | Purpose |
59+
//! |---|---|---|
60+
//! | [`ReadOnlyCache`](traits::ReadOnlyCache) | — | `contains`, `len`, `capacity` (no side effects) |
61+
//! | [`CoreCache`](traits::CoreCache) | `ReadOnlyCache` | `insert`, `get`, `clear` |
62+
//! | [`MutableCache`](traits::MutableCache) | `CoreCache` | `remove` (not available on FIFO) |
63+
//! | [`FifoCacheTrait`](traits::FifoCacheTrait) | `CoreCache` | `pop_oldest`, `age_rank` |
64+
//! | [`LruCacheTrait`](traits::LruCacheTrait) | `MutableCache` | `pop_lru`, `touch`, `recency_rank` |
65+
//! | [`LfuCacheTrait`](traits::LfuCacheTrait) | `MutableCache` | `pop_lfu`, `frequency` |
66+
//! | [`LrukCacheTrait`](traits::LrukCacheTrait) | `MutableCache` | `pop_lru_k`, `k_distance` |
67+
//!
68+
//! Write generic code against the trait you need:
69+
//!
70+
//! ```
71+
//! use cachekit::traits::{CoreCache, ReadOnlyCache};
72+
//!
73+
//! fn utilization<K, V, C: ReadOnlyCache<K, V>>(cache: &C) -> f64 {
74+
//! cache.len() as f64 / cache.capacity() as f64
75+
//! }
76+
//! ```
77+
//!
78+
//! # Eviction Policies
79+
//!
80+
//! Each policy is gated behind a `policy-*` feature flag. The `default` feature
81+
//! set enables S3-FIFO, LRU, Fast-LRU, LRU-K, and Clock. Enable `policy-all`
82+
//! for everything.
83+
//!
84+
//! | Policy | Feature | Eviction basis | Complexity | Best for |
85+
//! |---|---|---|---|---|
86+
//! | FIFO | `policy-fifo` | Insertion order | O(1) | Streaming, predictable eviction |
87+
//! | LRU | `policy-lru` | Recency | O(1) | Temporal locality |
88+
//! | Fast-LRU | `policy-fast-lru` | Recency (no Arc) | O(1) | Max single-threaded throughput |
89+
//! | LRU-K | `policy-lru-k` | K-th access time | O(1) | Scan resistance (databases) |
90+
//! | LFU | `policy-lfu` | Frequency (buckets) | O(1) | Stable hot spots |
91+
//! | Heap-LFU | `policy-heap-lfu` | Frequency (heap) | O(log n) | Large caches, frequent eviction |
92+
//! | 2Q | `policy-two-q` | Two-queue promotion | O(1) | Mixed workloads |
93+
//! | S3-FIFO | `policy-s3-fifo` | Three-queue FIFO | O(1) | CDN, scan-heavy workloads |
94+
//! | ARC | `policy-arc` | Adaptive recency/freq | O(1) | Unknown/changing workloads |
95+
//! | CAR | `policy-car` | Clock + ARC | O(1) | Adaptive with low overhead |
96+
//! | SLRU | `policy-slru` | Segmented LRU | O(1) | Buffer pools, scans |
97+
//! | Clock | `policy-clock` | Reference bit | O(1) amortised | Low-overhead LRU approx |
98+
//! | Clock-PRO | `policy-clock-pro` | Adaptive clock | O(1) amortised | Scan-resistant clock |
99+
//! | NRU | `policy-nru` | Not-recently-used bit | O(n) worst case | Small caches, coarse recency |
100+
//! | LIFO | `policy-lifo` | Reverse insertion | O(1) | Stack-like / undo buffers |
101+
//! | MRU | `policy-mru` | Most recent access | O(1) | Cyclic / sequential scans |
102+
//! | MFU | `policy-mfu` | Highest frequency | O(1) | Niche inverse-frequency |
103+
//! | Random | `policy-random` | Uniform random | O(1) | Baselines |
104+
//!
105+
//! # Feature Flags
106+
//!
107+
//! | Flag | Default | Description |
108+
//! |---|---|---|
109+
//! | `policy-s3-fifo` | yes | S3-FIFO policy |
110+
//! | `policy-lru` | yes | LRU policy |
111+
//! | `policy-fast-lru` | yes | Fast-LRU (no Arc wrapping) |
112+
//! | `policy-lru-k` | yes | LRU-K policy |
113+
//! | `policy-clock` | yes | Clock (second-chance) policy |
114+
//! | `policy-all` | no | Enable every policy |
115+
//! | `metrics` | no | Hit/miss counters, [`metrics::snapshot::CacheMetricsSnapshot`] |
116+
//! | `concurrency` | no | `parking_lot`-backed concurrent data structures |
117+
//!
118+
//! Disable defaults and cherry-pick for smaller builds:
119+
//!
120+
//! ```toml
121+
//! [dependencies]
122+
//! cachekit = { version = "0.4", default-features = false, features = ["policy-s3-fifo"] }
123+
//! ```
124+
//!
125+
//! # Data Structures (`ds`)
126+
//!
127+
//! The [`ds`] module exposes the building blocks used by policies:
128+
//!
129+
//! - [`ClockRing`](ds::ClockRing) — fixed-capacity ring buffer with reference bits
130+
//! - [`SlotArena`](ds::SlotArena) — index-addressed arena with O(1) alloc/free
131+
//! - [`IntrusiveList`](ds::IntrusiveList) — doubly-linked list with arena-backed nodes
132+
//! - [`FrequencyBuckets`](ds::FrequencyBuckets) — O(1) frequency counter buckets
133+
//! - [`GhostList`](ds::GhostList) — bounded evicted-key history
134+
//! - [`LazyMinHeap`](ds::LazyMinHeap) — lazy-deletion min-heap
135+
//! - [`FixedHistory`](ds::FixedHistory) — fixed-size circular access history
136+
//! - [`KeyInterner`](ds::KeyInterner) — deduplicating key storage
137+
//! - [`ShardSelector`](ds::ShardSelector) — deterministic shard routing
138+
//!
139+
//! All structures pre-allocate and reuse memory; none allocate on the hot path.
140+
//!
141+
//! # Thread Safety
142+
//!
143+
//! Individual caches are **not** thread-safe by default. Options:
144+
//!
145+
//! 1. Wrap in `Arc<RwLock<Cache>>` for coarse-grained sharing.
146+
//! 2. Enable the `concurrency` feature for `parking_lot`-backed concurrent
147+
//! variants ([`ConcurrentClockRing`](ds::clock_ring::ConcurrentClockRing),
148+
//! [`ConcurrentSlotArena`](ds::slot_arena::ConcurrentSlotArena), etc.).
149+
//! 3. Use the [`ConcurrentCache`](traits::ConcurrentCache) marker trait to
150+
//! constrain generic code to thread-safe implementations.
151+
//!
152+
//! # Metrics
153+
//!
154+
//! Enable the `metrics` feature to get lightweight hit/miss/eviction counters.
155+
//! Detailed snapshots are available via
156+
//! [`CacheMetricsSnapshot`](metrics::snapshot::CacheMetricsSnapshot).
157+
//!
158+
//! # Error Handling
159+
//!
160+
//! Fallible constructors (e.g.
161+
//! [`S3FifoCache::try_with_ratios`](policy::s3_fifo::S3FifoCache::try_with_ratios))
162+
//! return [`ConfigError`](error::ConfigError) for invalid parameters. Debug-only
163+
//! invariant checks produce [`InvariantError`](error::InvariantError).
164+
//!
165+
//! # Choosing a Policy
166+
//!
167+
//! ```text
168+
//! ┌─ temporal locality? ──► LRU / Fast-LRU
169+
//! │
170+
//! What does your ───┼─ frequency matters? ──► LFU / Heap-LFU
171+
//! workload look │
172+
//! like? ├─ scan-heavy / mixed? ─► S3-FIFO / 2Q / LRU-K / ARC
173+
//! │
174+
//! ├─ unknown / changing? ──► ARC / Clock-PRO
175+
//! │
176+
//! └─ simple / streaming? ──► FIFO / Clock
177+
//! ```
4178
5179
pub mod ds;
6180
pub mod error;

0 commit comments

Comments
 (0)