Cache key derivation
Each entry is identified by:Configuration
Caching is controlled at two levels: global defaults in the root config and per-tool overrides.Global defaults
Set default caching in.hyperterse:
| Field | Type | Default | Description |
|---|---|---|---|
enabled | boolean | false | Whether caching is active for all DB-backed tools. |
ttl | integer | 120 | Time-to-live in seconds. |
Per-tool override
Precedence
- Tool-level config (highest priority).
- Global root config (
tools.cache). - Runtime defaults (
enabled: false,ttl: 120).
Behavior
The cache operates as a read-through layer in front of connectors and handler scripts.On execution
- Before connector execution, the executor checks the cache using the derived key.
- Hit: cached result returned immediately; connector is not called.
- Miss: connector executes; result stored with configured TTL before being returned.
Eviction
- TTL-based. Entries expire after their configured TTL.
- Capacity-based. The cache enforces memory bounds (128 MiB default). When approaching capacity, least-recently-used entries are evicted.
No explicit invalidation
There is no API for manual cache invalidation. Entries are evicted only by TTL or capacity pressure. For immediate invalidation, disable caching on affected tools and manage cache externally (e.g., through a Redis adapter with handler logic).Characteristics
| Property | Value |
|---|---|
| Scope | Process-local. Not shared between instances. |
| Storage | In-memory. No persistence across restarts. |
| Thread safety | Concurrent-safe. Reads and writes do not block execution. |
| Serialization | Results are cloned on store and retrieval to prevent mutation. |
| Distributed | None. Each instance maintains its own cache. |
When to enable caching
Enable for:- Read-heavy tools with stable data (reference tables, configuration lookups).
- Expensive queries where staleness within the TTL window is acceptable.
- High-frequency tools where connector load reduction matters.
- Write operations or tools that must return real-time data.
- Highly variable inputs where cache fill exceeds hit rate.
- Tools where result freshness is critical for correctness.