echocache

package module
v1.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 22, 2026 License: MIT Imports: 9 Imported by: 0

README

EchoCache

EchoCache is a generic, concurrency-safe Go caching library that prevents redundant computations under high load by combining a pluggable storage backend with singleflight deduplication: when many goroutines request the same missing key at the same time, only one refresh function is executed and the result is shared with all waiting callers.

Two caching modes are available:

Mode API Description
Synchronous EchoCache.FetchWithCache Callers block until the value is computed and cached.
Lazy / Stale-While-Revalidate EchoCacheLazy.FetchWithLazyRefresh A stale value is returned immediately; a background goroutine silently refreshes the entry.

Features

  • Generic – works with any Go type (string, structs, slices, …).
  • Singleflight deduplication – collapses concurrent misses for the same key into a single computation.
  • Stale-while-revalidate – serve stale data instantly and refresh in the background.
  • Pluggable backends – swap storage without changing application code:
    • In-memory LRU (NewLRUCache) – bounded, no expiry.
    • In-memory Expirable LRU (NewLRUExpirableCache) – bounded + per-entry TTL.
    • In-memory Single-entry (NewSingleCache) – single slot with TTL, zero allocations on hits.
    • Redis (NewRedisCache) – persistent, distributed, JSON serialisation.
    • NATS JetStream (NewNatsCache) – distributed KeyValue-backed cache.

Installation

go get github.com/logocomune/echocache

Requires Go 1.21+.

Quick Start

Synchronous cache (EchoCache)
package main

import (
    "context"
    "fmt"
    "time"

    "github.com/logocomune/echocache"
    "github.com/logocomune/echocache/store"
)

func main() {
    // Create a cache backed by a 128-entry in-memory LRU.
    cache := echocache.NewEchoCache[string](store.NewLRUCache[string](128))

    ctx := context.Background()
    key := "greeting"

    // First call: cache miss – the refresh function is executed.
    value, cached, err := cache.FetchWithCache(ctx, key, func(ctx context.Context) (string, error) {
        time.Sleep(50 * time.Millisecond) // simulate slow work
        return "Hello, World!", nil
    })
    fmt.Printf("value=%q cached=%v err=%v\n", value, cached, err)
    // Output: value="Hello, World!" cached=true err=<nil>

    // Second call: cache hit – refresh function is NOT executed.
    value, cached, err = cache.FetchWithCache(ctx, key, func(ctx context.Context) (string, error) {
        panic("should not be called")
    })
    fmt.Printf("value=%q cached=%v err=%v\n", value, cached, err)
    // Output: value="Hello, World!" cached=true err=<nil>
}
Concurrent deduplication

When many goroutines request the same key simultaneously, only one refresh is performed. All others wait and receive the same result.

package main

import (
    "context"
    "fmt"
    "sync"
    "time"

    "github.com/logocomune/echocache"
    "github.com/logocomune/echocache/store"
)

func main() {
    cache := echocache.NewEchoCache[string](store.NewLRUCache[string](128))
    ctx := context.Background()

    var wg sync.WaitGroup
    for i := range 10 {
        wg.Add(1)
        go func(id int) {
            defer wg.Done()
            start := time.Now()
            val, _, _ := cache.FetchWithCache(ctx, "shared-key", func(ctx context.Context) (string, error) {
                time.Sleep(200 * time.Millisecond)
                return "computed once", nil
            })
            fmt.Printf("goroutine %d: %q elapsed=%v\n", id, val, time.Since(start).Round(time.Millisecond))
        }(i)
    }
    wg.Wait()
    // All goroutines complete in ~200 ms, not 2 000 ms.
}
Stale-while-revalidate (EchoCacheLazy)

Return the cached value immediately (even if stale) and refresh silently in the background.

package main

import (
    "context"
    "fmt"
    "time"

    "github.com/logocomune/echocache"
    "github.com/logocomune/echocache/store"
)

type Weather struct {
    City        string
    Temperature float64
}

func main() {
    // LRU backend – no TTL; the lazy refresh interval controls staleness.
    backend := store.NewStaleWhileRevalidateLRUCache[Weather](128)
    cache := echocache.NewLazyEchoCache[Weather](backend, 5*time.Second)
    defer cache.ShutdownLazyRefresh()

    ctx := context.Background()
    refresh := func(ctx context.Context) (Weather, error) {
        // Simulate a slow external API call.
        time.Sleep(300 * time.Millisecond)
        return Weather{City: "Rome", Temperature: 22.5}, nil
    }

    // First call: cold cache – blocks until the value is fetched (~300 ms).
    w, _, _ := cache.FetchWithLazyRefresh(ctx, "weather:rome", refresh, 10*time.Second)
    fmt.Printf("%s: %.1f°C\n", w.City, w.Temperature)

    time.Sleep(11 * time.Second) // wait until the 10-second refresh interval expires

    // Second call: returns the stale value INSTANTLY and queues a background refresh.
    w, _, _ = cache.FetchWithLazyRefresh(ctx, "weather:rome", refresh, 10*time.Second)
    fmt.Printf("%s: %.1f°C (stale – refresh enqueued)\n", w.City, w.Temperature)
}
Using a Redis backend
import (
    "github.com/logocomune/echocache"
    "github.com/logocomune/echocache/store"
    "github.com/redis/go-redis/v9"
    "time"
)

rdb := redis.NewClient(&redis.Options{Addr: "localhost:6379"})

// Simple cache
cache := echocache.NewEchoCache[string](
    store.NewRedisCache[string](rdb, "myapp", 5*time.Minute),
)

// Lazy (stale-while-revalidate) cache
lazyBackend := store.NewStaleWhileRevalidateRedisCache[string](rdb, "myapp", 5*time.Minute)
lazyCache := echocache.NewLazyEchoCache[string](lazyBackend, 10*time.Second)
defer lazyCache.ShutdownLazyRefresh()
Using a NATS JetStream backend
import (
    "github.com/logocomune/echocache"
    "github.com/logocomune/echocache/store"
    "github.com/nats-io/nats.go"
    "github.com/nats-io/nats.go/jetstream"
)

nc, _ := nats.Connect(nats.DefaultURL)
js, _ := jetstream.New(nc)
kv, _ := js.CreateKeyValue(context.Background(), jetstream.KeyValueConfig{Bucket: "mycache"})

cache := echocache.NewEchoCache[string](store.NewNatsCache[string](kv, "prefix"))

Storage backends reference

Constructor Interface Description
store.NewLRUCache[T](size) Cacher[T] Bounded in-memory LRU, no expiry
store.NewLRUExpirableCache[T](size, ttl) Cacher[T] Bounded in-memory LRU with per-entry TTL
store.NewSingleCache[T](ttl) Cacher[T] Single-entry in-memory cache with TTL
store.NewRedisCache[T](client, prefix, ttl) Cacher[T] Redis-backed, JSON serialisation
store.NewNatsCache[T](kv, prefix) Cacher[T] NATS JetStream KeyValue-backed
store.NewStaleWhileRevalidateLRUCache[T](size) StaleWhileRevalidateCache[T] LRU variant for lazy caching
store.NewStaleWhileRevalidateExpiringLRUCache[T](size, ttl) StaleWhileRevalidateCache[T] Expirable LRU variant for lazy caching
store.NewStaleWhileRevalidateSingleCache[T](ttl) StaleWhileRevalidateCache[T] Single-entry variant for lazy caching
store.NewStaleWhileRevalidateRedisCache[T](client, prefix, ttl) StaleWhileRevalidateCache[T] Redis variant for lazy caching
store.NewStaleWhileRevalidateNatsCache[T](kv, prefix) StaleWhileRevalidateCache[T] NATS variant for lazy caching

Benchmarks

Run with:

go test ./... -bench=. -benchmem

Representative results on an AMD Ryzen 7 5800H (16 threads):

BenchmarkFetchWithCache_CacheHit-16              43965662    53.5 ns/op    0 B/op    0 allocs/op
BenchmarkFetchWithCache_CacheMiss-16              2335598   958.4 ns/op  319 B/op    5 allocs/op
BenchmarkFetchWithCache_Concurrent-16            46195622    56.7 ns/op    0 B/op    0 allocs/op
BenchmarkFetchWithLazyRefresh_CacheHit-16        15728220   159.3 ns/op    0 B/op    0 allocs/op
BenchmarkFetchWithLazyRefresh_CacheHitStale-16   24293737    99.0 ns/op    0 B/op    0 allocs/op

BenchmarkLRUCache_Get-16                         49176892    48.9 ns/op    0 B/op    0 allocs/op
BenchmarkLRUCache_Set-16                         44030736    56.8 ns/op    0 B/op    0 allocs/op
BenchmarkLRUExpirableCache_Get-16                26374305    86.9 ns/op    0 B/op    0 allocs/op
BenchmarkLRUExpirableCache_Set-16                20437982   117.7 ns/op    0 B/op    0 allocs/op
BenchmarkSingleCache_Get-16                      87536472    28.2 ns/op    0 B/op    0 allocs/op
BenchmarkSingleCache_Set-16                      26237246    93.0 ns/op    0 B/op    0 allocs/op

License

Distributed under the MIT license.

Documentation

Overview

Package echocache provides a generic, concurrency-safe caching layer built on top of pluggable storage backends. It prevents redundant computations under high concurrency by using singleflight deduplication: when multiple goroutines request the same missing key simultaneously, only one refresh function is executed and the result is shared.

Two caching modes are available:

  • EchoCache – synchronous: the caller blocks until the value is computed and cached.
  • EchoCacheLazy – lazy/stale-while-revalidate: a stale value is returned immediately while a background goroutine silently refreshes the cache entry.

Storage backends live in the github.com/logocomune/echocache/store sub-package and include in-memory LRU, expirable LRU, single-entry, Redis, and NATS JetStream implementations.

Index

Constants

View Source
const (
	NeverExpire = time.Hour * 24 * 365 * 100
)

NeverExpire represents a duration of 100 years, effectively used to denote a value that should never expire.

Variables

This section is empty.

Functions

This section is empty.

Types

type EchoCache

type EchoCache[T any] struct {
	// contains filtered or unexported fields
}

EchoCache is a generic caching mechanism that integrates singleflight to prevent redundant computations. EchoCache uses a Cacher interface for data storage and retrieval, supporting custom refresh functions for cache misses. EchoCache ensures only one computation per key occurs simultaneously to optimize concurrent operations.

func NewEchoCache added in v1.2.0

func NewEchoCache[T any](cacher store.Cacher[T]) *EchoCache[T]

NewEchoCache creates a new EchoCache instance to enable caching with optional singleflight for concurrent requests.

func (*EchoCache[T]) FetchWithCache added in v1.2.0

func (ec *EchoCache[T]) FetchWithCache(ctx context.Context, key string, refreshFn store.RefreshFunc[T]) (T, bool, error)

FetchWithCache retrieves a cached value by key or computes it using a given refresh function, caching the result for future use. Returns the value, a boolean indicating if it was found or computed, and an error if computation or retrieval fails.

type EchoCacheLazy added in v1.2.0

type EchoCacheLazy[T any] struct {
	// contains filtered or unexported fields
}

EchoCacheLazy is a lazy-refresh cache implementing the stale-while-revalidate pattern. It returns a cached (possibly stale) value immediately and enqueues a background refresh for keys whose value has aged past the configured interval.

Concurrency guarantees:

  • singleflight deduplicates concurrent refreshes within the same process.
  • TryAcquireRefreshLock / ReleaseRefreshLock (delegated to the backend) prevent thundering-herd across multiple processes / instances.
  • A per-key pending-task map (pendingKeys) ensures at most one refresh task is queued per key, protecting the queue from duplicate entries.
  • ShutdownLazyRefresh waits for all in-flight background tasks to complete before returning (graceful shutdown via sync.WaitGroup).

func NewLazyEchoCache added in v1.2.0

func NewLazyEchoCache[T any](cacher store.StaleWhileRevalidateCache[T], refreshTimeout time.Duration, opts ...LazyEchoCacheOption) *EchoCacheLazy[T]

NewLazyEchoCache initialises a lazy echo cache with the given backend and refresh timeout. Background refresh workers are started immediately. Optional LazyEchoCacheOption values can tune queue size and worker count.

func (*EchoCacheLazy[T]) FetchWithLazyRefresh added in v1.2.0

func (ec *EchoCacheLazy[T]) FetchWithLazyRefresh(ctx context.Context, key string, refreshFn store.RefreshFunc[T], lazyRefreshInterval time.Duration) (T, bool, error)

FetchWithLazyRefresh retrieves a cached value or computes a new value if missing, scheduling a lazy background refresh when the cached value is stale.

If the cached value exists but is older than lazyRefreshInterval, a refresh task is enqueued for background processing (at most one task per key; fix 1.2). If the value is missing or an error occurs, the refreshFn is called synchronously using the caller's context deadline (fix M5).

func (*EchoCacheLazy[T]) ShutdownLazyRefresh added in v1.2.0

func (ec *EchoCacheLazy[T]) ShutdownLazyRefresh()

ShutdownLazyRefresh gracefully stops the refresh workers:

  1. Cancels the internal context so workers stop accepting new tasks.
  2. Closes the task queue (idempotent via sync.Once).
  3. Waits (fix 2.1) for all in-flight worker goroutines to finish.

It is safe to call ShutdownLazyRefresh multiple times; only the first call triggers the shutdown sequence; subsequent calls are no-ops.

type LazyEchoCacheOption added in v1.2.1

type LazyEchoCacheOption func(*lazyEchoCacheOptions)

LazyEchoCacheOption is a functional option for NewLazyEchoCache.

func WithQueueSize added in v1.2.1

func WithQueueSize(n int) LazyEchoCacheOption

WithQueueSize sets the capacity of the background-refresh task queue. The default is 1000. Values ≤ 0 are ignored.

func WithWorkerCount added in v1.2.1

func WithWorkerCount(n int) LazyEchoCacheOption

WithWorkerCount sets the number of goroutines that drain the background-refresh task queue concurrently. The default is 1. Values ≤ 0 are ignored.

Directories

Path Synopsis
Package store provides pluggable cache backend implementations for use with EchoCache and EchoCacheLazy.
Package store provides pluggable cache backend implementations for use with EchoCache and EchoCacheLazy.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL