DynamicWhere.ex
DynamicWhere.exv2.1.0·docs

Cache & Optimization

DynamicWhere.ex caches every reflection lookup it performs — property metadata, validated property paths, collection element types — so that repeated queries never pay the reflection cost twice. The entire cache subsystem is thread-safe, fully configurable, and exposed through a single public class: CacheExpose.

Why a reflection cache?

Building dynamic LINQ expressions from JSON requires walking type metadata for every condition, sort, group, and projection. Reflection is expensive, but its results are stable for a given type and path. Caching turns the per-request reflection work into a one-time cost amortised across the lifetime of the process.

  • First request for Customer.Name resolves the path via reflection and stores the result.
  • Every subsequent request — across any thread — reads from a ConcurrentDictionary.
  • When a store grows past its configured ceiling, an eviction algorithm trims it back down.

Thread-safety

Every cache store is a ConcurrentDictionary guarded by access-tracking metadata managed by CacheDatabase. Reads and writes from many threads are safe with no locking on the caller's side. Configuration changes via CacheExpose.Configure(...) are eventually consistent — see the configuration page for details.

The three cache stores

Reflection results are split across three independent stores so that eviction in one does not invalidate the others. Each store has its own size limit, hit counters, and eviction state.

StoreKeyValuePurpose
TypePropertiesTypeDictionary<string, PropertyInfo>All public instance properties per type.
PropertyPath(Type, string)stringValidated & normalized property paths.
CollectionElementTypeTypeType?Element type for collection types.

Each store is addressable by name through the CacheMemoryType enum, which lets you target individual stores in the diagnostics API.

Three eviction strategies

When a store exceeds its MaxCacheSize, an eviction pass removes the least valuable entries. The selection algorithm is controlled by the CacheEvictionStrategy enum.

StrategyWhat gets evicted
FIFOOldest entries first — order of insertion.
LRULeast recently used — entries untouched the longest.
LFULeast frequently used — entries with the lowest hit count.
Note
The default configuration is LRU with MaxCacheSize = 1000 per store. This works well for most applications. Tune it via presets or a custom CacheOptions object.

Quick example

using DynamicWhere.ex.Optimization.Cache.Source;
using DynamicWhere.ex.Optimization.Cache.Config;

// Pick a preset that matches your environment
CacheExpose.Configure(CacheOptions.ForHighMemoryEnvironment());

// Optionally warm the cache at startup
CacheExpose.WarmupCache<Customer>("Name", "Email", "Address.City");

// Inspect at runtime
CacheStatistics stats = CacheExpose.GetCacheStatistics();

Sub-pages