mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2026-03-13 00:59:59 +00:00
* feat: Add CacheProvider API for external distributed caching Introduces a public API for external cache providers, enabling distributed caching across multiple ComfyUI instances (e.g., Kubernetes pods). New files: - comfy_execution/cache_provider.py: CacheProvider ABC, CacheContext/CacheValue dataclasses, thread-safe provider registry, serialization utilities Modified files: - comfy_execution/caching.py: Add provider hooks to BasicCache (_notify_providers_store, _check_providers_lookup), subcache exclusion, prompt ID propagation - execution.py: Add prompt lifecycle hooks (on_prompt_start/on_prompt_end) to PromptExecutor, set _current_prompt_id on caches Key features: - Local-first caching (check local before external for performance) - NaN detection to prevent incorrect external cache hits - Subcache exclusion (ephemeral subgraph results not cached externally) - Thread-safe provider snapshot caching - Graceful error handling (provider errors logged, never break execution) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * fix: use deterministic hash for cache keys instead of pickle Pickle serialization is NOT deterministic across Python sessions due to hash randomization affecting frozenset iteration order. This causes distributed caching to fail because different pods compute different hashes for identical cache keys. Fix: Use _canonicalize() + JSON serialization which ensures deterministic ordering regardless of Python's hash randomization. This is critical for cross-pod cache key consistency in Kubernetes deployments. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * test: add unit tests for CacheProvider API - Add comprehensive tests for _canonicalize deterministic ordering - Add tests for serialize_cache_key hash consistency - Add tests for contains_nan utility - Add tests for estimate_value_size - Add tests for provider registry (register, unregister, clear) - Move json import to top-level (fix inline import) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * style: remove unused imports in test_cache_provider.py 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * fix: move _torch_available before usage and use importlib.util.find_spec Fixes ruff F821 (undefined name) and F401 (unused import) errors. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * fix: use hashable types in frozenset test and add dict test Frozensets can only contain hashable types, so use nested frozensets instead of dicts. Added separate test for dict handling via serialize_cache_key. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * refactor: expose CacheProvider API via comfy_api.latest.Caching - Add Caching class to comfy_api/latest/__init__.py that re-exports from comfy_execution.cache_provider (source of truth) - Fix docstring: "Skip large values" instead of "Skip small values" (small compute-heavy values are good cache targets) - Maintain backward compatibility: comfy_execution.cache_provider imports still work Usage: from comfy_api.latest import Caching class MyProvider(Caching.CacheProvider): def on_lookup(self, context): ... def on_store(self, context, value): ... Caching.register_provider(MyProvider()) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * docs: clarify should_cache filtering criteria Change docstring from "Skip large values" to "Skip if download time > compute time" which better captures the cost/benefit tradeoff for external caching. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * docs: make should_cache docstring implementation-agnostic Remove prescriptive filtering suggestions - let implementations decide their own caching logic based on their use case. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * feat: add optional ui field to CacheValue - Add ui field to CacheValue dataclass (default None) - Pass ui when creating CacheValue for external providers - Use result.ui (or default {}) when returning from external cache lookup This allows external cache implementations to store/retrieve UI data if desired, while remaining optional for implementations that skip it. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * refactor: rename _is_cacheable_value to _is_external_cacheable_value Clearer name since objects are also cached locally - this specifically checks for external caching eligibility. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * refactor: async CacheProvider API + reduce public surface - Make on_lookup/on_store async on CacheProvider ABC - Simplify CacheContext: replace cache_key + cache_key_bytes with cache_key_hash (str hex digest) - Make registry/utility functions internal (_prefix) - Trim comfy_api.latest.Caching exports to core API only - Make cache get/set async throughout caching.py hierarchy - Use asyncio.create_task for fire-and-forget on_store - Add NaN gating before provider calls in Core - Add await to 5 cache call sites in execution.py Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: remove unused imports (ruff) and update tests for internal API - Remove unused CacheContext and _serialize_cache_key imports from caching.py (now handled by _build_context helper) - Update test_cache_provider.py to use _-prefixed internal names - Update tests for new CacheContext.cache_key_hash field (str) - Make MockCacheProvider methods async to match ABC Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: address coderabbit review feedback - Add try/except to _build_context, return None when hash fails - Return None from _serialize_cache_key on total failure (no id()-based fallback) - Replace hex-like test literal with non-secret placeholder Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: use _-prefixed imports in _notify_prompt_lifecycle The lifecycle notification method was importing the old non-prefixed names (has_cache_providers, get_cache_providers, logger) which no longer exist after the API cleanup. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: add sync get_local/set_local for graph traversal ExecutionList in graph.py calls output_cache.get() and .set() from sync methods (is_cached, cache_link, get_cache). These cannot await the now-async get/set. Add get_local/set_local that bypass external providers and only access the local dict — which is all graph traversal needs. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: remove cloud-specific language from cache provider API Make all docstrings and comments generic for the OSS codebase. Remove references to Kubernetes, Redis, GCS, pods, and other infrastructure-specific terminology. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * style: align documentation with codebase conventions Strip verbose docstrings and section banners to match existing minimal documentation style used throughout the codebase. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: add usage example to Caching class, remove pickle fallback - Add docstring with usage example to Caching class matching the convention used by sibling APIs (Execution.set_progress, ComfyExtension) - Remove non-deterministic pickle fallback from _serialize_cache_key; return None on JSON failure instead of producing unretrievable hashes - Move cache_provider imports to top of execution.py (no circular dep) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * refactor: move public types to comfy_api, eager provider snapshot Address review feedback: - Move CacheProvider/CacheContext/CacheValue definitions to comfy_api/latest/_caching.py (source of truth for public API) - comfy_execution/cache_provider.py re-exports types from there - Build _providers_snapshot eagerly on register/unregister instead of lazy memoization in _get_cache_providers Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: generalize self-inequality check, fail-closed canonicalization Address review feedback from guill: - Rename _contains_nan to _contains_self_unequal, use not (x == x) instead of math.isnan to catch any self-unequal value - Remove Unhashable and repr() fallbacks from _canonicalize; raise ValueError for unknown types so _serialize_cache_key returns None and external caching is skipped (fail-closed) - Update tests for renamed function and new fail-closed behavior Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: suppress ruff F401 for re-exported CacheContext CacheContext is imported from _caching and re-exported for use by caching.py. Add noqa comment to satisfy the linter. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: enable external caching for subcache (expanded) nodes Subcache nodes (from node expansion) now participate in external provider store/lookup. Previously skipped to avoid duplicates, but the cost of missing partial-expansion cache hits outweighs redundant stores — especially with looping behavior on the horizon. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: wrap register/unregister as explicit static methods Define register_provider and unregister_provider as wrapper functions in the Caching class instead of re-importing. This locks the public API signature in comfy_api/ so internal changes can't accidentally break it. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: use debug-level logging for provider registration Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: follow ProxiedSingleton pattern for Caching class Add Caching as a nested class inside ComfyAPI_latest inheriting from ProxiedSingleton with async instance methods, matching the Execution and NodeReplacement patterns. Retains standalone Caching class for direct import convenience. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: inline registration logic in Caching class Follow the Execution/NodeReplacement pattern — the public API methods contain the actual logic operating on cache_provider module state, not wrapper functions delegating to free functions. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: single Caching definition inside ComfyAPI_latest Remove duplicate standalone Caching class. Define it once as a nested class in ComfyAPI_latest (matching Execution/NodeReplacement pattern), with a module-level alias for import convenience. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: remove prompt_id from CacheContext, type-safe canonicalization Remove prompt_id from CacheContext — it's not relevant for cache matching and added unnecessary plumbing (_current_prompt_id on every cache). Lifecycle hooks still receive prompt_id directly. Include type name in canonicalized primitives so that int 7 and str "7" produce distinct hashes. Also canonicalize dict keys properly instead of str() coercion. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: address review feedback on cache provider API - Hold references to pending store tasks to prevent "Task was destroyed but it is still pending" warnings (bigcat88) - Parallel cache lookups with asyncio.gather instead of sequential awaits for better performance (bigcat88) - Delegate Caching.register/unregister_provider to existing functions in cache_provider.py instead of reimplementing (bigcat88) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude <noreply@anthropic.com>
404 lines
13 KiB
Python
404 lines
13 KiB
Python
"""Tests for external cache provider API."""
|
|
|
|
import importlib.util
|
|
import pytest
|
|
from typing import Optional
|
|
|
|
|
|
def _torch_available() -> bool:
|
|
"""Check if PyTorch is available."""
|
|
return importlib.util.find_spec("torch") is not None
|
|
|
|
|
|
from comfy_execution.cache_provider import (
|
|
CacheProvider,
|
|
CacheContext,
|
|
CacheValue,
|
|
register_cache_provider,
|
|
unregister_cache_provider,
|
|
_get_cache_providers,
|
|
_has_cache_providers,
|
|
_clear_cache_providers,
|
|
_serialize_cache_key,
|
|
_contains_self_unequal,
|
|
_estimate_value_size,
|
|
_canonicalize,
|
|
)
|
|
|
|
|
|
class TestCanonicalize:
|
|
"""Test _canonicalize function for deterministic ordering."""
|
|
|
|
def test_frozenset_ordering_is_deterministic(self):
|
|
"""Frozensets should produce consistent canonical form regardless of iteration order."""
|
|
# Create two frozensets with same content
|
|
fs1 = frozenset([("a", 1), ("b", 2), ("c", 3)])
|
|
fs2 = frozenset([("c", 3), ("a", 1), ("b", 2)])
|
|
|
|
result1 = _canonicalize(fs1)
|
|
result2 = _canonicalize(fs2)
|
|
|
|
assert result1 == result2
|
|
|
|
def test_nested_frozenset_ordering(self):
|
|
"""Nested frozensets should also be deterministically ordered."""
|
|
inner1 = frozenset([1, 2, 3])
|
|
inner2 = frozenset([3, 2, 1])
|
|
|
|
fs1 = frozenset([("key", inner1)])
|
|
fs2 = frozenset([("key", inner2)])
|
|
|
|
result1 = _canonicalize(fs1)
|
|
result2 = _canonicalize(fs2)
|
|
|
|
assert result1 == result2
|
|
|
|
def test_dict_ordering(self):
|
|
"""Dicts should be sorted by key."""
|
|
d1 = {"z": 1, "a": 2, "m": 3}
|
|
d2 = {"a": 2, "m": 3, "z": 1}
|
|
|
|
result1 = _canonicalize(d1)
|
|
result2 = _canonicalize(d2)
|
|
|
|
assert result1 == result2
|
|
|
|
def test_tuple_preserved(self):
|
|
"""Tuples should be marked and preserved."""
|
|
t = (1, 2, 3)
|
|
result = _canonicalize(t)
|
|
|
|
assert result[0] == "__tuple__"
|
|
|
|
def test_list_preserved(self):
|
|
"""Lists should be recursively canonicalized."""
|
|
lst = [{"b": 2, "a": 1}, frozenset([3, 2, 1])]
|
|
result = _canonicalize(lst)
|
|
|
|
# First element should be canonicalized dict
|
|
assert "__dict__" in result[0]
|
|
# Second element should be canonicalized frozenset
|
|
assert result[1][0] == "__frozenset__"
|
|
|
|
def test_primitives_include_type(self):
|
|
"""Primitive types should include type name for disambiguation."""
|
|
assert _canonicalize(42) == ("int", 42)
|
|
assert _canonicalize(3.14) == ("float", 3.14)
|
|
assert _canonicalize("hello") == ("str", "hello")
|
|
assert _canonicalize(True) == ("bool", True)
|
|
assert _canonicalize(None) == ("NoneType", None)
|
|
|
|
def test_int_and_str_distinguished(self):
|
|
"""int 7 and str '7' must produce different canonical forms."""
|
|
assert _canonicalize(7) != _canonicalize("7")
|
|
|
|
def test_bytes_converted(self):
|
|
"""Bytes should be converted to hex string."""
|
|
b = b"\x00\xff"
|
|
result = _canonicalize(b)
|
|
|
|
assert result[0] == "__bytes__"
|
|
assert result[1] == "00ff"
|
|
|
|
def test_set_ordering(self):
|
|
"""Sets should be sorted like frozensets."""
|
|
s1 = {3, 1, 2}
|
|
s2 = {1, 2, 3}
|
|
|
|
result1 = _canonicalize(s1)
|
|
result2 = _canonicalize(s2)
|
|
|
|
assert result1 == result2
|
|
assert result1[0] == "__set__"
|
|
|
|
def test_unknown_type_raises(self):
|
|
"""Unknown types should raise ValueError (fail-closed)."""
|
|
class CustomObj:
|
|
pass
|
|
with pytest.raises(ValueError):
|
|
_canonicalize(CustomObj())
|
|
|
|
def test_object_with_value_attr_raises(self):
|
|
"""Objects with .value attribute (Unhashable-like) should raise ValueError."""
|
|
class FakeUnhashable:
|
|
def __init__(self):
|
|
self.value = float('nan')
|
|
with pytest.raises(ValueError):
|
|
_canonicalize(FakeUnhashable())
|
|
|
|
|
|
class TestSerializeCacheKey:
|
|
"""Test _serialize_cache_key for deterministic hashing."""
|
|
|
|
def test_same_content_same_hash(self):
|
|
"""Same content should produce same hash."""
|
|
key1 = frozenset([("node_1", frozenset([("input", "value")]))])
|
|
key2 = frozenset([("node_1", frozenset([("input", "value")]))])
|
|
|
|
hash1 = _serialize_cache_key(key1)
|
|
hash2 = _serialize_cache_key(key2)
|
|
|
|
assert hash1 == hash2
|
|
|
|
def test_different_content_different_hash(self):
|
|
"""Different content should produce different hash."""
|
|
key1 = frozenset([("node_1", "value_a")])
|
|
key2 = frozenset([("node_1", "value_b")])
|
|
|
|
hash1 = _serialize_cache_key(key1)
|
|
hash2 = _serialize_cache_key(key2)
|
|
|
|
assert hash1 != hash2
|
|
|
|
def test_returns_hex_string(self):
|
|
"""Should return hex string (SHA256 hex digest)."""
|
|
key = frozenset([("test", 123)])
|
|
result = _serialize_cache_key(key)
|
|
|
|
assert isinstance(result, str)
|
|
assert len(result) == 64 # SHA256 hex digest is 64 chars
|
|
|
|
def test_complex_nested_structure(self):
|
|
"""Complex nested structures should hash deterministically."""
|
|
# Note: frozensets can only contain hashable types, so we use
|
|
# nested frozensets of tuples to represent dict-like structures
|
|
key = frozenset([
|
|
("node_1", frozenset([
|
|
("input_a", ("tuple", "value")),
|
|
("input_b", frozenset([("nested", "dict")])),
|
|
])),
|
|
("node_2", frozenset([
|
|
("param", 42),
|
|
])),
|
|
])
|
|
|
|
# Hash twice to verify determinism
|
|
hash1 = _serialize_cache_key(key)
|
|
hash2 = _serialize_cache_key(key)
|
|
|
|
assert hash1 == hash2
|
|
|
|
def test_dict_in_cache_key(self):
|
|
"""Dicts passed directly to _serialize_cache_key should work."""
|
|
key = {"node_1": {"input": "value"}, "node_2": 42}
|
|
|
|
hash1 = _serialize_cache_key(key)
|
|
hash2 = _serialize_cache_key(key)
|
|
|
|
assert hash1 == hash2
|
|
assert isinstance(hash1, str)
|
|
assert len(hash1) == 64
|
|
|
|
def test_unknown_type_returns_none(self):
|
|
"""Non-cacheable types should return None (fail-closed)."""
|
|
class CustomObj:
|
|
pass
|
|
assert _serialize_cache_key(CustomObj()) is None
|
|
|
|
|
|
class TestContainsSelfUnequal:
|
|
"""Test _contains_self_unequal utility function."""
|
|
|
|
def test_nan_float_detected(self):
|
|
"""NaN floats should be detected (not equal to itself)."""
|
|
assert _contains_self_unequal(float('nan')) is True
|
|
|
|
def test_regular_float_not_detected(self):
|
|
"""Regular floats are equal to themselves."""
|
|
assert _contains_self_unequal(3.14) is False
|
|
assert _contains_self_unequal(0.0) is False
|
|
assert _contains_self_unequal(-1.5) is False
|
|
|
|
def test_infinity_not_detected(self):
|
|
"""Infinity is equal to itself."""
|
|
assert _contains_self_unequal(float('inf')) is False
|
|
assert _contains_self_unequal(float('-inf')) is False
|
|
|
|
def test_nan_in_list(self):
|
|
"""NaN in list should be detected."""
|
|
assert _contains_self_unequal([1, 2, float('nan'), 4]) is True
|
|
assert _contains_self_unequal([1, 2, 3, 4]) is False
|
|
|
|
def test_nan_in_tuple(self):
|
|
"""NaN in tuple should be detected."""
|
|
assert _contains_self_unequal((1, float('nan'))) is True
|
|
assert _contains_self_unequal((1, 2, 3)) is False
|
|
|
|
def test_nan_in_frozenset(self):
|
|
"""NaN in frozenset should be detected."""
|
|
assert _contains_self_unequal(frozenset([1, float('nan')])) is True
|
|
assert _contains_self_unequal(frozenset([1, 2, 3])) is False
|
|
|
|
def test_nan_in_dict_value(self):
|
|
"""NaN in dict value should be detected."""
|
|
assert _contains_self_unequal({"key": float('nan')}) is True
|
|
assert _contains_self_unequal({"key": 42}) is False
|
|
|
|
def test_nan_in_nested_structure(self):
|
|
"""NaN in deeply nested structure should be detected."""
|
|
nested = {"level1": [{"level2": (1, 2, float('nan'))}]}
|
|
assert _contains_self_unequal(nested) is True
|
|
|
|
def test_non_numeric_types(self):
|
|
"""Non-numeric types should not be self-unequal."""
|
|
assert _contains_self_unequal("string") is False
|
|
assert _contains_self_unequal(None) is False
|
|
assert _contains_self_unequal(True) is False
|
|
|
|
def test_object_with_nan_value_attribute(self):
|
|
"""Objects wrapping NaN in .value should be detected."""
|
|
class NanWrapper:
|
|
def __init__(self):
|
|
self.value = float('nan')
|
|
assert _contains_self_unequal(NanWrapper()) is True
|
|
|
|
def test_custom_self_unequal_object(self):
|
|
"""Custom objects where not (x == x) should be detected."""
|
|
class NeverEqual:
|
|
def __eq__(self, other):
|
|
return False
|
|
assert _contains_self_unequal(NeverEqual()) is True
|
|
|
|
|
|
class TestEstimateValueSize:
|
|
"""Test _estimate_value_size utility function."""
|
|
|
|
def test_empty_outputs(self):
|
|
"""Empty outputs should have zero size."""
|
|
value = CacheValue(outputs=[])
|
|
assert _estimate_value_size(value) == 0
|
|
|
|
@pytest.mark.skipif(
|
|
not _torch_available(),
|
|
reason="PyTorch not available"
|
|
)
|
|
def test_tensor_size_estimation(self):
|
|
"""Tensor size should be estimated correctly."""
|
|
import torch
|
|
|
|
# 1000 float32 elements = 4000 bytes
|
|
tensor = torch.zeros(1000, dtype=torch.float32)
|
|
value = CacheValue(outputs=[[tensor]])
|
|
|
|
size = _estimate_value_size(value)
|
|
assert size == 4000
|
|
|
|
@pytest.mark.skipif(
|
|
not _torch_available(),
|
|
reason="PyTorch not available"
|
|
)
|
|
def test_nested_tensor_in_dict(self):
|
|
"""Tensors nested in dicts should be counted."""
|
|
import torch
|
|
|
|
tensor = torch.zeros(100, dtype=torch.float32) # 400 bytes
|
|
value = CacheValue(outputs=[[{"samples": tensor}]])
|
|
|
|
size = _estimate_value_size(value)
|
|
assert size == 400
|
|
|
|
|
|
class TestProviderRegistry:
|
|
"""Test cache provider registration and retrieval."""
|
|
|
|
def setup_method(self):
|
|
"""Clear providers before each test."""
|
|
_clear_cache_providers()
|
|
|
|
def teardown_method(self):
|
|
"""Clear providers after each test."""
|
|
_clear_cache_providers()
|
|
|
|
def test_register_provider(self):
|
|
"""Provider should be registered successfully."""
|
|
provider = MockCacheProvider()
|
|
register_cache_provider(provider)
|
|
|
|
assert _has_cache_providers() is True
|
|
providers = _get_cache_providers()
|
|
assert len(providers) == 1
|
|
assert providers[0] is provider
|
|
|
|
def test_unregister_provider(self):
|
|
"""Provider should be unregistered successfully."""
|
|
provider = MockCacheProvider()
|
|
register_cache_provider(provider)
|
|
unregister_cache_provider(provider)
|
|
|
|
assert _has_cache_providers() is False
|
|
|
|
def test_multiple_providers(self):
|
|
"""Multiple providers can be registered."""
|
|
provider1 = MockCacheProvider()
|
|
provider2 = MockCacheProvider()
|
|
|
|
register_cache_provider(provider1)
|
|
register_cache_provider(provider2)
|
|
|
|
providers = _get_cache_providers()
|
|
assert len(providers) == 2
|
|
|
|
def test_duplicate_registration_ignored(self):
|
|
"""Registering same provider twice should be ignored."""
|
|
provider = MockCacheProvider()
|
|
|
|
register_cache_provider(provider)
|
|
register_cache_provider(provider) # Should be ignored
|
|
|
|
providers = _get_cache_providers()
|
|
assert len(providers) == 1
|
|
|
|
def test_clear_providers(self):
|
|
"""_clear_cache_providers should remove all providers."""
|
|
provider1 = MockCacheProvider()
|
|
provider2 = MockCacheProvider()
|
|
|
|
register_cache_provider(provider1)
|
|
register_cache_provider(provider2)
|
|
_clear_cache_providers()
|
|
|
|
assert _has_cache_providers() is False
|
|
assert len(_get_cache_providers()) == 0
|
|
|
|
|
|
class TestCacheContext:
|
|
"""Test CacheContext dataclass."""
|
|
|
|
def test_context_creation(self):
|
|
"""CacheContext should be created with all fields."""
|
|
context = CacheContext(
|
|
node_id="node-456",
|
|
class_type="KSampler",
|
|
cache_key_hash="a" * 64,
|
|
)
|
|
|
|
assert context.node_id == "node-456"
|
|
assert context.class_type == "KSampler"
|
|
assert context.cache_key_hash == "a" * 64
|
|
|
|
|
|
class TestCacheValue:
|
|
"""Test CacheValue dataclass."""
|
|
|
|
def test_value_creation(self):
|
|
"""CacheValue should be created with outputs."""
|
|
outputs = [[{"samples": "tensor_data"}]]
|
|
value = CacheValue(outputs=outputs)
|
|
|
|
assert value.outputs == outputs
|
|
|
|
|
|
class MockCacheProvider(CacheProvider):
|
|
"""Mock cache provider for testing."""
|
|
|
|
def __init__(self):
|
|
self.lookups = []
|
|
self.stores = []
|
|
|
|
async def on_lookup(self, context: CacheContext) -> Optional[CacheValue]:
|
|
self.lookups.append(context)
|
|
return None
|
|
|
|
async def on_store(self, context: CacheContext, value: CacheValue) -> None:
|
|
self.stores.append((context, value))
|