Compare commits

...

5 Commits

Author SHA1 Message Date
Connor Byrne
bd0c868b56 refactor: use canvas event for read-only sync instead of callback
Replace LGraphCanvas.onReadOnlyChanged callback property with a
'litegraph:read-only-changed' CustomEvent dispatched on the canvas
DOM element. canvasStore now subscribes via useEventListener,
matching the existing pattern used for litegraph:set-graph,
subgraph-opened, subgraph-converted, and litegraph:ghost-placement.

Addresses review feedback:
https://github.com/Comfy-Org/ComfyUI_frontend/pull/8998#discussion_r2831838643
https://github.com/Comfy-Org/ComfyUI_frontend/pull/8998#discussion_r3184409804
2026-05-04 13:51:38 -07:00
bymyself
79bfb95d4d fix: space bar panning over Vue nodes in standard nav mode
Bridge LGraphCanvas.read_only to Vue reactivity via onReadOnlyChanged
callback so the existing CSS pointer-events-auto/none toggle on
LGraphNode.vue and NodeWidgets.vue re-evaluates when space key
toggles panning mode. Events then fall through to the LiteGraph
canvas naturally — no per-handler forwarding or force flags needed.

Fixes #7806

Amp-Thread-ID: https://ampcode.com/threads/T-019c796c-e83c-769d-85f4-20a349994bad
2026-05-04 00:33:14 -07:00
jaeone94
04918360eb Use hash lookup for missing asset detection (#11873)
## Summary

Use exact BLAKE3 hash lookups first for missing model/media detection,
and add a separate public-inclusive input asset cache so public input
assets are considered missing-detection candidates without changing the
user-only input assets shown in the UI.

## Changes

- **What**:
- Added `assetService.checkAssetHash()` for `HEAD
/api/assets/hash/{hash}` status-only existence checks.
- Added strict BLAKE3 hash helpers so only `blake3:<64 hex>` media
values and raw 64-hex BLAKE3 model metadata are sent to the hash
endpoint.
- Updated missing media detection to group BLAKE3 candidates by hash,
resolve them through the hash endpoint, and fall back to the legacy
asset list path for invalid/unverifiable/non-hash values.
- Updated missing model detection to use hash lookup for BLAKE3-backed
asset-supported candidates before falling back to the existing node-type
asset matching path.
- Added `assetService.getInputAssetsIncludingPublic()` backed by a
dedicated cache that fetches input assets with `include_public=true` for
missing media fallback checks.
- Kept `assetsStore.inputAssets` user-only for widget/UI display, while
invalidating the public-inclusive missing-detection cache when input
assets may change.
- Added abort handling for paginated asset fetches and shared
public-input cache callers so one aborted caller does not cancel the
shared fetch for other callers.
- Added regression coverage for hash lookup, fallback behavior, abort
paths, public input fallback detection, and cache invalidation.
- **Dependencies**: None.
- **Change size**:
  - Production code: 4 files, 400 insertions, 24 deletions, net +376.
  - Test code: 4 files, 806 insertions, 59 deletions, net +747.
  - Total: 8 files, 1206 insertions, 83 deletions, net +1123.

## Review Focus

- The public-inclusive input asset cache is intentionally separate from
`assetsStore.inputAssets`. The existing store data is user-only and
drives the asset widgets/sidebar, so using it for missing input
detection misses public assets. Making that store public-inclusive would
change UI data semantics; this PR instead keeps the UI dataset unchanged
and adds a missing-detection-specific cache in `assetService`.
- Hash lookup is only used when the workflow exposes a valid BLAKE3
hash. Filename-like values and invalid hash values still use the legacy
fallback path.
- Missing model detection keeps the existing fallback behavior for
non-hash candidates and for hash checks that are invalid or fail
transiently.
- Async model download cache refresh behavior is left unchanged; this PR
avoids coupling model download completion to input asset cache
invalidation.
- No browser/e2e test was added because this changes the missing asset
detection data path, not UI interaction or rendering. The behavioral
coverage is in unit tests for the asset service and the missing
media/model scanners.

## Follow-up Items

- Fix `assetsStore.updateAssetTags()` partial-failure recovery. If
`removeAssetTags()` succeeds and `addAssetTags()` fails, the local model
asset cache can roll back to tags that the backend has already removed;
this should be handled in a focused model asset cache PR.
- Consider extracting shared hash-verification flow used by missing
media and missing model scans after this behavior stabilizes.
- Consider adding a concurrency cap or short-lived request cache for
large workflows with many unique hash lookups.
- Consider splitting `assetService.ts` further, e.g. hash helpers, abort
utilities, and the public-inclusive input asset cache.
- Consider tightening the asset hash service API shape so callers do not
directly depend on HTTP-oriented statuses such as `invalid`.
- Consider adding broader mutation-path coverage for public-inclusive
input cache invalidation once the cache has more consumers.

Linear: FE-534

## Screenshots (if applicable)

Before <false positive / missing image / public asset>


https://github.com/user-attachments/assets/db7ce2a9-b169-4fae-bf9f-98bb93d3ee6d

After 


https://github.com/user-attachments/assets/29af9f9e-b536-4fcd-a426-3add40bcb165



┆Issue is synchronized with this [Notion
page](https://www.notion.so/PR-11873-Use-hash-lookup-for-missing-asset-detection-3556d73d36508165babafb16614be0d8)
by [Unito](https://www.unito.io)
2026-05-04 03:59:54 +00:00
Dante
af70d88860 fix: keep finished badge fully opaque in ProgressToastItem (#11542)
## Summary
- fix
**[slack](https://comfy-organization.slack.com/archives/C0A4XMHANP3/p1776801170742579)**
- Move `opacity-50` off the row container onto the asset-name column
only, so the contrast badge (white pill, dark label) is not dimmed to
gray-on-gray when a download completes.
- Matches the Figma intent that the `FINISHED` badge stands out —
designer spec uses `base/foreground` for pill, `base/background` for
text, which is unreadable when the parent is 50% opacity.

<img width="560" height="269" alt="Screenshot 2026-04-23 at 2 46 17 PM"
src="https://github.com/user-attachments/assets/fb84aa57-c348-4a86-9a65-9342c12400e1"
/>
<img width="764" height="332" alt="Screenshot 2026-04-23 at 2 46 41 PM"
src="https://github.com/user-attachments/assets/ecbe6a5f-c2e8-4427-9c1d-f8f123009d2e"
/>


## Before / After

![before /
after](https://raw.githubusercontent.com/Comfy-Org/ComfyUI_frontend/jaewon/fe-237-fix-honeytoast-badge-finished-opacity/.github/pr-images/fe-237-before-after.png)

## Repro
Cloud → trigger a model download → wait for completion → the `FINISHED`
badge is the same tone as the toast surface (see Slack thread
screenshots).

## Test plan
- [ ] Complete a model download in cloud and confirm the `FINISHED`
badge is clearly legible in both themes.
- [ ] File name + subtitle still appear dimmed to signal the row is
completed.
- [ ] Running / failed / pending states unchanged.

- Fixes
[FE-237](https://linear.app/comfyorg/issue/FE-237/fix-honeytoast-badge-text-color-for-finished-job-matches-background)
2026-05-03 08:40:27 +00:00
Christian Byrne
c955309b26 [chore] Update Comfy Registry API types from comfy-api@911406c (#11518)
## Automated API Type Update

This PR updates the Comfy Registry API types from the latest comfy-api
OpenAPI specification.

- API commit: 911406c
- Generated on: 2026-04-17T16:10:40Z

These types are automatically generated using openapi-typescript.

┆Issue is synchronized with this [Notion
page](https://www.notion.so/PR-11518-chore-Update-Comfy-Registry-API-types-from-comfy-api-911406c-3496d73d36508146a1e2e1ee90640fa4)
by [Unito](https://www.unito.io)

Co-authored-by: coderfromthenorth93 <213232275+coderfromthenorth93@users.noreply.github.com>
Co-authored-by: Alexander Brown <drjkl@comfy.org>
2026-05-03 01:01:41 -07:00
17 changed files with 1504 additions and 93 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

View File

@@ -0,0 +1,51 @@
import { render, screen } from '@testing-library/vue'
import { describe, expect, it } from 'vitest'
import { createI18n } from 'vue-i18n'
import type { AssetDownload } from '@/stores/assetDownloadStore'
import ProgressToastItem from './ProgressToastItem.vue'
const i18n = createI18n({
legacy: false,
locale: 'en',
messages: {
en: {
progressToast: {
finished: 'Finished',
failed: 'Failed',
pending: 'Pending'
}
}
}
})
function completedJob(): AssetDownload {
return {
taskId: 'task-1',
assetId: 'asset-1',
assetName: 'controlnet-canny.safetensors',
bytesTotal: 100,
bytesDownloaded: 100,
progress: 1,
status: 'completed',
lastUpdate: Date.now()
}
}
describe('ProgressToastItem — completed state', () => {
it('keeps the finished badge outside the dimmed (opacity-50) subtree', () => {
render(ProgressToastItem, {
props: { job: completedJob() },
global: { plugins: [i18n] }
})
const badge = screen.getByText('Finished')
// eslint-disable-next-line testing-library/no-node-access -- verifying structural placement of opacity-50 boundary, which is the subject of this fix
expect(badge.closest('.opacity-50')).toBeNull()
const assetName = screen.getByText('controlnet-canny.safetensors')
// eslint-disable-next-line testing-library/no-node-access -- verifying structural placement of opacity-50 boundary, which is the subject of this fix
expect(assetName.closest('.opacity-50')).not.toBeNull()
})
})

View File

@@ -22,14 +22,9 @@ const isPending = computed(() => job.status === 'created')
<template>
<div
:class="
cn(
'flex items-center justify-between rounded-lg bg-modal-card-background px-4 py-3',
isCompleted && 'opacity-50'
)
"
class="flex items-center justify-between rounded-lg bg-modal-card-background px-4 py-3"
>
<div class="min-w-0 flex-1">
<div :class="cn('min-w-0 flex-1', isCompleted && 'opacity-50')">
<span class="block truncate text-sm text-base-foreground">{{
job.assetName
}}</span>

View File

@@ -408,8 +408,12 @@ export class LGraphCanvas implements CustomEventDispatcher<LGraphCanvasEventMap>
}
set read_only(value: boolean) {
const changed = this.state.readOnly !== value
this.state.readOnly = value
this._updateCursorStyle()
if (changed) {
this.dispatchEvent('litegraph:read-only-changed', { readOnly: value })
}
}
get isDragging(): boolean {

View File

@@ -59,4 +59,9 @@ export interface LGraphCanvasEventMap {
active: boolean
nodeId: NodeId
}
/** The canvas read-only state has changed. */
'litegraph:read-only-changed': {
readOnly: boolean
}
}

View File

@@ -1,7 +1,12 @@
import { beforeEach, describe, expect, it, vi } from 'vitest'
import type { AssetItem } from '@/platform/assets/schemas/assetSchema'
import { assetService } from '@/platform/assets/services/assetService'
import {
MISSING_TAG,
assetService,
isBlake3AssetHash,
toBlake3AssetHash
} from '@/platform/assets/services/assetService'
import { api } from '@/scripts/api'
const mockDistributionState = vi.hoisted(() => ({ isCloud: false }))
@@ -44,6 +49,10 @@ vi.mock('@/i18n', () => ({
const fetchApiMock = vi.mocked(api.fetchApi)
const validBlake3Hash =
'1111111111111111111111111111111111111111111111111111111111111111'
const validBlake3AssetHash = `blake3:${validBlake3Hash}`
function buildResponse(
body: unknown,
init: { ok?: boolean; status?: number } = {}
@@ -180,9 +189,98 @@ describe(assetService.getAssetMetadata, () => {
})
})
describe(isBlake3AssetHash, () => {
it('accepts only prefixed 64-character blake3 hashes', () => {
expect(isBlake3AssetHash(validBlake3AssetHash)).toBe(true)
expect(isBlake3AssetHash('BLAKE3:' + validBlake3Hash.toUpperCase())).toBe(
true
)
expect(isBlake3AssetHash('blake3:abc')).toBe(false)
expect(isBlake3AssetHash(validBlake3Hash)).toBe(false)
})
})
describe(toBlake3AssetHash, () => {
it('normalizes 64-character blake3 hex values to asset hashes', () => {
expect(toBlake3AssetHash(validBlake3Hash)).toBe(validBlake3AssetHash)
expect(toBlake3AssetHash('abc')).toBeNull()
expect(toBlake3AssetHash(undefined)).toBeNull()
})
})
describe(assetService.uploadAssetFromUrl, () => {
beforeEach(() => {
vi.clearAllMocks()
assetService.invalidateInputAssetsIncludingPublic()
})
it('does not invalidate cached input assets when the upload response is invalid', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse({ id: 'missing-name' }))
await assetService.getInputAssetsIncludingPublic()
await expect(
assetService.uploadAssetFromUrl({
url: 'https://example.com/input.png',
name: 'input.png',
tags: ['input']
})
).rejects.toThrow('Failed to upload asset')
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
consoleSpy.mockRestore()
})
it('requires upload responses to include created_new', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(
buildResponse(validAsset({ id: 'uploaded-input', tags: ['input'] }))
)
await assetService.getInputAssetsIncludingPublic()
await expect(
assetService.uploadAssetFromUrl({
url: 'https://example.com/input.png',
name: 'input.png',
tags: ['input']
})
).rejects.toThrow('Failed to upload asset')
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
consoleSpy.mockRestore()
})
it('returns validated upload responses with created_new', async () => {
const uploadedAsset = {
...validAsset({ id: 'uploaded-input', tags: ['input'] }),
created_new: true
}
fetchApiMock.mockResolvedValueOnce(buildResponse(uploadedAsset))
await expect(
assetService.uploadAssetFromUrl({
url: 'https://example.com/input.png',
name: 'input.png',
tags: ['input']
})
).resolves.toEqual(uploadedAsset)
})
})
describe(assetService.uploadAssetFromBase64, () => {
beforeEach(() => {
vi.clearAllMocks()
assetService.invalidateInputAssetsIncludingPublic()
})
it('throws before calling the network when data is not a data URL', async () => {
@@ -195,6 +293,63 @@ describe(assetService.uploadAssetFromBase64, () => {
expect(fetchApiMock).not.toHaveBeenCalled()
})
it('does not invalidate cached input assets when the upload response is invalid', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
const fetchSpy = vi
.spyOn(globalThis, 'fetch')
.mockResolvedValueOnce(new Response('hello'))
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse({ id: 'missing-name' }))
await assetService.getInputAssetsIncludingPublic()
await expect(
assetService.uploadAssetFromBase64({
data: 'data:text/plain;base64,aGVsbG8=',
name: 'input.txt',
tags: ['input']
})
).rejects.toThrow('Failed to upload asset')
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
fetchSpy.mockRestore()
consoleSpy.mockRestore()
})
it('rejects upload responses with a non-boolean created_new', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
const fetchSpy = vi
.spyOn(globalThis, 'fetch')
.mockResolvedValueOnce(new Response('hello'))
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(
buildResponse({
...validAsset({ id: 'uploaded-input', tags: ['input'] }),
created_new: 'true'
})
)
await assetService.getInputAssetsIncludingPublic()
await expect(
assetService.uploadAssetFromBase64({
data: 'data:text/plain;base64,aGVsbG8=',
name: 'input.txt',
tags: ['input']
})
).rejects.toThrow('Failed to upload asset')
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
fetchSpy.mockRestore()
consoleSpy.mockRestore()
})
})
describe(assetService.uploadAssetAsync, () => {
@@ -354,3 +509,391 @@ describe(assetService.getAssetsByTag, () => {
expect(params.get('include_public')).toBe('true')
})
})
describe(assetService.getAllAssetsByTag, () => {
beforeEach(() => {
vi.clearAllMocks()
})
it('paginates tagged asset requests with include_public=true', async () => {
fetchApiMock
.mockResolvedValueOnce(
buildResponse({
assets: [
validAsset({ id: 'a', tags: ['input'] }),
validAsset({ id: 'b', tags: ['input'] })
]
})
)
.mockResolvedValueOnce(
buildResponse({
assets: [validAsset({ id: 'c', tags: ['input'] })]
})
)
const assets = await assetService.getAllAssetsByTag('input', true, {
limit: 2
})
expect(assets.map((a) => a.id)).toEqual(['a', 'b', 'c'])
const firstUrl = fetchApiMock.mock.calls[0]?.[0] as string
const firstParams = new URL(firstUrl, 'http://localhost').searchParams
expect(firstParams.get('include_public')).toBe('true')
expect(firstParams.get('limit')).toBe('2')
expect(firstParams.has('offset')).toBe(false)
const secondUrl = fetchApiMock.mock.calls[1]?.[0] as string
const secondParams = new URL(secondUrl, 'http://localhost').searchParams
expect(secondParams.get('include_public')).toBe('true')
expect(secondParams.get('limit')).toBe('2')
expect(secondParams.get('offset')).toBe('2')
})
it('paginates from raw response size before filtering missing-tagged assets', async () => {
fetchApiMock
.mockResolvedValueOnce(
buildResponse({
assets: [
validAsset({ id: 'visible', tags: ['input'] }),
validAsset({ id: 'hidden', tags: ['input', MISSING_TAG] })
]
})
)
.mockResolvedValueOnce(
buildResponse({
assets: [validAsset({ id: 'later-public', tags: ['input'] })]
})
)
const assets = await assetService.getAllAssetsByTag('input', true, {
limit: 2
})
expect(assets.map((a) => a.id)).toEqual(['visible', 'later-public'])
expect(fetchApiMock).toHaveBeenCalledTimes(2)
const secondUrl = fetchApiMock.mock.calls[1]?.[0]
if (typeof secondUrl !== 'string') {
throw new Error('Expected a second asset request URL')
}
const secondParams = new URL(secondUrl, 'http://localhost').searchParams
expect(secondParams.get('offset')).toBe('2')
})
it('honors has_more when walking tagged asset pages', async () => {
fetchApiMock
.mockResolvedValueOnce(
buildResponse({
assets: [
validAsset({ id: 'first', tags: ['input'] }),
validAsset({ id: 'second', tags: ['input'] })
],
has_more: true
})
)
.mockResolvedValueOnce(
buildResponse({
assets: [validAsset({ id: 'later-public', tags: ['input'] })],
has_more: false
})
)
const assets = await assetService.getAllAssetsByTag('input', true, {
limit: 3
})
expect(assets.map((a) => a.id)).toEqual(['first', 'second', 'later-public'])
expect(fetchApiMock).toHaveBeenCalledTimes(2)
const secondUrl = fetchApiMock.mock.calls[1]?.[0]
if (typeof secondUrl !== 'string') {
throw new Error('Expected a second asset request URL')
}
const secondParams = new URL(secondUrl, 'http://localhost').searchParams
expect(secondParams.get('offset')).toBe('2')
})
it('passes abort signals through paginated requests', async () => {
const controller = new AbortController()
fetchApiMock.mockResolvedValueOnce(
buildResponse({
assets: [validAsset({ id: 'a', tags: ['input'] })]
})
)
await assetService.getAllAssetsByTag('input', true, {
limit: 2,
signal: controller.signal
})
expect(fetchApiMock).toHaveBeenCalledWith(expect.any(String), {
signal: controller.signal
})
})
it('stops pagination when aborted between pages', async () => {
const controller = new AbortController()
fetchApiMock.mockImplementationOnce(async () => {
controller.abort()
return buildResponse({
assets: [
validAsset({ id: 'a', tags: ['input'] }),
validAsset({ id: 'b', tags: ['input'] })
]
})
})
await expect(
assetService.getAllAssetsByTag('input', true, {
limit: 2,
signal: controller.signal
})
).rejects.toMatchObject({ name: 'AbortError' })
expect(fetchApiMock).toHaveBeenCalledOnce()
})
})
describe(assetService.getInputAssetsIncludingPublic, () => {
beforeEach(() => {
vi.clearAllMocks()
assetService.invalidateInputAssetsIncludingPublic()
})
it('loads input assets with public assets included and reuses the cache', async () => {
const assets = [
validAsset({ id: 'user-input', tags: ['input'] }),
validAsset({ id: 'public-input', tags: ['input'], is_immutable: true })
]
fetchApiMock.mockResolvedValueOnce(buildResponse({ assets }))
const first = await assetService.getInputAssetsIncludingPublic()
const second = await assetService.getInputAssetsIncludingPublic()
expect(first).toEqual(assets)
expect(second).toBe(first)
expect(fetchApiMock).toHaveBeenCalledOnce()
const requestedUrl = fetchApiMock.mock.calls[0]?.[0] as string
const params = new URL(requestedUrl, 'http://localhost').searchParams
expect(params.get('include_public')).toBe('true')
expect(params.get('limit')).toBe('500')
})
it('fetches fresh input assets after explicit invalidation', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const freshAssets = [validAsset({ id: 'fresh-input', tags: ['input'] })]
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse({ assets: freshAssets }))
await assetService.getInputAssetsIncludingPublic()
assetService.invalidateInputAssetsIncludingPublic()
const refreshed = await assetService.getInputAssetsIncludingPublic()
expect(refreshed).toEqual(freshAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
})
it('does not let one caller abort the shared input asset load for other callers', async () => {
const firstController = new AbortController()
const secondController = new AbortController()
const assets = [validAsset({ id: 'public-input', tags: ['input'] })]
let resolveResponse!: (response: Response) => void
let serviceSignal: AbortSignal | undefined
fetchApiMock.mockImplementationOnce(async (_url, options) => {
serviceSignal = options?.signal ?? undefined
return await new Promise<Response>((resolve) => {
resolveResponse = resolve
})
})
const first = assetService.getInputAssetsIncludingPublic(
firstController.signal
)
const second = assetService.getInputAssetsIncludingPublic(
secondController.signal
)
firstController.abort()
await expect(first).rejects.toMatchObject({ name: 'AbortError' })
expect(serviceSignal).toBeUndefined()
resolveResponse(buildResponse({ assets }))
await expect(second).resolves.toEqual(assets)
expect(fetchApiMock).toHaveBeenCalledOnce()
})
it('keeps the shared input asset load alive after all callers abort', async () => {
const firstController = new AbortController()
const secondController = new AbortController()
const assets = [validAsset({ id: 'public-input', tags: ['input'] })]
let resolveResponse!: (response: Response) => void
fetchApiMock.mockImplementationOnce(
async () =>
await new Promise<Response>((resolve) => {
resolveResponse = resolve
})
)
const first = assetService.getInputAssetsIncludingPublic(
firstController.signal
)
const second = assetService.getInputAssetsIncludingPublic(
secondController.signal
)
firstController.abort()
secondController.abort()
await expect(first).rejects.toMatchObject({ name: 'AbortError' })
await expect(second).rejects.toMatchObject({ name: 'AbortError' })
resolveResponse(buildResponse({ assets }))
await Promise.resolve()
await expect(assetService.getInputAssetsIncludingPublic()).resolves.toEqual(
assets
)
expect(fetchApiMock).toHaveBeenCalledOnce()
})
it('does not abort in-flight input asset loads when invalidated', async () => {
const assets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const freshAssets = [validAsset({ id: 'fresh-input', tags: ['input'] })]
let resolveResponse!: (response: Response) => void
fetchApiMock
.mockImplementationOnce(
async () =>
await new Promise<Response>((resolve) => {
resolveResponse = resolve
})
)
.mockResolvedValueOnce(buildResponse({ assets: freshAssets }))
const inFlight = assetService.getInputAssetsIncludingPublic()
assetService.invalidateInputAssetsIncludingPublic()
resolveResponse(buildResponse({ assets }))
await expect(inFlight).resolves.toEqual(assets)
await expect(assetService.getInputAssetsIncludingPublic()).resolves.toEqual(
freshAssets
)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
})
it('invalidates cached input assets after deleting an asset', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const freshAssets = [validAsset({ id: 'fresh-input', tags: ['input'] })]
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse(null))
.mockResolvedValueOnce(buildResponse({ assets: freshAssets }))
await assetService.getInputAssetsIncludingPublic()
await assetService.deleteAsset('stale-input')
const refreshed = await assetService.getInputAssetsIncludingPublic()
expect(refreshed).toEqual(freshAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(3)
expect(fetchApiMock.mock.calls[1]).toEqual([
'/assets/stale-input',
expect.objectContaining({ method: 'DELETE' })
])
})
it('invalidates cached input assets after an input asset upload', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
const uploadedAsset = validAsset({ id: 'uploaded-input', tags: ['input'] })
const freshAssets = [uploadedAsset]
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse(uploadedAsset))
.mockResolvedValueOnce(buildResponse({ assets: freshAssets }))
await assetService.getInputAssetsIncludingPublic()
await assetService.uploadAssetAsync({
source_url: 'https://example.com/input.png',
tags: ['input']
})
const refreshed = await assetService.getInputAssetsIncludingPublic()
expect(refreshed).toEqual(freshAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(3)
})
it('does not invalidate cached input assets for pending async input uploads', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(
buildResponse(
{ task_id: 'task-1', status: 'running' },
{ ok: true, status: 202 }
)
)
await assetService.getInputAssetsIncludingPublic()
await assetService.uploadAssetAsync({
source_url: 'https://example.com/input.png',
tags: ['input']
})
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
})
it('does not invalidate cached input assets for non-input uploads', async () => {
const staleAssets = [validAsset({ id: 'stale-input', tags: ['input'] })]
fetchApiMock
.mockResolvedValueOnce(buildResponse({ assets: staleAssets }))
.mockResolvedValueOnce(buildResponse(validAsset({ tags: ['models'] })))
await assetService.getInputAssetsIncludingPublic()
await assetService.uploadAssetAsync({
source_url: 'https://example.com/model.safetensors',
tags: ['models']
})
const cached = await assetService.getInputAssetsIncludingPublic()
expect(cached).toEqual(staleAssets)
expect(fetchApiMock).toHaveBeenCalledTimes(2)
})
})
describe(assetService.checkAssetHash, () => {
beforeEach(() => {
vi.clearAllMocks()
})
it.each([
[200, 'exists'],
[404, 'missing'],
[400, 'invalid']
] as const)('maps %s responses to %s', async (status, expected) => {
const hash =
'blake3:1111111111111111111111111111111111111111111111111111111111111111'
fetchApiMock.mockResolvedValueOnce(buildResponse(null, { status }))
await expect(assetService.checkAssetHash(hash)).resolves.toBe(expected)
expect(fetchApiMock).toHaveBeenCalledWith(
`/assets/hash/${encodeURIComponent(hash)}`,
{
method: 'HEAD',
signal: undefined
}
)
})
it('throws for unexpected responses', async () => {
fetchApiMock.mockResolvedValueOnce(buildResponse(null, { status: 500 }))
await expect(assetService.checkAssetHash('blake3:abc')).rejects.toThrow(
'Unexpected asset hash check status: 500'
)
})
})

View File

@@ -1,4 +1,5 @@
import { fromZodError } from 'zod-validation-error'
import { z } from 'zod'
import { st } from '@/i18n'
@@ -29,9 +30,14 @@ export interface PaginationOptions {
offset?: number
}
interface AssetPaginationOptions extends PaginationOptions {
signal?: AbortSignal
}
interface AssetRequestOptions extends PaginationOptions {
includeTags: string[]
includePublic?: boolean
signal?: AbortSignal
}
interface AssetExportOptions {
@@ -170,10 +176,61 @@ const ASSETS_DOWNLOAD_ENDPOINT = '/assets/download'
const ASSETS_EXPORT_ENDPOINT = '/assets/export'
const EXPERIMENTAL_WARNING = `EXPERIMENTAL: If you are seeing this please make sure "Comfy.Assets.UseAssetAPI" is set to "false" in your ComfyUI Settings.\n`
const DEFAULT_LIMIT = 500
const INPUT_ASSETS_WITH_PUBLIC_LIMIT = 500
export const MODELS_TAG = 'models'
/** Asset tag used by the backend for placeholder records that are not installed. */
export const MISSING_TAG = 'missing'
/** Result of a HEAD lookup against an exact asset hash. */
export type AssetHashStatus = 'exists' | 'missing' | 'invalid'
const BLAKE3_ASSET_HASH_PATTERN = /^blake3:[0-9a-f]{64}$/i
const BLAKE3_HEX_PATTERN = /^[0-9a-f]{64}$/i
const uploadedAssetResponseSchema = assetItemSchema.extend({
created_new: z.boolean()
})
/** Returns true for a prefixed BLAKE3 asset hash: `blake3:<64 hex>`. */
export function isBlake3AssetHash(value: string): boolean {
return BLAKE3_ASSET_HASH_PATTERN.test(value)
}
/** Converts a raw 64-character BLAKE3 hex digest into an asset hash. */
export function toBlake3AssetHash(hash: string | undefined): string | null {
if (!hash || !BLAKE3_HEX_PATTERN.test(hash)) return null
return `blake3:${hash}`
}
function createAbortError(): DOMException {
return new DOMException('Aborted', 'AbortError')
}
function throwIfAborted(signal?: AbortSignal): void {
if (signal?.aborted) throw createAbortError()
}
async function withCallerAbort<T>(
promise: Promise<T>,
signal?: AbortSignal
): Promise<T> {
throwIfAborted(signal)
if (!signal) return await promise
let removeAbortListener = () => {}
const abortPromise = new Promise<never>((_, reject) => {
const onAbort = () => reject(createAbortError())
signal.addEventListener('abort', onAbort, { once: true })
removeAbortListener = () => signal.removeEventListener('abort', onAbort)
})
try {
return await Promise.race([promise, abortPromise])
} finally {
removeAbortListener()
}
}
/**
* Validates asset response data using Zod schema
*/
@@ -187,11 +244,43 @@ function validateAssetResponse(data: unknown): AssetResponse {
)
}
function validateUploadedAssetResponse(
data: unknown
): AssetItem & { created_new: boolean } {
const result = uploadedAssetResponseSchema.safeParse(data)
if (result.success) {
return result.data
}
console.error('Invalid asset upload response:', fromZodError(result.error))
throw new Error(
st(
'assetBrowser.errorUploadFailed',
'Failed to upload asset. Please try again.'
)
)
}
/**
* Private service for asset-related network requests
* Not exposed globally - used internally by ComfyApi
*/
function createAssetService() {
let inputAssetsIncludingPublic: AssetItem[] | null = null
let inputAssetsIncludingPublicRequestId = 0
let pendingInputAssetsIncludingPublic: Promise<AssetItem[]> | null = null
/** Invalidates the cached public-inclusive input assets without aborting in-flight readers. */
function invalidateInputAssetsIncludingPublic(): void {
inputAssetsIncludingPublicRequestId++
pendingInputAssetsIncludingPublic = null
inputAssetsIncludingPublic = null
}
function invalidateInputAssetsCacheIfNeeded(tags?: string[]): void {
if (tags?.includes('input')) invalidateInputAssetsIncludingPublic()
}
/**
* Handles API response with consistent error handling and Zod validation
*/
@@ -203,7 +292,8 @@ function createAssetService() {
includeTags,
limit = DEFAULT_LIMIT,
offset,
includePublic
includePublic,
signal
} = options
const queryParams = new URLSearchParams({
include_tags: includeTags.join(','),
@@ -217,7 +307,9 @@ function createAssetService() {
}
const url = `${ASSETS_ENDPOINT}?${queryParams.toString()}`
const res = await api.fetchApi(url)
const res = signal
? await api.fetchApi(url, { signal })
: await api.fetchApi(url)
if (!res.ok) {
throw new Error(
`${EXPERIMENTAL_WARNING}Unable to load ${context}: Server returned ${res.status}. Please try again.`
@@ -403,15 +495,16 @@ function createAssetService() {
* @param options - Pagination options
* @param options.limit - Maximum number of assets to return (default: 500)
* @param options.offset - Number of assets to skip (default: 0)
* @param options.signal - Optional abort signal for cancelling the request
* @returns Promise<AssetItem[]> - Full asset objects filtered by tag, excluding missing assets
*/
async function getAssetsByTag(
tag: string,
includePublic: boolean = true,
{ limit = DEFAULT_LIMIT, offset = 0 }: PaginationOptions = {}
{ limit = DEFAULT_LIMIT, offset = 0, signal }: AssetPaginationOptions = {}
): Promise<AssetItem[]> {
const data = await handleAssetRequest(
{ includeTags: [tag], limit, offset, includePublic },
{ includeTags: [tag], limit, offset, includePublic, signal },
`assets for tag ${tag}`
)
@@ -420,6 +513,116 @@ function createAssetService() {
)
}
/**
* Gets every asset for a tag by walking paginated asset API responses.
*
* @param tag - The tag to filter by (e.g., 'models', 'input')
* @param includePublic - Whether to include public assets (default: true)
* @param options - Pagination options
* @param options.limit - Page size for each request (default: 500)
* @param options.signal - Optional abort signal for cancelling requests
* @returns Promise<AssetItem[]> - Full asset objects filtered by tag
*/
async function getAllAssetsByTag(
tag: string,
includePublic: boolean = true,
{ limit = DEFAULT_LIMIT, signal }: AssetPaginationOptions = {}
): Promise<AssetItem[]> {
const assets: AssetItem[] = []
const pageSize = limit > 0 ? limit : DEFAULT_LIMIT
let offset = 0
while (true) {
if (signal?.aborted) throw createAbortError()
const data = await handleAssetRequest(
{
includeTags: [tag],
limit: pageSize,
offset,
includePublic,
signal
},
`assets for tag ${tag}`
)
const batch = data.assets ?? []
assets.push(...batch.filter((asset) => !asset.tags.includes(MISSING_TAG)))
const noMoreFromServer = data.has_more === false
const inferredLastPage =
data.has_more === undefined && batch.length < pageSize
if (batch.length === 0 || noMoreFromServer || inferredLastPage) {
return assets
}
offset += batch.length
}
}
function startInputAssetsIncludingPublicRequest(): Promise<AssetItem[]> {
const requestId = ++inputAssetsIncludingPublicRequestId
pendingInputAssetsIncludingPublic = getAllAssetsByTag('input', true, {
limit: INPUT_ASSETS_WITH_PUBLIC_LIMIT
})
.then((assets) => {
if (requestId === inputAssetsIncludingPublicRequestId) {
inputAssetsIncludingPublic = assets
}
return assets
})
.finally(() => {
if (requestId === inputAssetsIncludingPublicRequestId) {
pendingInputAssetsIncludingPublic = null
}
})
void pendingInputAssetsIncludingPublic.catch(() => {})
return pendingInputAssetsIncludingPublic
}
/**
* Gets cached input assets including public assets for missing media checks.
* Caller aborts cancel only that caller; shared fetches are invalidated
* through invalidateInputAssetsIncludingPublic().
*/
async function getInputAssetsIncludingPublic(
signal?: AbortSignal
): Promise<AssetItem[]> {
throwIfAborted(signal)
if (inputAssetsIncludingPublic) return inputAssetsIncludingPublic
const request =
pendingInputAssetsIncludingPublic ??
startInputAssetsIncludingPublicRequest()
return await withCallerAbort(request, signal)
}
/**
* Checks whether an asset exists for an exact asset hash.
*
* Uses the HEAD /assets/hash/{hash} endpoint and maps status-only responses:
* 200 -> exists, 404 -> missing, and 400 -> invalid hash format.
*/
async function checkAssetHash(
assetHash: string,
signal?: AbortSignal
): Promise<AssetHashStatus> {
const response = await api.fetchApi(
`${ASSETS_ENDPOINT}/hash/${encodeURIComponent(assetHash)}`,
{
method: 'HEAD',
signal
}
)
if (response.status === 200) return 'exists'
if (response.status === 404) return 'missing'
if (response.status === 400) return 'invalid'
throw new Error(`Unexpected asset hash check status: ${response.status}`)
}
/**
* Deletes an asset by ID
* Only available in cloud environment
@@ -438,6 +641,8 @@ function createAssetService() {
`Unable to delete asset ${id}: Server returned ${res.status}`
)
}
invalidateInputAssetsIncludingPublic()
}
/**
@@ -545,7 +750,9 @@ function createAssetService() {
)
}
return await res.json()
const asset = validateUploadedAssetResponse(await res.json())
invalidateInputAssetsCacheIfNeeded(params.tags)
return asset
}
/**
@@ -598,7 +805,9 @@ function createAssetService() {
)
}
return await res.json()
const asset = validateUploadedAssetResponse(await res.json())
invalidateInputAssetsCacheIfNeeded(params.tags)
return asset
}
/**
@@ -628,6 +837,7 @@ function createAssetService() {
if (!parseResult.success) {
throw fromZodError(parseResult.error)
}
invalidateInputAssetsIncludingPublic()
return parseResult.data
}
@@ -658,6 +868,7 @@ function createAssetService() {
if (!parseResult.success) {
throw fromZodError(parseResult.error)
}
invalidateInputAssetsIncludingPublic()
return parseResult.data
}
@@ -709,6 +920,13 @@ function createAssetService() {
)
)
}
if (
params.tags?.includes('input') &&
result.data.type === 'async' &&
result.data.task.status === 'completed'
) {
invalidateInputAssetsIncludingPublic()
}
return result.data
}
@@ -724,6 +942,7 @@ function createAssetService() {
)
)
}
invalidateInputAssetsCacheIfNeeded(params.tags)
return result.data
}
@@ -764,6 +983,10 @@ function createAssetService() {
getAssetsForNodeType,
getAssetDetails,
getAssetsByTag,
getAllAssetsByTag,
getInputAssetsIncludingPublic,
invalidateInputAssetsIncludingPublic,
checkAssetHash,
deleteAsset,
updateAsset,
addAssetTags,

View File

@@ -1,9 +1,11 @@
import { fromAny } from '@total-typescript/shoehorn'
import { describe, expect, it, vi } from 'vitest'
import { beforeEach, describe, expect, it, vi } from 'vitest'
import type { LGraph } from '@/lib/litegraph/src/LGraph'
import type { LGraphNode } from '@/lib/litegraph/src/LGraphNode'
import type { IComboWidget } from '@/lib/litegraph/src/types/widgets'
import type { AssetItem } from '@/platform/assets/schemas/assetSchema'
import type * as AssetServiceModule from '@/platform/assets/services/assetService'
import {
scanAllMediaCandidates,
scanNodeMediaCandidates,
@@ -13,6 +15,13 @@ import {
} from './missingMediaScan'
import type { MissingMediaCandidate } from './types'
const { mockCheckAssetHash, mockGetInputAssetsIncludingPublic } = vi.hoisted(
() => ({
mockCheckAssetHash: vi.fn(),
mockGetInputAssetsIncludingPublic: vi.fn()
})
)
vi.mock('@/utils/graphTraversalUtil', () => ({
collectAllNodes: (graph: { _testNodes: LGraphNode[] }) => graph._testNodes,
getExecutionIdByNode: (
@@ -21,6 +30,21 @@ vi.mock('@/utils/graphTraversalUtil', () => ({
) => node._testExecutionId ?? String(node.id)
}))
vi.mock('@/platform/assets/services/assetService', async () => {
const actual = await vi.importActual<typeof AssetServiceModule>(
'@/platform/assets/services/assetService'
)
return {
...actual,
assetService: {
...actual.assetService,
checkAssetHash: mockCheckAssetHash,
getInputAssetsIncludingPublic: mockGetInputAssetsIncludingPublic
}
}
})
function makeCandidate(
nodeId: string,
name: string,
@@ -70,6 +94,16 @@ function makeGraph(nodes: LGraphNode[]): LGraph {
return fromAny<LGraph, unknown>({ _testNodes: nodes })
}
function makeAsset(name: string, assetHash: string | null = null): AssetItem {
return {
id: name,
name,
asset_hash: assetHash,
mime_type: null,
tags: ['input']
}
}
describe('scanNodeMediaCandidates', () => {
it('returns candidate for a LoadImage node with missing image', () => {
const graph = makeGraph([])
@@ -232,37 +266,43 @@ describe('groupCandidatesByMediaType', () => {
})
describe('verifyCloudMediaCandidates', () => {
it('marks candidates missing when not in input assets', async () => {
const existingHash =
'blake3:1111111111111111111111111111111111111111111111111111111111111111'
const missingHash =
'blake3:2222222222222222222222222222222222222222222222222222222222222222'
beforeEach(() => {
vi.clearAllMocks()
mockCheckAssetHash.mockResolvedValue('missing')
mockGetInputAssetsIncludingPublic.mockResolvedValue([])
})
it('marks candidates missing when the asset hash is not found', async () => {
const candidates = [
makeCandidate('1', 'abc123.png', { isMissing: undefined }),
makeCandidate('2', 'def456.png', { isMissing: undefined })
makeCandidate('1', missingHash, { isMissing: undefined }),
makeCandidate('2', existingHash, { isMissing: undefined })
]
const mockStore = {
updateInputs: async () => {},
inputAssets: [{ asset_hash: 'def456.png', name: 'my-photo.png' }]
}
const checkAssetHash = vi.fn(async (assetHash: string) =>
assetHash === existingHash ? ('exists' as const) : ('missing' as const)
)
await verifyCloudMediaCandidates(candidates, undefined, mockStore)
await verifyCloudMediaCandidates(candidates, undefined, checkAssetHash)
expect(candidates[0].isMissing).toBe(true)
expect(candidates[1].isMissing).toBe(false)
})
it('calls updateInputs before checking assets', async () => {
let updateCalled = false
const candidates = [makeCandidate('1', 'abc.png', { isMissing: undefined })]
it('uses assetService.checkAssetHash by default', async () => {
const candidates = [
makeCandidate('1', existingHash, { isMissing: undefined })
]
mockCheckAssetHash.mockResolvedValue('exists')
const mockStore = {
updateInputs: async () => {
updateCalled = true
},
inputAssets: []
}
await verifyCloudMediaCandidates(candidates)
await verifyCloudMediaCandidates(candidates, undefined, mockStore)
expect(updateCalled).toBe(true)
expect(candidates[0].isMissing).toBe(false)
expect(mockCheckAssetHash).toHaveBeenCalledWith(existingHash, undefined)
})
it('respects abort signal before execution', async () => {
@@ -270,69 +310,221 @@ describe('verifyCloudMediaCandidates', () => {
controller.abort()
const candidates = [
makeCandidate('1', 'abc123.png', { isMissing: undefined })
makeCandidate('1', missingHash, { isMissing: undefined })
]
await verifyCloudMediaCandidates(candidates, controller.signal)
expect(candidates[0].isMissing).toBeUndefined()
expect(mockCheckAssetHash).not.toHaveBeenCalled()
})
it('respects abort signal after updateInputs', async () => {
it('respects abort signal after hash verification', async () => {
const controller = new AbortController()
const candidates = [makeCandidate('1', 'abc.png', { isMissing: undefined })]
const candidates = [
makeCandidate('1', existingHash, { isMissing: undefined })
]
const checkAssetHash = vi.fn(async () => {
controller.abort()
return 'exists' as const
})
const mockStore = {
updateInputs: async () => {
controller.abort()
},
inputAssets: [{ asset_hash: 'abc.png', name: 'photo.png' }]
}
await verifyCloudMediaCandidates(candidates, controller.signal, mockStore)
await verifyCloudMediaCandidates(
candidates,
controller.signal,
checkAssetHash
)
expect(candidates[0].isMissing).toBeUndefined()
})
it('skips candidates already resolved as true', async () => {
const candidates = [makeCandidate('1', 'abc.png', { isMissing: true })]
const candidates = [makeCandidate('1', missingHash, { isMissing: true })]
const mockStore = {
updateInputs: async () => {},
inputAssets: []
}
await verifyCloudMediaCandidates(candidates, undefined, mockStore)
await verifyCloudMediaCandidates(candidates)
expect(candidates[0].isMissing).toBe(true)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
})
it('skips candidates already resolved as false', async () => {
const candidates = [makeCandidate('1', 'abc.png', { isMissing: false })]
const candidates = [makeCandidate('1', existingHash, { isMissing: false })]
const mockStore = {
updateInputs: async () => {},
inputAssets: []
}
await verifyCloudMediaCandidates(candidates, undefined, mockStore)
await verifyCloudMediaCandidates(candidates)
expect(candidates[0].isMissing).toBe(false)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
})
it('skips entirely when no pending candidates', async () => {
let updateCalled = false
const candidates = [makeCandidate('1', 'abc.png', { isMissing: true })]
const candidates = [makeCandidate('1', missingHash, { isMissing: true })]
const mockStore = {
updateInputs: async () => {
updateCalled = true
},
inputAssets: []
}
await verifyCloudMediaCandidates(candidates)
await verifyCloudMediaCandidates(candidates, undefined, mockStore)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
})
expect(updateCalled).toBe(false)
it('falls back to input assets for non-blake3 candidate names', async () => {
const candidates = [
makeCandidate('1', 'photo.png', { isMissing: undefined }),
makeCandidate('2', 'missing.png', { isMissing: undefined })
]
const fetchInputAssets = vi.fn(async () => [
makeAsset('stored-photo.png', 'photo.png')
])
await verifyCloudMediaCandidates(
candidates,
undefined,
undefined,
fetchInputAssets
)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
expect(fetchInputAssets).toHaveBeenCalledOnce()
expect(candidates[0].isMissing).toBe(false)
expect(candidates[1].isMissing).toBe(true)
})
it('uses public input assets for default legacy fallback', async () => {
const candidates = [
makeCandidate('1', 'public-photo.png', { isMissing: undefined })
]
const inputAssets = Array.from({ length: 500 }, (_, index) =>
makeAsset(`asset-${index}.png`)
)
inputAssets[42] = makeAsset('public-asset-record', 'public-photo.png')
mockGetInputAssetsIncludingPublic.mockResolvedValue(inputAssets)
await verifyCloudMediaCandidates(candidates)
expect(mockGetInputAssetsIncludingPublic).toHaveBeenCalledWith(undefined)
expect(candidates[0].isMissing).toBe(false)
})
it('silences aborts while loading legacy fallback input assets', async () => {
const abortError = new Error('aborted')
abortError.name = 'AbortError'
const controller = new AbortController()
const candidates = [
makeCandidate('1', 'photo.png', { isMissing: undefined })
]
const fetchInputAssets = vi.fn(async () => {
controller.abort()
throw abortError
})
await expect(
verifyCloudMediaCandidates(
candidates,
controller.signal,
undefined,
fetchInputAssets
)
).resolves.toBeUndefined()
expect(candidates[0].isMissing).toBeUndefined()
})
it('silences aborts from the default legacy fallback input asset store path', async () => {
const abortError = new Error('aborted')
abortError.name = 'AbortError'
const controller = new AbortController()
const candidates = [
makeCandidate('1', 'photo.png', { isMissing: undefined })
]
mockGetInputAssetsIncludingPublic.mockImplementationOnce(async () => {
controller.abort()
throw abortError
})
await expect(
verifyCloudMediaCandidates(candidates, controller.signal)
).resolves.toBeUndefined()
expect(mockGetInputAssetsIncludingPublic).toHaveBeenCalledWith(
controller.signal
)
expect(candidates[0].isMissing).toBeUndefined()
})
it('falls back to input assets when the hash endpoint returns 400', async () => {
const candidates = [
makeCandidate('1', existingHash, { isMissing: undefined })
]
mockCheckAssetHash.mockResolvedValue('invalid')
const fetchInputAssets = vi.fn(async () => [
makeAsset('photo.png', existingHash)
])
await verifyCloudMediaCandidates(
candidates,
undefined,
undefined,
fetchInputAssets
)
expect(mockCheckAssetHash).toHaveBeenCalledWith(existingHash, undefined)
expect(fetchInputAssets).toHaveBeenCalledOnce()
expect(candidates[0].isMissing).toBe(false)
})
it('falls back to input assets when hash verification fails', async () => {
const warn = vi.spyOn(console, 'warn').mockImplementation(() => {})
const candidates = [
makeCandidate('1', existingHash, { isMissing: undefined })
]
const checkAssetHash = vi.fn(async () => {
throw new Error('network failed')
})
const fetchInputAssets = vi.fn(async () => [
makeAsset('photo.png', existingHash)
])
await verifyCloudMediaCandidates(
candidates,
undefined,
checkAssetHash,
fetchInputAssets
)
expect(fetchInputAssets).toHaveBeenCalledOnce()
expect(candidates[0].isMissing).toBe(false)
expect(warn).toHaveBeenCalledOnce()
warn.mockRestore()
})
it('does not call the hash endpoint for malformed blake3-looking values', async () => {
const malformedHash = 'blake3:abc'
const candidates = [
makeCandidate('1', malformedHash, { isMissing: undefined })
]
const fetchInputAssets = vi.fn(async () => [
makeAsset('legacy.png', malformedHash)
])
await verifyCloudMediaCandidates(
candidates,
undefined,
undefined,
fetchInputAssets
)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
expect(fetchInputAssets).toHaveBeenCalledOnce()
expect(candidates[0].isMissing).toBe(false)
})
it('deduplicates checks for repeated candidate names', async () => {
const candidates = [
makeCandidate('1', missingHash, { isMissing: undefined }),
makeCandidate('2', missingHash, { isMissing: undefined })
]
await verifyCloudMediaCandidates(candidates)
expect(mockCheckAssetHash).toHaveBeenCalledOnce()
expect(candidates[0].isMissing).toBe(true)
expect(candidates[1].isMissing).toBe(true)
})
})

View File

@@ -18,6 +18,12 @@ import {
} from '@/utils/graphTraversalUtil'
import { LGraphEventMode } from '@/lib/litegraph/src/types/globalEnums'
import { resolveComboValues } from '@/utils/litegraphUtil'
import type { AssetItem } from '@/platform/assets/schemas/assetSchema'
import type { AssetHashStatus } from '@/platform/assets/services/assetService'
import {
assetService,
isBlake3AssetHash
} from '@/platform/assets/services/assetService'
/** Map of node types to their media widget name and media type. */
const MEDIA_NODE_WIDGETS: Record<
@@ -106,41 +112,130 @@ export function scanNodeMediaCandidates(
return candidates
}
interface InputVerifier {
updateInputs: () => Promise<unknown>
inputAssets: Array<{ asset_hash?: string | null; name: string }>
type AssetHashVerifier = (
assetHash: string,
signal?: AbortSignal
) => Promise<AssetHashStatus>
type InputAssetFetcher = (signal?: AbortSignal) => Promise<AssetItem[]>
function groupCandidatesForHashLookup(candidates: MissingMediaCandidate[]): {
candidatesByHash: Map<string, MissingMediaCandidate[]>
legacyCandidates: MissingMediaCandidate[]
} {
const candidatesByHash = new Map<string, MissingMediaCandidate[]>()
const legacyCandidates: MissingMediaCandidate[] = []
for (const candidate of candidates) {
if (!isBlake3AssetHash(candidate.name)) {
legacyCandidates.push(candidate)
continue
}
const hashCandidates = candidatesByHash.get(candidate.name)
if (hashCandidates) hashCandidates.push(candidate)
else candidatesByHash.set(candidate.name, [candidate])
}
return { candidatesByHash, legacyCandidates }
}
async function verifyCandidatesByHash(
candidatesByHash: Map<string, MissingMediaCandidate[]>,
legacyCandidates: MissingMediaCandidate[],
signal: AbortSignal | undefined,
checkAssetHash: AssetHashVerifier
): Promise<void> {
await Promise.all(
Array.from(candidatesByHash, async ([assetHash, hashCandidates]) => {
if (signal?.aborted) return
let status: AssetHashStatus
try {
status = await checkAssetHash(assetHash, signal)
if (signal?.aborted) return
} catch (err) {
if (signal?.aborted || isAbortError(err)) return
console.warn(
'[Missing Media Pipeline] Failed to verify asset hash:',
err
)
legacyCandidates.push(...hashCandidates)
return
}
if (status === 'invalid') {
legacyCandidates.push(...hashCandidates)
return
}
for (const candidate of hashCandidates) {
candidate.isMissing = status === 'missing'
}
})
)
}
/**
* Verify cloud media candidates against the input assets fetched from the
* assets store. Mutates candidates' `isMissing` in place.
* Verify cloud media candidates by probing the asset hash endpoint first.
* Invalid hash values fall back to the legacy input asset list check.
*/
export async function verifyCloudMediaCandidates(
candidates: MissingMediaCandidate[],
signal?: AbortSignal,
assetsStore?: InputVerifier
checkAssetHash: AssetHashVerifier = assetService.checkAssetHash,
fetchInputAssets: InputAssetFetcher = fetchMissingInputAssets
): Promise<void> {
if (signal?.aborted) return
const pending = candidates.filter((c) => c.isMissing === undefined)
if (pending.length === 0) return
const store =
assetsStore ?? (await import('@/stores/assetsStore')).useAssetsStore()
const { candidatesByHash, legacyCandidates } =
groupCandidatesForHashLookup(pending)
await verifyCandidatesByHash(
candidatesByHash,
legacyCandidates,
signal,
checkAssetHash
)
await store.updateInputs()
if (signal?.aborted || legacyCandidates.length === 0) return
let inputAssets: AssetItem[]
try {
inputAssets = await fetchInputAssets(signal)
} catch (err) {
if (signal?.aborted || isAbortError(err)) return
throw err
}
if (signal?.aborted) return
const assetHashes = new Set(
store.inputAssets.map((a) => a.asset_hash).filter((h): h is string => !!h)
inputAssets.map((a) => a.asset_hash).filter((h): h is string => !!h)
)
for (const c of pending) {
c.isMissing = !assetHashes.has(c.name)
for (const candidate of legacyCandidates) {
candidate.isMissing = !assetHashes.has(candidate.name)
}
}
async function fetchMissingInputAssets(
signal?: AbortSignal
): Promise<AssetItem[]> {
return await assetService.getInputAssetsIncludingPublic(signal)
}
function isAbortError(err: unknown): boolean {
return (
typeof err === 'object' &&
err !== null &&
'name' in err &&
err.name === 'AbortError'
)
}
/** Group confirmed-missing candidates by file name into view models. */
export function groupCandidatesByName(
candidates: MissingMediaCandidate[]

View File

@@ -19,6 +19,11 @@ import activeSubgraphUnmatchedModel from '@/platform/missingModel/__fixtures__/a
import bypassedSubgraphUnmatchedModel from '@/platform/missingModel/__fixtures__/bypassedSubgraphUnmatchedModel.json' with { type: 'json' }
import type { MissingModelCandidate } from '@/platform/missingModel/types'
import type { ComfyWorkflowJSON } from '@/platform/workflow/validation/schemas/workflowSchema'
import type * as AssetServiceModule from '@/platform/assets/services/assetService'
const { mockCheckAssetHash } = vi.hoisted(() => ({
mockCheckAssetHash: vi.fn()
}))
vi.mock('@/utils/graphTraversalUtil', () => ({
collectAllNodes: (graph: { _testNodes: LGraphNode[] }) => graph._testNodes,
@@ -28,6 +33,20 @@ vi.mock('@/utils/graphTraversalUtil', () => ({
) => node._testExecutionId ?? String(node.id)
}))
vi.mock('@/platform/assets/services/assetService', async () => {
const actual = await vi.importActual<typeof AssetServiceModule>(
'@/platform/assets/services/assetService'
)
return {
...actual,
assetService: {
...actual.assetService,
checkAssetHash: mockCheckAssetHash
}
}
})
/** Helper: create a combo widget mock */
function makeComboWidget(
name: string,
@@ -43,7 +62,7 @@ function makeComboWidget(
}
/** Helper: create an asset widget mock (Cloud combo replacement) */
function makeAssetWidget(name: string, value: string): IBaseWidget {
function makeAssetWidget(name: string, value: unknown): IBaseWidget {
return fromAny<IBaseWidget, unknown>({
type: 'asset',
name,
@@ -551,6 +570,16 @@ describe('scanAllModelCandidates', () => {
expect(result).toEqual([])
})
it('should skip asset widgets with non-string values', () => {
const graph = makeGraph([
makeNode(1, 'SomeNode', [makeAssetWidget('ckpt_name', 123)])
])
const result = scanAllModelCandidates(graph, noAssetSupport)
expect(result).toEqual([])
})
it('should scan both combo and asset widgets on the same node', () => {
const graph = makeGraph([
makeNode(1, 'DualLoaderNode', [
@@ -1411,6 +1440,7 @@ function makeAssetCandidate(
describe('verifyAssetSupportedCandidates', () => {
beforeEach(() => {
vi.clearAllMocks()
mockCheckAssetHash.mockResolvedValue('missing')
mockIsModelLoading.mockReturnValue(false)
mockHasMore.mockReturnValue(false)
mockGetAssets.mockReturnValue([])
@@ -1428,6 +1458,125 @@ describe('verifyAssetSupportedCandidates', () => {
)
})
it('should resolve isMissing=false when the blake3 hash endpoint finds the asset', async () => {
const hash =
'1111111111111111111111111111111111111111111111111111111111111111'
const candidates = [
makeAssetCandidate('model.safetensors', {
hash,
hashType: 'blake3'
})
]
mockCheckAssetHash.mockResolvedValue('exists')
await verifyAssetSupportedCandidates(candidates)
expect(candidates[0].isMissing).toBe(false)
expect(mockCheckAssetHash).toHaveBeenCalledWith(`blake3:${hash}`, undefined)
expect(mockUpdateModelsForNodeType).not.toHaveBeenCalled()
})
it('should fall back to asset store matching when the blake3 hash is not found', async () => {
const hash =
'2222222222222222222222222222222222222222222222222222222222222222'
const candidates = [
makeAssetCandidate('my_model.safetensors', {
hash,
hashType: 'blake3'
})
]
mockCheckAssetHash.mockResolvedValue('missing')
mockGetAssets.mockReturnValue([
{
id: '1',
name: 'my_model.safetensors',
asset_hash: null,
metadata: { filename: 'my_model.safetensors' }
}
])
await verifyAssetSupportedCandidates(candidates)
expect(candidates[0].isMissing).toBe(false)
expect(mockUpdateModelsForNodeType).toHaveBeenCalledWith(
'CheckpointLoaderSimple'
)
})
it('should fall back to asset store matching when hash verification fails', async () => {
const warn = vi.spyOn(console, 'warn').mockImplementation(() => {})
const hash =
'3333333333333333333333333333333333333333333333333333333333333333'
const candidates = [
makeAssetCandidate('my_model.safetensors', {
hash,
hashType: 'blake3'
})
]
mockCheckAssetHash.mockRejectedValue(new Error('network failed'))
mockGetAssets.mockReturnValue([
{
id: '1',
name: 'my_model.safetensors',
asset_hash: null,
metadata: { filename: 'my_model.safetensors' }
}
])
await verifyAssetSupportedCandidates(candidates)
expect(candidates[0].isMissing).toBe(false)
expect(mockUpdateModelsForNodeType).toHaveBeenCalledWith(
'CheckpointLoaderSimple'
)
expect(warn).toHaveBeenCalledOnce()
warn.mockRestore()
})
it('should skip malformed blake3 hashes and use asset store matching', async () => {
const candidates = [
makeAssetCandidate('my_model.safetensors', {
hash: 'abc123',
hashType: 'blake3'
})
]
mockGetAssets.mockReturnValue([
{
id: '1',
name: 'my_model.safetensors',
asset_hash: null,
metadata: { filename: 'my_model.safetensors' }
}
])
await verifyAssetSupportedCandidates(candidates)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
expect(candidates[0].isMissing).toBe(false)
})
it('should not warn or fall back when hash verification is aborted', async () => {
const warn = vi.spyOn(console, 'warn').mockImplementation(() => {})
const abortError = new Error('aborted')
abortError.name = 'AbortError'
const hash =
'4444444444444444444444444444444444444444444444444444444444444444'
const candidates = [
makeAssetCandidate('my_model.safetensors', {
hash,
hashType: 'blake3'
})
]
mockCheckAssetHash.mockRejectedValue(abortError)
await verifyAssetSupportedCandidates(candidates)
expect(candidates[0].isMissing).toBeUndefined()
expect(mockUpdateModelsForNodeType).not.toHaveBeenCalled()
expect(warn).not.toHaveBeenCalled()
warn.mockRestore()
})
it('should resolve isMissing=false when asset with matching hash exists', async () => {
const candidates = [
makeAssetCandidate('model.safetensors', {
@@ -1442,6 +1591,7 @@ describe('verifyAssetSupportedCandidates', () => {
await verifyAssetSupportedCandidates(candidates)
expect(candidates[0].isMissing).toBe(false)
expect(mockCheckAssetHash).not.toHaveBeenCalled()
})
it('should resolve isMissing=false when asset with matching filename exists', async () => {

View File

@@ -24,6 +24,11 @@ import {
} from '@/utils/graphTraversalUtil'
import { LGraphEventMode } from '@/lib/litegraph/src/types/globalEnums'
import { resolveComboValues } from '@/utils/litegraphUtil'
import type { AssetHashStatus } from '@/platform/assets/services/assetService'
import {
assetService,
toBlake3AssetHash
} from '@/platform/assets/services/assetService'
export type MissingModelWorkflowData = FlattenableWorkflowGraph & {
models?: ModelFile[]
@@ -177,7 +182,7 @@ function scanAssetWidget(
getDirectory: ((nodeType: string) => string | undefined) | undefined
): MissingModelCandidate | null {
const value = widget.value
if (!value.trim()) return null
if (typeof value !== 'string' || !value.trim()) return null
if (!isModelFileName(value)) return null
return {
@@ -445,20 +450,68 @@ interface AssetVerifier {
getAssets: (nodeType: string) => AssetItem[] | undefined
}
type AssetHashVerifier = (
assetHash: string,
signal?: AbortSignal
) => Promise<AssetHashStatus>
export async function verifyAssetSupportedCandidates(
candidates: MissingModelCandidate[],
signal?: AbortSignal,
assetsStore?: AssetVerifier
assetsStore?: AssetVerifier,
checkAssetHash: AssetHashVerifier = assetService.checkAssetHash
): Promise<void> {
if (signal?.aborted) return
const pendingCandidates = candidates.filter(
(c) => c.isAssetSupported && c.isMissing === undefined
)
if (pendingCandidates.length === 0) return
const pendingNodeTypes = new Set<string>()
for (const c of candidates) {
if (c.isAssetSupported && c.isMissing === undefined) {
pendingNodeTypes.add(c.nodeType)
const candidatesByHash = new Map<string, MissingModelCandidate[]>()
for (const candidate of pendingCandidates) {
const assetHash = getBlake3AssetHash(candidate)
if (!assetHash) {
pendingNodeTypes.add(candidate.nodeType)
continue
}
const hashCandidates = candidatesByHash.get(assetHash)
if (hashCandidates) hashCandidates.push(candidate)
else candidatesByHash.set(assetHash, [candidate])
}
await Promise.all(
Array.from(candidatesByHash, async ([assetHash, hashCandidates]) => {
if (signal?.aborted) return
try {
const status = await checkAssetHash(assetHash, signal)
if (signal?.aborted) return
if (status === 'exists') {
for (const candidate of hashCandidates) {
candidate.isMissing = false
}
return
}
} catch (err) {
if (signal?.aborted || isAbortError(err)) return
console.warn(
'[Missing Model Pipeline] Failed to verify asset hash:',
err
)
}
for (const candidate of hashCandidates) {
pendingNodeTypes.add(candidate.nodeType)
}
})
)
if (signal?.aborted) return
if (pendingNodeTypes.size === 0) return
const store =
@@ -491,6 +544,20 @@ export async function verifyAssetSupportedCandidates(
}
}
function getBlake3AssetHash(candidate: MissingModelCandidate): string | null {
if (candidate.hashType?.toLowerCase() !== 'blake3') return null
return toBlake3AssetHash(candidate.hash)
}
function isAbortError(err: unknown): boolean {
return (
typeof err === 'object' &&
err !== null &&
'name' in err &&
err.name === 'AbortError'
)
}
function normalizePath(path: string): string {
return path.replace(/\\/g, '/')
}

View File

@@ -1,7 +1,9 @@
import { createTestingPinia } from '@pinia/testing'
import { setActivePinia } from 'pinia'
import { beforeEach, describe, expect, it, vi } from 'vitest'
import { nextTick } from 'vue'
import type { LGraphCanvas } from '@/lib/litegraph/src/litegraph'
import { useCanvasStore } from '@/renderer/core/canvas/canvasStore'
vi.mock('@/composables/useAppMode', () => ({
@@ -35,6 +37,30 @@ vi.mock('@/scripts/app', () => ({
}
}))
vi.mock('@vueuse/core', async (importOriginal) => {
const actual = await importOriginal()
return {
...(actual as Record<string, unknown>),
useEventListener: vi.fn(
(
target: EventTarget,
event: string,
handler: EventListenerOrEventListenerObject
) => {
target.addEventListener(event, handler)
return () => target.removeEventListener(event, handler)
}
)
}
})
function createMockCanvas(readOnly = false): LGraphCanvas {
return {
read_only: readOnly,
canvas: document.createElement('canvas')
} as unknown as LGraphCanvas
}
describe('useCanvasStore', () => {
let store: ReturnType<typeof useCanvasStore>
@@ -84,4 +110,42 @@ describe('useCanvasStore', () => {
expect(originalHandler).toHaveBeenCalledWith(2.0, app.canvas.ds.offset)
})
})
describe('isReadOnly', () => {
it('syncs initial read_only value when canvas is set', async () => {
const mockCanvas = createMockCanvas(true)
store.canvas = mockCanvas as unknown as LGraphCanvas
await nextTick()
expect(store.isReadOnly).toBe(true)
})
it('updates isReadOnly when litegraph:read-only-changed event fires', async () => {
const mockCanvas = createMockCanvas(false)
store.canvas = mockCanvas as unknown as LGraphCanvas
await nextTick()
expect(store.isReadOnly).toBe(false)
// Simulate space key press → LGraphCanvas sets read_only = true
mockCanvas.canvas.dispatchEvent(
new CustomEvent('litegraph:read-only-changed', {
detail: { readOnly: true }
})
)
expect(store.isReadOnly).toBe(true)
// Simulate space key release
mockCanvas.canvas.dispatchEvent(
new CustomEvent('litegraph:read-only-changed', {
detail: { readOnly: false }
})
)
expect(store.isReadOnly).toBe(false)
})
})
})

View File

@@ -56,6 +56,7 @@ export const useCanvasStore = defineStore('canvas', () => {
setMode(val ? 'app' : 'graph')
}
})
const isReadOnly = ref(false)
// Set up scale synchronization when canvas is available
let originalOnChanged: ((scale: number, offset: Point) => void) | undefined =
@@ -131,6 +132,16 @@ export const useCanvasStore = defineStore('canvas', () => {
whenever(
() => canvas.value,
(newCanvas) => {
isReadOnly.value = newCanvas.read_only
useEventListener(
newCanvas.canvas,
'litegraph:read-only-changed',
(event: CustomEvent<{ readOnly: boolean }>) => {
isReadOnly.value = event.detail.readOnly
}
)
useEventListener(
newCanvas.canvas,
'litegraph:set-graph',
@@ -176,6 +187,7 @@ export const useCanvasStore = defineStore('canvas', () => {
rerouteSelected,
appScalePercentage,
linearMode,
isReadOnly,
updateSelectedItems,
getCanvas,
setAppZoomFromPercentage,

View File

@@ -12,7 +12,8 @@ vi.mock('@/renderer/core/canvas/canvasStore', () => {
return {
useCanvasStore: vi.fn(() => ({
getCanvas,
setCursorStyle
setCursorStyle,
isReadOnly: false
}))
}
})

View File

@@ -22,9 +22,7 @@ export function useCanvasInteractions() {
* Whether Vue node components should handle pointer events.
* Returns false when canvas is in read-only/panning mode (e.g., space key held for panning).
*/
const shouldHandleNodePointerEvents = computed(
() => !(canvasStore.canvas?.read_only ?? false)
)
const shouldHandleNodePointerEvents = computed(() => !canvasStore.isReadOnly)
/**
* Returns true if the wheel event target is inside an element that should

View File

@@ -24,7 +24,9 @@ vi.mock('@/scripts/api', () => ({
vi.mock('@/platform/assets/services/assetService', () => ({
assetService: {
getAssetsByTag: vi.fn(),
getAllAssetsByTag: vi.fn(),
getAssetsForNodeType: vi.fn(),
invalidateInputAssetsIncludingPublic: vi.fn(),
updateAsset: vi.fn(),
addAssetTags: vi.fn(),
removeAssetTags: vi.fn()
@@ -1259,6 +1261,9 @@ describe('assetsStore - Deletion State and Input Mapping', () => {
false,
{ limit: 100 }
)
expect(
assetService.invalidateInputAssetsIncludingPublic
).toHaveBeenCalledOnce()
} finally {
mockIsCloud.value = false
}

View File

@@ -123,7 +123,7 @@ export const useAssetsStore = defineStore('assets', () => {
state: inputAssets,
isLoading: inputLoading,
error: inputError,
execute: updateInputs
execute: executeUpdateInputs
} = useAsyncState(fetchInputFiles, [], {
immediate: false,
resetOnExecute: false,
@@ -132,6 +132,12 @@ export const useAssetsStore = defineStore('assets', () => {
}
})
const updateInputs = async () => {
const result = await executeUpdateInputs()
assetService.invalidateInputAssetsIncludingPublic()
return result
}
/**
* Fetch history assets with pagination support
* @param loadMore - true for pagination (append), false for initial load (replace)