From 11432f7d0ef86748527cdc6520b025aefda4823c Mon Sep 17 00:00:00 2001 From: jaeone94 <89377375+jaeone94@users.noreply.github.com> Date: Fri, 1 May 2026 09:50:51 +0900 Subject: [PATCH] refactor: extract missing model refresh pipeline (#11751) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit ## Summary Extracts the missing-model pipeline orchestration out of `ComfyApp` and into an app-independent platform module, while tightening the workflow-flattening type boundary that refresh needs when rescanning the live LiteGraph graph. This PR is intentionally refactor-heavy. It is the follow-up to the earlier missing-model refresh work: instead of keeping refresh-specific candidate recheck logic beside the UI, this change makes the refresh path reuse the existing missing-model pipeline and removes the direct dependency on private `ComfyApp` pipeline methods. Linear: FE-499 Issues covered by this PR: - Fixes #11678 - Fixes #11680 - Partially addresses #11679 by removing the missing-model refresh path's unsafe `graph.serialize() as unknown as ComfyWorkflowJSON` cast and replacing it with the narrower flattenable workflow contract. Broader workflow serialization/type-boundary cleanup outside this missing-model refresh path remains deferred. ## Changes - **What**: - Added `src/platform/missingModel/missingModelPipeline.ts` as the orchestration module for missing-model detection/verification. - `runMissingModelPipeline(...)` now owns the pipeline previously embedded in `ComfyApp`: - candidate scan and enrichment - active ancestor filtering for muted/bypassed subgraph containers - pending warning cache updates - OSS folder path and file-size follow-up work - cloud asset verification follow-up work - surfaced missing-model errors via the existing execution error store - `refreshMissingModelPipeline(...)` handles the refresh-specific flow: - calls the injected `reloadNodeDefs()` first - serializes the current live graph - preserves model metadata by preferring active workflow `models`, then falling back to current missing-model candidate metadata - delegates back into the same pipeline used during workflow load - Kept `ComfyApp` as the compatibility caller instead of the owner of the pipeline. - `loadGraphData(...)` now calls `runMissingModelPipeline(...)` with `graph`, `graphData`, `missingNodeTypes`, and `silent` options. - `refreshMissingModels(...)` is now a thin wrapper around `refreshMissingModelPipeline(...)` and keeps the existing default `silent: true` refresh behavior. - The new pipeline module does not import `@/scripts/app`; app-owned data/actions are passed in as inputs. - Moved the workflow node-flattening helpers out of `workflowSchema.ts` and into `src/platform/workflow/core/utils/workflowFlattening.ts`. - This includes `flattenWorkflowNodes`, `buildSubgraphExecutionPaths`, and `isSubgraphDefinition`. - The move is intentional: these helpers are not zod schema definitions or workflow validation logic. They are core workflow traversal utilities used to flatten root workflow nodes plus nested subgraph definition nodes into the execution-shaped node list needed by missing-model scanning. - The refresh path receives data from `LGraph.serialize()`, whose return type is serialized LiteGraph data rather than validated `ComfyWorkflowJSON`. Previously this forced unsafe typing like `graph.serialize() as unknown as ComfyWorkflowJSON`. - The new `FlattenableWorkflowGraph` / `FlattenableWorkflowNode` structural contract describes only what flattening actually needs: `nodes`, `definitions.subgraphs`, node `id`, `type`, `mode`, `widgets_values`, and `properties`. - This lets both normal workflow-load data (`ComfyWorkflowJSON`) and refresh-time live graph serialization (`LGraph.serialize()`) flow into the same scan/enrichment path without pretending serialized LiteGraph output is a fully validated workflow schema document. - Updated `missingModelScan.ts` to consume that minimal flattenable workflow shape via `MissingModelWorkflowData`. - `MissingModelWorkflowData` extends the flattenable workflow contract with optional workflow-level `models` metadata. - Removed now-unnecessary casts around execution IDs, flattened nodes, and `widgets_values` object access. - Updated `getSelectedModelsMetadata(...)` to accept readonly widget value arrays so flattened workflow data can stay read-only. - Reduced the exported surface of the new pipeline module after `knip` flagged unused exported internal option/store interfaces. - Kept `workflowSchema.ts` focused on validation schemas. The flattening helpers are not re-exported from the schema module because they are internal workflow core utilities, not public schema API. - **Breaking**: None intended. - Internal imports were updated to the new core utility path. - This repo is not exposing these flattening helpers as a public package API, so the old schema-local helper location is treated as an internal implementation detail. - **Dependencies**: None. ## Review Focus - **Pipeline extraction / dependency direction**: - Please verify that `missingModelPipeline.ts` stays independent from `@/scripts/app`. - `ComfyApp` should remain the caller/adapter, not the owner of missing-model pipeline orchestration. - **Workflow flattening type boundary**: - The main type-cleanup goal is removing the refresh-time `graph.serialize() as unknown as ComfyWorkflowJSON` lie. - `LGraph.serialize()` and validated workflow JSON are not the same contract. The new flattenable workflow contract is deliberately smaller and structural because the missing-model enrichment path only needs enough data to flatten nodes and read embedded model metadata. - This is why the flattening helpers moved from `workflowSchema.ts` to `workflow/core/utils`: the logic is reusable workflow traversal, not validation schema. - **Behavior preservation**: - The PR is intended to preserve existing user-facing missing-model behavior while moving ownership out of `app.ts`. - Existing async follow-up behavior remains intentionally fire-and-forget: - cloud asset verification still surfaces after verification completes - OSS folder paths still update asynchronously before surfacing confirmed missing models - file-size metadata fetching remains asynchronous - More invasive behavior changes, such as adding non-cloud post-fetch `isMissingCandidateActive(...)` re-verification or redesigning the fire-and-forget result contract, are intentionally left for follow-up work because they are not pure extraction. - **Downloadable model metadata**: - `missingModels` returned for download metadata now requires both `url` and `directory`. - Candidates without a directory still remain in `confirmedCandidates`, but they are not exposed as downloadable model metadata. This keeps the returned downloadable list aligned with what the download flow can actually use. - **Test ownership**: - Complex missing-model pipeline behavior tests moved out of `src/scripts/app.test.ts` and into `src/platform/missingModel/missingModelPipeline.test.ts`. - `app.test.ts` now only covers thin delegation for `app.refreshMissingModels(...)`. - Workflow flattening tests moved with the helper from schema tests into `src/platform/workflow/core/utils/workflowFlattening.test.ts`. - **Deferred follow-ups**: - Broader function decomposition for cognitive complexity. - Wider dependency-injection/port cleanup for stores and services beyond the app boundary. - Cloud-specific pipeline unit tests, which need a separate `isCloud` mocking strategy. - Additional E2E coverage expansion beyond the existing OSS refresh path. - More general workflow serialization/type-boundary cleanup outside the missing-model refresh path. ## Validation - `pnpm format` - `pnpm lint` - Passed. Existing lint output included a pre-existing `no-misused-spread` warning and icon-name logs, but the command exited successfully. - `pnpm typecheck` - `pnpm test:unit` - `714 passed`, `9514 passed | 8 skipped` - Pre-push `pnpm knip` - Passed after reducing the exported surface of the new pipeline module. ## Screenshots (if applicable) Not applicable. This PR is a pipeline/type-boundary refactor with no UI changes. ┆Issue is synchronized with this [Notion page](https://app.notion.com/p/PR-11751-refactor-extract-missing-model-refresh-pipeline-3516d73d3650816d9245d4b1324b71c9) by [Unito](https://www.unito.io) --------- Co-authored-by: DrJKL Co-authored-by: Alexander Brown --- .../missingModel/missingModelPipeline.test.ts | 601 ++++++++++++++++++ .../missingModel/missingModelPipeline.ts | 260 ++++++++ src/platform/missingModel/missingModelScan.ts | 28 +- .../core/utils/workflowFlattening.test.ts | 197 ++++++ .../workflow/core/utils/workflowFlattening.ts | 166 +++++ .../validation/schemas/workflowSchema.test.ts | 124 +--- .../validation/schemas/workflowSchema.ts | 114 ---- src/scripts/app.test.ts | 147 +---- src/scripts/app.ts | 253 +------- src/workbench/utils/modelMetadataUtil.ts | 2 +- 10 files changed, 1303 insertions(+), 589 deletions(-) create mode 100644 src/platform/missingModel/missingModelPipeline.test.ts create mode 100644 src/platform/missingModel/missingModelPipeline.ts create mode 100644 src/platform/workflow/core/utils/workflowFlattening.test.ts create mode 100644 src/platform/workflow/core/utils/workflowFlattening.ts diff --git a/src/platform/missingModel/missingModelPipeline.test.ts b/src/platform/missingModel/missingModelPipeline.test.ts new file mode 100644 index 0000000000..b6c1b1cda3 --- /dev/null +++ b/src/platform/missingModel/missingModelPipeline.test.ts @@ -0,0 +1,601 @@ +import { beforeEach, describe, expect, it, vi } from 'vitest' + +import type { LGraph } from '@/lib/litegraph/src/litegraph' +import type { MissingModelCandidate } from '@/platform/missingModel/types' +import type { + ComfyWorkflowJSON, + ModelFile +} from '@/platform/workflow/validation/schemas/workflowSchema' +import { + refreshMissingModelPipeline, + runMissingModelPipeline +} from '@/platform/missingModel/missingModelPipeline' + +const { mockHandles } = vi.hoisted(() => { + const state = { + enrichedCandidates: [] as MissingModelCandidate[] + } + + return { + mockHandles: { + state, + missingModelStore: { + missingModelCandidates: null as MissingModelCandidate[] | null, + createVerificationAbortController: vi.fn(() => new AbortController()), + setFolderPaths: vi.fn(), + setFileSize: vi.fn() + }, + workspaceWorkflow: { + activeWorkflow: null as { + activeState?: Pick | null + pendingWarnings?: unknown + } | null + }, + executionErrorStore: { + surfaceMissingModels: vi.fn() + }, + modelStore: { + loadModelFolders: vi.fn(), + getLoadedModelFolder: vi.fn() + }, + modelToNodeStore: { + getCategoryForNodeType: vi.fn() + }, + scanAllModelCandidates: vi.fn( + ( + _graph: LGraph, + _isAssetSupported: (nodeType: string, widgetName: string) => boolean, + _getDirectory?: (nodeType: string) => string | undefined + ): MissingModelCandidate[] => [] + ), + enrichWithEmbeddedMetadata: vi.fn( + async ( + _candidates: readonly MissingModelCandidate[], + _graphData: ComfyWorkflowJSON, + _checkModelInstalled: ( + name: string, + directory: string + ) => Promise, + _isAssetSupported?: (nodeType: string, widgetName: string) => boolean + ) => state.enrichedCandidates + ), + verifyAssetSupportedCandidates: vi.fn( + async ( + _candidates: readonly MissingModelCandidate[], + _signal: AbortSignal + ) => undefined + ), + toastStore: { + add: vi.fn() + }, + assetService: { + shouldUseAssetBrowser: vi.fn() + }, + api: { + getFolderPaths: vi.fn() + }, + fetchModelMetadata: vi.fn(), + isAncestorPathActive: vi.fn((_graph: LGraph, _nodeId: string) => true), + isMissingCandidateActive: vi.fn( + (_graph: LGraph, _candidate: MissingModelCandidate) => true + ) + } + } +}) + +vi.mock('@/platform/distribution/types', () => ({ + isCloud: false +})) + +vi.mock('@/platform/assets/services/assetService', () => ({ + assetService: { + shouldUseAssetBrowser: (nodeType: string, widgetName: string) => + mockHandles.assetService.shouldUseAssetBrowser(nodeType, widgetName) + } +})) + +vi.mock('@/stores/workspaceStore', () => ({ + useWorkspaceStore: () => ({ + workflow: mockHandles.workspaceWorkflow + }) +})) + +vi.mock('@/stores/executionErrorStore', () => ({ + useExecutionErrorStore: () => mockHandles.executionErrorStore +})) + +vi.mock('@/stores/modelStore', () => ({ + useModelStore: () => mockHandles.modelStore +})) + +vi.mock('@/stores/modelToNodeStore', () => ({ + useModelToNodeStore: () => mockHandles.modelToNodeStore +})) + +vi.mock('@/platform/missingModel/missingModelScan', () => ({ + scanAllModelCandidates: ( + graph: LGraph, + isAssetSupported: (nodeType: string, widgetName: string) => boolean, + getDirectory?: (nodeType: string) => string | undefined + ) => + mockHandles.scanAllModelCandidates(graph, isAssetSupported, getDirectory), + enrichWithEmbeddedMetadata: ( + candidates: readonly MissingModelCandidate[], + graphData: ComfyWorkflowJSON, + checkModelInstalled: (name: string, directory: string) => Promise, + isAssetSupported?: (nodeType: string, widgetName: string) => boolean + ) => + mockHandles.enrichWithEmbeddedMetadata( + candidates, + graphData, + checkModelInstalled, + isAssetSupported + ), + verifyAssetSupportedCandidates: ( + candidates: readonly MissingModelCandidate[], + signal: AbortSignal + ) => mockHandles.verifyAssetSupportedCandidates(candidates, signal) +})) + +vi.mock('@/platform/updates/common/toastStore', () => ({ + useToastStore: () => mockHandles.toastStore +})) + +vi.mock('@/scripts/api', () => ({ + api: { + getFolderPaths: () => mockHandles.api.getFolderPaths() + } +})) + +vi.mock('@/platform/missingModel/missingModelDownload', () => ({ + fetchModelMetadata: (url: string) => mockHandles.fetchModelMetadata(url) +})) + +vi.mock('@/utils/graphTraversalUtil', () => ({ + isAncestorPathActive: (graph: LGraph, nodeId: string) => + mockHandles.isAncestorPathActive(graph, nodeId), + isMissingCandidateActive: (graph: LGraph, candidate: MissingModelCandidate) => + mockHandles.isMissingCandidateActive(graph, candidate) +})) + +function createWorkflowGraphData(): ComfyWorkflowJSON { + return { + last_node_id: 0, + last_link_id: 0, + nodes: [], + links: [], + groups: [], + config: {}, + extra: {}, + version: 0.4 + } +} + +function createGraph(graphData = createWorkflowGraphData()): LGraph { + return { + serialize: vi.fn(() => graphData) + } as unknown as LGraph +} + +describe('missingModelPipeline', () => { + beforeEach(() => { + vi.clearAllMocks() + mockHandles.state.enrichedCandidates = [] + mockHandles.missingModelStore.missingModelCandidates = null + mockHandles.workspaceWorkflow.activeWorkflow = null + mockHandles.missingModelStore.createVerificationAbortController.mockImplementation( + () => new AbortController() + ) + mockHandles.modelStore.loadModelFolders.mockResolvedValue(undefined) + mockHandles.modelStore.getLoadedModelFolder.mockResolvedValue(undefined) + mockHandles.modelToNodeStore.getCategoryForNodeType.mockReturnValue( + undefined + ) + mockHandles.scanAllModelCandidates.mockReturnValue([]) + mockHandles.api.getFolderPaths.mockResolvedValue({}) + mockHandles.fetchModelMetadata.mockResolvedValue({ fileSize: null }) + mockHandles.isAncestorPathActive.mockReturnValue(true) + mockHandles.isMissingCandidateActive.mockReturnValue(true) + }) + + describe('refreshMissingModelPipeline', () => { + it('reloads node definitions before scanning the current graph', async () => { + const order: string[] = [] + const graph = createGraph() + const reloadNodeDefs = vi.fn(async () => { + order.push('reload') + }) + mockHandles.scanAllModelCandidates.mockImplementation(() => { + order.push('scan') + return [] + }) + + await refreshMissingModelPipeline({ + graph, + reloadNodeDefs, + missingModelStore: mockHandles.missingModelStore + }) + + expect(order).toEqual(['reload', 'scan']) + }) + + it('reuses active workflow model metadata when refreshing the current graph', async () => { + const activeModels: ModelFile[] = [ + { + name: 'embedded.safetensors', + url: 'https://example.com/embedded.safetensors', + directory: 'checkpoints' + } + ] + mockHandles.workspaceWorkflow.activeWorkflow = { + activeState: { models: activeModels }, + pendingWarnings: null + } + mockHandles.missingModelStore.missingModelCandidates = [ + { + nodeId: '1', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'candidate.safetensors', + url: 'https://example.com/candidate.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } + ] + + await refreshMissingModelPipeline({ + graph: createGraph(), + reloadNodeDefs: vi.fn(), + missingModelStore: mockHandles.missingModelStore, + silent: false + }) + + expect(mockHandles.enrichWithEmbeddedMetadata).toHaveBeenCalledWith( + expect.any(Array), + expect.objectContaining({ models: activeModels }), + expect.any(Function), + undefined + ) + expect( + mockHandles.executionErrorStore.surfaceMissingModels + ).toHaveBeenCalledWith([], { silent: false }) + }) + + it('falls back to current missing model metadata when workflow state has no models', async () => { + mockHandles.missingModelStore.missingModelCandidates = [ + { + nodeId: '1', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'candidate.safetensors', + url: 'https://example.com/candidate.safetensors', + directory: 'checkpoints', + hash: 'abc123', + hashType: 'sha256', + isMissing: true, + isAssetSupported: true + }, + { + nodeId: '2', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'missing-url.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } + ] + + await refreshMissingModelPipeline({ + graph: createGraph(), + reloadNodeDefs: vi.fn(), + missingModelStore: mockHandles.missingModelStore + }) + + expect(mockHandles.enrichWithEmbeddedMetadata).toHaveBeenCalledWith( + expect.any(Array), + expect.objectContaining({ + models: [ + { + name: 'candidate.safetensors', + url: 'https://example.com/candidate.safetensors', + directory: 'checkpoints', + hash: 'abc123', + hash_type: 'sha256' + } + ] + }), + expect.any(Function), + undefined + ) + expect( + mockHandles.executionErrorStore.surfaceMissingModels + ).toHaveBeenCalledWith([], { silent: true }) + }) + + it('does not add model metadata when no active workflow or current candidate metadata exists', async () => { + const graphData = createWorkflowGraphData() + + await refreshMissingModelPipeline({ + graph: createGraph(graphData), + reloadNodeDefs: vi.fn(), + missingModelStore: mockHandles.missingModelStore + }) + + expect(mockHandles.enrichWithEmbeddedMetadata).toHaveBeenCalledWith( + expect.any(Array), + graphData, + expect.any(Function), + undefined + ) + }) + + it('rejects when injected node definition reload fails', async () => { + const error = new Error('object_info failed') + + await expect( + refreshMissingModelPipeline({ + graph: createGraph(), + reloadNodeDefs: vi.fn().mockRejectedValue(error), + missingModelStore: mockHandles.missingModelStore + }) + ).rejects.toThrow(error) + + expect(mockHandles.scanAllModelCandidates).not.toHaveBeenCalled() + }) + }) + + describe('runMissingModelPipeline', () => { + it('returns confirmed missing models and caches pending warning candidates', async () => { + const confirmedCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'missing.safetensors', + url: 'https://example.com/missing.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + const installedCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'installed.safetensors', + directory: 'checkpoints', + isMissing: false, + isAssetSupported: true + } satisfies MissingModelCandidate + const activeWorkflow = { + activeState: null, + pendingWarnings: null + } + mockHandles.state.enrichedCandidates = [ + confirmedCandidate, + installedCandidate + ] + mockHandles.workspaceWorkflow.activeWorkflow = activeWorkflow + + const result = await runMissingModelPipeline({ + graph: createGraph(), + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore, + missingNodeTypes: ['MissingCustomNode'] + }) + await vi.dynamicImportSettled() + + expect(result).toEqual({ + missingModels: [ + { + name: 'missing.safetensors', + url: 'https://example.com/missing.safetensors', + directory: 'checkpoints', + hash: undefined, + hash_type: undefined + } + ], + confirmedCandidates: [confirmedCandidate] + }) + expect(activeWorkflow.pendingWarnings).toEqual({ + missingNodeTypes: ['MissingCustomNode'], + missingModelCandidates: [confirmedCandidate], + missingMediaCandidates: undefined + }) + }) + + it('does not expose downloadable model metadata without a directory', async () => { + const confirmedCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'missing.safetensors', + url: 'https://example.com/missing.safetensors', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + mockHandles.state.enrichedCandidates = [confirmedCandidate] + + const result = await runMissingModelPipeline({ + graph: createGraph(), + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore + }) + + expect(result).toEqual({ + missingModels: [], + confirmedCandidates: [confirmedCandidate] + }) + }) + + it('fetches file sizes only for candidates with complete download metadata', async () => { + const downloadableCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'downloadable.safetensors', + url: 'https://example.com/downloadable.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + const urlOnlyCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'url-only.safetensors', + url: 'https://example.com/url-only.safetensors', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + mockHandles.state.enrichedCandidates = [ + downloadableCandidate, + urlOnlyCandidate + ] + mockHandles.fetchModelMetadata.mockResolvedValue({ fileSize: 1024 }) + + await runMissingModelPipeline({ + graph: createGraph(), + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore + }) + await vi.dynamicImportSettled() + + expect(mockHandles.fetchModelMetadata).toHaveBeenCalledOnce() + expect(mockHandles.fetchModelMetadata).toHaveBeenCalledWith( + 'https://example.com/downloadable.safetensors' + ) + expect(mockHandles.missingModelStore.setFileSize).toHaveBeenCalledWith( + 'https://example.com/downloadable.safetensors', + 1024 + ) + }) + + it('clears surfaced and cached missing models when no candidates are confirmed missing', async () => { + const installedCandidate = { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'installed.safetensors', + directory: 'checkpoints', + isMissing: false, + isAssetSupported: true + } satisfies MissingModelCandidate + const activeWorkflow = { + activeState: null, + pendingWarnings: { + missingModelCandidates: [ + { + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'stale.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } + ], + missingNodeTypes: undefined, + missingMediaCandidates: undefined + } + } + mockHandles.state.enrichedCandidates = [installedCandidate] + mockHandles.workspaceWorkflow.activeWorkflow = activeWorkflow + + await runMissingModelPipeline({ + graph: createGraph(), + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore + }) + + expect( + mockHandles.executionErrorStore.surfaceMissingModels + ).toHaveBeenCalledWith([], { silent: false }) + expect(activeWorkflow.pendingWarnings).toBeNull() + }) + + it('drops candidates whose ancestor path is inactive', async () => { + const activeCandidate = { + nodeId: '1', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'active.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + const inactiveCandidate = { + nodeId: '2', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'inactive.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + const activeWorkflow = { + activeState: null, + pendingWarnings: null + } + const graph = createGraph() + mockHandles.state.enrichedCandidates = [ + activeCandidate, + inactiveCandidate + ] + mockHandles.workspaceWorkflow.activeWorkflow = activeWorkflow + mockHandles.isAncestorPathActive.mockImplementation( + (_graph: LGraph, nodeId: string) => nodeId !== '2' + ) + + const result = await runMissingModelPipeline({ + graph, + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore + }) + + expect(result.confirmedCandidates).toEqual([activeCandidate]) + expect(activeWorkflow.pendingWarnings).toEqual({ + missingNodeTypes: undefined, + missingModelCandidates: [activeCandidate], + missingMediaCandidates: undefined + }) + }) + + it('skips post-fetch surface when folder path refresh is aborted', async () => { + const controller = new AbortController() + const confirmedCandidate = { + nodeId: '1', + nodeType: 'CheckpointLoaderSimple', + widgetName: 'ckpt_name', + name: 'missing.safetensors', + directory: 'checkpoints', + isMissing: true, + isAssetSupported: true + } satisfies MissingModelCandidate + let resolveFolderPaths!: (paths: Record) => void + const folderPathsPromise = new Promise>( + (resolve) => { + resolveFolderPaths = resolve + } + ) + mockHandles.state.enrichedCandidates = [confirmedCandidate] + mockHandles.missingModelStore.createVerificationAbortController.mockReturnValueOnce( + controller + ) + mockHandles.api.getFolderPaths.mockReturnValueOnce(folderPathsPromise) + + await runMissingModelPipeline({ + graph: createGraph(), + graphData: createWorkflowGraphData(), + missingModelStore: mockHandles.missingModelStore + }) + + controller.abort() + resolveFolderPaths({ checkpoints: ['/models/checkpoints'] }) + await folderPathsPromise + // Settle both .then() and .finally() microtasks on getFolderPaths(). + await Promise.resolve() + await Promise.resolve() + + expect( + mockHandles.missingModelStore.setFolderPaths + ).not.toHaveBeenCalled() + expect( + mockHandles.executionErrorStore.surfaceMissingModels + ).not.toHaveBeenCalled() + }) + }) +}) diff --git a/src/platform/missingModel/missingModelPipeline.ts b/src/platform/missingModel/missingModelPipeline.ts new file mode 100644 index 0000000000..8ccb71018e --- /dev/null +++ b/src/platform/missingModel/missingModelPipeline.ts @@ -0,0 +1,260 @@ +import { st } from '@/i18n' +import type { LGraph } from '@/lib/litegraph/src/litegraph' +import { assetService } from '@/platform/assets/services/assetService' +import { isCloud } from '@/platform/distribution/types' +import { + enrichWithEmbeddedMetadata, + scanAllModelCandidates, + verifyAssetSupportedCandidates +} from '@/platform/missingModel/missingModelScan' +import type { MissingModelWorkflowData } from '@/platform/missingModel/missingModelScan' +import type { MissingModelCandidate } from '@/platform/missingModel/types' +import { useToastStore } from '@/platform/updates/common/toastStore' +import { updatePendingWarnings } from '@/platform/workflow/core/utils/pendingWarnings' +import type { ComfyWorkflow } from '@/platform/workflow/management/stores/comfyWorkflow' +import type { ModelFile } from '@/platform/workflow/validation/schemas/workflowSchema' +import { api } from '@/scripts/api' +import { useExecutionErrorStore } from '@/stores/executionErrorStore' +import { useModelStore } from '@/stores/modelStore' +import { useModelToNodeStore } from '@/stores/modelToNodeStore' +import { useWorkspaceStore } from '@/stores/workspaceStore' +import type { MissingNodeType } from '@/types/comfy' +import { + isAncestorPathActive, + isMissingCandidateActive +} from '@/utils/graphTraversalUtil' + +export interface MissingModelPipelineResult { + missingModels: ModelFile[] + confirmedCandidates: MissingModelCandidate[] +} + +interface MissingModelPipelineStore { + missingModelCandidates: MissingModelCandidate[] | null + createVerificationAbortController: () => AbortController + setFolderPaths: (paths: Record) => void + setFileSize: (url: string, size: number) => void +} + +interface RunMissingModelPipelineOptions { + graph: LGraph + graphData: MissingModelWorkflowData + missingModelStore: MissingModelPipelineStore + missingNodeTypes?: MissingNodeType[] + silent?: boolean +} + +interface RefreshMissingModelPipelineOptions { + graph: LGraph + reloadNodeDefs: () => Promise + missingModelStore: MissingModelPipelineStore + silent?: boolean +} + +type MissingModelCandidateWithDownloadMetadata = MissingModelCandidate & { + url: string + directory: string +} + +function cacheModelCandidates( + wf: Pick | null | undefined, + confirmed: MissingModelCandidate[] +) { + if (!wf) return + updatePendingWarnings(wf, { + missingModelCandidates: confirmed + }) +} + +function clearMissingModels( + wf: Pick | null | undefined, + silent: boolean +) { + useExecutionErrorStore().surfaceMissingModels([], { silent }) + cacheModelCandidates(wf, []) +} + +function hasDownloadMetadata( + candidate: MissingModelCandidate +): candidate is MissingModelCandidateWithDownloadMetadata { + return !!candidate.url && !!candidate.directory +} + +function toModelFile(candidate: MissingModelCandidateWithDownloadMetadata) { + return { + name: candidate.name, + url: candidate.url, + directory: candidate.directory, + hash: candidate.hash, + hash_type: candidate.hashType + } +} + +function getCurrentMissingModelMetadata( + missingModelStore: MissingModelPipelineStore +): ModelFile[] { + return ( + missingModelStore.missingModelCandidates + ?.filter(hasDownloadMetadata) + .map(toModelFile) ?? [] + ) +} + +export async function runMissingModelPipeline({ + graph, + graphData, + missingModelStore, + missingNodeTypes, + silent = false +}: RunMissingModelPipelineOptions): Promise { + const controller = missingModelStore.createVerificationAbortController() + + const getDirectory = (nodeType: string) => + useModelToNodeStore().getCategoryForNodeType(nodeType) + const isAssetBrowserWidget = isCloud + ? assetService.shouldUseAssetBrowser + : () => false + + const candidates = scanAllModelCandidates( + graph, + isAssetBrowserWidget, + getDirectory + ) + + const modelStore = useModelStore() + await modelStore.loadModelFolders() + const enrichedAll = await enrichWithEmbeddedMetadata( + candidates, + graphData, + async (name, directory) => { + const folder = await modelStore.getLoadedModelFolder(directory) + const models = folder?.models + return !!( + models && Object.values(models).some((m) => m.file_name === name) + ) + }, + isCloud ? isAssetBrowserWidget : undefined + ) + + // Drop candidates whose enclosing subgraph is muted/bypassed. Per-node + // scans only checked each node's own mode; the cascade from an + // inactive container to its interior happens here. + // Asymmetric on purpose: a candidate dropped here is not resurrected if + // the user un-bypasses the container mid-verification. The realtime + // mode-change path (handleNodeModeChange → scanAndAddNodeErrors) is + // responsible for surfacing errors after an un-bypass. + const enrichedCandidates = enrichedAll.filter( + (c) => c.nodeId == null || isAncestorPathActive(graph, String(c.nodeId)) + ) + + const confirmedCandidates = enrichedCandidates.filter( + (c) => c.isMissing === true + ) + const downloadableCandidates = confirmedCandidates.filter(hasDownloadMetadata) + + const missingModels: ModelFile[] = downloadableCandidates.map(toModelFile) + + const activeWf = useWorkspaceStore().workflow.activeWorkflow + updatePendingWarnings(activeWf, { + ...(missingNodeTypes ? { missingNodeTypes } : {}), + missingModelCandidates: confirmedCandidates + }) + + if (enrichedCandidates.length) { + if (isCloud) { + void verifyAssetSupportedCandidates(enrichedCandidates, controller.signal) + .then(() => { + if (controller.signal.aborted) return + // Re-check ancestor: user may have bypassed a container + // while verification was in flight. + const confirmedAfterReverify = enrichedCandidates.filter((c) => + isMissingCandidateActive(graph, c) + ) + useExecutionErrorStore().surfaceMissingModels( + confirmedAfterReverify, + { silent } + ) + cacheModelCandidates(activeWf, confirmedAfterReverify) + }) + .catch((err) => { + if (controller.signal.aborted) return + console.warn( + '[Missing Model Pipeline] Asset verification failed:', + err + ) + useToastStore().add({ + severity: 'warn', + summary: st( + 'toastMessages.missingModelVerificationFailed', + 'Failed to verify missing models. Some models may not be shown in the Errors tab.' + ), + life: 5000 + }) + }) + } else { + if (!confirmedCandidates.length) { + clearMissingModels(activeWf, silent) + return { missingModels, confirmedCandidates } + } + + void api + .getFolderPaths() + .then((paths) => { + if (controller.signal.aborted) return + missingModelStore.setFolderPaths(paths) + }) + .catch((err) => { + console.warn( + '[Missing Model Pipeline] Failed to fetch folder paths:', + err + ) + }) + .finally(() => { + if (controller.signal.aborted) return + useExecutionErrorStore().surfaceMissingModels(confirmedCandidates, { + silent + }) + cacheModelCandidates(activeWf, confirmedCandidates) + }) + + const missingModelDownload = + import('@/platform/missingModel/missingModelDownload') + void Promise.allSettled( + downloadableCandidates.map(async (c) => { + const { fetchModelMetadata } = await missingModelDownload + const metadata = await fetchModelMetadata(c.url) + if (!controller.signal.aborted && metadata.fileSize !== null) { + missingModelStore.setFileSize(c.url, metadata.fileSize) + } + }) + ) + } + } else { + clearMissingModels(activeWf, silent) + } + + return { missingModels, confirmedCandidates } +} + +export async function refreshMissingModelPipeline({ + graph, + reloadNodeDefs, + missingModelStore, + silent = true +}: RefreshMissingModelPipelineOptions): Promise { + await reloadNodeDefs() + const graphData: MissingModelWorkflowData = graph.serialize() + const activeWorkflowState = + useWorkspaceStore().workflow.activeWorkflow?.activeState + const currentModelMetadata = getCurrentMissingModelMetadata(missingModelStore) + const models = activeWorkflowState?.models?.length + ? activeWorkflowState.models + : currentModelMetadata + + return runMissingModelPipeline({ + graph, + graphData: models.length ? { ...graphData, models } : graphData, + missingModelStore, + silent + }) +} diff --git a/src/platform/missingModel/missingModelScan.ts b/src/platform/missingModel/missingModelScan.ts index 54d64f4c83..11302154bc 100644 --- a/src/platform/missingModel/missingModelScan.ts +++ b/src/platform/missingModel/missingModelScan.ts @@ -1,9 +1,6 @@ -import type { - ComfyWorkflowJSON, - ModelFile, - NodeId -} from '@/platform/workflow/validation/schemas/workflowSchema' -import { flattenWorkflowNodes } from '@/platform/workflow/validation/schemas/workflowSchema' +import type { ModelFile } from '@/platform/workflow/validation/schemas/workflowSchema' +import type { FlattenableWorkflowGraph } from '@/platform/workflow/core/utils/workflowFlattening' +import { flattenWorkflowNodes } from '@/platform/workflow/core/utils/workflowFlattening' import type { MissingModelCandidate, MissingModelViewModel, @@ -28,6 +25,10 @@ import { import { LGraphEventMode } from '@/lib/litegraph/src/types/globalEnums' import { resolveComboValues } from '@/utils/litegraphUtil' +export type MissingModelWorkflowData = FlattenableWorkflowGraph & { + models?: ModelFile[] +} + function isComboWidget(widget: IBaseWidget): widget is IComboWidget { return widget.type === 'combo' } @@ -180,7 +181,7 @@ function scanAssetWidget( if (!isModelFileName(value)) return null return { - nodeId: executionId as NodeId, + nodeId: executionId, nodeType: node.type, widgetName: widget.name, isAssetSupported: true, @@ -206,7 +207,7 @@ function scanComboWidget( const inOptions = options.includes(value) return { - nodeId: executionId as NodeId, + nodeId: executionId, nodeType: node.type, widgetName: widget.name, isAssetSupported: nodeIsAssetSupported, @@ -218,7 +219,7 @@ function scanComboWidget( export async function enrichWithEmbeddedMetadata( candidates: readonly MissingModelCandidate[], - graphData: ComfyWorkflowJSON, + graphData: MissingModelWorkflowData, checkModelInstalled: (name: string, directory: string) => Promise, isAssetSupported?: (nodeType: string, widgetName: string) => boolean ): Promise { @@ -388,7 +389,7 @@ function isAncestorPathActiveInFlattened( function collectEmbeddedModelsWithSource( allNodes: ReturnType, - graphData: ComfyWorkflowJSON + graphData: MissingModelWorkflowData ): EmbeddedModelWithSource[] { const result: EmbeddedModelWithSource[] = [] @@ -399,9 +400,7 @@ function collectEmbeddedModelsWithSource( ) continue - const selected = getSelectedModelsMetadata( - node as Parameters[0] - ) + const selected = getSelectedModelsMetadata(node) if (!selected?.length) continue for (const model of selected) { @@ -435,8 +434,7 @@ function findWidgetNameForModel( modelName: string ): string { if (Array.isArray(node.widgets_values) || !node.widgets_values) return '' - const wv = node.widgets_values as Record - for (const [key, val] of Object.entries(wv)) { + for (const [key, val] of Object.entries(node.widgets_values)) { if (val === modelName) return key } return '' diff --git a/src/platform/workflow/core/utils/workflowFlattening.test.ts b/src/platform/workflow/core/utils/workflowFlattening.test.ts new file mode 100644 index 0000000000..ad432859d2 --- /dev/null +++ b/src/platform/workflow/core/utils/workflowFlattening.test.ts @@ -0,0 +1,197 @@ +import { describe, expect, it } from 'vitest' + +import type { FlattenableWorkflowNode } from '@/platform/workflow/core/utils/workflowFlattening' +import { + buildSubgraphExecutionPaths, + flattenWorkflowNodes +} from '@/platform/workflow/core/utils/workflowFlattening' + +function node(id: number, type: string): FlattenableWorkflowNode { + return { id, type } +} + +function subgraphDef( + id: string, + nodes: FlattenableWorkflowNode[], + nestedDefs: unknown[] = [] +) { + return { + id, + name: id, + nodes, + definitions: { subgraphs: nestedDefs }, + inputNode: {}, + outputNode: {} + } +} + +describe('buildSubgraphExecutionPaths', () => { + it('returns empty map when there are no subgraph definitions', () => { + expect(buildSubgraphExecutionPaths([node(5, 'SomeNode')], [])).toEqual( + new Map() + ) + }) + + it('returns empty map when no root node matches a subgraph type', () => { + const def = subgraphDef('def-A', []) + expect( + buildSubgraphExecutionPaths([node(5, 'UnrelatedNode')], [def]) + ).toEqual(new Map()) + }) + + it('skips malformed subgraph definitions', () => { + const malformedDef = { + id: 'def-A', + name: 'def-A', + nodes: [null], + inputNode: {}, + outputNode: {} + } + + expect( + buildSubgraphExecutionPaths([node(5, 'def-A')], [malformedDef]) + ).toEqual(new Map()) + }) + + it('maps a single subgraph instance to its execution path', () => { + const def = subgraphDef('def-A', []) + const result = buildSubgraphExecutionPaths([node(5, 'def-A')], [def]) + expect(result.get('def-A')).toEqual(['5']) + }) + + it('collects multiple instances of the same subgraph type', () => { + const def = subgraphDef('def-A', []) + const result = buildSubgraphExecutionPaths( + [node(5, 'def-A'), node(10, 'def-A')], + [def] + ) + expect(result.get('def-A')).toEqual(['5', '10']) + }) + + it('builds nested execution paths for subgraphs within subgraphs', () => { + const innerDef = subgraphDef('def-B', []) + const outerDef = subgraphDef('def-A', [node(70, 'def-B')]) + const result = buildSubgraphExecutionPaths( + [node(5, 'def-A')], + [outerDef, innerDef] + ) + expect(result.get('def-A')).toEqual(['5']) + expect(result.get('def-B')).toEqual(['5:70']) + }) + + it('does not recurse infinitely on self-referential subgraph definitions', () => { + const cyclicDef = subgraphDef('def-A', [node(70, 'def-A')]) + const result = buildSubgraphExecutionPaths([node(5, 'def-A')], [cyclicDef]) + expect(result.get('def-A')).toEqual(['5']) + }) + + it('does not recurse infinitely on mutually cyclic subgraph definitions', () => { + const defA = subgraphDef('def-A', [node(70, 'def-B')]) + const defB = subgraphDef('def-B', [node(80, 'def-A')]) + const result = buildSubgraphExecutionPaths([node(5, 'def-A')], [defA, defB]) + expect(result.get('def-A')).toEqual(['5']) + expect(result.get('def-B')).toEqual(['5:70']) + }) +}) + +describe('flattenWorkflowNodes', () => { + it('returns root nodes when no subgraphs exist', () => { + const result = flattenWorkflowNodes({ + nodes: [node(1, 'KSampler'), node(2, 'CLIPLoader')] + }) + + expect(result).toHaveLength(2) + expect(result.map((n) => n.id)).toEqual([1, 2]) + }) + + it('returns empty array when nodes is undefined', () => { + const result = flattenWorkflowNodes({}) + expect(result).toEqual([]) + }) + + it('includes subgraph nodes with prefixed IDs', () => { + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [ + subgraphDef('def-A', [node(10, 'Inner'), node(20, 'Inner2')]) + ] + } + }) + + expect(result).toHaveLength(3) + expect(result.map((n) => n.id)).toEqual([5, '5:10', '5:20']) + }) + + it('skips malformed subgraph definitions', () => { + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [ + { + id: 'def-A', + name: 'def-A', + nodes: [null], + inputNode: {}, + outputNode: {} + } + ] + } + }) + + expect(result.map((n) => n.id)).toEqual([5]) + }) + + it('skips malformed nested subgraph definitions', () => { + const outerDef = { + ...subgraphDef('def-A', [node(10, 'def-B')]), + definitions: { subgraphs: { length: 1 } } + } + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [outerDef] + } + }) + + expect(result.map((n) => n.id)).toEqual([5, '5:10']) + }) + + it('prefixes nested subgraph nodes with full execution path', () => { + const innerDef = subgraphDef('def-B', [node(3, 'Leaf')]) + const outerDef = subgraphDef('def-A', [node(10, 'def-B')], [innerDef]) + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [outerDef] + } + }) + + expect(result.map((n) => n.id)).toEqual([5, '5:10', '5:10:3']) + }) + + it('does not clone phantom nodes from self-referential subgraphs', () => { + const cyclicDef = subgraphDef('def-A', [node(70, 'def-A')]) + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [cyclicDef] + } + }) + + expect(result.map((n) => n.id)).toEqual([5, '5:70']) + }) + + it('does not clone phantom nodes from mutually cyclic subgraphs', () => { + const defA = subgraphDef('def-A', [node(70, 'def-B')]) + const defB = subgraphDef('def-B', [node(80, 'def-A')]) + const result = flattenWorkflowNodes({ + nodes: [node(5, 'def-A')], + definitions: { + subgraphs: [defA, defB] + } + }) + + expect(result.map((n) => n.id)).toEqual([5, '5:70', '5:70:80']) + }) +}) diff --git a/src/platform/workflow/core/utils/workflowFlattening.ts b/src/platform/workflow/core/utils/workflowFlattening.ts new file mode 100644 index 0000000000..4a7c3d31fd --- /dev/null +++ b/src/platform/workflow/core/utils/workflowFlattening.ts @@ -0,0 +1,166 @@ +import type { NodeId } from '@/lib/litegraph/src/litegraph' + +export interface FlattenableWorkflowNode { + id: NodeId + type: string + mode?: number + widgets_values?: readonly unknown[] | Record + properties?: Record +} + +export interface FlattenableWorkflowGraph { + nodes?: readonly FlattenableWorkflowNode[] + definitions?: { + subgraphs?: readonly unknown[] + } +} + +interface FlattenableSubgraphDefinition { + id: string + name: string + nodes: FlattenableWorkflowNode[] + definitions?: { + subgraphs?: readonly unknown[] + } + inputNode: unknown + outputNode: unknown +} + +function isFlattenableWorkflowNode( + obj: unknown +): obj is FlattenableWorkflowNode { + if (obj === null || typeof obj !== 'object') return false + + const candidate = obj as Record + return ( + (typeof candidate.id === 'string' || typeof candidate.id === 'number') && + typeof candidate.type === 'string' + ) +} + +/** + * Type guard to check if an object is a subgraph definition. + * This helps TypeScript understand the type when recursive definitions are unknown. + */ +function isSubgraphDefinition( + obj: unknown +): obj is FlattenableSubgraphDefinition { + if (obj === null || typeof obj !== 'object') return false + + const candidate = obj as Record + return ( + typeof candidate.id === 'string' && + typeof candidate.name === 'string' && + Array.isArray(candidate.nodes) && + candidate.nodes.every(isFlattenableWorkflowNode) && + 'inputNode' in candidate && + 'outputNode' in candidate + ) +} + +/** + * Builds a map from subgraph definition ID to all execution path prefixes + * where that definition is instantiated in the workflow. + * + * "def-A" -> ["5", "10"] for each container node instantiating that subgraph definition. + */ +export function buildSubgraphExecutionPaths( + rootNodes: readonly FlattenableWorkflowNode[], + allSubgraphDefs: readonly unknown[] +): Map { + const subgraphDefMap = new Map( + allSubgraphDefs.filter(isSubgraphDefinition).map((s) => [s.id, s]) + ) + const pathMap = new Map() + const visited = new Set() + + function build( + nodes: readonly FlattenableWorkflowNode[], + parentPrefix: string + ) { + for (const n of nodes ?? []) { + if (typeof n.type !== 'string' || !subgraphDefMap.has(n.type)) continue + if (visited.has(n.type)) continue + + const path = parentPrefix ? `${parentPrefix}:${n.id}` : String(n.id) + const existing = pathMap.get(n.type) + if (existing) { + existing.push(path) + } else { + pathMap.set(n.type, [path]) + } + + visited.add(n.type) + + const innerDef = subgraphDefMap.get(n.type) + if (innerDef) { + build(innerDef.nodes, path) + } + + visited.delete(n.type) + } + } + + build(rootNodes, '') + return pathMap +} + +/** + * Recursively collect all subgraph definitions from root and nested levels. + */ +export function collectSubgraphDefinitions( + rootDefs: readonly unknown[] +): FlattenableSubgraphDefinition[] { + const result: FlattenableSubgraphDefinition[] = [] + const seen = new Set() + + function collect(defs: readonly unknown[]) { + for (const def of defs) { + if (!isSubgraphDefinition(def)) continue + if (seen.has(def.id)) continue + seen.add(def.id) + result.push(def) + + const nestedSubgraphs = def.definitions?.subgraphs + if (!Array.isArray(nestedSubgraphs) || nestedSubgraphs.length === 0) { + continue + } + collect(nestedSubgraphs) + } + } + + collect(rootDefs) + return result +} + +/** + * Flatten all workflow nodes (root + subgraphs) into a single array. + * Each node's `id` is prefixed with its execution path (e.g. node "3" inside container "11" -> "11:3"). + */ +export function flattenWorkflowNodes( + graphData: FlattenableWorkflowGraph +): Readonly[] { + const rootNodes = graphData.nodes ?? [] + const allDefs = collectSubgraphDefinitions( + graphData.definitions?.subgraphs ?? [] + ) + const pathMap = buildSubgraphExecutionPaths(rootNodes, allDefs) + + const allNodes: FlattenableWorkflowNode[] = [...rootNodes] + + const subgraphDefMap = new Map(allDefs.map((s) => [s.id, s])) + for (const [defId, paths] of pathMap.entries()) { + const def = subgraphDefMap.get(defId) + if (!def?.nodes) continue + for (const prefix of paths) { + for (const node of def.nodes) { + allNodes.push({ + ...node, + id: `${prefix}:${node.id}` + }) + } + } + } + + return allNodes +} diff --git a/src/platform/workflow/validation/schemas/workflowSchema.test.ts b/src/platform/workflow/validation/schemas/workflowSchema.test.ts index b3ef7827fe..e6c608de47 100644 --- a/src/platform/workflow/validation/schemas/workflowSchema.test.ts +++ b/src/platform/workflow/validation/schemas/workflowSchema.test.ts @@ -1,16 +1,7 @@ -import { fromPartial } from '@total-typescript/shoehorn' import fs from 'fs' import { describe, expect, it } from 'vitest' -import { - buildSubgraphExecutionPaths, - flattenWorkflowNodes, - validateComfyWorkflow -} from '@/platform/workflow/validation/schemas/workflowSchema' -import type { - ComfyNode, - ComfyWorkflowJSON -} from '@/platform/workflow/validation/schemas/workflowSchema' +import { validateComfyWorkflow } from '@/platform/workflow/validation/schemas/workflowSchema' import { defaultGraph } from '@/scripts/defaultGraph' const WORKFLOW_DIR = 'src/platform/workflow/validation/schemas/__fixtures__' @@ -278,116 +269,3 @@ describe('parseComfyWorkflow', () => { }) }) }) - -function node(id: number, type: string): ComfyNode { - return { id, type } as ComfyNode -} - -function subgraphDef(id: string, nodes: ComfyNode[]) { - return { id, name: id, nodes, inputNode: {}, outputNode: {} } -} - -describe('buildSubgraphExecutionPaths', () => { - it('returns empty map when there are no subgraph definitions', () => { - expect(buildSubgraphExecutionPaths([node(5, 'SomeNode')], [])).toEqual( - new Map() - ) - }) - - it('returns empty map when no root node matches a subgraph type', () => { - const def = subgraphDef('def-A', []) - expect( - buildSubgraphExecutionPaths([node(5, 'UnrelatedNode')], [def]) - ).toEqual(new Map()) - }) - - it('maps a single subgraph instance to its execution path', () => { - const def = subgraphDef('def-A', []) - const result = buildSubgraphExecutionPaths([node(5, 'def-A')], [def]) - expect(result.get('def-A')).toEqual(['5']) - }) - - it('collects multiple instances of the same subgraph type', () => { - const def = subgraphDef('def-A', []) - const result = buildSubgraphExecutionPaths( - [node(5, 'def-A'), node(10, 'def-A')], - [def] - ) - expect(result.get('def-A')).toEqual(['5', '10']) - }) - - it('builds nested execution paths for subgraphs within subgraphs', () => { - const innerDef = subgraphDef('def-B', []) - const outerDef = subgraphDef('def-A', [node(70, 'def-B')]) - const result = buildSubgraphExecutionPaths( - [node(5, 'def-A')], - [outerDef, innerDef] - ) - expect(result.get('def-A')).toEqual(['5']) - expect(result.get('def-B')).toEqual(['5:70']) - }) - - it('does not recurse infinitely on self-referential subgraph definitions', () => { - const cyclicDef = subgraphDef('def-A', [node(70, 'def-A')]) - expect(() => - buildSubgraphExecutionPaths([node(5, 'def-A')], [cyclicDef]) - ).not.toThrow() - }) - - it('does not recurse infinitely on mutually cyclic subgraph definitions', () => { - const defA = subgraphDef('def-A', [node(70, 'def-B')]) - const defB = subgraphDef('def-B', [node(80, 'def-A')]) - expect(() => - buildSubgraphExecutionPaths([node(5, 'def-A')], [defA, defB]) - ).not.toThrow() - }) -}) - -describe('flattenWorkflowNodes', () => { - it('returns root nodes when no subgraphs exist', () => { - const result = flattenWorkflowNodes({ - nodes: [node(1, 'KSampler'), node(2, 'CLIPLoader')] - } as ComfyWorkflowJSON) - - expect(result).toHaveLength(2) - expect(result.map((n) => n.id)).toEqual([1, 2]) - }) - - it('returns empty array when nodes is undefined', () => { - const result = flattenWorkflowNodes({} as ComfyWorkflowJSON) - expect(result).toEqual([]) - }) - - it('includes subgraph nodes with prefixed IDs', () => { - const result = flattenWorkflowNodes( - fromPartial({ - nodes: [node(5, 'def-A')], - definitions: { - subgraphs: [ - subgraphDef('def-A', [node(10, 'Inner'), node(20, 'Inner2')]) - ] - } - }) - ) - - expect(result).toHaveLength(3) // 1 root + 2 subgraph - expect(result.map((n) => n.id)).toEqual([5, '5:10', '5:20']) - }) - - it('prefixes nested subgraph nodes with full execution path', () => { - const result = flattenWorkflowNodes( - fromPartial({ - nodes: [node(5, 'def-A')], - definitions: { - subgraphs: [ - subgraphDef('def-A', [node(10, 'def-B')]), - subgraphDef('def-B', [node(3, 'Leaf')]) - ] - } - }) - ) - - // root:5, def-A inner: 5:10, def-B inner: 5:10:3 - expect(result.map((n) => n.id)).toEqual([5, '5:10', '5:10:3']) - }) -}) diff --git a/src/platform/workflow/validation/schemas/workflowSchema.ts b/src/platform/workflow/validation/schemas/workflowSchema.ts index 9a6973040d..1f1dc3fefc 100644 --- a/src/platform/workflow/validation/schemas/workflowSchema.ts +++ b/src/platform/workflow/validation/schemas/workflowSchema.ts @@ -504,24 +504,6 @@ export type WorkflowJSON04 = z.infer export type ComfyWorkflowJSON = z.infer< typeof zComfyWorkflow | typeof zComfyWorkflow1 > -type SubgraphDefinition = z.infer - -/** - * Type guard to check if an object is a SubgraphDefinition. - * This helps TypeScript understand the type when z.lazy() breaks inference. - */ -export function isSubgraphDefinition(obj: unknown): obj is SubgraphDefinition { - return ( - obj !== null && - typeof obj === 'object' && - 'id' in obj && - 'name' in obj && - 'nodes' in obj && - Array.isArray((obj as SubgraphDefinition).nodes) && - 'inputNode' in obj && - 'outputNode' in obj - ) -} const zWorkflowVersion = z.object({ version: z.number() @@ -574,99 +556,3 @@ const zNodeData = z.object({ const zComfyApiWorkflow = z.record(zNodeId, zNodeData) export type ComfyApiWorkflow = z.infer - -/** - * Builds a map from subgraph definition ID to all execution path prefixes - * where that definition is instantiated in the workflow. - * - * "def-A" → ["5", "10"] for each container node instantiating that subgraph definition. - */ -export function buildSubgraphExecutionPaths( - rootNodes: ComfyNode[], - allSubgraphDefs: unknown[] -): Map { - const subgraphDefMap = new Map( - allSubgraphDefs.filter(isSubgraphDefinition).map((s) => [s.id, s]) - ) - const pathMap = new Map() - const visited = new Set() - - const build = (nodes: ComfyNode[], parentPrefix: string) => { - for (const n of nodes ?? []) { - if (typeof n.type !== 'string' || !subgraphDefMap.has(n.type)) continue - const path = parentPrefix ? `${parentPrefix}:${n.id}` : String(n.id) - const existing = pathMap.get(n.type) - if (existing) { - existing.push(path) - } else { - pathMap.set(n.type, [path]) - } - - if (visited.has(n.type)) continue - visited.add(n.type) - - const innerDef = subgraphDefMap.get(n.type) - if (innerDef) { - build(innerDef.nodes, path) - } - - visited.delete(n.type) - } - } - - build(rootNodes, '') - return pathMap -} - -/** - * Recursively collect all subgraph definitions from root and nested levels. - */ -function collectAllSubgraphDefs(rootDefs: unknown[]): SubgraphDefinition[] { - const result: SubgraphDefinition[] = [] - const seen = new Set() - - function collect(defs: unknown[]) { - for (const def of defs) { - if (!isSubgraphDefinition(def)) continue - if (seen.has(def.id)) continue - seen.add(def.id) - result.push(def) - if (def.definitions?.subgraphs?.length) { - collect(def.definitions.subgraphs) - } - } - } - - collect(rootDefs) - return result -} - -/** - * Flatten all workflow nodes (root + subgraphs) into a single array. - * Each node's `id` is prefixed with its execution path (e.g. node "3" inside container "11" → "11:3"). - */ -export function flattenWorkflowNodes( - graphData: ComfyWorkflowJSON -): Readonly[] { - const rootNodes = graphData.nodes ?? [] - const allDefs = collectAllSubgraphDefs(graphData.definitions?.subgraphs ?? []) - const pathMap = buildSubgraphExecutionPaths(rootNodes, allDefs) - - const allNodes: ComfyNode[] = [...rootNodes] - - const subgraphDefMap = new Map(allDefs.map((s) => [s.id, s])) - for (const [defId, paths] of pathMap.entries()) { - const def = subgraphDefMap.get(defId) - if (!def?.nodes) continue - for (const prefix of paths) { - for (const node of def.nodes) { - allNodes.push({ - ...node, - id: `${prefix}:${node.id}` - }) - } - } - } - - return allNodes -} diff --git a/src/scripts/app.test.ts b/src/scripts/app.test.ts index 8f62552532..138a759376 100644 --- a/src/scripts/app.test.ts +++ b/src/scripts/app.test.ts @@ -1,4 +1,5 @@ -import { createPinia, setActivePinia } from 'pinia' +import { createTestingPinia } from '@pinia/testing' +import { setActivePinia } from 'pinia' import { beforeEach, describe, expect, it, vi } from 'vitest' import type { @@ -6,10 +7,7 @@ import type { LGraphCanvas, LGraphNode } from '@/lib/litegraph/src/litegraph' -import type { - ComfyWorkflowJSON, - ModelFile -} from '@/platform/workflow/validation/schemas/workflowSchema' +import type { ComfyWorkflowJSON } from '@/platform/workflow/validation/schemas/workflowSchema' import { ComfyApp } from './app' import { createNode } from '@/utils/litegraphUtil' import { @@ -22,14 +20,13 @@ import { } from '@/composables/usePaste' import { getWorkflowDataFromFile } from '@/scripts/metadata/parser' import { useMissingModelStore } from '@/platform/missingModel/missingModelStore' -import type { LoadedComfyWorkflow } from '@/platform/workflow/management/stores/comfyWorkflow' -import type { MissingModelCandidate } from '@/platform/missingModel/types' const { mockToastStore, mockExtensionService, mockNodeOutputStore, - mockWorkspaceWorkflow + mockWorkspaceWorkflow, + mockRefreshMissingModelPipeline } = vi.hoisted(() => ({ mockToastStore: { addAlert: vi.fn(), @@ -44,8 +41,9 @@ const { refreshNodeOutputs: vi.fn() }, mockWorkspaceWorkflow: { - activeWorkflow: null as unknown - } + activeWorkflow: null + }, + mockRefreshMissingModelPipeline: vi.fn() })) vi.mock('@/utils/litegraphUtil', () => ({ @@ -88,6 +86,11 @@ vi.mock('@/stores/workspaceStore', () => ({ })) })) +vi.mock('@/platform/missingModel/missingModelPipeline', () => ({ + refreshMissingModelPipeline: mockRefreshMissingModelPipeline, + runMissingModelPipeline: vi.fn() +})) + function createMockNode(options: { [K in keyof LGraphNode]?: any } = {}) { return { id: 1, @@ -115,16 +118,6 @@ function createTestFile(name: string, type: string): File { return new File([''], name, { type }) } -type ComfyAppMissingModelPipelineTarget = { - runMissingModelPipeline: ( - graphData: ComfyWorkflowJSON, - options?: { silent?: boolean; missingNodeTypes?: string[] } - ) => Promise<{ - missingModels: ModelFile[] - confirmedCandidates: MissingModelCandidate[] - }> -} - function createWorkflowGraphData(): ComfyWorkflowJSON { return { last_node_id: 0, @@ -143,7 +136,7 @@ describe('ComfyApp', () => { let mockCanvas: LGraphCanvas beforeEach(() => { - setActivePinia(createPinia()) + setActivePinia(createTestingPinia({ stubActions: false })) vi.clearAllMocks() app = new ComfyApp() mockCanvas = createMockCanvas() as LGraphCanvas @@ -187,104 +180,32 @@ describe('ComfyApp', () => { }) describe('refreshMissingModels', () => { - function mockRefreshMissingModelsApp( - graphData: ComfyWorkflowJSON, - candidates: MissingModelCandidate[] = [] - ) { - mockWorkspaceWorkflow.activeWorkflow = null - Reflect.set(app, 'rootGraphInternal', { + it('delegates to the app-independent missing model refresh pipeline', async () => { + const graph = { nodes: [], - serialize: vi.fn(() => graphData) - }) + serialize: vi.fn(() => createWorkflowGraphData()) + } + const result = { + missingModels: [], + confirmedCandidates: [] + } + Reflect.set(app, 'rootGraphInternal', graph) vi.spyOn(app, 'reloadNodeDefs').mockResolvedValue() - const appWithPrivate = - app as unknown as ComfyAppMissingModelPipelineTarget - const pipelineSpy = vi - .spyOn(appWithPrivate, 'runMissingModelPipeline') - .mockResolvedValue({ - missingModels: [], - confirmedCandidates: [] - }) - useMissingModelStore().missingModelCandidates = candidates - return pipelineSpy - } + mockRefreshMissingModelPipeline.mockResolvedValue(result) - it('reuses active workflow model metadata when refreshing the current graph', async () => { - const graphData = createWorkflowGraphData() - const activeModels = [ - { - name: 'embedded.safetensors', - url: 'https://example.com/embedded.safetensors', - directory: 'checkpoints' - } - ] - const pipelineSpy = mockRefreshMissingModelsApp(graphData, [ - { - nodeId: '1', - nodeType: 'CheckpointLoaderSimple', - widgetName: 'ckpt_name', - name: 'candidate.safetensors', - url: 'https://example.com/candidate.safetensors', - directory: 'checkpoints', - isMissing: true, - isAssetSupported: true - } - ]) - mockWorkspaceWorkflow.activeWorkflow = { - activeState: { models: activeModels } - } as LoadedComfyWorkflow + await expect(app.refreshMissingModels({ silent: false })).resolves.toBe( + result + ) - await app.refreshMissingModels({ silent: false }) + expect(mockRefreshMissingModelPipeline).toHaveBeenCalledWith({ + graph, + reloadNodeDefs: expect.any(Function), + missingModelStore: useMissingModelStore(), + silent: false + }) + await mockRefreshMissingModelPipeline.mock.calls[0][0].reloadNodeDefs() expect(app.reloadNodeDefs).toHaveBeenCalled() - expect(pipelineSpy).toHaveBeenCalledWith( - expect.objectContaining({ models: activeModels }), - { silent: false } - ) - }) - - it('falls back to current missing model metadata when workflow state has no models', async () => { - const graphData = createWorkflowGraphData() - const pipelineSpy = mockRefreshMissingModelsApp(graphData, [ - { - nodeId: '1', - nodeType: 'CheckpointLoaderSimple', - widgetName: 'ckpt_name', - name: 'candidate.safetensors', - url: 'https://example.com/candidate.safetensors', - directory: 'checkpoints', - hash: 'abc123', - hashType: 'sha256', - isMissing: true, - isAssetSupported: true - }, - { - nodeId: '2', - nodeType: 'CheckpointLoaderSimple', - widgetName: 'ckpt_name', - name: 'missing-url.safetensors', - directory: 'checkpoints', - isMissing: true, - isAssetSupported: true - } - ]) - - await app.refreshMissingModels() - - expect(pipelineSpy).toHaveBeenCalledWith( - expect.objectContaining({ - models: [ - { - name: 'candidate.safetensors', - url: 'https://example.com/candidate.safetensors', - directory: 'checkpoints', - hash: 'abc123', - hash_type: 'sha256' - } - ] - }), - { silent: true } - ) }) }) diff --git a/src/scripts/app.ts b/src/scripts/app.ts index ba7d740359..5b6c8b214b 100644 --- a/src/scripts/app.ts +++ b/src/scripts/app.ts @@ -34,13 +34,13 @@ import { useWorkflowValidation } from '@/platform/workflow/validation/composable import type { ComfyApiWorkflow, ComfyWorkflowJSON, - ModelFile, NodeId } from '@/platform/workflow/validation/schemas/workflowSchema' import { - isSubgraphDefinition, + collectSubgraphDefinitions, buildSubgraphExecutionPaths -} from '@/platform/workflow/validation/schemas/workflowSchema' +} from '@/platform/workflow/core/utils/workflowFlattening' +import type { FlattenableWorkflowNode } from '@/platform/workflow/core/utils/workflowFlattening' import type { ExecutionErrorWsMessage, NodeError, @@ -73,7 +73,6 @@ import { useNodeOutputStore } from '@/stores/nodeOutputStore' import { useJobPreviewStore } from '@/stores/jobPreviewStore' import { KeyComboImpl } from '@/platform/keybindings/keyCombo' import { useKeybindingStore } from '@/platform/keybindings/keybindingStore' -import { useModelStore } from '@/stores/modelStore' import { SYSTEM_NODE_DEFS, useNodeDefStore } from '@/stores/nodeDefStore' import { useNodeReplacementStore } from '@/platform/nodeReplacement/nodeReplacementStore' @@ -87,12 +86,11 @@ import type { NodeExecutionId } from '@/types/nodeIdentification' import { graphToPrompt } from '@/utils/executionUtil' import { getCnrIdFromProperties } from '@/platform/nodeReplacement/cnrIdUtil' import { rescanAndSurfaceMissingNodes } from '@/platform/nodeReplacement/missingNodeScan' -import type { MissingModelCandidate } from '@/platform/missingModel/types' import { - scanAllModelCandidates, - enrichWithEmbeddedMetadata, - verifyAssetSupportedCandidates -} from '@/platform/missingModel/missingModelScan' + refreshMissingModelPipeline, + runMissingModelPipeline +} from '@/platform/missingModel/missingModelPipeline' +import type { MissingModelPipelineResult } from '@/platform/missingModel/missingModelPipeline' import { useMissingModelStore } from '@/platform/missingModel/missingModelStore' import { useMissingMediaStore } from '@/platform/missingMedia/missingMediaStore' import type { MissingMediaCandidate } from '@/platform/missingMedia/types' @@ -100,8 +98,6 @@ import { scanAllMediaCandidates, verifyCloudMediaCandidates } from '@/platform/missingMedia/missingMediaScan' -import { assetService } from '@/platform/assets/services/assetService' -import { useModelToNodeStore } from '@/stores/modelToNodeStore' import { anyItemOverlapsRect } from '@/utils/mathUtil' import { @@ -154,11 +150,6 @@ import { pasteVideoNodes } from '@/composables/usePaste' -interface MissingModelPipelineOptions { - missingNodeTypes?: MissingNodeType[] - silent?: boolean -} - export const ANIM_PREVIEW_WIDGET = '$$comfy_animation_preview' export function sanitizeNodeName(string: string) { @@ -1225,7 +1216,7 @@ export class ComfyApp { // Collect missing node types from all nodes (root + subgraphs) const collectMissingNodes = ( - nodes: ComfyWorkflowJSON['nodes'], + nodes: readonly FlattenableWorkflowNode[], pathPrefix: string = '', displayName: string = '' ) => { @@ -1270,21 +1261,21 @@ export class ComfyApp { } collectMissingNodes(graphData.nodes) - const subgraphDefs = graphData.definitions?.subgraphs ?? [] + const subgraphDefs = collectSubgraphDefinitions( + graphData.definitions?.subgraphs ?? [] + ) const subgraphContainerIdMap = buildSubgraphExecutionPaths( graphData.nodes, subgraphDefs ) for (const subgraph of subgraphDefs) { - if (isSubgraphDefinition(subgraph)) { - const paths = subgraphContainerIdMap.get(subgraph.id) ?? [] - for (const pathPrefix of paths) { - collectMissingNodes( - subgraph.nodes, - pathPrefix, - subgraph.name || subgraph.id - ) - } + const paths = subgraphContainerIdMap.get(subgraph.id) ?? [] + for (const pathPrefix of paths) { + collectMissingNodes( + subgraph.nodes, + pathPrefix, + subgraph.name || subgraph.id + ) } } @@ -1454,7 +1445,10 @@ export class ComfyApp { ) if (!skipAssetScans) { - await this.runMissingModelPipeline(graphData, { + await runMissingModelPipeline({ + graph: this.rootGraph, + graphData, + missingModelStore: useMissingModelStore(), missingNodeTypes: activeMissingNodeTypes, silent: silentAssetErrors }) @@ -1477,201 +1471,14 @@ export class ComfyApp { } } - private async runMissingModelPipeline( - graphData: ComfyWorkflowJSON, - { missingNodeTypes, silent = false }: MissingModelPipelineOptions = {} - ): Promise<{ - missingModels: ModelFile[] - confirmedCandidates: MissingModelCandidate[] - }> { - const missingModelStore = useMissingModelStore() - const controller = missingModelStore.createVerificationAbortController() - - const getDirectory = (nodeType: string) => - useModelToNodeStore().getCategoryForNodeType(nodeType) - - const candidates = scanAllModelCandidates( - this.rootGraph, - isCloud - ? (nodeType, widgetName) => - assetService.shouldUseAssetBrowser(nodeType, widgetName) - : () => false, - getDirectory - ) - - const modelStore = useModelStore() - await modelStore.loadModelFolders() - const enrichedAll = await enrichWithEmbeddedMetadata( - candidates, - graphData, - async (name, directory) => { - const folder = await modelStore.getLoadedModelFolder(directory) - const models = folder?.models - return !!( - models && Object.values(models).some((m) => m.file_name === name) - ) - }, - isCloud - ? (nodeType, widgetName) => - assetService.shouldUseAssetBrowser(nodeType, widgetName) - : undefined - ) - - // Drop candidates whose enclosing subgraph is muted/bypassed. Per-node - // scans only checked each node's own mode; the cascade from an - // inactive container to its interior happens here. - // Asymmetric on purpose: a candidate dropped here is not resurrected if - // the user un-bypasses the container mid-verification. The realtime - // mode-change path (handleNodeModeChange → scanAndAddNodeErrors) is - // responsible for surfacing errors after an un-bypass. - const enrichedCandidates = enrichedAll.filter( - (c) => - c.nodeId == null || - isAncestorPathActive(this.rootGraph, String(c.nodeId)) - ) - - const missingModels: ModelFile[] = enrichedCandidates - .filter((c) => c.isMissing === true && c.url) - .map((c) => ({ - name: c.name, - url: c.url ?? '', - directory: c.directory ?? '', - hash: c.hash, - hash_type: c.hashType - })) - - const confirmedCandidates = enrichedCandidates.filter( - (c) => c.isMissing === true - ) - - const activeWf = useWorkspaceStore().workflow.activeWorkflow - updatePendingWarnings(activeWf, { - ...(missingNodeTypes ? { missingNodeTypes } : {}), - missingModelCandidates: confirmedCandidates - }) - - if (enrichedCandidates.length) { - if (isCloud) { - void verifyAssetSupportedCandidates( - enrichedCandidates, - controller.signal - ) - .then(() => { - if (controller.signal.aborted) return - // Re-check ancestor: user may have bypassed a container - // while verification was in flight. - const confirmed = enrichedCandidates.filter((c) => - isMissingCandidateActive(this.rootGraph, c) - ) - useExecutionErrorStore().surfaceMissingModels(confirmed, { silent }) - this.cacheModelCandidates(activeWf, confirmed) - }) - .catch((err) => { - console.warn( - '[Missing Model Pipeline] Asset verification failed:', - err - ) - useToastStore().add({ - severity: 'warn', - summary: st( - 'toastMessages.missingModelVerificationFailed', - 'Failed to verify missing models. Some models may not be shown in the Errors tab.' - ), - life: 5000 - }) - }) - } else { - const confirmed = enrichedCandidates.filter((c) => c.isMissing === true) - if (!confirmed.length) { - useExecutionErrorStore().surfaceMissingModels([], { silent }) - this.cacheModelCandidates(activeWf, []) - } else { - void api - .getFolderPaths() - .then((paths) => { - if (controller.signal.aborted) return - missingModelStore.setFolderPaths(paths) - }) - .catch((err) => { - console.warn( - '[Missing Model Pipeline] Failed to fetch folder paths:', - err - ) - }) - .finally(() => { - if (controller.signal.aborted) return - useExecutionErrorStore().surfaceMissingModels(confirmed, { - silent - }) - this.cacheModelCandidates(activeWf, confirmed) - }) - - void Promise.allSettled( - confirmed - .filter((c) => c.url) - .map(async (c) => { - const { fetchModelMetadata } = - await import('@/platform/missingModel/missingModelDownload') - const metadata = await fetchModelMetadata(c.url!) - if (!controller.signal.aborted && metadata.fileSize !== null) { - missingModelStore.setFileSize(c.url!, metadata.fileSize) - } - }) - ) - } - } - } else { - useExecutionErrorStore().surfaceMissingModels([], { silent }) - this.cacheModelCandidates(activeWf, []) - } - - return { missingModels, confirmedCandidates } - } - - async refreshMissingModels(options: { silent?: boolean } = {}): Promise<{ - missingModels: ModelFile[] - confirmedCandidates: MissingModelCandidate[] - }> { - await this.reloadNodeDefs() - const graphData = this.rootGraph.serialize() as unknown as ComfyWorkflowJSON - const activeWorkflowState = - useWorkspaceStore().workflow.activeWorkflow?.activeState - const currentModelMetadata = - useMissingModelStore() - .missingModelCandidates?.filter( - ( - candidate - ): candidate is MissingModelCandidate & { - url: string - directory: string - } => !!candidate.url && !!candidate.directory - ) - .map((candidate) => ({ - name: candidate.name, - url: candidate.url, - directory: candidate.directory, - hash: candidate.hash, - hash_type: candidate.hashType - })) ?? [] - const models = activeWorkflowState?.models?.length - ? activeWorkflowState.models - : currentModelMetadata - - return this.runMissingModelPipeline( - models.length ? { ...graphData, models } : graphData, - { - silent: options.silent ?? true - } - ) - } - - private cacheModelCandidates( - wf: ComfyWorkflow | null, - confirmed: MissingModelCandidate[] - ) { - if (!wf) return - updatePendingWarnings(wf, { - missingModelCandidates: confirmed + async refreshMissingModels( + options: { silent?: boolean } = {} + ): Promise { + return refreshMissingModelPipeline({ + graph: this.rootGraph, + reloadNodeDefs: () => this.reloadNodeDefs(), + missingModelStore: useMissingModelStore(), + silent: options.silent ?? true }) } diff --git a/src/workbench/utils/modelMetadataUtil.ts b/src/workbench/utils/modelMetadataUtil.ts index e5137d6f1c..e243de6680 100644 --- a/src/workbench/utils/modelMetadataUtil.ts +++ b/src/workbench/utils/modelMetadataUtil.ts @@ -21,7 +21,7 @@ import type { ModelFile } from '@/platform/workflow/validation/schemas/workflowS */ export function getSelectedModelsMetadata(node: { type: string - widgets_values?: unknown[] | Record + widgets_values?: readonly unknown[] | Record properties?: { models?: ModelFile[] } }): ModelFile[] | undefined { try {