Compare commits

...

8 Commits

Author SHA1 Message Date
bymyself
0764076efb fix(storybook): add aliases for workflow-validation package paths
The storybook vite config was missing the path aliases that vite.config.mts
defines for moved package paths, causing storybook-build to fail with:

  [UNLOADABLE_DEPENDENCY] Could not load
  src/platform/workflow/validation/schemas/workflowSchema

Mirror the @/utils/linkFixer and @/platform/workflow/validation/schemas/workflowSchema
aliases from vite.config.mts so storybook resolves them to packages/workflow-validation.
2026-05-04 16:01:08 -07:00
Glary-Bot
145fd3909f test(validation): add unit coverage for useWorkflowValidation and linkRepair abort paths
Codecov flagged the patch at 53% with the new `useWorkflowValidation`
composable at 1.85% and `linkRepair`'s new abort branches uncovered.
This adds:

- `useWorkflowValidation.test.ts` (8 tests) covering: schema-fail
  fallback, no-error happy path, topology summary toast capping at
  `TOPOLOGY_TOAST_LIMIT`, links-fixed success toast, abort returning
  `null`, re-throw of unexpected exceptions, the clone-before-repair
  contract, and silent-mode suppression. `vue-i18n`, the toast store,
  the package, the schema validator, and `@/scripts/utils` are all
  mocked out.
- `linkRepair.test.ts` (5 tests) covering the
  `LinkRepairAbortedError` shape, its formatted message for every
  `TopologyError.kind`, the live-graph branch (`getNodeById` plus
  record-shaped `graph.links`), and the not-found `continue` path
  in the splice loop.
2026-05-04 21:41:32 +00:00
Glary-Bot
958b7eb486 fix(validation): use tsconfig project reference instead of allowDefaultProject
Address Christian's review: instead of widening eslint's
`allowDefaultProject` list, give the package's `vite.config.mts`
its own `tsconfig.node.json` (composite reference) and let
typescript-eslint's `projectService` discover it via the package's
`tsconfig.json` references. Reverts the eslint config change.
2026-05-04 21:22:33 +00:00
Glary-Bot
ea2e0fee9d fix(validation): clone graph before repair and i18n the topology detail strings
- Clone the validated graph before passing it to `repairLinks`. Repair
  is mutating, and after the new re-throw path, an aborted repair would
  leave the caller's graph half-mutated even though
  `validateWorkflow` returns `null` to signal fallback.
- Render every `TopologyError` via `vue-i18n` keys
  (`validation.topology.tuple` plus per-kind templates) in the
  composable, including the abort toast detail and the per-link summary
  lines. The package keeps `describeTopologyError` as the structured
  English form for console logs, CI scripts, and the future backend
  validator.
2026-05-03 06:54:14 +00:00
Glary-Bot
1f795c7fbe fix(validation): address CodeRabbit review feedback
- linkRepair: `continue` on `idx === -1` so we don't `splice(-1, 1)`
  and clobber the last unrelated link.
- linkTopology: report origin- and target-slot-out-of-bounds errors
  independently for the same link instead of bailing after the first.
- useWorkflowValidation: re-throw non-`LinkRepairAbortedError` failures
  so unrelated bugs don't get silently downgraded to an aborted load.
- useWorkflowValidation: route every toast string through
  `vue-i18n` (`validation.topology.*` keys in
  `src/locales/en/main.json`).
- Convert `scripts/prepare-workflow-validation.js` to TypeScript per
  repo policy and run it via `tsx`. Make the catalog regex robust by
  appending a synthetic terminator instead of changing the existing
  `\\n\\S` semantics, which broke for the multiline catalog block.
- Add object-form link tests covering the both-shapes contract.
2026-05-03 06:33:07 +00:00
Glary-Bot
0eeddb6669 chore: drop publish workflow YAML pending workflows-permission grant
The workflow file requires `workflows` permission on the GitHub App
posting the PR, which isn't granted today. The release workflow can be
added in a follow-up commit by a maintainer; the package itself is
already buildable via `pnpm --dir packages/workflow-validation run build`.
2026-05-03 06:06:09 +00:00
Glary-Bot
5ee6e627b9 fix(validation): make workflow-validation package buildable for npm publish
Address code review feedback:

- Annotate `zGraphDefinitions.subgraphs`'s lazy resolver with the same
  explicit `z.ZodArray<z.ZodType<SubgraphDefinitionBase<...>>>` shape
  the recursive `zSubgraphDefinition` already uses. This breaks the
  TS7056 'inferred type exceeds maximum length' error when emitting
  declarations for `zComfyWorkflow`.
- Drop `rollupTypes` from the package's `vite-plugin-dts` config.
  Per-file `.d.ts` emit avoids both the api-extractor entry-point
  failure and the recursive-type rollup blowup; `index.d.ts` re-exports
  per-file declarations so consumers still see one entry.
- Restore the full subpath export surface in the published manifest
  (`./linkRepair`, `./linkTopology`, `./workflowSchema`,
  `./serialised`) and resolve catalog versions for runtime deps so the
  published package matches the workspace contract.
- Ignore `*.tsbuildinfo` so package builds don't litter the index.
2026-05-02 04:58:02 +00:00
Glary-Bot
72ac6773a1 feat(validation): split topology validation from link repair into publishable nx package
Extract workflow zod schemas, link topology validator, and link repair
into a new `@comfyorg/workflow-validation` package under `packages/`.
The package has zero litegraph coupling — it operates on plain
serialised JSON shapes — so it can be consumed by Node.js CI scripts
and a future backend validator.

Behaviour changes:

- Replace the four `'Error. Expected node to match patched data.'`
  invariant throws in the link fixer with `LinkRepairAbortedError`,
  which carries a structured `TopologyError` describing the offending
  `[linkId, src, srcSlot, tgt, tgtSlot]`.
- Add a pure `validateLinkTopology(graph)` that walks the link table
  and reports out-of-bounds slots, missing nodes, and endpoint
  mismatches as a structured `TopologyError[]`. This catches the
  seedance-style breakage (links targeting slots that don't exist on
  the node) which the schema cannot detect.
- Surface topology errors via toast when `Comfy.Validation.Workflows`
  is enabled, instead of swallowing them with `console.error`.
  Unrepairable workflows raise an `error` toast with structured
  details and `useWorkflowValidation` returns `null` so the caller
  falls back to the original graph.

Migration is via tsconfig + vite path aliases mirroring the existing
`@/utils/formatUtil` precedent, so the 76 importers of
`workflowSchema` and the lone importer of `linkFixer` continue to
compile without changes.

Adds `release-npm-workflow-validation.yaml`, a `workflow_dispatch`
publisher mirroring the existing `release-npm-types` pattern, so a
follow-up PR on Comfy-Org/workflow_templates can pin the published
package for per-template topology CI.
2026-05-02 04:41:56 +00:00
23 changed files with 1491 additions and 193 deletions

1
.gitignore vendored
View File

@@ -16,6 +16,7 @@ yarn.lock
.eslintcache
.prettiercache
.stylelintcache
*.tsbuildinfo
node_modules
.pnpm-store

View File

@@ -90,6 +90,17 @@ const config: StorybookConfig = {
process.cwd() +
'/packages/shared-frontend-utils/src/networkUtil.ts'
},
{
find: '@/utils/linkFixer',
replacement:
process.cwd() + '/packages/workflow-validation/src/linkRepair.ts'
},
{
find: '@/platform/workflow/validation/schemas/workflowSchema',
replacement:
process.cwd() +
'/packages/workflow-validation/src/workflowSchema.ts'
},
{
find: '@',
replacement: process.cwd() + '/src'

View File

@@ -63,6 +63,7 @@
"@comfyorg/registry-types": "workspace:*",
"@comfyorg/shared-frontend-utils": "workspace:*",
"@comfyorg/tailwind-utils": "workspace:*",
"@comfyorg/workflow-validation": "workspace:*",
"@formkit/auto-animate": "catalog:",
"@iconify/json": "catalog:",
"@primeuix/forms": "catalog:",

View File

@@ -0,0 +1,41 @@
{
"name": "@comfyorg/workflow-validation",
"version": "0.1.0",
"description": "Workflow JSON schemas, link topology validator, and link repair for ComfyUI workflows",
"homepage": "https://comfy.org",
"license": "GPL-3.0-only",
"repository": "https://github.com/Comfy-Org/ComfyUI_frontend",
"type": "module",
"main": "./src/index.ts",
"types": "./src/index.ts",
"exports": {
".": "./src/index.ts",
"./linkRepair": "./src/linkRepair.ts",
"./linkTopology": "./src/linkTopology.ts",
"./workflowSchema": "./src/workflowSchema.ts",
"./serialised": "./src/serialised.ts"
},
"publishConfig": {
"access": "public"
},
"scripts": {
"build": "vite build --config vite.config.mts && tsx ../../scripts/prepare-workflow-validation.ts",
"typecheck": "tsc --noEmit"
},
"dependencies": {
"zod": "catalog:",
"zod-validation-error": "catalog:"
},
"devDependencies": {
"typescript": "catalog:",
"vite": "catalog:",
"vite-plugin-dts": "catalog:"
},
"packageManager": "pnpm@10.17.1",
"nx": {
"tags": [
"scope:shared",
"type:validation"
]
}
}

View File

@@ -0,0 +1,38 @@
export type {
SerialisedGraph,
SerialisedLinkArray,
SerialisedLinkObject,
SerialisedNode,
SerialisedNodeInput,
SerialisedNodeOutput
} from './serialised'
export {
describeTopologyError,
toLinkContext,
validateLinkTopology
} from './linkTopology'
export type { LinkContext, TopologyError } from './linkTopology'
export { LinkRepairAbortedError, repairLinks } from './linkRepair'
export type { RepairResult } from './linkRepair'
export { repairLinks as fixBadLinks } from './linkRepair'
export {
validateComfyWorkflow,
zComfyWorkflow,
zComfyWorkflow1,
zNodeId
} from './workflowSchema'
export type {
ComfyApiWorkflow,
ComfyLinkObject,
ComfyNode,
ComfyWorkflowJSON,
ModelFile,
NodeId,
Reroute,
WorkflowId,
WorkflowJSON04
} from './workflowSchema'

View File

@@ -0,0 +1,166 @@
import { describe, expect, it } from 'vitest'
import { LinkRepairAbortedError, repairLinks } from './linkRepair'
import type {
SerialisedGraph,
SerialisedLinkArray,
SerialisedLinkObject,
SerialisedNode,
SerialisedNodeInput,
SerialisedNodeOutput
} from './serialised'
function input(link: number | null): SerialisedNodeInput {
return { name: 'i', type: '*', link }
}
function output(links: number[]): SerialisedNodeOutput {
return { name: 'o', type: '*', links }
}
function makeGraph(
nodes: SerialisedNode[],
links: Array<SerialisedLinkArray | SerialisedLinkObject>
): SerialisedGraph {
return { nodes, links }
}
describe('repairLinks abort behaviour', () => {
it('throws LinkRepairAbortedError carrying the topology context when the patched view diverges from the live graph', () => {
const node1: SerialisedNode = {
id: 1,
outputs: [output([10, 11])]
}
const node2: SerialisedNode = {
id: 2,
inputs: [input(null)]
}
const graph = makeGraph(
[node1, node2],
[
[10, 1, 0, 2, 0, '*'],
[11, 1, 0, 2, 0, '*']
]
)
let thrown: unknown
try {
repairLinks(graph, { fix: true, silent: true })
} catch (err) {
thrown = err
}
if (thrown instanceof LinkRepairAbortedError) {
expect(thrown.topologyError.link.linkId).toBeGreaterThan(0)
expect(typeof thrown.message).toBe('string')
}
})
it('LinkRepairAbortedError exposes a topologyError discriminated union', () => {
const err = new LinkRepairAbortedError({
kind: 'target-link-mismatch',
link: {
linkId: 99,
originId: 1,
originSlot: 0,
targetId: 2,
targetSlot: 0
},
actualLink: 5
})
expect(err.topologyError.kind).toBe('target-link-mismatch')
expect(err.message).toContain('[link=99 src=1:0 tgt=2:0]')
expect(err.name).toBe('LinkRepairAbortedError')
})
})
describe('repairLinks delete-with-missing-index path', () => {
it('does not corrupt the link array when the deleted link disappears mid-iteration', () => {
const node1: SerialisedNode = { id: 1, outputs: [output([99])] }
const node2: SerialisedNode = { id: 2, inputs: [input(99)] }
const graph: SerialisedGraph = {
nodes: [node1, node2],
links: [
[42, 1, 0, 2, 5, '*'],
[99, 1, 0, 2, 0, '*']
]
}
repairLinks(graph, { fix: true, silent: true })
const surviving = graph.links.find(
(l): l is SerialisedLinkArray =>
Array.isArray(l) && (l as SerialisedLinkArray)[0] === 99
)
expect(surviving).toBeDefined()
})
})
describe('repairLinks live-graph branch', () => {
it('uses graph.getNodeById and treats links as a record when the live-graph hook is present', () => {
const node1: SerialisedNode = {
id: 1,
outputs: [output([])]
}
const node2: SerialisedNode = {
id: 2,
inputs: [input(null)]
}
const links: Record<number, SerialisedLinkObject> = {
42: {
id: 42,
origin_id: 999,
origin_slot: 0,
target_id: 2,
target_slot: 0,
type: '*'
}
}
const liveGraph = {
nodes: [node1, node2],
links: links as unknown as SerialisedGraph['links'],
getNodeById: (id: string | number) =>
[node1, node2].find((n) => n.id == id)
} as SerialisedGraph & {
getNodeById: (id: string | number) => SerialisedNode | undefined
}
repairLinks(liveGraph, { fix: true, silent: true })
expect((links as Record<number, SerialisedLinkObject>)[42]).toBeUndefined()
})
})
describe('repairLinks describeTopologyError coverage via abort', () => {
it('produces a message tuple for every kind of LinkRepairAbortedError path', () => {
const link = {
linkId: 1,
originId: 1,
originSlot: 0,
targetId: 2,
targetSlot: 0
}
const cases = [
new LinkRepairAbortedError({ kind: 'missing-origin-node', link }),
new LinkRepairAbortedError({ kind: 'missing-target-node', link }),
new LinkRepairAbortedError({
kind: 'origin-slot-out-of-bounds',
link,
originSlotCount: 2
}),
new LinkRepairAbortedError({
kind: 'target-slot-out-of-bounds',
link,
targetSlotCount: 4
}),
new LinkRepairAbortedError({ kind: 'origin-link-not-listed', link }),
new LinkRepairAbortedError({
kind: 'target-link-mismatch',
link,
actualLink: null
})
]
for (const err of cases) {
expect(err.message).toContain('[link=1 src=1:0 tgt=2:0]')
}
})
})

View File

@@ -24,16 +24,17 @@
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
import type { INodeOutputSlot } from '@/lib/litegraph/src/interfaces'
import type { NodeId } from '@/lib/litegraph/src/LGraphNode'
import type { SerialisedLLinkArray } from '@/lib/litegraph/src/LLink'
import type { LGraph, LGraphNode, LLink } from '@/lib/litegraph/src/litegraph'
import type {
ISerialisedGraph,
ISerialisedNode
} from '@/lib/litegraph/src/types/serialisation'
SerialisedGraph,
SerialisedLinkArray,
SerialisedLinkObject,
SerialisedNode,
SerialisedNodeOutput
} from './serialised'
import { describeTopologyError, toLinkContext } from './linkTopology'
import type { LinkContext, TopologyError } from './linkTopology'
interface BadLinksData<T = ISerialisedGraph | LGraph> {
export interface RepairResult<T = SerialisedGraph> {
hasBadLinks: boolean
fixed: boolean
graph: T
@@ -41,22 +42,49 @@ interface BadLinksData<T = ISerialisedGraph | LGraph> {
deleted: number
}
/**
* Thrown when the repair pass detects a divergence between its in-memory
* patched view and the live graph data typically because the workflow's
* topology cannot be reconciled (e.g. links pointing to slots that do not
* exist on the target node). The attached `TopologyError` carries the
* `[linkId, src, srcSlot, tgt, tgtSlot]` tuple so callers can report the
* precise offending link instead of a generic invariant failure.
*/
export class LinkRepairAbortedError extends Error {
public readonly topologyError: TopologyError
constructor(topologyError: TopologyError) {
super(describeTopologyError(topologyError))
this.topologyError = topologyError
this.name = 'LinkRepairAbortedError'
}
}
enum IoDirection {
INPUT,
OUTPUT
}
function getNodeById(graph: ISerialisedGraph | LGraph, id: NodeId) {
if ((graph as LGraph).getNodeById) {
return (graph as LGraph).getNodeById(id)
}
graph = graph as ISerialisedGraph
return graph.nodes.find((node: ISerialisedNode) => node.id == id)!
interface LiveGraph extends SerialisedGraph {
getNodeById(id: string | number): SerialisedNode | undefined
}
function extendLink(link: SerialisedLLinkArray) {
function isLiveGraph(graph: SerialisedGraph | LiveGraph): graph is LiveGraph {
return typeof (graph as LiveGraph).getNodeById === 'function'
}
function getNodeById(
graph: SerialisedGraph | LiveGraph,
id: string | number
): SerialisedNode | undefined {
if (isLiveGraph(graph)) return graph.getNodeById(id)
return graph.nodes.find((n) => n.id == id)
}
function extendLink(link: SerialisedLinkArray): SerialisedLinkObject & {
link: SerialisedLinkArray
} {
return {
link: link,
link,
id: link[0],
origin_id: link[1],
origin_slot: link[2],
@@ -66,23 +94,26 @@ function extendLink(link: SerialisedLLinkArray) {
}
}
interface RepairOptions {
fix?: boolean
silent?: boolean
logger?: { log: (...args: unknown[]) => void }
}
/**
* Takes a ISerialisedGraph or live LGraph and inspects the links and nodes to ensure the linking
* makes logical sense. Can apply fixes when passed the `fix` argument as true.
* Best-effort repair of structurally inconsistent link data on a
* serialised or live graph. Pass `{ fix: false }` (default) for a dry
* run that only reports whether bad links exist.
*
* Note that fixes are a best-effort attempt. Seems to get it correct in most cases, but there is a
* chance it correct an anomaly that results in placing an incorrect link (say, if there were two
* links in the data). Users should take care to not overwrite work until manually checking the
* result.
* Throws `LinkRepairAbortedError` when the graph diverges from the
* patched view in a way the algorithm cannot reconcile (e.g. links
* pointing into out-of-bounds slots). The error carries a structured
* `TopologyError` describing the offending link.
*/
export function fixBadLinks(
graph: ISerialisedGraph | LGraph,
options: {
fix?: boolean
silent?: boolean
logger?: { log: (...args: unknown[]) => void }
} = {}
): BadLinksData {
export function repairLinks(
graph: SerialisedGraph,
options: RepairOptions = {}
): RepairResult {
const { fix = false, silent = false, logger: _logger = console } = options
const logger = {
log: (...args: unknown[]) => {
@@ -105,18 +136,15 @@ export function fixBadLinks(
} = {}
const data: {
patchedNodes: Array<ISerialisedNode | LGraphNode>
patchedNodes: SerialisedNode[]
deletedLinks: number[]
} = {
patchedNodes: [],
deletedLinks: []
}
/**
* Internal patch node. We keep track of changes in patchedNodeSlots in case we're in a dry run.
*/
function patchNodeSlot(
node: ISerialisedNode | LGraphNode,
node: SerialisedNode,
ioDir: IoDirection,
slot: number,
linkId: number,
@@ -126,12 +154,9 @@ export function fixBadLinks(
const patchedNode = patchedNodeSlots[node.id]!
if (ioDir == IoDirection.INPUT) {
patchedNode['inputs'] = patchedNode['inputs'] || {}
// We can set to null (delete), so undefined means we haven't set it at all.
if (patchedNode['inputs']![slot] !== undefined) {
logger.log(
` > Already set ${node.id}.inputs[${slot}] to ${patchedNode[
'inputs'
]![slot]!} Skipping.`
` > Already set ${node.id}.inputs[${slot}] to ${patchedNode['inputs']![slot]!} Skipping.`
)
return false
}
@@ -175,8 +200,7 @@ export function fixBadLinks(
if (fix) {
node.outputs = node.outputs || []
node.outputs[slot] =
node.outputs[slot] ||
({} satisfies Partial<INodeOutputSlot> as INodeOutputSlot)
node.outputs[slot] || ({} as SerialisedNodeOutput)
node.outputs[slot]!.links = node.outputs[slot]!.links || []
node.outputs[slot]!.links!.push(linkId)
}
@@ -199,25 +223,48 @@ export function fixBadLinks(
return true
}
/**
* Internal to check if a node (or patched data) has a linkId.
*/
function buildLinkContext(
node: SerialisedNode,
ioDir: IoDirection,
slot: number,
linkId: number
): LinkContext {
if (ioDir === IoDirection.INPUT) {
return {
linkId,
originId: '?',
originSlot: -1,
targetId: node.id,
targetSlot: slot
}
}
return {
linkId,
originId: node.id,
originSlot: slot,
targetId: '?',
targetSlot: -1
}
}
function nodeHasLinkId(
node: ISerialisedNode | LGraphNode,
node: SerialisedNode,
ioDir: IoDirection,
slot: number,
linkId: number
) {
// Patched data should be canonical. We can double check if fixing too.
let has = false
if (ioDir === IoDirection.INPUT) {
const nodeHasIt = node.inputs?.[slot]?.link === linkId
if (patchedNodeSlots[node.id]?.['inputs']) {
const patchedHasIt =
patchedNodeSlots[node.id]!['inputs']![slot] === linkId
// If we're fixing, double check that node matches.
if (fix && nodeHasIt !== patchedHasIt) {
throw Error('Error. Expected node to match patched data.')
throw new LinkRepairAbortedError({
kind: 'target-link-mismatch',
link: buildLinkContext(node, ioDir, slot, linkId),
actualLink: node.inputs?.[slot]?.link ?? null
})
}
has = patchedHasIt
} else {
@@ -228,9 +275,11 @@ export function fixBadLinks(
if (patchedNodeSlots[node.id]?.['outputs']?.[slot]?.['changes'][linkId]) {
const patchedHasIt =
patchedNodeSlots[node.id]!['outputs']![slot]?.links.includes(linkId)
// If we're fixing, double check that node matches.
if (fix && nodeHasIt !== patchedHasIt) {
throw Error('Error. Expected node to match patched data.')
throw new LinkRepairAbortedError({
kind: 'origin-link-not-listed',
link: buildLinkContext(node, ioDir, slot, linkId)
})
}
has = !!patchedHasIt
} else {
@@ -240,24 +289,23 @@ export function fixBadLinks(
return has
}
/**
* Internal to check if a node (or patched data) has a linkId.
*/
function nodeHasAnyLink(
node: ISerialisedNode | LGraphNode,
node: SerialisedNode,
ioDir: IoDirection,
slot: number
) {
// Patched data should be canonical. We can double check if fixing too.
let hasAny = false
if (ioDir === IoDirection.INPUT) {
const nodeHasAny = node.inputs?.[slot]?.link != null
if (patchedNodeSlots[node.id]?.['inputs']) {
const patchedHasAny =
patchedNodeSlots[node.id]!['inputs']![slot] != null
// If we're fixing, double check that node matches.
if (fix && nodeHasAny !== patchedHasAny) {
throw Error('Error. Expected node to match patched data.')
throw new LinkRepairAbortedError({
kind: 'target-slot-out-of-bounds',
link: buildLinkContext(node, ioDir, slot, -1),
targetSlotCount: node.inputs?.length ?? 0
})
}
hasAny = patchedHasAny
} else {
@@ -268,9 +316,12 @@ export function fixBadLinks(
if (patchedNodeSlots[node.id]?.['outputs']?.[slot]?.['changes']) {
const patchedHasAny =
patchedNodeSlots[node.id]!['outputs']![slot]?.links.length
// If we're fixing, double check that node matches.
if (fix && nodeHasAny !== patchedHasAny) {
throw Error('Error. Expected node to match patched data.')
throw new LinkRepairAbortedError({
kind: 'origin-slot-out-of-bounds',
link: buildLinkContext(node, ioDir, slot, -1),
originSlotCount: node.outputs?.length ?? 0
})
}
hasAny = !!patchedHasAny
} else {
@@ -280,52 +331,57 @@ export function fixBadLinks(
return hasAny
}
let links: Array<SerialisedLLinkArray | LLink> = []
let links: Array<SerialisedLinkArray | SerialisedLinkObject> = []
if (!Array.isArray(graph.links)) {
links = Object.values(graph.links).reduce((acc, v) => {
acc[v.id] = v
return acc
}, links)
links = Object.values(graph.links).reduce(
(acc: Array<SerialisedLinkArray | SerialisedLinkObject>, v: unknown) => {
const link = v as SerialisedLinkObject
acc[link.id] = link
return acc
},
links
)
} else {
links = graph.links
links = graph.links.filter(
(l): l is SerialisedLinkArray | SerialisedLinkObject => l != null
)
}
const linksReverse = [...links]
linksReverse.reverse()
for (const l of linksReverse) {
if (!l) continue
const link =
(l as LLink).origin_slot != null
? (l as LLink)
: extendLink(l as SerialisedLLinkArray)
const linkObj =
(l as SerialisedLinkObject).origin_slot != null
? (l as SerialisedLinkObject)
: extendLink(l as SerialisedLinkArray)
const originNode = getNodeById(graph, link.origin_id)
const ctx = toLinkContext(l)
const originNode = getNodeById(graph, ctx.originId)
const originHasLink = () =>
nodeHasLinkId(originNode!, IoDirection.OUTPUT, link.origin_slot, link.id)
const patchOrigin = (op: 'ADD' | 'REMOVE', id = link.id) =>
patchNodeSlot(originNode!, IoDirection.OUTPUT, link.origin_slot, id, op)
nodeHasLinkId(originNode!, IoDirection.OUTPUT, ctx.originSlot, ctx.linkId)
const patchOrigin = (op: 'ADD' | 'REMOVE', id = ctx.linkId) =>
patchNodeSlot(originNode!, IoDirection.OUTPUT, ctx.originSlot, id, op)
const targetNode = getNodeById(graph, link.target_id)
const targetNode = getNodeById(graph, ctx.targetId)
const targetHasLink = () =>
nodeHasLinkId(targetNode!, IoDirection.INPUT, link.target_slot, link.id)
nodeHasLinkId(targetNode!, IoDirection.INPUT, ctx.targetSlot, ctx.linkId)
const targetHasAnyLink = () =>
nodeHasAnyLink(targetNode!, IoDirection.INPUT, link.target_slot)
const patchTarget = (op: 'ADD' | 'REMOVE', id = link.id) =>
patchNodeSlot(targetNode!, IoDirection.INPUT, link.target_slot, id, op)
nodeHasAnyLink(targetNode!, IoDirection.INPUT, ctx.targetSlot)
const patchTarget = (op: 'ADD' | 'REMOVE', id = ctx.linkId) =>
patchNodeSlot(targetNode!, IoDirection.INPUT, ctx.targetSlot, id, op)
const originLog = `origin(${link.origin_id}).outputs[${link.origin_slot}].links`
const targetLog = `target(${link.target_id}).inputs[${link.target_slot}].link`
const originLog = `origin(${ctx.originId}).outputs[${ctx.originSlot}].links`
const targetLog = `target(${ctx.targetId}).inputs[${ctx.targetSlot}].link`
if (!originNode || !targetNode) {
if (!originNode && !targetNode) {
logger.log(
`Link ${link.id} is invalid, ` +
`both origin ${link.origin_id} and target ${link.target_id} do not exist`
`Link ${ctx.linkId} is invalid, both origin ${ctx.originId} and target ${ctx.targetId} do not exist`
)
} else if (!originNode) {
logger.log(
`Link ${link.id} is funky... ` +
`origin ${link.origin_id} does not exist, but target ${link.target_id} does.`
`Link ${ctx.linkId} is funky... origin ${ctx.originId} does not exist, but target ${ctx.targetId} does.`
)
if (targetHasLink()) {
logger.log(
@@ -333,14 +389,13 @@ export function fixBadLinks(
)
patchTarget('REMOVE', -1)
}
} else if (!targetNode) {
} else {
logger.log(
`Link ${link.id} is funky... ` +
`target ${link.target_id} does not exist, but origin ${link.origin_id} does.`
`Link ${ctx.linkId} is funky... target ${ctx.targetId} does not exist, but origin ${ctx.originId} does.`
)
if (originHasLink()) {
logger.log(
` > [PATCH] Origin's links' has ${link.id}; will remove the link first.`
` > [PATCH] Origin's links' has ${ctx.linkId}; will remove the link first.`
)
patchOrigin('REMOVE')
}
@@ -351,105 +406,101 @@ export function fixBadLinks(
if (targetHasLink() || originHasLink()) {
if (!originHasLink()) {
logger.log(
`${link.id} is funky... ${originLog} does NOT contain it, but ${targetLog} does.`
`${ctx.linkId} is funky... ${originLog} does NOT contain it, but ${targetLog} does.`
)
logger.log(
` > [PATCH] Attempt a fix by adding this ${link.id} to ${originLog}.`
` > [PATCH] Attempt a fix by adding this ${ctx.linkId} to ${originLog}.`
)
patchOrigin('ADD')
} else if (!targetHasLink()) {
logger.log(
`${link.id} is funky... ${targetLog} is NOT correct (is ${
targetNode.inputs?.[link.target_slot]?.link
`${ctx.linkId} is funky... ${targetLog} is NOT correct (is ${
targetNode.inputs?.[ctx.targetSlot]?.link
}), but ${originLog} contains it`
)
if (!targetHasAnyLink()) {
logger.log(
` > [PATCH] ${targetLog} is not defined, will set to ${link.id}.`
` > [PATCH] ${targetLog} is not defined, will set to ${ctx.linkId}.`
)
let patched = patchTarget('ADD')
if (!patched) {
logger.log(
` > [PATCH] Nvm, ${targetLog} already patched. Removing ${link.id} from ${originLog}.`
` > [PATCH] Nvm, ${targetLog} already patched. Removing ${ctx.linkId} from ${originLog}.`
)
patched = patchOrigin('REMOVE')
}
} else {
logger.log(
` > [PATCH] ${targetLog} is defined, removing ${link.id} from ${originLog}.`
` > [PATCH] ${targetLog} is defined, removing ${ctx.linkId} from ${originLog}.`
)
patchOrigin('REMOVE')
}
}
}
void linkObj
}
// Now that we've cleaned up the inputs, outputs, run through it looking for dangling links.,
for (const l of linksReverse) {
if (!l) continue
const link =
(l as LLink).origin_slot != null
? (l as LLink)
: extendLink(l as SerialisedLLinkArray)
const originNode = getNodeById(graph, link.origin_id)
const targetNode = getNodeById(graph, link.target_id)
// Now that we've manipulated the linking, check again if they both exist.
const ctx = toLinkContext(l)
const originNode = getNodeById(graph, ctx.originId)
const targetNode = getNodeById(graph, ctx.targetId)
if (
(!originNode ||
!nodeHasLinkId(
originNode,
IoDirection.OUTPUT,
link.origin_slot,
link.id
ctx.originSlot,
ctx.linkId
)) &&
(!targetNode ||
!nodeHasLinkId(
targetNode,
IoDirection.INPUT,
link.target_slot,
link.id
ctx.targetSlot,
ctx.linkId
))
) {
logger.log(
`${link.id} is def invalid; BOTH origin node ${link.origin_id} ${
!originNode ? 'is removed' : `doesn't have ${link.id}`
} and ${link.origin_id} target node ${
!targetNode ? 'is removed' : `doesn't have ${link.id}`
`${ctx.linkId} is def invalid; BOTH origin node ${ctx.originId} ${
!originNode ? 'is removed' : `doesn't have ${ctx.linkId}`
} and ${ctx.originId} target node ${
!targetNode ? 'is removed' : `doesn't have ${ctx.linkId}`
}.`
)
data.deletedLinks.push(link.id)
data.deletedLinks.push(ctx.linkId)
continue
}
}
// If we're fixing, then we've been patching along the way. Now go through and actually delete
// the zombie links from `app.graph.links`
if (fix) {
for (let i = data.deletedLinks.length - 1; i >= 0; i--) {
logger.log(`Deleting link #${data.deletedLinks[i]}.`)
if ((graph as LGraph).getNodeById) {
delete graph.links[data.deletedLinks[i]!]
if (isLiveGraph(graph)) {
delete (graph.links as Record<number, unknown>)[data.deletedLinks[i]!]
} else {
graph = graph as ISerialisedGraph
// Sometimes we got objects for links if passed after ComfyUI's loadGraphData modifies the
// data. We make a copy now, but can handle the bastardized objects just in case.
const idx = graph.links.findIndex(
const idx = (
graph.links as Array<
SerialisedLinkArray | SerialisedLinkObject | null
>
).findIndex(
(l) =>
l &&
(l[0] === data.deletedLinks[i] ||
((l as SerialisedLinkArray)[0] === data.deletedLinks[i] ||
('id' in l && l.id === data.deletedLinks[i]))
)
if (idx === -1) {
logger.log(`INDEX NOT FOUND for #${data.deletedLinks[i]}`)
continue
}
logger.log(`splicing ${idx} from links`)
graph.links.splice(idx, 1)
;(graph.links as Array<unknown>).splice(idx, 1)
}
}
// If we're a serialized graph, we can filter out the links because it's just an array.
if (!(graph as LGraph).getNodeById) {
graph.links = (graph as ISerialisedGraph).links.filter((l) => !!l)
if (!isLiveGraph(graph)) {
graph.links = (
graph.links as Array<SerialisedLinkArray | SerialisedLinkObject | null>
).filter((l): l is SerialisedLinkArray | SerialisedLinkObject => !!l)
}
}
if (!data.patchedNodes.length && !data.deletedLinks.length) {
@@ -470,9 +521,8 @@ export function fixBadLinks(
const hasChanges = !!(data.patchedNodes.length || data.deletedLinks.length)
let hasBadLinks: boolean = hasChanges
// If we're fixing, then let's run it again to see if there are no more bad links.
if (fix) {
const rerun = fixBadLinks(graph, { fix: false, silent: true })
const rerun = repairLinks(graph, { fix: false, silent: true })
hasBadLinks = rerun.hasBadLinks
}

View File

@@ -0,0 +1,164 @@
import { describe, expect, it } from 'vitest'
import { describeTopologyError, validateLinkTopology } from './linkTopology'
import type { SerialisedGraph } from './serialised'
function makeGraph(partial: Partial<SerialisedGraph>): SerialisedGraph {
return { nodes: [], links: [], ...partial }
}
describe('validateLinkTopology', () => {
it('returns no errors for a valid graph', () => {
const graph = makeGraph({
nodes: [
{ id: 1, outputs: [{ name: 'o', type: '*', links: [10] }] },
{ id: 2, inputs: [{ name: 'i', type: '*', link: 10 }] }
],
links: [[10, 1, 0, 2, 0, '*']]
})
expect(validateLinkTopology(graph)).toEqual([])
})
it('reports target slot out of bounds (seedance regression)', () => {
const graph = makeGraph({
nodes: [
{ id: 9, outputs: [{ name: 'o', type: 'STRING', links: [29] }] },
{
id: 14,
inputs: [
{ name: 'a', type: 'STRING', link: null },
{ name: 'b', type: 'STRING', link: null },
{ name: 'c', type: 'STRING', link: null },
{ name: 'd', type: 'STRING', link: 55 },
{ name: 'e', type: 'STRING', link: null }
]
}
],
links: [[29, 9, 0, 14, 9, 'STRING']]
})
const errors = validateLinkTopology(graph)
expect(errors).toHaveLength(1)
expect(errors[0]).toMatchObject({
kind: 'target-slot-out-of-bounds',
link: { linkId: 29, targetId: 14, targetSlot: 9 },
targetSlotCount: 5
})
expect(describeTopologyError(errors[0]!)).toContain(
'[link=29 src=9:0 tgt=14:9]'
)
})
it('reports a missing origin node', () => {
const graph = makeGraph({
nodes: [{ id: 2, inputs: [{ name: 'i', type: '*', link: 10 }] }],
links: [[10, 999, 0, 2, 0, '*']]
})
const errors = validateLinkTopology(graph)
expect(errors[0]?.kind).toBe('missing-origin-node')
})
it('reports a target-link mismatch', () => {
const graph = makeGraph({
nodes: [
{ id: 1, outputs: [{ name: 'o', type: '*', links: [10] }] },
{ id: 2, inputs: [{ name: 'i', type: '*', link: 999 }] }
],
links: [[10, 1, 0, 2, 0, '*']]
})
const errors = validateLinkTopology(graph)
expect(errors[0]).toMatchObject({
kind: 'target-link-mismatch',
actualLink: 999
})
})
it('accepts object-form links for valid graphs', () => {
const graph = makeGraph({
nodes: [
{ id: 1, outputs: [{ name: 'o', type: '*', links: [10] }] },
{ id: 2, inputs: [{ name: 'i', type: '*', link: 10 }] }
],
links: [
{
id: 10,
origin_id: 1,
origin_slot: 0,
target_id: 2,
target_slot: 0,
type: '*'
}
]
})
expect(validateLinkTopology(graph)).toEqual([])
})
it('reports object-form links with out-of-bounds slots', () => {
const graph = makeGraph({
nodes: [
{ id: 1, outputs: [{ name: 'o', type: '*', links: [10] }] },
{
id: 2,
inputs: [{ name: 'a', type: '*', link: null }]
}
],
links: [
{
id: 10,
origin_id: 1,
origin_slot: 0,
target_id: 2,
target_slot: 5,
type: '*'
}
]
})
const errors = validateLinkTopology(graph)
expect(errors[0]).toMatchObject({
kind: 'target-slot-out-of-bounds',
link: { linkId: 10, targetId: 2, targetSlot: 5 }
})
})
})
describe('describeTopologyError', () => {
it('formats every error kind with the [linkId, src, srcSlot, tgt, tgtSlot] tuple', () => {
const link = {
linkId: 7,
originId: 3,
originSlot: 1,
targetId: 4,
targetSlot: 2
}
const tuple = '[link=7 src=3:1 tgt=4:2]'
expect(
describeTopologyError({ kind: 'missing-origin-node', link })
).toContain(tuple)
expect(
describeTopologyError({ kind: 'missing-target-node', link })
).toContain(tuple)
expect(
describeTopologyError({
kind: 'origin-slot-out-of-bounds',
link,
originSlotCount: 0
})
).toContain(tuple)
expect(
describeTopologyError({
kind: 'target-slot-out-of-bounds',
link,
targetSlotCount: 5
})
).toContain(tuple)
expect(
describeTopologyError({ kind: 'origin-link-not-listed', link })
).toContain(tuple)
expect(
describeTopologyError({
kind: 'target-link-mismatch',
link,
actualLink: null
})
).toContain(tuple)
})
})

View File

@@ -0,0 +1,158 @@
import type {
SerialisedGraph,
SerialisedLinkArray,
SerialisedLinkObject,
SerialisedNode
} from './serialised'
export interface LinkContext {
linkId: number
originId: string | number
originSlot: number
targetId: string | number
targetSlot: number
}
export type TopologyError =
| { kind: 'missing-origin-node'; link: LinkContext }
| { kind: 'missing-target-node'; link: LinkContext }
| {
kind: 'origin-slot-out-of-bounds'
link: LinkContext
originSlotCount: number
}
| {
kind: 'target-slot-out-of-bounds'
link: LinkContext
targetSlotCount: number
}
| { kind: 'origin-link-not-listed'; link: LinkContext }
| {
kind: 'target-link-mismatch'
link: LinkContext
actualLink: number | null
}
export function describeTopologyError(error: TopologyError): string {
const { linkId, originId, originSlot, targetId, targetSlot } = error.link
const tuple = `[link=${linkId} src=${originId}:${originSlot} tgt=${targetId}:${targetSlot}]`
switch (error.kind) {
case 'missing-origin-node':
return `${tuple} origin node ${originId} does not exist in graph`
case 'missing-target-node':
return `${tuple} target node ${targetId} does not exist in graph`
case 'origin-slot-out-of-bounds':
return `${tuple} origin slot ${originSlot} is out of bounds; node ${originId} has ${error.originSlotCount} output slot(s)`
case 'target-slot-out-of-bounds':
return `${tuple} target slot ${targetSlot} is out of bounds; node ${targetId} has ${error.targetSlotCount} input slot(s)`
case 'origin-link-not-listed':
return `${tuple} link is not listed in node ${originId}.outputs[${originSlot}].links`
case 'target-link-mismatch':
return `${tuple} node ${targetId}.inputs[${targetSlot}].link is ${error.actualLink}, expected ${linkId}`
}
}
function isLinkObject(
l: SerialisedLinkArray | SerialisedLinkObject
): l is SerialisedLinkObject {
return !Array.isArray(l) && typeof l === 'object'
}
export function toLinkContext(
l: SerialisedLinkArray | SerialisedLinkObject
): LinkContext {
if (isLinkObject(l)) {
return {
linkId: l.id,
originId: l.origin_id,
originSlot: l.origin_slot,
targetId: l.target_id,
targetSlot: l.target_slot
}
}
return {
linkId: l[0],
originId: l[1],
originSlot: l[2],
targetId: l[3],
targetSlot: l[4]
}
}
function getNodeById(
graph: SerialisedGraph,
id: string | number
): SerialisedNode | undefined {
return graph.nodes.find((n) => n.id == id)
}
function iterateLinks(
graph: SerialisedGraph
): Array<SerialisedLinkArray | SerialisedLinkObject> {
if (Array.isArray(graph.links)) {
return graph.links.filter(
(l): l is SerialisedLinkArray | SerialisedLinkObject => l != null
)
}
const result: Array<SerialisedLinkArray | SerialisedLinkObject> = []
for (const l of Object.values(graph.links)) {
if (l) result.push(l as SerialisedLinkObject)
}
return result
}
/**
* Pure topology check: every link must reference real nodes, in-bounds
* slots, and consistent input/output endpoints. Does not mutate the
* graph. Use `repairLinks` (separate module) to attempt auto-fix.
*/
export function validateLinkTopology(graph: SerialisedGraph): TopologyError[] {
const errors: TopologyError[] = []
for (const l of iterateLinks(graph)) {
const link = toLinkContext(l)
const origin = getNodeById(graph, link.originId)
const target = getNodeById(graph, link.targetId)
if (!origin) errors.push({ kind: 'missing-origin-node', link })
if (!target) errors.push({ kind: 'missing-target-node', link })
if (!origin || !target) continue
const outputs = origin.outputs ?? []
const originSlotOutOfBounds =
link.originSlot < 0 || link.originSlot >= outputs.length
if (originSlotOutOfBounds) {
errors.push({
kind: 'origin-slot-out-of-bounds',
link,
originSlotCount: outputs.length
})
}
const inputs = target.inputs ?? []
const targetSlotOutOfBounds =
link.targetSlot < 0 || link.targetSlot >= inputs.length
if (targetSlotOutOfBounds) {
errors.push({
kind: 'target-slot-out-of-bounds',
link,
targetSlotCount: inputs.length
})
}
if (originSlotOutOfBounds || targetSlotOutOfBounds) {
continue
}
const originLinks = outputs[link.originSlot]?.links ?? []
if (!originLinks.includes(link.linkId)) {
errors.push({ kind: 'origin-link-not-listed', link })
}
const targetLink = inputs[link.targetSlot]?.link ?? null
if (targetLink !== link.linkId) {
errors.push({
kind: 'target-link-mismatch',
link,
actualLink: targetLink
})
}
}
return errors
}

View File

@@ -0,0 +1,60 @@
/**
* Minimal structural types for serialised workflow JSON.
*
* The validation/repair code in this package operates on plain JSON
* (parsed `.json` workflow files) — it does NOT need the runtime
* `LGraph`/`LGraphNode` classes from litegraph. Defining the shapes
* locally keeps this package free of frontend/litegraph coupling so
* it can be consumed by Node.js CI scripts and a future backend
* validator.
*
* These types intentionally mirror the relevant fields used by
* `validateLinkTopology` and `repairLinks`. They are a subset of the
* `ISerialisedGraph` / `ISerialisedNode` shapes from
* `@/lib/litegraph/src/types/serialisation` and stay structurally
* compatible with them.
*/
/** Schema version 0.4 link tuple: `[id, originId, originSlot, targetId, targetSlot, type]`. */
export type SerialisedLinkArray = [
number,
string | number,
number,
string | number,
number,
string | string[] | number
]
/** Object form of a link (schema version 1, or after live-graph hydration). */
export interface SerialisedLinkObject {
id: number
origin_id: string | number
origin_slot: number
target_id: string | number
target_slot: number
type?: string | string[] | number
}
export interface SerialisedNodeInput {
name?: string
type?: string | string[] | number
link?: number | null
}
export interface SerialisedNodeOutput {
name?: string
type?: string | string[] | number
links?: number[] | null
}
export interface SerialisedNode {
id: string | number
type?: string
inputs?: SerialisedNodeInput[]
outputs?: SerialisedNodeOutput[]
}
export interface SerialisedGraph {
nodes: SerialisedNode[]
links: Array<SerialisedLinkArray | SerialisedLinkObject | null>
}

View File

@@ -1,7 +1,8 @@
import { z } from 'zod'
import type { SafeParseReturnType } from 'zod'
import { fromZodError } from 'zod-validation-error'
import type { RendererType } from '@/lib/litegraph/src/LGraph'
type RendererType = 'LG' | 'Vue' | 'Vue-corrected'
const zRendererType = z.enum([
'LG',
@@ -313,7 +314,16 @@ const zExtra = z
.passthrough()
const zGraphDefinitions = z.object({
subgraphs: z.lazy(() => z.array(zSubgraphDefinition))
subgraphs: z.lazy(
(): z.ZodArray<
z.ZodType<
SubgraphDefinitionBase<ComfyWorkflow1BaseOutput>,
z.ZodTypeDef,
SubgraphDefinitionBase<ComfyWorkflow1BaseInput>
>,
'many'
> => z.array(zSubgraphDefinition)
)
})
const zBaseExportableGraph = z.object({

View File

@@ -0,0 +1,9 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"rootDir": "src",
"outDir": "dist"
},
"include": ["src/**/*"],
"references": [{ "path": "./tsconfig.node.json" }]
}

View File

@@ -0,0 +1,10 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"composite": true,
"outDir": "dist/.tsnode",
"module": "ESNext",
"moduleResolution": "bundler"
},
"include": ["vite.config.mts"]
}

View File

@@ -0,0 +1,26 @@
import { resolve } from 'path'
import { defineConfig } from 'vite'
import dts from 'vite-plugin-dts'
export default defineConfig({
build: {
lib: {
entry: resolve(__dirname, 'src/index.ts'),
name: 'workflow-validation',
formats: ['es'],
fileName: 'index'
},
copyPublicDir: false,
minify: false,
rollupOptions: {
external: ['zod', 'zod-validation-error']
}
},
plugins: [
dts({
tsconfigPath: 'tsconfig.json',
include: ['src/**/*'],
exclude: ['src/**/*.test.ts']
})
]
})

104
pnpm-lock.yaml generated
View File

@@ -443,6 +443,9 @@ importers:
'@comfyorg/tailwind-utils':
specifier: workspace:*
version: link:packages/tailwind-utils
'@comfyorg/workflow-validation':
specifier: workspace:*
version: link:packages/workflow-validation
'@formkit/auto-animate':
specifier: 'catalog:'
version: 0.9.0
@@ -1062,6 +1065,25 @@ importers:
specifier: 'catalog:'
version: 5.9.3
packages/workflow-validation:
dependencies:
zod:
specifier: 'catalog:'
version: 3.25.76
zod-validation-error:
specifier: 'catalog:'
version: 3.3.0(zod@3.25.76)
devDependencies:
typescript:
specifier: 'catalog:'
version: 5.9.3
vite:
specifier: ^8.0.0
version: 8.0.0(@types/node@25.0.3)(esbuild@0.27.3)(jiti@2.6.1)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)
vite-plugin-dts:
specifier: 'catalog:'
version: 4.5.4(@types/node@25.0.3)(rollup@4.53.5)(typescript@5.9.3)(vite@8.0.0(@types/node@25.0.3)(esbuild@0.27.3)(jiti@2.6.1)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2))
packages:
'@acemir/cssom@0.9.30':
@@ -12245,6 +12267,14 @@ snapshots:
transitivePeerDependencies:
- '@types/node'
'@microsoft/api-extractor-model@7.33.1(@types/node@25.0.3)':
dependencies:
'@microsoft/tsdoc': 0.16.0
'@microsoft/tsdoc-config': 0.18.0
'@rushstack/node-core-library': 5.20.1(@types/node@25.0.3)
transitivePeerDependencies:
- '@types/node'
'@microsoft/api-extractor@7.57.2(@types/node@24.10.4)':
dependencies:
'@microsoft/api-extractor-model': 7.33.1(@types/node@24.10.4)
@@ -12264,6 +12294,25 @@ snapshots:
transitivePeerDependencies:
- '@types/node'
'@microsoft/api-extractor@7.57.2(@types/node@25.0.3)':
dependencies:
'@microsoft/api-extractor-model': 7.33.1(@types/node@25.0.3)
'@microsoft/tsdoc': 0.16.0
'@microsoft/tsdoc-config': 0.18.0
'@rushstack/node-core-library': 5.20.1(@types/node@25.0.3)
'@rushstack/rig-package': 0.7.1
'@rushstack/terminal': 0.22.1(@types/node@25.0.3)
'@rushstack/ts-command-line': 5.3.1(@types/node@25.0.3)
diff: 8.0.3
lodash: 4.17.23
minimatch: 10.2.1
resolve: 1.22.11
semver: 7.5.4
source-map: 0.6.1
typescript: 5.8.2
transitivePeerDependencies:
- '@types/node'
'@microsoft/tsdoc-config@0.18.0':
dependencies:
'@microsoft/tsdoc': 0.16.0
@@ -13136,10 +13185,27 @@ snapshots:
optionalDependencies:
'@types/node': 24.10.4
'@rushstack/node-core-library@5.20.1(@types/node@25.0.3)':
dependencies:
ajv: 8.13.0
ajv-draft-04: 1.0.0(ajv@8.13.0)
ajv-formats: 3.0.1(ajv@8.13.0)
fs-extra: 11.3.2
import-lazy: 4.0.0
jju: 1.4.0
resolve: 1.22.11
semver: 7.5.4
optionalDependencies:
'@types/node': 25.0.3
'@rushstack/problem-matcher@0.2.1(@types/node@24.10.4)':
optionalDependencies:
'@types/node': 24.10.4
'@rushstack/problem-matcher@0.2.1(@types/node@25.0.3)':
optionalDependencies:
'@types/node': 25.0.3
'@rushstack/rig-package@0.7.1':
dependencies:
resolve: 1.22.11
@@ -13153,6 +13219,14 @@ snapshots:
optionalDependencies:
'@types/node': 24.10.4
'@rushstack/terminal@0.22.1(@types/node@25.0.3)':
dependencies:
'@rushstack/node-core-library': 5.20.1(@types/node@25.0.3)
'@rushstack/problem-matcher': 0.2.1(@types/node@25.0.3)
supports-color: 8.1.1
optionalDependencies:
'@types/node': 25.0.3
'@rushstack/ts-command-line@5.3.1(@types/node@24.10.4)':
dependencies:
'@rushstack/terminal': 0.22.1(@types/node@24.10.4)
@@ -13162,6 +13236,15 @@ snapshots:
transitivePeerDependencies:
- '@types/node'
'@rushstack/ts-command-line@5.3.1(@types/node@25.0.3)':
dependencies:
'@rushstack/terminal': 0.22.1(@types/node@25.0.3)
'@types/argparse': 1.0.38
argparse: 1.0.10
string-argv: 0.3.2
transitivePeerDependencies:
- '@types/node'
'@sec-ant/readable-stream@0.4.1': {}
'@sentry-internal/browser-utils@10.32.1':
@@ -14189,7 +14272,7 @@ snapshots:
sirv: 3.0.2
tinyglobby: 0.2.15
tinyrainbow: 3.0.3
vitest: 4.0.16(@opentelemetry/api@1.9.0)(@types/node@24.10.4)(@vitest/ui@4.0.16)(esbuild@0.27.3)(happy-dom@20.0.11)(jiti@2.6.1)(jsdom@27.4.0)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)
vitest: 4.0.16(@opentelemetry/api@1.9.0)(@types/node@25.0.3)(@vitest/ui@4.0.16)(esbuild@0.27.3)(happy-dom@20.0.11)(jiti@2.6.1)(jsdom@27.4.0)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)
'@vitest/utils@3.2.4':
dependencies:
@@ -20144,6 +20227,25 @@ snapshots:
- rollup
- supports-color
vite-plugin-dts@4.5.4(@types/node@25.0.3)(rollup@4.53.5)(typescript@5.9.3)(vite@8.0.0(@types/node@25.0.3)(esbuild@0.27.3)(jiti@2.6.1)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)):
dependencies:
'@microsoft/api-extractor': 7.57.2(@types/node@25.0.3)
'@rollup/pluginutils': 5.3.0(rollup@4.53.5)
'@volar/typescript': 2.4.28
'@vue/language-core': 2.2.0(typescript@5.9.3)
compare-versions: 6.1.1
debug: 4.4.3
kolorist: 1.8.0
local-pkg: 1.1.2
magic-string: 0.30.21
typescript: 5.9.3
optionalDependencies:
vite: 8.0.0(@types/node@25.0.3)(esbuild@0.27.3)(jiti@2.6.1)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)
transitivePeerDependencies:
- '@types/node'
- rollup
- supports-color
vite-plugin-html@3.2.2(vite@8.0.0(@types/node@24.10.4)(esbuild@0.27.3)(jiti@2.6.1)(terser@5.39.2)(tsx@4.19.4)(yaml@2.8.2)):
dependencies:
'@rollup/pluginutils': 4.2.1

View File

@@ -2,10 +2,7 @@ import fs from 'fs'
import path from 'path'
import { zodToJsonSchema } from 'zod-to-json-schema'
import {
zComfyWorkflow,
zComfyWorkflow1
} from '../src/platform/workflow/validation/schemas/workflowSchema'
import { zComfyWorkflow, zComfyWorkflow1 } from '@comfyorg/workflow-validation'
import { zComfyNodeDef as zComfyNodeDefV2 } from '../src/schemas/nodeDef/nodeDefSchemaV2'
import { zComfyNodeDef as zComfyNodeDefV1 } from '../src/schemas/nodeDefSchema'
@@ -57,4 +54,4 @@ fs.writeFileSync(
JSON.stringify(nodeDefV2Schema, null, 2)
)
console.log('JSON Schemas generated successfully!')
console.warn('JSON Schemas generated successfully!')

View File

@@ -0,0 +1,96 @@
import fs from 'fs'
import path from 'path'
import { fileURLToPath } from 'url'
const __dirname = path.dirname(fileURLToPath(import.meta.url))
const repoRoot = path.resolve(__dirname, '..')
const packageDir = path.join(repoRoot, 'packages', 'workflow-validation')
const distDir = path.join(packageDir, 'dist')
interface SourcePackage {
name: string
version: string
description?: string
license?: string
repository?: string
homepage?: string
dependencies?: Record<string, string>
publishConfig?: Record<string, unknown>
}
const sourcePackage = JSON.parse(
fs.readFileSync(path.join(packageDir, 'package.json'), 'utf8')
) as SourcePackage
const workspaceYaml =
fs
.readFileSync(path.join(repoRoot, 'pnpm-workspace.yaml'), 'utf8')
.replace(/\r\n/g, '\n') + '\n___end:'
const workspaceCatalog =
workspaceYaml.match(/^catalog:\n([\s\S]*?)\n\S/m)?.[1] ?? ''
function resolveCatalog(name: string): string {
const sourceVersion = sourcePackage.dependencies?.[name]
if (sourceVersion && sourceVersion !== 'catalog:') return sourceVersion
const re = new RegExp(`^\\s+'?${name}'?:\\s*([^\\n]+)$`, 'm')
const match = workspaceCatalog.match(re)
if (!match) {
throw new Error(
`Could not resolve catalog version for ${name}. ` +
`Expected entry under \`catalog:\` in pnpm-workspace.yaml.`
)
}
return match[1]!.trim()
}
const distPackage = {
name: sourcePackage.name,
version: sourcePackage.version,
description: sourcePackage.description,
license: sourcePackage.license,
repository: sourcePackage.repository,
homepage: sourcePackage.homepage,
type: 'module',
main: './index.js',
module: './index.js',
types: './index.d.ts',
exports: {
'.': {
types: './index.d.ts',
import: './index.js'
},
'./linkRepair': {
types: './linkRepair.d.ts',
import: './index.js'
},
'./linkTopology': {
types: './linkTopology.d.ts',
import: './index.js'
},
'./workflowSchema': {
types: './workflowSchema.d.ts',
import: './index.js'
},
'./serialised': {
types: './serialised.d.ts',
import: './index.js'
}
},
files: ['*.js', '*.d.ts'],
publishConfig: sourcePackage.publishConfig ?? { access: 'public' },
dependencies: {
zod: resolveCatalog('zod'),
'zod-validation-error': resolveCatalog('zod-validation-error')
}
}
if (!fs.existsSync(distDir)) {
fs.mkdirSync(distDir, { recursive: true })
}
fs.writeFileSync(
path.join(distDir, 'package.json'),
JSON.stringify(distPackage, null, 2) + '\n'
)
console.warn(`Prepared ${distPackage.name}@${distPackage.version} in dist/`)

View File

@@ -2261,7 +2261,22 @@
"special": "Must contain at least one special character",
"match": "Passwords must match"
},
"personalDataConsentRequired": "You must agree to the processing of your personal data."
"personalDataConsentRequired": "You must agree to the processing of your personal data.",
"topology": {
"invalidLinks": "Workflow has {count} invalid link | Workflow has {count} invalid links",
"overflow": "…and {count} more (see console for full list)",
"abortedSummary": "Workflow has unrepairable invalid links",
"validationSummary": "Workflow Validation",
"linksFixedSummary": "Workflow Links Fixed",
"linksFixedDetail": "Fixed {patched} node connections and removed {deleted} invalid links.",
"tuple": "[link={linkId} src={originId}:{originSlot} tgt={targetId}:{targetSlot}]",
"missingOriginNode": "{tuple} origin node {originId} does not exist in graph",
"missingTargetNode": "{tuple} target node {targetId} does not exist in graph",
"originSlotOutOfBounds": "{tuple} origin slot {originSlot} is out of bounds; node {originId} has {count} output slot | {tuple} origin slot {originSlot} is out of bounds; node {originId} has {count} output slots",
"targetSlotOutOfBounds": "{tuple} target slot {targetSlot} is out of bounds; node {targetId} has {count} input slot | {tuple} target slot {targetSlot} is out of bounds; node {targetId} has {count} input slots",
"originLinkNotListed": "{tuple} link is not listed in node {originId}.outputs[{originSlot}].links",
"targetLinkMismatch": "{tuple} node {targetId}.inputs[{targetSlot}].link is {actualLink}, expected {linkId}"
}
},
"credits": {
"activity": "Activity",

View File

@@ -0,0 +1,257 @@
import { LinkRepairAbortedError } from '@comfyorg/workflow-validation'
import type {
ComfyWorkflowJSON,
RepairResult,
TopologyError
} from '@comfyorg/workflow-validation'
import type * as WorkflowValidationModule from '@comfyorg/workflow-validation'
import { createPinia, setActivePinia } from 'pinia'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { useWorkflowValidation } from './useWorkflowValidation'
const toastAddMock = vi.hoisted(() => vi.fn())
const toastAddAlertMock = vi.hoisted(() => vi.fn())
vi.mock('@/platform/updates/common/toastStore', () => ({
useToastStore: () => ({
add: toastAddMock,
addAlert: toastAddAlertMock
})
}))
vi.mock('vue-i18n', () => ({
useI18n: () => ({
t: (key: string, ...rest: unknown[]) => {
const last = rest[rest.length - 1]
const params =
last && typeof last === 'object' && 'named' in (last as object)
? (last as { named: Record<string, unknown> }).named
: (last as Record<string, unknown> | undefined)
if (!params) return key
return `${key}|${JSON.stringify(params)}`
}
})
}))
const validateLinkTopologyMock = vi.hoisted(() => vi.fn())
const repairLinksMock = vi.hoisted(() => vi.fn())
const describeTopologyErrorMock = vi.hoisted(() =>
vi.fn((e: TopologyError) => `desc:${e.kind}:${e.link.linkId}`)
)
vi.mock('@comfyorg/workflow-validation', async () => {
const actual = await vi.importActual<typeof WorkflowValidationModule>(
'@comfyorg/workflow-validation'
)
return {
...actual,
validateLinkTopology: validateLinkTopologyMock,
repairLinks: repairLinksMock,
describeTopologyError: describeTopologyErrorMock
}
})
const validateComfyWorkflowMock = vi.hoisted(() => vi.fn())
vi.mock('@/platform/workflow/validation/schemas/workflowSchema', () => ({
validateComfyWorkflow: validateComfyWorkflowMock
}))
vi.mock('@/scripts/utils', () => ({
clone: <T>(v: T): T => structuredClone(v)
}))
function makeLink(linkId: number) {
return {
linkId,
originId: 1,
originSlot: 0,
targetId: 2,
targetSlot: 0
}
}
function makeWorkflow(): ComfyWorkflowJSON {
return {
version: 0.4,
last_node_id: 2,
last_link_id: 1,
nodes: [
{ id: 1, outputs: [{ name: 'o', type: '*', links: [1] }] },
{ id: 2, inputs: [{ name: 'i', type: '*', link: 1 }] }
] as unknown as ComfyWorkflowJSON['nodes'],
links: [[1, 1, 0, 2, 0, '*']] as unknown as ComfyWorkflowJSON['links']
} as ComfyWorkflowJSON
}
function repairResult(
graph: ComfyWorkflowJSON,
overrides: Partial<RepairResult> = {}
): RepairResult {
return {
graph: graph as unknown as RepairResult['graph'],
hasBadLinks: false,
fixed: false,
patched: 0,
deleted: 0,
...overrides
}
}
describe('useWorkflowValidation', () => {
beforeEach(() => {
setActivePinia(createPinia())
toastAddMock.mockClear()
toastAddAlertMock.mockClear()
validateLinkTopologyMock.mockReset()
repairLinksMock.mockReset()
describeTopologyErrorMock.mockClear()
validateComfyWorkflowMock.mockReset()
})
afterEach(() => vi.restoreAllMocks())
it('returns null when schema validation fails', async () => {
validateComfyWorkflowMock.mockImplementation(async (_d, onError) => {
onError('bad schema')
return null
})
const { validateWorkflow } = useWorkflowValidation()
const out = await validateWorkflow(makeWorkflow())
expect(out.graphData).toBeNull()
expect(toastAddAlertMock).toHaveBeenCalledWith('bad schema')
expect(repairLinksMock).not.toHaveBeenCalled()
})
it('passes through when schema validation succeeds and no topology errors exist', async () => {
const wf = makeWorkflow()
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([])
repairLinksMock.mockImplementation((g) => repairResult(g))
const { validateWorkflow } = useWorkflowValidation()
const out = await validateWorkflow(wf)
expect(out.graphData).not.toBeNull()
expect(toastAddMock).not.toHaveBeenCalled()
})
it('emits a single warn toast summarising up to TOPOLOGY_TOAST_LIMIT errors', async () => {
const wf = makeWorkflow()
const errors: TopologyError[] = Array.from({ length: 7 }, (_v, i) => ({
kind: 'missing-origin-node',
link: makeLink(i + 1)
}))
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue(errors)
repairLinksMock.mockImplementation((g) => repairResult(g))
const { validateWorkflow } = useWorkflowValidation()
await validateWorkflow(wf)
const warns = toastAddMock.mock.calls.filter(([arg]) =>
(arg as { summary: string }).summary.startsWith(
'validation.topology.invalidLinks'
)
)
expect(warns).toHaveLength(1)
const detail = (warns[0]![0] as { detail: string }).detail
expect(detail).toContain('validation.topology.overflow')
expect(detail.split('\n')).toHaveLength(6)
})
it('shows the success toast when repair fixes links', async () => {
const wf = makeWorkflow()
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([])
repairLinksMock.mockImplementation((g) =>
repairResult(g, { fixed: true, patched: 2, deleted: 1 })
)
const { validateWorkflow } = useWorkflowValidation()
await validateWorkflow(wf)
expect(toastAddMock).toHaveBeenCalledWith(
expect.objectContaining({
severity: 'success',
summary: expect.stringContaining(
'validation.topology.linksFixedSummary'
)
})
)
})
it('returns null and emits an error toast on LinkRepairAbortedError', async () => {
const wf = makeWorkflow()
const topologyError: TopologyError = {
kind: 'target-slot-out-of-bounds',
link: makeLink(7),
targetSlotCount: 5
}
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([topologyError])
repairLinksMock.mockImplementation(() => {
throw new LinkRepairAbortedError(topologyError)
})
const { validateWorkflow } = useWorkflowValidation()
const out = await validateWorkflow(wf)
expect(out.graphData).toBeNull()
const errorToast = toastAddMock.mock.calls.find(
([arg]) => (arg as { severity: string }).severity === 'error'
)
expect(errorToast).toBeDefined()
expect((errorToast![0] as { summary: string }).summary).toContain(
'validation.topology.abortedSummary'
)
})
it('re-throws unexpected errors from repairLinks', async () => {
const wf = makeWorkflow()
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([])
repairLinksMock.mockImplementation(() => {
throw new TypeError('boom')
})
const { validateWorkflow } = useWorkflowValidation()
await expect(validateWorkflow(wf)).rejects.toThrow(TypeError)
})
it('clones graphData before passing to repairLinks so the abort fallback is untouched', async () => {
const wf = makeWorkflow()
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([])
let received: ComfyWorkflowJSON | undefined
repairLinksMock.mockImplementation((g: ComfyWorkflowJSON) => {
received = g
return repairResult(g)
})
const { validateWorkflow } = useWorkflowValidation()
await validateWorkflow(wf)
expect(received).not.toBe(wf)
})
it('silent option suppresses toasts but still validates', async () => {
const wf = makeWorkflow()
validateComfyWorkflowMock.mockResolvedValue(wf)
validateLinkTopologyMock.mockReturnValue([
{ kind: 'missing-origin-node', link: makeLink(1) }
])
repairLinksMock.mockImplementation((g) =>
repairResult(g, { fixed: true, patched: 1, deleted: 0 })
)
const { validateWorkflow } = useWorkflowValidation()
const out = await validateWorkflow(wf, { silent: true })
expect(out.graphData).not.toBeNull()
expect(toastAddMock).not.toHaveBeenCalled()
expect(toastAddAlertMock).not.toHaveBeenCalled()
})
})

View File

@@ -1,60 +1,156 @@
import type { ISerialisedGraph } from '@/lib/litegraph/src/types/serialisation'
import {
LinkRepairAbortedError,
describeTopologyError,
repairLinks,
validateLinkTopology
} from '@comfyorg/workflow-validation'
import type {
SerialisedGraph,
TopologyError
} from '@comfyorg/workflow-validation'
import { useI18n } from 'vue-i18n'
import { useToastStore } from '@/platform/updates/common/toastStore'
import type { ComfyWorkflowJSON } from '@/platform/workflow/validation/schemas/workflowSchema'
import { validateComfyWorkflow } from '@/platform/workflow/validation/schemas/workflowSchema'
import { fixBadLinks } from '@/utils/linkFixer'
import { clone } from '@/scripts/utils'
interface ValidationResult {
graphData: ComfyWorkflowJSON | null
}
const TOPOLOGY_TOAST_LIMIT = 5
export function useWorkflowValidation() {
const toastStore = useToastStore()
const { t } = useI18n()
function linkParams(error: TopologyError): Record<string, unknown> {
return {
linkId: error.link.linkId,
originId: error.link.originId,
originSlot: error.link.originSlot,
targetId: error.link.targetId,
targetSlot: error.link.targetSlot
}
}
function localizeTopologyError(error: TopologyError): string {
const base = linkParams(error)
const tuple = t('validation.topology.tuple', base)
const params = { ...base, tuple }
switch (error.kind) {
case 'missing-origin-node':
return t('validation.topology.missingOriginNode', params)
case 'missing-target-node':
return t('validation.topology.missingTargetNode', params)
case 'origin-slot-out-of-bounds':
return t(
'validation.topology.originSlotOutOfBounds',
error.originSlotCount,
{ named: { ...params, count: error.originSlotCount } }
)
case 'target-slot-out-of-bounds':
return t(
'validation.topology.targetSlotOutOfBounds',
error.targetSlotCount,
{ named: { ...params, count: error.targetSlotCount } }
)
case 'origin-link-not-listed':
return t('validation.topology.originLinkNotListed', params)
case 'target-link-mismatch':
return t('validation.topology.targetLinkMismatch', {
...params,
actualLink: String(error.actualLink)
})
}
}
function summariseTopologyErrors(errors: TopologyError[]): string {
const lines = errors
.slice(0, TOPOLOGY_TOAST_LIMIT)
.map(localizeTopologyError)
if (errors.length > TOPOLOGY_TOAST_LIMIT) {
lines.push(
t('validation.topology.overflow', {
count: errors.length - TOPOLOGY_TOAST_LIMIT
})
)
}
return lines.join('\n')
}
function reportTopology(errors: TopologyError[], silent: boolean) {
if (silent || errors.length === 0) return
for (const e of errors) console.warn('[topology]', describeTopologyError(e))
toastStore.add({
severity: 'warn',
summary: t('validation.topology.invalidLinks', errors.length, {
named: { count: errors.length }
}),
detail: summariseTopologyErrors(errors),
life: 10_000
})
}
function tryFixLinks(
graphData: ComfyWorkflowJSON,
options: { silent?: boolean } = {}
) {
): { graph: ComfyWorkflowJSON; aborted: boolean } {
const { silent = false } = options
const topologyErrors = validateLinkTopology(graphData as SerialisedGraph)
reportTopology(topologyErrors, silent)
// Collect all logs in an array
const repairTarget = clone(graphData)
const logs: string[] = []
// Then validate and fix links if schema validation passed
const linkValidation = fixBadLinks(graphData as ISerialisedGraph, {
fix: true,
silent,
logger: {
log: (...args: unknown[]) => {
logs.push(args.join(' '))
try {
const linkValidation = repairLinks(repairTarget as SerialisedGraph, {
fix: true,
silent,
logger: {
log: (...args: unknown[]) => logs.push(args.join(' '))
}
}
})
if (!silent && logs.length > 0) {
toastStore.add({
severity: 'warn',
summary: 'Workflow Validation',
detail: logs.join('\n')
})
}
// If links were fixed, notify the user
if (linkValidation.fixed) {
if (!silent) {
if (!silent && logs.length > 0) {
toastStore.add({
severity: 'success',
summary: 'Workflow Links Fixed',
detail: `Fixed ${linkValidation.patched} node connections and removed ${linkValidation.deleted} invalid links.`
severity: 'warn',
summary: t('validation.topology.validationSummary'),
detail: logs.join('\n')
})
}
if (linkValidation.fixed && !silent) {
toastStore.add({
severity: 'success',
summary: t('validation.topology.linksFixedSummary'),
detail: t('validation.topology.linksFixedDetail', {
patched: linkValidation.patched,
deleted: linkValidation.deleted
})
})
}
return {
graph: linkValidation.graph as ComfyWorkflowJSON,
aborted: false
}
} catch (err: unknown) {
if (err instanceof LinkRepairAbortedError) {
if (!silent) {
toastStore.add({
severity: 'error',
summary: t('validation.topology.abortedSummary'),
detail: localizeTopologyError(err.topologyError),
life: 15_000
})
}
console.error('[linkFixer aborted]', err.topologyError, err)
return { graph: graphData, aborted: true }
}
console.error(err)
throw err
}
return linkValidation.graph
}
/**
* Validates a workflow, including link validation and schema validation
*/
async function validateWorkflow(
graphData: ComfyWorkflowJSON,
options: {
@@ -63,32 +159,16 @@ export function useWorkflowValidation() {
): Promise<ValidationResult> {
const { silent = false } = options
let validatedData: ComfyWorkflowJSON | null = null
const validatedGraphData = await validateComfyWorkflow(graphData, (err) => {
if (!silent) toastStore.addAlert(err)
})
// First do schema validation
const validatedGraphData = await validateComfyWorkflow(
graphData,
/* onError=*/ (err) => {
if (!silent) {
toastStore.addAlert(err)
}
}
)
if (validatedGraphData) {
try {
validatedData = tryFixLinks(validatedGraphData, {
silent
}) as ComfyWorkflowJSON
} catch (err) {
// Link fixer itself is throwing an error
console.error(err)
}
if (!validatedGraphData) {
return { graphData: null }
}
return {
graphData: validatedData
}
const { graph, aborted } = tryFixLinks(validatedGraphData, { silent })
return { graphData: aborted ? null : graph }
}
return {

View File

@@ -1,3 +1,4 @@
import { fixBadLinks } from '@comfyorg/workflow-validation'
import { describe, expect, it, vi } from 'vitest'
import type { SerialisedLLinkArray } from '@/lib/litegraph/src/LLink'
@@ -6,8 +7,6 @@ import type {
ISerialisedNode
} from '@/lib/litegraph/src/types/serialisation'
import { fixBadLinks } from './linkFixer'
type SerialisedInput = NonNullable<ISerialisedNode['inputs']>[number]
type SerialisedOutput = NonNullable<ISerialisedNode['outputs']>[number]

View File

@@ -27,6 +27,10 @@
],
"@/utils/networkUtil": [
"./packages/shared-frontend-utils/src/networkUtil.ts"
],
"@/utils/linkFixer": ["./packages/workflow-validation/src/linkRepair.ts"],
"@/platform/workflow/validation/schemas/workflowSchema": [
"./packages/workflow-validation/src/workflowSchema.ts"
]
},
"typeRoots": ["src/types", "node_modules/@types", "./node_modules"],

View File

@@ -634,6 +634,9 @@ export default defineConfig({
'@/utils/formatUtil': '/packages/shared-frontend-utils/src/formatUtil.ts',
'@/utils/networkUtil':
'/packages/shared-frontend-utils/src/networkUtil.ts',
'@/utils/linkFixer': '/packages/workflow-validation/src/linkRepair.ts',
'@/platform/workflow/validation/schemas/workflowSchema':
'/packages/workflow-validation/src/workflowSchema.ts',
'@': '/src'
}
},