mirror of
https://github.com/Comfy-Org/ComfyUI_frontend.git
synced 2026-04-19 22:09:37 +00:00
Compare commits
14 Commits
perf/fix-f
...
automation
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a20b279013 | ||
|
|
fa1ffcba01 | ||
|
|
f1db1122f3 | ||
|
|
f56abb3ecf | ||
|
|
95c6811f59 | ||
|
|
88079250eb | ||
|
|
08ea013c51 | ||
|
|
4aae52c2fc | ||
|
|
174da3ffa5 | ||
|
|
6b7691422b | ||
|
|
437f41c553 | ||
|
|
975393b48b | ||
|
|
a44fa1fdd5 | ||
|
|
cc3acebceb |
@@ -18,12 +18,20 @@ Cherry-pick backport management for Comfy-Org/ComfyUI_frontend stable release br
|
||||
|
||||
## System Context
|
||||
|
||||
| Item | Value |
|
||||
| -------------- | ------------------------------------------------- |
|
||||
| Repo | `~/ComfyUI_frontend` (Comfy-Org/ComfyUI_frontend) |
|
||||
| Merge strategy | Squash merge (`gh pr merge --squash --admin`) |
|
||||
| Automation | `pr-backport.yaml` GitHub Action (label-driven) |
|
||||
| Tracking dir | `~/temp/backport-session/` |
|
||||
| Item | Value |
|
||||
| -------------- | --------------------------------------------------------------------------- |
|
||||
| Repo | `~/ComfyUI_frontend` (Comfy-Org/ComfyUI_frontend) |
|
||||
| Merge strategy | Auto-merge via workflow (`--auto --squash`); `--admin` only after CI passes |
|
||||
| Automation | `pr-backport.yaml` GitHub Action (label-driven, auto-merge enabled) |
|
||||
| Tracking dir | `~/temp/backport-session/` |
|
||||
|
||||
## CI Safety Rules
|
||||
|
||||
**NEVER merge a backport PR without all CI checks passing.** This applies to both automation-created and manual cherry-pick PRs.
|
||||
|
||||
- **Automation PRs:** The `pr-backport.yaml` workflow now enables `gh pr merge --auto --squash`, so clean PRs auto-merge once CI passes. Monitor with polling (`gh pr list --base TARGET_BRANCH --state open`). Do not intervene unless CI fails.
|
||||
- **Manual cherry-pick PRs:** After `gh pr create`, wait for CI before merging. Poll with `gh pr checks $PR --watch` or use a sleep+check loop. Only merge after all checks pass.
|
||||
- **CI failures:** DO NOT use `--admin` to bypass failing CI. Analyze the failure, present it to the user with possible causes (test backported without implementation, missing dependency, flaky test), and let the user decide the next step.
|
||||
|
||||
## Branch Scope Rules
|
||||
|
||||
@@ -108,11 +116,15 @@ git fetch origin TARGET_BRANCH
|
||||
# Quick smoke check: does the branch build?
|
||||
git worktree add /tmp/verify-TARGET origin/TARGET_BRANCH
|
||||
cd /tmp/verify-TARGET
|
||||
source ~/.nvm/nvm.sh && nvm use 24 && pnpm install && pnpm typecheck
|
||||
source ~/.nvm/nvm.sh && nvm use 24 && pnpm install && pnpm typecheck && pnpm test:unit
|
||||
git worktree remove /tmp/verify-TARGET --force
|
||||
```
|
||||
|
||||
If typecheck fails, stop and investigate before continuing. A broken branch after wave N means all subsequent waves will compound the problem.
|
||||
If typecheck or tests fail, stop and investigate before continuing. A broken branch after wave N means all subsequent waves will compound the problem.
|
||||
|
||||
### Never Admin-Merge Without CI
|
||||
|
||||
In a previous bulk session, all 69 backport PRs were merged with `gh pr merge --squash --admin`, bypassing required CI checks. This shipped 3 test failures to a release branch. **Lesson: `--admin` skips all branch protection, including required status checks.** Only use `--admin` after confirming CI has passed (e.g., `gh pr checks $PR` shows all green), or rely on auto-merge (`--auto --squash`) which waits for CI by design.
|
||||
|
||||
## Continuous Backporting Recommendation
|
||||
|
||||
|
||||
@@ -19,23 +19,44 @@ done
|
||||
# Wait 3 minutes for automation
|
||||
sleep 180
|
||||
|
||||
# Check which got auto-PRs
|
||||
# Check which got auto-PRs (auto-merge is enabled, so clean ones will self-merge after CI)
|
||||
gh pr list --base TARGET_BRANCH --state open --limit 50 --json number,title
|
||||
```
|
||||
|
||||
## Step 2: Review & Merge Clean Auto-PRs
|
||||
> **Note:** The `pr-backport.yaml` workflow now enables `gh pr merge --auto --squash` on automation-created PRs. Clean PRs will auto-merge once CI passes — no manual merge needed for those.
|
||||
|
||||
## Step 2: Wait for CI & Merge Clean Auto-PRs
|
||||
|
||||
Most automation PRs will auto-merge once CI passes (via `--auto --squash` in the workflow). Monitor and handle failures:
|
||||
|
||||
```bash
|
||||
for pr in $AUTO_PRS; do
|
||||
# Check size
|
||||
gh pr view $pr --json title,additions,deletions,changedFiles \
|
||||
--jq '"Files: \(.changedFiles), +\(.additions)/-\(.deletions)"'
|
||||
# Admin merge
|
||||
gh pr merge $pr --squash --admin
|
||||
sleep 3
|
||||
# Wait for CI to complete (~45 minutes for full suite)
|
||||
sleep 2700
|
||||
|
||||
# Check which PRs are still open (CI may have failed, or auto-merge succeeded)
|
||||
STILL_OPEN_PRS=$(gh pr list --base TARGET_BRANCH --state open --limit 50 --json number --jq '.[].number')
|
||||
RECENTLY_MERGED=$(gh pr list --base TARGET_BRANCH --state merged --limit 50 --json number,title,mergedAt)
|
||||
|
||||
# For PRs still open, check CI status
|
||||
for pr in $STILL_OPEN_PRS; do
|
||||
CI_FAILED=$(gh pr checks $pr --json name,state --jq '[.[] | select(.state == "FAILURE")] | length')
|
||||
CI_PENDING=$(gh pr checks $pr --json name,state --jq '[.[] | select(.state == "PENDING" or .state == "QUEUED")] | length')
|
||||
if [ "$CI_FAILED" != "0" ]; then
|
||||
# CI failed — collect details for triage
|
||||
echo "PR #$pr — CI FAILED:"
|
||||
gh pr checks $pr --json name,state,link --jq '.[] | select(.state == "FAILURE") | "\(.name): \(.state)"'
|
||||
elif [ "$CI_PENDING" != "0" ]; then
|
||||
echo "PR #$pr — CI still running ($CI_PENDING checks pending)"
|
||||
else
|
||||
# All checks passed but didn't auto-merge (race condition or label issue)
|
||||
gh pr merge $pr --squash --admin
|
||||
sleep 3
|
||||
fi
|
||||
done
|
||||
```
|
||||
|
||||
**⚠️ If CI fails: DO NOT admin-merge to bypass.** See "CI Failure Triage" below.
|
||||
|
||||
## Step 3: Manual Worktree for Conflicts
|
||||
|
||||
```bash
|
||||
@@ -63,6 +84,13 @@ for PR in ${CONFLICT_PRS[@]}; do
|
||||
NEW_PR=$(gh pr create --base TARGET_BRANCH --head backport-$PR-to-TARGET \
|
||||
--title "[backport TARGET] TITLE (#$PR)" \
|
||||
--body "Backport of #$PR..." | grep -oP '\d+$')
|
||||
|
||||
# Wait for CI before merging — NEVER admin-merge without CI passing
|
||||
echo "Waiting for CI on PR #$NEW_PR..."
|
||||
gh pr checks $NEW_PR --watch --fail-fast || {
|
||||
echo "⚠️ CI failed on PR #$NEW_PR — skipping merge, needs triage"
|
||||
continue
|
||||
}
|
||||
gh pr merge $NEW_PR --squash --admin
|
||||
sleep 3
|
||||
done
|
||||
@@ -82,7 +110,7 @@ After completing all PRs in a wave for a target branch:
|
||||
git fetch origin TARGET_BRANCH
|
||||
git worktree add /tmp/verify-TARGET origin/TARGET_BRANCH
|
||||
cd /tmp/verify-TARGET
|
||||
source ~/.nvm/nvm.sh && nvm use 24 && pnpm install && pnpm typecheck
|
||||
source ~/.nvm/nvm.sh && nvm use 24 && pnpm install && pnpm typecheck && pnpm test:unit
|
||||
git worktree remove /tmp/verify-TARGET --force
|
||||
```
|
||||
|
||||
@@ -132,7 +160,8 @@ git rebase origin/TARGET_BRANCH
|
||||
# Resolve new conflicts
|
||||
git push --force origin backport-$PR-to-TARGET
|
||||
sleep 20 # Wait for GitHub to recompute merge state
|
||||
gh pr merge $PR --squash --admin
|
||||
# Wait for CI after rebase before merging
|
||||
gh pr checks $PR --watch --fail-fast && gh pr merge $PR --squash --admin
|
||||
```
|
||||
|
||||
## Lessons Learned
|
||||
@@ -146,5 +175,31 @@ gh pr merge $PR --squash --admin
|
||||
7. **appModeStore.ts, painter files, GLSLShader files** don't exist on core/1.40 — `git rm` these
|
||||
8. **Always validate JSON** after resolving locale file conflicts
|
||||
9. **Dep refresh PRs** — skip on stable branches. Risk of transitive dep regressions outweighs audit cleanup. Cherry-pick individual CVE fixes instead.
|
||||
10. **Verify after each wave** — run `pnpm typecheck` on the target branch after merging a batch. Catching breakage early prevents compounding errors.
|
||||
10. **Verify after each wave** — run `pnpm typecheck && pnpm test:unit` on the target branch after merging a batch. Catching breakage early prevents compounding errors.
|
||||
11. **Cloud-only PRs don't belong on core/\* branches** — app mode, cloud auth, and cloud-specific UI changes are irrelevant to local users. Always check PR scope against branch scope before backporting.
|
||||
12. **Never admin-merge without CI** — `--admin` bypasses all branch protections including required status checks. A bulk session of 69 admin-merges shipped 3 test failures. Always wait for CI to pass first, or use `--auto --squash` which waits by design.
|
||||
|
||||
## CI Failure Triage
|
||||
|
||||
When CI fails on a backport PR, present failures to the user using this template:
|
||||
|
||||
```markdown
|
||||
### PR #XXXX — CI Failed
|
||||
|
||||
- **Failing check:** test / lint / typecheck
|
||||
- **Error:** (summary of the failure message)
|
||||
- **Likely cause:** test backported without implementation / missing dependency / flaky test / snapshot mismatch
|
||||
- **Recommendation:** backport PR #YYYY first / skip this PR / rerun CI after fixing prerequisites
|
||||
```
|
||||
|
||||
Common failure categories:
|
||||
|
||||
| Category | Example | Resolution |
|
||||
| --------------------------- | ---------------------------------------- | ----------------------------------------- |
|
||||
| Test without implementation | Test references function not on branch | Backport the implementation PR first |
|
||||
| Missing dependency | Import from module not on branch | Backport the dependency PR first, or skip |
|
||||
| Snapshot mismatch | Screenshot test differs | Usually safe — update snapshots on branch |
|
||||
| Flaky test | Passes on retry | Re-run CI, merge if green on retry |
|
||||
| Type error | Interface changed on main but not branch | May need manual adaptation |
|
||||
|
||||
**Never assume a failure is safe to skip.** Present all failures to the user with analysis.
|
||||
|
||||
@@ -5,9 +5,9 @@
|
||||
Maintain `execution-log.md` with per-branch tables:
|
||||
|
||||
```markdown
|
||||
| PR# | Title | Status | Backport PR | Notes |
|
||||
| ----- | ----- | --------------------------------- | ----------- | ------- |
|
||||
| #XXXX | Title | ✅ Merged / ⏭️ Skip / ⏸️ Deferred | #YYYY | Details |
|
||||
| PR# | Title | CI Status | Status | Backport PR | Notes |
|
||||
| ----- | ----- | ------------------------------ | --------------------------------- | ----------- | ------- |
|
||||
| #XXXX | Title | ✅ Pass / ❌ Fail / ⏳ Pending | ✅ Merged / ⏭️ Skip / ⏸️ Deferred | #YYYY | Details |
|
||||
```
|
||||
|
||||
## Wave Verification Log
|
||||
@@ -19,6 +19,7 @@ Track verification results per wave:
|
||||
|
||||
- PRs merged: #A, #B, #C
|
||||
- Typecheck: ✅ Pass / ❌ Fail
|
||||
- Unit tests: ✅ Pass / ❌ Fail
|
||||
- Issues found: (if any)
|
||||
- Human review needed: (list any non-trivial conflict resolutions)
|
||||
```
|
||||
@@ -41,6 +42,11 @@ Track verification results per wave:
|
||||
|
||||
| PR# | Branch | Conflict Type | Resolution Summary |
|
||||
|
||||
## CI Failure Report
|
||||
|
||||
| PR# | Branch | Failing Check | Error Summary | Cause | Resolution |
|
||||
| --- | ------ | ------------- | ------------- | ----- | ---------- |
|
||||
|
||||
## Automation Performance
|
||||
|
||||
| Metric | Value |
|
||||
|
||||
4
.github/workflows/ci-tests-e2e.yaml
vendored
4
.github/workflows/ci-tests-e2e.yaml
vendored
@@ -39,7 +39,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 60
|
||||
container:
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.0.13
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.18.2
|
||||
credentials:
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
@@ -87,7 +87,7 @@ jobs:
|
||||
needs: setup
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.0.13
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.18.2
|
||||
credentials:
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
2
.github/workflows/pr-report.yaml
vendored
2
.github/workflows/pr-report.yaml
vendored
@@ -180,7 +180,7 @@ jobs:
|
||||
if git ls-remote --exit-code origin perf-data >/dev/null 2>&1; then
|
||||
git fetch origin perf-data --depth=1
|
||||
mkdir -p temp/perf-history
|
||||
for file in $(git ls-tree --name-only origin/perf-data baselines/ 2>/dev/null | sort -r | head -10); do
|
||||
for file in $(git ls-tree --name-only origin/perf-data baselines/ 2>/dev/null | sort -r | head -15); do
|
||||
git show "origin/perf-data:${file}" > "temp/perf-history/$(basename "$file")" 2>/dev/null || true
|
||||
done
|
||||
echo "Loaded $(ls temp/perf-history/*.json 2>/dev/null | wc -l) historical baselines"
|
||||
|
||||
@@ -77,7 +77,7 @@ jobs:
|
||||
needs: setup
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.0.13
|
||||
image: ghcr.io/comfy-org/comfyui-ci-container:0.18.2
|
||||
credentials:
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import tailwindcss from '@tailwindcss/vite'
|
||||
import vue from '@vitejs/plugin-vue'
|
||||
import dotenv from 'dotenv'
|
||||
import { config as dotenvConfig } from 'dotenv'
|
||||
import path from 'node:path'
|
||||
import { fileURLToPath } from 'node:url'
|
||||
import { FileSystemIconLoader } from 'unplugin-icons/loaders'
|
||||
@@ -11,7 +11,7 @@ import { defineConfig } from 'vite'
|
||||
import { createHtmlPlugin } from 'vite-plugin-html'
|
||||
import vueDevTools from 'vite-plugin-vue-devtools'
|
||||
|
||||
dotenv.config()
|
||||
dotenvConfig()
|
||||
|
||||
const projectRoot = fileURLToPath(new URL('.', import.meta.url))
|
||||
|
||||
|
||||
2
apps/website/.gitignore
vendored
Normal file
2
apps/website/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
dist/
|
||||
.astro/
|
||||
24
apps/website/astro.config.ts
Normal file
24
apps/website/astro.config.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { defineConfig } from 'astro/config'
|
||||
import vue from '@astrojs/vue'
|
||||
import tailwindcss from '@tailwindcss/vite'
|
||||
|
||||
export default defineConfig({
|
||||
site: 'https://comfy.org',
|
||||
output: 'static',
|
||||
integrations: [vue()],
|
||||
vite: {
|
||||
plugins: [tailwindcss()]
|
||||
},
|
||||
build: {
|
||||
assetsPrefix: process.env.VERCEL_URL
|
||||
? `https://${process.env.VERCEL_URL}`
|
||||
: undefined
|
||||
},
|
||||
i18n: {
|
||||
locales: ['en', 'zh-CN'],
|
||||
defaultLocale: 'en',
|
||||
routing: {
|
||||
prefixDefaultLocale: false
|
||||
}
|
||||
}
|
||||
})
|
||||
80
apps/website/package.json
Normal file
80
apps/website/package.json
Normal file
@@ -0,0 +1,80 @@
|
||||
{
|
||||
"name": "@comfyorg/website",
|
||||
"version": "0.0.1",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "astro dev",
|
||||
"build": "astro build",
|
||||
"preview": "astro preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"@comfyorg/design-system": "workspace:*",
|
||||
"@vercel/analytics": "catalog:",
|
||||
"vue": "catalog:"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@astrojs/vue": "catalog:",
|
||||
"@tailwindcss/vite": "catalog:",
|
||||
"astro": "catalog:",
|
||||
"tailwindcss": "catalog:",
|
||||
"typescript": "catalog:"
|
||||
},
|
||||
"nx": {
|
||||
"tags": [
|
||||
"scope:website",
|
||||
"type:app"
|
||||
],
|
||||
"targets": {
|
||||
"dev": {
|
||||
"executor": "nx:run-commands",
|
||||
"continuous": true,
|
||||
"options": {
|
||||
"cwd": "apps/website",
|
||||
"command": "astro dev"
|
||||
}
|
||||
},
|
||||
"serve": {
|
||||
"executor": "nx:run-commands",
|
||||
"continuous": true,
|
||||
"options": {
|
||||
"cwd": "apps/website",
|
||||
"command": "astro dev"
|
||||
}
|
||||
},
|
||||
"build": {
|
||||
"executor": "nx:run-commands",
|
||||
"cache": true,
|
||||
"dependsOn": [
|
||||
"^build"
|
||||
],
|
||||
"options": {
|
||||
"cwd": "apps/website",
|
||||
"command": "astro build"
|
||||
},
|
||||
"outputs": [
|
||||
"{projectRoot}/dist"
|
||||
]
|
||||
},
|
||||
"preview": {
|
||||
"executor": "nx:run-commands",
|
||||
"continuous": true,
|
||||
"dependsOn": [
|
||||
"build"
|
||||
],
|
||||
"options": {
|
||||
"cwd": "apps/website",
|
||||
"command": "astro preview"
|
||||
}
|
||||
},
|
||||
"typecheck": {
|
||||
"executor": "nx:run-commands",
|
||||
"cache": true,
|
||||
"options": {
|
||||
"cwd": "apps/website",
|
||||
"command": "astro check"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
BIN
apps/website/public/fonts/inter-latin-italic.woff2
Normal file
BIN
apps/website/public/fonts/inter-latin-italic.woff2
Normal file
Binary file not shown.
BIN
apps/website/public/fonts/inter-latin-normal.woff2
Normal file
BIN
apps/website/public/fonts/inter-latin-normal.woff2
Normal file
Binary file not shown.
1
apps/website/src/env.d.ts
vendored
Normal file
1
apps/website/src/env.d.ts
vendored
Normal file
@@ -0,0 +1 @@
|
||||
/// <reference types="astro/client" />
|
||||
2
apps/website/src/styles/global.css
Normal file
2
apps/website/src/styles/global.css
Normal file
@@ -0,0 +1,2 @@
|
||||
@import 'tailwindcss';
|
||||
@import '@comfyorg/design-system/css/base.css';
|
||||
9
apps/website/tsconfig.json
Normal file
9
apps/website/tsconfig.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"extends": "astro/tsconfigs/strict",
|
||||
"compilerOptions": {
|
||||
"paths": {
|
||||
"@/*": ["./src/*"]
|
||||
}
|
||||
},
|
||||
"include": ["src/**/*", "astro.config.mjs"]
|
||||
}
|
||||
@@ -0,0 +1,555 @@
|
||||
{
|
||||
"id": "b04d981f-6857-48cc-ac6e-429ab2f6bc8d",
|
||||
"revision": 0,
|
||||
"last_node_id": 11,
|
||||
"last_link_id": 16,
|
||||
"nodes": [
|
||||
{
|
||||
"id": 9,
|
||||
"type": "SaveImage",
|
||||
"pos": [1451.0058559453123, 189.0019842294924],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 2,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"name": "images",
|
||||
"type": "IMAGE",
|
||||
"link": 13
|
||||
}
|
||||
],
|
||||
"outputs": [],
|
||||
"properties": {},
|
||||
"widgets_values": ["ComfyUI"]
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"type": "CheckpointLoaderSimple",
|
||||
"pos": [25.988896564209426, 473.9973077158204],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 0,
|
||||
"mode": 0,
|
||||
"inputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "MODEL",
|
||||
"type": "MODEL",
|
||||
"slot_index": 0,
|
||||
"links": [11]
|
||||
},
|
||||
{
|
||||
"name": "CLIP",
|
||||
"type": "CLIP",
|
||||
"slot_index": 1,
|
||||
"links": [10]
|
||||
},
|
||||
{
|
||||
"name": "VAE",
|
||||
"type": "VAE",
|
||||
"slot_index": 2,
|
||||
"links": [12]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "CheckpointLoaderSimple"
|
||||
},
|
||||
"widgets_values": ["v1-5-pruned-emaonly-fp16.safetensors"]
|
||||
},
|
||||
{
|
||||
"id": 10,
|
||||
"type": "d14ff4cf-e5cb-4c84-941f-7c2457476424",
|
||||
"pos": [711.776576770508, 420.55569028417983],
|
||||
"size": [400, 293],
|
||||
"flags": {},
|
||||
"order": 1,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"link": 10
|
||||
},
|
||||
{
|
||||
"name": "model",
|
||||
"type": "MODEL",
|
||||
"link": 11
|
||||
},
|
||||
{
|
||||
"name": "vae",
|
||||
"type": "VAE",
|
||||
"link": 12
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "IMAGE",
|
||||
"type": "IMAGE",
|
||||
"links": [13]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"proxyWidgets": [
|
||||
["7", "text"],
|
||||
["6", "text"],
|
||||
["3", "seed"]
|
||||
]
|
||||
},
|
||||
"widgets_values": []
|
||||
}
|
||||
],
|
||||
"links": [
|
||||
[10, 4, 1, 10, 0, "CLIP"],
|
||||
[11, 4, 0, 10, 1, "MODEL"],
|
||||
[12, 4, 2, 10, 2, "VAE"],
|
||||
[13, 10, 0, 9, 0, "IMAGE"]
|
||||
],
|
||||
"groups": [],
|
||||
"definitions": {
|
||||
"subgraphs": [
|
||||
{
|
||||
"id": "d14ff4cf-e5cb-4c84-941f-7c2457476424",
|
||||
"version": 1,
|
||||
"state": {
|
||||
"lastGroupId": 0,
|
||||
"lastNodeId": 11,
|
||||
"lastLinkId": 16,
|
||||
"lastRerouteId": 0
|
||||
},
|
||||
"revision": 0,
|
||||
"config": {},
|
||||
"name": "New Subgraph",
|
||||
"inputNode": {
|
||||
"id": -10,
|
||||
"bounding": [233, 404.5, 120, 100]
|
||||
},
|
||||
"outputNode": {
|
||||
"id": -20,
|
||||
"bounding": [1494, 424.5, 120, 60]
|
||||
},
|
||||
"inputs": [
|
||||
{
|
||||
"id": "85b7a46b-f14c-4297-8b86-3fc73a41da2b",
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"linkIds": [14],
|
||||
"localized_name": "clip",
|
||||
"pos": [333, 424.5]
|
||||
},
|
||||
{
|
||||
"id": "b4040cb7-0457-416e-ad6e-14890b871dd2",
|
||||
"name": "model",
|
||||
"type": "MODEL",
|
||||
"linkIds": [1],
|
||||
"localized_name": "model",
|
||||
"pos": [333, 444.5]
|
||||
},
|
||||
{
|
||||
"id": "e61199fa-9113-4532-a3d9-879095969171",
|
||||
"name": "vae",
|
||||
"type": "VAE",
|
||||
"linkIds": [8],
|
||||
"localized_name": "vae",
|
||||
"pos": [333, 464.5]
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"id": "a4705fa5-a5e6-4c4e-83c2-dfb875861466",
|
||||
"name": "IMAGE",
|
||||
"type": "IMAGE",
|
||||
"linkIds": [9],
|
||||
"localized_name": "IMAGE",
|
||||
"pos": [1514, 444.5]
|
||||
}
|
||||
],
|
||||
"widgets": [],
|
||||
"nodes": [
|
||||
{
|
||||
"id": 5,
|
||||
"type": "EmptyLatentImage",
|
||||
"pos": [473.007643669922, 609.0214689174805],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 0,
|
||||
"mode": 0,
|
||||
"inputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "LATENT",
|
||||
"name": "LATENT",
|
||||
"type": "LATENT",
|
||||
"slot_index": 0,
|
||||
"links": [2]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "EmptyLatentImage"
|
||||
},
|
||||
"widgets_values": [512, 512, 1]
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"type": "KSampler",
|
||||
"pos": [862.990643669922, 185.9853293300783],
|
||||
"size": [400, 317],
|
||||
"flags": {},
|
||||
"order": 2,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"localized_name": "model",
|
||||
"name": "model",
|
||||
"type": "MODEL",
|
||||
"link": 1
|
||||
},
|
||||
{
|
||||
"localized_name": "positive",
|
||||
"name": "positive",
|
||||
"type": "CONDITIONING",
|
||||
"link": 16
|
||||
},
|
||||
{
|
||||
"localized_name": "negative",
|
||||
"name": "negative",
|
||||
"type": "CONDITIONING",
|
||||
"link": 15
|
||||
},
|
||||
{
|
||||
"localized_name": "latent_image",
|
||||
"name": "latent_image",
|
||||
"type": "LATENT",
|
||||
"link": 2
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "LATENT",
|
||||
"name": "LATENT",
|
||||
"type": "LATENT",
|
||||
"slot_index": 0,
|
||||
"links": [7]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "KSampler"
|
||||
},
|
||||
"widgets_values": [
|
||||
156680208700286,
|
||||
"randomize",
|
||||
20,
|
||||
8,
|
||||
"euler",
|
||||
"normal",
|
||||
1
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": 8,
|
||||
"type": "VAEDecode",
|
||||
"pos": [1209.0062878349609, 188.00400724755877],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 3,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"localized_name": "samples",
|
||||
"name": "samples",
|
||||
"type": "LATENT",
|
||||
"link": 7
|
||||
},
|
||||
{
|
||||
"localized_name": "vae",
|
||||
"name": "vae",
|
||||
"type": "VAE",
|
||||
"link": 8
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "IMAGE",
|
||||
"name": "IMAGE",
|
||||
"type": "IMAGE",
|
||||
"slot_index": 0,
|
||||
"links": [9]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "VAEDecode"
|
||||
},
|
||||
"widgets_values": []
|
||||
},
|
||||
{
|
||||
"id": 11,
|
||||
"type": "3b9b7fb9-a8f6-4b4e-ac13-b68156afe8f6",
|
||||
"pos": [485.5190761650391, 283.9247189174806],
|
||||
"size": [400, 237],
|
||||
"flags": {},
|
||||
"order": 1,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"localized_name": "clip",
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"link": 14
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "CONDITIONING",
|
||||
"name": "CONDITIONING",
|
||||
"type": "CONDITIONING",
|
||||
"links": [15]
|
||||
},
|
||||
{
|
||||
"localized_name": "CONDITIONING_1",
|
||||
"name": "CONDITIONING_1",
|
||||
"type": "CONDITIONING",
|
||||
"links": [16]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"proxyWidgets": [
|
||||
["7", "text"],
|
||||
["6", "text"]
|
||||
]
|
||||
},
|
||||
"widgets_values": []
|
||||
}
|
||||
],
|
||||
"groups": [],
|
||||
"links": [
|
||||
{
|
||||
"id": 2,
|
||||
"origin_id": 5,
|
||||
"origin_slot": 0,
|
||||
"target_id": 3,
|
||||
"target_slot": 3,
|
||||
"type": "LATENT"
|
||||
},
|
||||
{
|
||||
"id": 7,
|
||||
"origin_id": 3,
|
||||
"origin_slot": 0,
|
||||
"target_id": 8,
|
||||
"target_slot": 0,
|
||||
"type": "LATENT"
|
||||
},
|
||||
{
|
||||
"id": 1,
|
||||
"origin_id": -10,
|
||||
"origin_slot": 1,
|
||||
"target_id": 3,
|
||||
"target_slot": 0,
|
||||
"type": "MODEL"
|
||||
},
|
||||
{
|
||||
"id": 8,
|
||||
"origin_id": -10,
|
||||
"origin_slot": 2,
|
||||
"target_id": 8,
|
||||
"target_slot": 1,
|
||||
"type": "VAE"
|
||||
},
|
||||
{
|
||||
"id": 9,
|
||||
"origin_id": 8,
|
||||
"origin_slot": 0,
|
||||
"target_id": -20,
|
||||
"target_slot": 0,
|
||||
"type": "IMAGE"
|
||||
},
|
||||
{
|
||||
"id": 14,
|
||||
"origin_id": -10,
|
||||
"origin_slot": 0,
|
||||
"target_id": 11,
|
||||
"target_slot": 0,
|
||||
"type": "CLIP"
|
||||
},
|
||||
{
|
||||
"id": 15,
|
||||
"origin_id": 11,
|
||||
"origin_slot": 0,
|
||||
"target_id": 3,
|
||||
"target_slot": 2,
|
||||
"type": "CONDITIONING"
|
||||
},
|
||||
{
|
||||
"id": 16,
|
||||
"origin_id": 11,
|
||||
"origin_slot": 1,
|
||||
"target_id": 3,
|
||||
"target_slot": 1,
|
||||
"type": "CONDITIONING"
|
||||
}
|
||||
],
|
||||
"extra": {}
|
||||
},
|
||||
{
|
||||
"id": "3b9b7fb9-a8f6-4b4e-ac13-b68156afe8f6",
|
||||
"version": 1,
|
||||
"state": {
|
||||
"lastGroupId": 0,
|
||||
"lastNodeId": 11,
|
||||
"lastLinkId": 16,
|
||||
"lastRerouteId": 0
|
||||
},
|
||||
"revision": 0,
|
||||
"config": {},
|
||||
"name": "New Subgraph",
|
||||
"inputNode": {
|
||||
"id": -10,
|
||||
"bounding": [233.01228575000005, 332.7902770140076, 120, 60]
|
||||
},
|
||||
"outputNode": {
|
||||
"id": -20,
|
||||
"bounding": [
|
||||
898.2956109453125, 322.7902770140076, 138.31666564941406, 80
|
||||
]
|
||||
},
|
||||
"inputs": [
|
||||
{
|
||||
"id": "e5074a9c-3b33-4998-b569-0638817e81e7",
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"linkIds": [5, 3],
|
||||
"localized_name": "clip",
|
||||
"pos": [55, 20]
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"id": "5fd778da-7ff1-4a0b-9282-d11a2e332e15",
|
||||
"name": "CONDITIONING",
|
||||
"type": "CONDITIONING",
|
||||
"linkIds": [6],
|
||||
"localized_name": "CONDITIONING",
|
||||
"pos": [20, 20]
|
||||
},
|
||||
{
|
||||
"id": "1e02089f-6491-45fa-aa0a-24458100f8ae",
|
||||
"name": "CONDITIONING_1",
|
||||
"type": "CONDITIONING",
|
||||
"linkIds": [4],
|
||||
"localized_name": "CONDITIONING_1",
|
||||
"pos": [20, 40]
|
||||
}
|
||||
],
|
||||
"widgets": [],
|
||||
"nodes": [
|
||||
{
|
||||
"id": 7,
|
||||
"type": "CLIPTextEncode",
|
||||
"pos": [413.01228575000005, 388.98593823266606],
|
||||
"size": [425, 200],
|
||||
"flags": {},
|
||||
"order": 1,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"localized_name": "clip",
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"link": 5
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "CONDITIONING",
|
||||
"name": "CONDITIONING",
|
||||
"type": "CONDITIONING",
|
||||
"slot_index": 0,
|
||||
"links": [6]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "CLIPTextEncode"
|
||||
},
|
||||
"widgets_values": ["text, watermark"]
|
||||
},
|
||||
{
|
||||
"id": 6,
|
||||
"type": "CLIPTextEncode",
|
||||
"pos": [414.99053247091683, 185.9946096918335],
|
||||
"size": [423, 200],
|
||||
"flags": {},
|
||||
"order": 0,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{
|
||||
"localized_name": "clip",
|
||||
"name": "clip",
|
||||
"type": "CLIP",
|
||||
"link": 3
|
||||
}
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"localized_name": "CONDITIONING",
|
||||
"name": "CONDITIONING",
|
||||
"type": "CONDITIONING",
|
||||
"slot_index": 0,
|
||||
"links": [4]
|
||||
}
|
||||
],
|
||||
"properties": {
|
||||
"Node name for S&R": "CLIPTextEncode"
|
||||
},
|
||||
"widgets_values": [
|
||||
"beautiful scenery nature glass bottle landscape, , purple galaxy bottle,"
|
||||
]
|
||||
}
|
||||
],
|
||||
"groups": [],
|
||||
"links": [
|
||||
{
|
||||
"id": 5,
|
||||
"origin_id": -10,
|
||||
"origin_slot": 0,
|
||||
"target_id": 7,
|
||||
"target_slot": 0,
|
||||
"type": "CLIP"
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"origin_id": -10,
|
||||
"origin_slot": 0,
|
||||
"target_id": 6,
|
||||
"target_slot": 0,
|
||||
"type": "CLIP"
|
||||
},
|
||||
{
|
||||
"id": 6,
|
||||
"origin_id": 7,
|
||||
"origin_slot": 0,
|
||||
"target_id": -20,
|
||||
"target_slot": 0,
|
||||
"type": "CONDITIONING"
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"origin_id": 6,
|
||||
"origin_slot": 0,
|
||||
"target_id": -20,
|
||||
"target_slot": 1,
|
||||
"type": "CONDITIONING"
|
||||
}
|
||||
],
|
||||
"extra": {}
|
||||
}
|
||||
]
|
||||
},
|
||||
"config": {},
|
||||
"extra": {
|
||||
"ds": {
|
||||
"scale": 0.6830134553650709,
|
||||
"offset": [-203.70966200000038, 259.92420099999975]
|
||||
},
|
||||
"frontendVersion": "1.43.2"
|
||||
},
|
||||
"version": 0.4
|
||||
}
|
||||
@@ -5,7 +5,7 @@ import type {
|
||||
Page
|
||||
} from '@playwright/test'
|
||||
import { test as base, expect } from '@playwright/test'
|
||||
import dotenv from 'dotenv'
|
||||
import { config as dotenvConfig } from 'dotenv'
|
||||
|
||||
import { TestIds } from './selectors'
|
||||
import { NodeBadgeMode } from '../../src/types/nodeSource'
|
||||
@@ -40,7 +40,7 @@ import { WorkflowHelper } from './helpers/WorkflowHelper'
|
||||
import type { NodeReference } from './utils/litegraphUtils'
|
||||
import type { WorkspaceStore } from '../types/globals'
|
||||
|
||||
dotenv.config()
|
||||
dotenvConfig()
|
||||
|
||||
class ComfyPropertiesPanel {
|
||||
readonly root: Locator
|
||||
|
||||
@@ -25,13 +25,15 @@ export class DragDropHelper {
|
||||
url?: string
|
||||
dropPosition?: Position
|
||||
waitForUpload?: boolean
|
||||
preserveNativePropagation?: boolean
|
||||
} = {}
|
||||
): Promise<void> {
|
||||
const {
|
||||
dropPosition = { x: 100, y: 100 },
|
||||
fileName,
|
||||
url,
|
||||
waitForUpload = false
|
||||
waitForUpload = false,
|
||||
preserveNativePropagation = false
|
||||
} = options
|
||||
|
||||
if (!fileName && !url)
|
||||
@@ -43,7 +45,8 @@ export class DragDropHelper {
|
||||
fileType?: string
|
||||
buffer?: Uint8Array | number[]
|
||||
url?: string
|
||||
} = { dropPosition }
|
||||
preserveNativePropagation: boolean
|
||||
} = { dropPosition, preserveNativePropagation }
|
||||
|
||||
if (fileName) {
|
||||
const filePath = this.assetPath(fileName)
|
||||
@@ -115,15 +118,17 @@ export class DragDropHelper {
|
||||
)
|
||||
}
|
||||
|
||||
Object.defineProperty(dropEvent, 'preventDefault', {
|
||||
value: () => {},
|
||||
writable: false
|
||||
})
|
||||
if (!params.preserveNativePropagation) {
|
||||
Object.defineProperty(dropEvent, 'preventDefault', {
|
||||
value: () => {},
|
||||
writable: false
|
||||
})
|
||||
|
||||
Object.defineProperty(dropEvent, 'stopPropagation', {
|
||||
value: () => {},
|
||||
writable: false
|
||||
})
|
||||
Object.defineProperty(dropEvent, 'stopPropagation', {
|
||||
value: () => {},
|
||||
writable: false
|
||||
})
|
||||
}
|
||||
|
||||
targetElement.dispatchEvent(dragOverEvent)
|
||||
targetElement.dispatchEvent(dropEvent)
|
||||
@@ -154,7 +159,10 @@ export class DragDropHelper {
|
||||
|
||||
async dragAndDropURL(
|
||||
url: string,
|
||||
options: { dropPosition?: Position } = {}
|
||||
options: {
|
||||
dropPosition?: Position
|
||||
preserveNativePropagation?: boolean
|
||||
} = {}
|
||||
): Promise<void> {
|
||||
return this.dragAndDropExternalResource({ url, ...options })
|
||||
}
|
||||
|
||||
@@ -23,6 +23,7 @@ export interface PerfMeasurement {
|
||||
layoutDurationMs: number
|
||||
taskDurationMs: number
|
||||
heapDeltaBytes: number
|
||||
heapUsedBytes: number
|
||||
domNodes: number
|
||||
jsHeapTotalBytes: number
|
||||
scriptDurationMs: number
|
||||
@@ -190,6 +191,7 @@ export class PerformanceHelper {
|
||||
layoutDurationMs: delta('LayoutDuration') * 1000,
|
||||
taskDurationMs: delta('TaskDuration') * 1000,
|
||||
heapDeltaBytes: delta('JSHeapUsedSize'),
|
||||
heapUsedBytes: after.JSHeapUsedSize,
|
||||
domNodes: delta('Nodes'),
|
||||
jsHeapTotalBytes: delta('JSHeapTotalSize'),
|
||||
scriptDurationMs: delta('ScriptDuration') * 1000,
|
||||
|
||||
@@ -62,6 +62,8 @@ export const TestIds = {
|
||||
colorRed: 'red'
|
||||
},
|
||||
widgets: {
|
||||
container: 'node-widgets',
|
||||
widget: 'node-widget',
|
||||
decrement: 'decrement',
|
||||
increment: 'increment',
|
||||
domWidgetTextarea: 'dom-widget-textarea',
|
||||
|
||||
@@ -1,11 +1,10 @@
|
||||
import type { FullConfig } from '@playwright/test'
|
||||
import dotenv from 'dotenv'
|
||||
import { config as dotenvConfig } from 'dotenv'
|
||||
|
||||
import { backupPath } from './utils/backupUtils'
|
||||
|
||||
dotenv.config()
|
||||
dotenvConfig()
|
||||
|
||||
export default function globalSetup(_config: FullConfig) {
|
||||
export default function globalSetup() {
|
||||
if (!process.env.CI) {
|
||||
if (process.env.TEST_COMFYUI_DIR) {
|
||||
backupPath([process.env.TEST_COMFYUI_DIR, 'user'])
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
import type { FullConfig } from '@playwright/test'
|
||||
import dotenv from 'dotenv'
|
||||
import { config as dotenvConfig } from 'dotenv'
|
||||
|
||||
import { writePerfReport } from './helpers/perfReporter'
|
||||
import { restorePath } from './utils/backupUtils'
|
||||
|
||||
dotenv.config()
|
||||
dotenvConfig()
|
||||
|
||||
export default function globalTeardown(_config: FullConfig) {
|
||||
export default function globalTeardown() {
|
||||
writePerfReport()
|
||||
|
||||
if (!process.env.CI && process.env.TEST_COMFYUI_DIR) {
|
||||
|
||||
@@ -67,5 +67,44 @@ test.describe(
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
test('Load workflow from URL dropped onto Vue node', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const fakeUrl = 'https://example.com/workflow.png'
|
||||
await comfyPage.page.route(fakeUrl, (route) =>
|
||||
route.fulfill({
|
||||
path: comfyPage.assetPath('workflowInMedia/workflow_itxt.png')
|
||||
})
|
||||
)
|
||||
|
||||
await comfyPage.settings.setSetting('Comfy.VueNodes.Enabled', true)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const initialNodeCount = await comfyPage.nodeOps.getGraphNodesCount()
|
||||
|
||||
const node = comfyPage.vueNodes.getNodeByTitle('KSampler')
|
||||
const box = await node.boundingBox()
|
||||
expect(box).not.toBeNull()
|
||||
|
||||
const dropPosition = {
|
||||
x: box!.x + box!.width / 2,
|
||||
y: box!.y + box!.height / 2
|
||||
}
|
||||
|
||||
await comfyPage.dragDrop.dragAndDropURL(fakeUrl, {
|
||||
dropPosition,
|
||||
preserveNativePropagation: true
|
||||
})
|
||||
|
||||
await comfyPage.page.waitForFunction(
|
||||
(prevCount) => window.app!.graph.nodes.length !== prevCount,
|
||||
initialNodeCount,
|
||||
{ timeout: 10000 }
|
||||
)
|
||||
|
||||
const newNodeCount = await comfyPage.nodeOps.getGraphNodesCount()
|
||||
expect(newNodeCount).not.toBe(initialNodeCount)
|
||||
})
|
||||
}
|
||||
)
|
||||
|
||||
51
browser_tests/tests/subgraphNestedStaleProxyWidgets.spec.ts
Normal file
51
browser_tests/tests/subgraphNestedStaleProxyWidgets.spec.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import {
|
||||
comfyPageFixture as test,
|
||||
comfyExpect as expect
|
||||
} from '../fixtures/ComfyPage'
|
||||
import { TestIds } from '../fixtures/selectors'
|
||||
|
||||
const WORKFLOW = 'subgraphs/nested-subgraph-stale-proxy-widgets'
|
||||
|
||||
/**
|
||||
* Regression test for nested subgraph packing leaving stale proxyWidgets
|
||||
* on the outer SubgraphNode.
|
||||
*
|
||||
* When two CLIPTextEncode nodes (ids 6, 7) inside the outer subgraph are
|
||||
* packed into a nested subgraph (node 11), the outer SubgraphNode (id 10)
|
||||
* must drop the now-stale ["7","text"] and ["6","text"] proxy entries.
|
||||
* Only ["3","seed"] (KSampler) should remain.
|
||||
*
|
||||
* Stale entries render as "Disconnected" placeholder widgets (type "button").
|
||||
*
|
||||
* See: https://github.com/Comfy-Org/ComfyUI_frontend/pull/10390
|
||||
*/
|
||||
test.describe(
|
||||
'Nested subgraph stale proxyWidgets',
|
||||
{ tag: ['@subgraph', '@widget'] },
|
||||
() => {
|
||||
test.beforeEach(async ({ comfyPage }) => {
|
||||
await comfyPage.settings.setSetting('Comfy.VueNodes.Enabled', true)
|
||||
})
|
||||
|
||||
test('Outer subgraph node has no stale proxyWidgets after nested packing', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.workflow.loadWorkflow(WORKFLOW)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const outerNode = comfyPage.vueNodes.getNodeLocator('10')
|
||||
await expect(outerNode).toBeVisible()
|
||||
|
||||
const widgets = outerNode.getByTestId(TestIds.widgets.widget)
|
||||
|
||||
// Only the KSampler seed widget should be present — no stale
|
||||
// "Disconnected" placeholders from the packed CLIPTextEncode nodes.
|
||||
await expect(widgets).toHaveCount(1)
|
||||
await expect(widgets.first()).toBeVisible()
|
||||
|
||||
// Verify the seed widget is present via its label
|
||||
const seedWidget = outerNode.getByLabel('seed', { exact: true })
|
||||
await expect(seedWidget).toBeVisible()
|
||||
})
|
||||
}
|
||||
)
|
||||
@@ -27,6 +27,17 @@ const config: KnipConfig = {
|
||||
},
|
||||
'packages/ingest-types': {
|
||||
project: ['src/**/*.{js,ts}']
|
||||
},
|
||||
'apps/website': {
|
||||
entry: [
|
||||
'src/pages/**/*.astro',
|
||||
'src/layouts/**/*.astro',
|
||||
'src/components/**/*.vue',
|
||||
'src/styles/global.css',
|
||||
'astro.config.ts'
|
||||
],
|
||||
project: ['src/**/*.{astro,vue,ts}', '*.{js,ts,mjs}'],
|
||||
ignoreDependencies: ['@comfyorg/design-system', '@vercel/analytics']
|
||||
}
|
||||
},
|
||||
ignoreBinaries: ['python3'],
|
||||
|
||||
46
packages/design-system/src/css/base.css
Normal file
46
packages/design-system/src/css/base.css
Normal file
@@ -0,0 +1,46 @@
|
||||
/*
|
||||
* Design System Base — Brand tokens + fonts only.
|
||||
* For marketing sites that don't use PrimeVue or the node editor.
|
||||
* Import the full style.css instead for the desktop app.
|
||||
*/
|
||||
|
||||
@import './fonts.css';
|
||||
|
||||
@theme {
|
||||
/* Font Families */
|
||||
--font-inter: 'Inter', sans-serif;
|
||||
|
||||
/* Palette Colors */
|
||||
--color-charcoal-100: #55565e;
|
||||
--color-charcoal-200: #494a50;
|
||||
--color-charcoal-300: #3c3d42;
|
||||
--color-charcoal-400: #313235;
|
||||
--color-charcoal-500: #2d2e32;
|
||||
--color-charcoal-600: #262729;
|
||||
--color-charcoal-700: #202121;
|
||||
--color-charcoal-800: #171718;
|
||||
|
||||
--color-neutral-550: #636363;
|
||||
|
||||
--color-ash-300: #bbbbbb;
|
||||
--color-ash-500: #828282;
|
||||
--color-ash-800: #444444;
|
||||
|
||||
--color-smoke-100: #f3f3f3;
|
||||
--color-smoke-200: #e9e9e9;
|
||||
--color-smoke-300: #e1e1e1;
|
||||
--color-smoke-400: #d9d9d9;
|
||||
--color-smoke-500: #c5c5c5;
|
||||
--color-smoke-600: #b4b4b4;
|
||||
--color-smoke-700: #a0a0a0;
|
||||
--color-smoke-800: #8a8a8a;
|
||||
|
||||
--color-white: #ffffff;
|
||||
--color-black: #000000;
|
||||
|
||||
/* Brand Colors */
|
||||
--color-electric-400: #f0ff41;
|
||||
--color-sapphire-700: #172dd7;
|
||||
--color-brand-yellow: var(--color-electric-400);
|
||||
--color-brand-blue: var(--color-sapphire-700);
|
||||
}
|
||||
1665
pnpm-lock.yaml
generated
1665
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@@ -4,6 +4,7 @@ packages:
|
||||
|
||||
catalog:
|
||||
'@alloc/quick-lru': ^5.2.0
|
||||
'@astrojs/vue': ^5.0.0
|
||||
'@comfyorg/comfyui-electron-types': 0.6.2
|
||||
'@eslint/js': ^9.39.1
|
||||
'@formkit/auto-animate': ^0.9.0
|
||||
@@ -50,6 +51,7 @@ catalog:
|
||||
'@types/node': ^24.1.0
|
||||
'@types/semver': ^7.7.0
|
||||
'@types/three': ^0.169.0
|
||||
'@vercel/analytics': ^2.0.1
|
||||
'@vitejs/plugin-vue': ^6.0.0
|
||||
'@vitest/coverage-v8': ^4.0.16
|
||||
'@vitest/ui': ^4.0.16
|
||||
@@ -58,6 +60,7 @@ catalog:
|
||||
'@vueuse/integrations': ^14.2.0
|
||||
'@webgpu/types': ^0.1.66
|
||||
algoliasearch: ^5.21.0
|
||||
astro: ^5.10.0
|
||||
axios: ^1.13.5
|
||||
cross-env: ^10.1.0
|
||||
cva: 1.0.0-beta.4
|
||||
|
||||
@@ -22,6 +22,7 @@ interface PerfMeasurement {
|
||||
layoutDurationMs: number
|
||||
taskDurationMs: number
|
||||
heapDeltaBytes: number
|
||||
heapUsedBytes: number
|
||||
domNodes: number
|
||||
jsHeapTotalBytes: number
|
||||
scriptDurationMs: number
|
||||
@@ -43,22 +44,46 @@ const HISTORY_DIR = 'temp/perf-history'
|
||||
|
||||
type MetricKey =
|
||||
| 'styleRecalcs'
|
||||
| 'styleRecalcDurationMs'
|
||||
| 'layouts'
|
||||
| 'layoutDurationMs'
|
||||
| 'taskDurationMs'
|
||||
| 'domNodes'
|
||||
| 'scriptDurationMs'
|
||||
| 'eventListeners'
|
||||
| 'totalBlockingTimeMs'
|
||||
| 'frameDurationMs'
|
||||
const REPORTED_METRICS: { key: MetricKey; label: string; unit: string }[] = [
|
||||
{ key: 'styleRecalcs', label: 'style recalcs', unit: '' },
|
||||
{ key: 'layouts', label: 'layouts', unit: '' },
|
||||
| 'heapUsedBytes'
|
||||
|
||||
interface MetricDef {
|
||||
key: MetricKey
|
||||
label: string
|
||||
unit: string
|
||||
/** Minimum absolute delta to consider meaningful (effect size gate) */
|
||||
minAbsDelta?: number
|
||||
}
|
||||
|
||||
const REPORTED_METRICS: MetricDef[] = [
|
||||
{ key: 'layoutDurationMs', label: 'layout duration', unit: 'ms' },
|
||||
{
|
||||
key: 'styleRecalcDurationMs',
|
||||
label: 'style recalc duration',
|
||||
unit: 'ms'
|
||||
},
|
||||
{ key: 'layouts', label: 'layout count', unit: '', minAbsDelta: 5 },
|
||||
{
|
||||
key: 'styleRecalcs',
|
||||
label: 'style recalc count',
|
||||
unit: '',
|
||||
minAbsDelta: 5
|
||||
},
|
||||
{ key: 'taskDurationMs', label: 'task duration', unit: 'ms' },
|
||||
{ key: 'domNodes', label: 'DOM nodes', unit: '' },
|
||||
{ key: 'scriptDurationMs', label: 'script duration', unit: 'ms' },
|
||||
{ key: 'eventListeners', label: 'event listeners', unit: '' },
|
||||
{ key: 'totalBlockingTimeMs', label: 'TBT', unit: 'ms' },
|
||||
{ key: 'frameDurationMs', label: 'frame duration', unit: 'ms' }
|
||||
{ key: 'frameDurationMs', label: 'frame duration', unit: 'ms' },
|
||||
{ key: 'heapUsedBytes', label: 'heap used', unit: 'bytes' },
|
||||
{ key: 'domNodes', label: 'DOM nodes', unit: '', minAbsDelta: 5 },
|
||||
{ key: 'eventListeners', label: 'event listeners', unit: '', minAbsDelta: 5 }
|
||||
]
|
||||
|
||||
function groupByName(
|
||||
@@ -134,7 +159,9 @@ function computeCV(stats: MetricStats): number {
|
||||
}
|
||||
|
||||
function formatValue(value: number, unit: string): string {
|
||||
return unit === 'ms' ? `${value.toFixed(0)}ms` : `${value.toFixed(0)}`
|
||||
if (unit === 'ms') return `${value.toFixed(0)}ms`
|
||||
if (unit === 'bytes') return formatBytes(value)
|
||||
return `${value.toFixed(0)}`
|
||||
}
|
||||
|
||||
function formatDelta(pct: number | null): string {
|
||||
@@ -159,6 +186,21 @@ function meanMetric(samples: PerfMeasurement[], key: MetricKey): number | null {
|
||||
return values.reduce((sum, v) => sum + v, 0) / values.length
|
||||
}
|
||||
|
||||
function medianMetric(
|
||||
samples: PerfMeasurement[],
|
||||
key: MetricKey
|
||||
): number | null {
|
||||
const values = samples
|
||||
.map((s) => getMetricValue(s, key))
|
||||
.filter((v): v is number => v !== null)
|
||||
.sort((a, b) => a - b)
|
||||
if (values.length === 0) return null
|
||||
const mid = Math.floor(values.length / 2)
|
||||
return values.length % 2 === 0
|
||||
? (values[mid - 1] + values[mid]) / 2
|
||||
: values[mid]
|
||||
}
|
||||
|
||||
function formatBytes(bytes: number): string {
|
||||
if (Math.abs(bytes) < 1024) return `${bytes} B`
|
||||
if (Math.abs(bytes) < 1024 * 1024) return `${(bytes / 1024).toFixed(1)} KB`
|
||||
@@ -173,7 +215,7 @@ function renderFullReport(
|
||||
const lines: string[] = []
|
||||
const baselineGroups = groupByName(baseline.measurements)
|
||||
const tableHeader = [
|
||||
'| Metric | Baseline | PR (n=3) | Δ | Sig |',
|
||||
'| Metric | Baseline | PR (median) | Δ | Sig |',
|
||||
'|--------|----------|----------|---|-----|'
|
||||
]
|
||||
|
||||
@@ -183,36 +225,38 @@ function renderFullReport(
|
||||
for (const [testName, prSamples] of prGroups) {
|
||||
const baseSamples = baselineGroups.get(testName)
|
||||
|
||||
for (const { key, label, unit } of REPORTED_METRICS) {
|
||||
const prMean = meanMetric(prSamples, key)
|
||||
if (prMean === null) continue
|
||||
for (const { key, label, unit, minAbsDelta } of REPORTED_METRICS) {
|
||||
// Use median for PR values — robust to outlier runs in CI
|
||||
const prVal = medianMetric(prSamples, key)
|
||||
if (prVal === null) continue
|
||||
const histStats = getHistoricalStats(historical, testName, key)
|
||||
const cv = computeCV(histStats)
|
||||
|
||||
if (!baseSamples?.length) {
|
||||
allRows.push(
|
||||
`| ${testName}: ${label} | — | ${formatValue(prMean, unit)} | new | — |`
|
||||
`| ${testName}: ${label} | — | ${formatValue(prVal, unit)} | new | — |`
|
||||
)
|
||||
continue
|
||||
}
|
||||
|
||||
const baseVal = meanMetric(baseSamples, key)
|
||||
const baseVal = medianMetric(baseSamples, key)
|
||||
if (baseVal === null) {
|
||||
allRows.push(
|
||||
`| ${testName}: ${label} | — | ${formatValue(prMean, unit)} | new | — |`
|
||||
`| ${testName}: ${label} | — | ${formatValue(prVal, unit)} | new | — |`
|
||||
)
|
||||
continue
|
||||
}
|
||||
const absDelta = prVal - baseVal
|
||||
const deltaPct =
|
||||
baseVal === 0
|
||||
? prMean === 0
|
||||
? prVal === 0
|
||||
? 0
|
||||
: null
|
||||
: ((prMean - baseVal) / baseVal) * 100
|
||||
const z = zScore(prMean, histStats)
|
||||
const sig = classifyChange(z, cv)
|
||||
: ((prVal - baseVal) / baseVal) * 100
|
||||
const z = zScore(prVal, histStats)
|
||||
const sig = classifyChange(z, cv, absDelta, minAbsDelta)
|
||||
|
||||
const row = `| ${testName}: ${label} | ${formatValue(baseVal, unit)} | ${formatValue(prMean, unit)} | ${formatDelta(deltaPct)} | ${formatSignificance(sig, z)} |`
|
||||
const row = `| ${testName}: ${label} | ${formatValue(baseVal, unit)} | ${formatValue(prVal, unit)} | ${formatDelta(deltaPct)} | ${formatSignificance(sig, z)} |`
|
||||
allRows.push(row)
|
||||
if (isNoteworthy(sig)) {
|
||||
flaggedRows.push(row)
|
||||
@@ -299,7 +343,7 @@ function renderColdStartReport(
|
||||
const lines: string[] = []
|
||||
const baselineGroups = groupByName(baseline.measurements)
|
||||
lines.push(
|
||||
`> ℹ️ Collecting baseline variance data (${historicalCount}/5 runs). Significance will appear after 2 main branch runs.`,
|
||||
`> ℹ️ Collecting baseline variance data (${historicalCount}/15 runs). Significance will appear after 2 main branch runs.`,
|
||||
'',
|
||||
'| Metric | Baseline | PR | Δ |',
|
||||
'|--------|----------|-----|---|'
|
||||
@@ -309,31 +353,31 @@ function renderColdStartReport(
|
||||
const baseSamples = baselineGroups.get(testName)
|
||||
|
||||
for (const { key, label, unit } of REPORTED_METRICS) {
|
||||
const prMean = meanMetric(prSamples, key)
|
||||
if (prMean === null) continue
|
||||
const prVal = medianMetric(prSamples, key)
|
||||
if (prVal === null) continue
|
||||
|
||||
if (!baseSamples?.length) {
|
||||
lines.push(
|
||||
`| ${testName}: ${label} | — | ${formatValue(prMean, unit)} | new |`
|
||||
`| ${testName}: ${label} | — | ${formatValue(prVal, unit)} | new |`
|
||||
)
|
||||
continue
|
||||
}
|
||||
|
||||
const baseVal = meanMetric(baseSamples, key)
|
||||
const baseVal = medianMetric(baseSamples, key)
|
||||
if (baseVal === null) {
|
||||
lines.push(
|
||||
`| ${testName}: ${label} | — | ${formatValue(prMean, unit)} | new |`
|
||||
`| ${testName}: ${label} | — | ${formatValue(prVal, unit)} | new |`
|
||||
)
|
||||
continue
|
||||
}
|
||||
const deltaPct =
|
||||
baseVal === 0
|
||||
? prMean === 0
|
||||
? prVal === 0
|
||||
? 0
|
||||
: null
|
||||
: ((prMean - baseVal) / baseVal) * 100
|
||||
: ((prVal - baseVal) / baseVal) * 100
|
||||
lines.push(
|
||||
`| ${testName}: ${label} | ${formatValue(baseVal, unit)} | ${formatValue(prMean, unit)} | ${formatDelta(deltaPct)} |`
|
||||
`| ${testName}: ${label} | ${formatValue(baseVal, unit)} | ${formatValue(prVal, unit)} | ${formatDelta(deltaPct)} |`
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -352,14 +396,10 @@ function renderNoBaselineReport(
|
||||
)
|
||||
for (const [testName, prSamples] of prGroups) {
|
||||
for (const { key, label, unit } of REPORTED_METRICS) {
|
||||
const prMean = meanMetric(prSamples, key)
|
||||
if (prMean === null) continue
|
||||
lines.push(`| ${testName}: ${label} | ${formatValue(prMean, unit)} |`)
|
||||
const prVal = medianMetric(prSamples, key)
|
||||
if (prVal === null) continue
|
||||
lines.push(`| ${testName}: ${label} | ${formatValue(prVal, unit)} |`)
|
||||
}
|
||||
const heapMean =
|
||||
prSamples.reduce((sum, s) => sum + (s.heapDeltaBytes ?? 0), 0) /
|
||||
prSamples.length
|
||||
lines.push(`| ${testName}: heap delta | ${formatBytes(heapMean)} |`)
|
||||
}
|
||||
return lines
|
||||
}
|
||||
|
||||
@@ -99,6 +99,21 @@ describe('classifyChange', () => {
|
||||
expect(classifyChange(2, 10)).toBe('neutral')
|
||||
expect(classifyChange(-2, 10)).toBe('neutral')
|
||||
})
|
||||
|
||||
it('returns neutral when absDelta below minAbsDelta despite high z', () => {
|
||||
// z=7.2 but only 1 unit change with minAbsDelta=5
|
||||
expect(classifyChange(7.2, 10, 1, 5)).toBe('neutral')
|
||||
expect(classifyChange(-7.2, 10, -1, 5)).toBe('neutral')
|
||||
})
|
||||
|
||||
it('returns regression when absDelta meets minAbsDelta', () => {
|
||||
expect(classifyChange(3, 10, 10, 5)).toBe('regression')
|
||||
})
|
||||
|
||||
it('ignores effect size gate when minAbsDelta not provided', () => {
|
||||
expect(classifyChange(3, 10)).toBe('regression')
|
||||
expect(classifyChange(3, 10, 1)).toBe('regression')
|
||||
})
|
||||
})
|
||||
|
||||
describe('formatSignificance', () => {
|
||||
|
||||
@@ -31,12 +31,28 @@ export function zScore(value: number, stats: MetricStats): number | null {
|
||||
|
||||
export type Significance = 'regression' | 'improvement' | 'neutral' | 'noisy'
|
||||
|
||||
/**
|
||||
* Classify a metric change as regression/improvement/neutral/noisy.
|
||||
*
|
||||
* Uses both statistical significance (z-score) and practical significance
|
||||
* (effect size gate via minAbsDelta) to reduce false positives from
|
||||
* integer-quantized metrics with near-zero variance.
|
||||
*/
|
||||
export function classifyChange(
|
||||
z: number | null,
|
||||
historicalCV: number
|
||||
historicalCV: number,
|
||||
absDelta?: number,
|
||||
minAbsDelta?: number
|
||||
): Significance {
|
||||
if (historicalCV > 50) return 'noisy'
|
||||
if (z === null) return 'neutral'
|
||||
|
||||
// Effect size gate: require minimum absolute change for count metrics
|
||||
// to avoid flagging e.g. 11→12 style recalcs as z=7.2 regression.
|
||||
if (minAbsDelta !== undefined && absDelta !== undefined) {
|
||||
if (Math.abs(absDelta) < minAbsDelta) return 'neutral'
|
||||
}
|
||||
|
||||
if (z > 2) return 'regression'
|
||||
if (z < -2) return 'improvement'
|
||||
return 'neutral'
|
||||
|
||||
94
src/composables/useCopyToClipboard.test.ts
Normal file
94
src/composables/useCopyToClipboard.test.ts
Normal file
@@ -0,0 +1,94 @@
|
||||
import { computed, ref } from 'vue'
|
||||
import { beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
|
||||
const mockCopy = vi.fn()
|
||||
const mockToastAdd = vi.fn()
|
||||
|
||||
vi.mock('@vueuse/core', () => ({
|
||||
useClipboard: vi.fn(() => ({
|
||||
copy: mockCopy,
|
||||
copied: ref(false),
|
||||
isSupported: computed(() => true)
|
||||
}))
|
||||
}))
|
||||
|
||||
vi.mock('primevue/usetoast', () => ({
|
||||
useToast: vi.fn(() => ({
|
||||
add: mockToastAdd
|
||||
}))
|
||||
}))
|
||||
|
||||
vi.mock('@/i18n', () => ({
|
||||
t: (key: string) => key
|
||||
}))
|
||||
|
||||
import { useClipboard } from '@vueuse/core'
|
||||
import { useCopyToClipboard } from '@/composables/useCopyToClipboard'
|
||||
|
||||
describe('useCopyToClipboard', () => {
|
||||
beforeEach(() => {
|
||||
vi.resetAllMocks()
|
||||
vi.mocked(useClipboard).mockReturnValue({
|
||||
copy: mockCopy,
|
||||
copied: ref(false),
|
||||
isSupported: computed(() => true),
|
||||
text: ref('')
|
||||
})
|
||||
})
|
||||
|
||||
it('shows success toast when modern clipboard succeeds', async () => {
|
||||
mockCopy.mockResolvedValue(undefined)
|
||||
|
||||
const { copyToClipboard } = useCopyToClipboard()
|
||||
await copyToClipboard('hello')
|
||||
|
||||
expect(mockCopy).toHaveBeenCalledWith('hello')
|
||||
expect(mockToastAdd).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ severity: 'success' })
|
||||
)
|
||||
})
|
||||
|
||||
it('falls back to legacy when modern clipboard fails', async () => {
|
||||
mockCopy.mockRejectedValue(new Error('Not allowed'))
|
||||
document.execCommand = vi.fn(() => true)
|
||||
|
||||
const { copyToClipboard } = useCopyToClipboard()
|
||||
await copyToClipboard('hello')
|
||||
|
||||
expect(document.execCommand).toHaveBeenCalledWith('copy')
|
||||
expect(mockToastAdd).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ severity: 'success' })
|
||||
)
|
||||
})
|
||||
|
||||
it('shows error toast when both modern and legacy fail', async () => {
|
||||
mockCopy.mockRejectedValue(new Error('Not allowed'))
|
||||
document.execCommand = vi.fn(() => false)
|
||||
|
||||
const { copyToClipboard } = useCopyToClipboard()
|
||||
await copyToClipboard('hello')
|
||||
|
||||
expect(mockToastAdd).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ severity: 'error' })
|
||||
)
|
||||
})
|
||||
|
||||
it('falls through to legacy when isSupported is false', async () => {
|
||||
vi.mocked(useClipboard).mockReturnValue({
|
||||
copy: mockCopy,
|
||||
copied: ref(false),
|
||||
isSupported: computed(() => false),
|
||||
text: ref('')
|
||||
})
|
||||
document.execCommand = vi.fn(() => true)
|
||||
|
||||
const { copyToClipboard } = useCopyToClipboard()
|
||||
await copyToClipboard('hello')
|
||||
|
||||
expect(mockCopy).not.toHaveBeenCalled()
|
||||
expect(document.execCommand).toHaveBeenCalledWith('copy')
|
||||
expect(mockToastAdd).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ severity: 'success' })
|
||||
)
|
||||
})
|
||||
})
|
||||
@@ -3,34 +3,60 @@ import { useToast } from 'primevue/usetoast'
|
||||
|
||||
import { t } from '@/i18n'
|
||||
|
||||
function legacyCopy(text: string): boolean {
|
||||
const textarea = document.createElement('textarea')
|
||||
textarea.setAttribute('readonly', '')
|
||||
textarea.value = text
|
||||
textarea.style.position = 'fixed'
|
||||
textarea.style.left = '-9999px'
|
||||
textarea.style.top = '-9999px'
|
||||
document.body.appendChild(textarea)
|
||||
textarea.select()
|
||||
try {
|
||||
return document.execCommand('copy')
|
||||
} finally {
|
||||
textarea.remove()
|
||||
}
|
||||
}
|
||||
|
||||
export function useCopyToClipboard() {
|
||||
const { copy, copied } = useClipboard({ legacy: true })
|
||||
const { copy, isSupported } = useClipboard()
|
||||
const toast = useToast()
|
||||
|
||||
async function copyToClipboard(text: string) {
|
||||
let success = false
|
||||
|
||||
try {
|
||||
await copy(text)
|
||||
if (copied.value) {
|
||||
toast.add({
|
||||
severity: 'success',
|
||||
summary: t('g.success'),
|
||||
detail: t('clipboard.successMessage'),
|
||||
life: 3000
|
||||
})
|
||||
} else {
|
||||
toast.add({
|
||||
severity: 'error',
|
||||
summary: t('g.error'),
|
||||
detail: t('clipboard.errorMessage')
|
||||
})
|
||||
if (isSupported.value) {
|
||||
await copy(text)
|
||||
success = true
|
||||
}
|
||||
} catch {
|
||||
toast.add({
|
||||
severity: 'error',
|
||||
summary: t('g.error'),
|
||||
detail: t('clipboard.errorMessage')
|
||||
})
|
||||
// Modern clipboard API failed, fall through to legacy
|
||||
}
|
||||
|
||||
if (!success) {
|
||||
try {
|
||||
success = legacyCopy(text)
|
||||
} catch {
|
||||
// Legacy also failed
|
||||
}
|
||||
}
|
||||
|
||||
toast.add(
|
||||
success
|
||||
? {
|
||||
severity: 'success',
|
||||
summary: t('g.success'),
|
||||
detail: t('clipboard.successMessage'),
|
||||
life: 3000
|
||||
}
|
||||
: {
|
||||
severity: 'error',
|
||||
summary: t('g.error'),
|
||||
detail: t('clipboard.errorMessage')
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
|
||||
@@ -751,7 +751,7 @@ describe('SubgraphNode.widgets getter', () => {
|
||||
])
|
||||
})
|
||||
|
||||
test('full linked coverage does not prune unresolved independent fallback promotions', () => {
|
||||
test('full linked coverage prunes promotions referencing non-existent nodes', () => {
|
||||
const subgraph = createTestSubgraph({
|
||||
inputs: [{ name: 'widgetA', type: '*' }]
|
||||
})
|
||||
@@ -776,9 +776,9 @@ describe('SubgraphNode.widgets getter', () => {
|
||||
subgraphNode.rootGraph.id,
|
||||
subgraphNode.id
|
||||
)
|
||||
// Node 9999 does not exist in the subgraph, so its entry is pruned
|
||||
expect(promotions).toStrictEqual([
|
||||
{ sourceNodeId: String(liveNode.id), sourceWidgetName: 'widgetA' },
|
||||
{ sourceNodeId: '9999', sourceWidgetName: 'widgetA' }
|
||||
{ sourceNodeId: String(liveNode.id), sourceWidgetName: 'widgetA' }
|
||||
])
|
||||
})
|
||||
|
||||
|
||||
@@ -958,3 +958,69 @@ describe('SubgraphNode promotion view keys', () => {
|
||||
expect(firstKey).not.toBe(secondKey)
|
||||
})
|
||||
})
|
||||
|
||||
describe('SubgraphNode label propagation', () => {
|
||||
it('should preserve input labels from configure path', () => {
|
||||
const subgraph = createTestSubgraph({
|
||||
inputs: [{ name: 'steps', type: 'number' }]
|
||||
})
|
||||
subgraph.inputs[0].label = 'Steps Count'
|
||||
|
||||
const subgraphNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
expect(subgraphNode.inputs[0].label).toBe('Steps Count')
|
||||
expect(subgraphNode.inputs[0].name).toBe('steps')
|
||||
})
|
||||
|
||||
it('should preserve output labels from configure path', () => {
|
||||
const subgraph = createTestSubgraph({
|
||||
outputs: [{ name: 'result', type: 'number' }]
|
||||
})
|
||||
subgraph.outputs[0].label = 'Final Result'
|
||||
|
||||
const subgraphNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
expect(subgraphNode.outputs[0].label).toBe('Final Result')
|
||||
expect(subgraphNode.outputs[0].name).toBe('result')
|
||||
})
|
||||
|
||||
it('should propagate label via renaming-input event', () => {
|
||||
const subgraph = createTestSubgraph()
|
||||
const subgraphNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
subgraph.addInput('steps', 'number')
|
||||
expect(subgraphNode.inputs[0].label).toBeUndefined()
|
||||
|
||||
subgraph.renameInput(subgraph.inputs[0], 'Steps Count')
|
||||
|
||||
expect(subgraphNode.inputs[0].label).toBe('Steps Count')
|
||||
expect(subgraphNode.inputs[0].name).toBe('steps')
|
||||
})
|
||||
|
||||
it('should propagate label via renaming-output event', () => {
|
||||
const subgraph = createTestSubgraph()
|
||||
const subgraphNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
subgraph.addOutput('result', 'number')
|
||||
expect(subgraphNode.outputs[0].label).toBeUndefined()
|
||||
|
||||
subgraph.renameOutput(subgraph.outputs[0], 'Final Result')
|
||||
|
||||
expect(subgraphNode.outputs[0].label).toBe('Final Result')
|
||||
expect(subgraphNode.outputs[0].name).toBe('result')
|
||||
})
|
||||
|
||||
it('should preserve localized_name from configure path', () => {
|
||||
const subgraph = createTestSubgraph({
|
||||
inputs: [{ name: 'steps', type: 'number' }],
|
||||
outputs: [{ name: 'result', type: 'number' }]
|
||||
})
|
||||
subgraph.inputs[0].localized_name = 'ステップ'
|
||||
subgraph.outputs[0].localized_name = '結果'
|
||||
|
||||
const subgraphNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
expect(subgraphNode.inputs[0].localized_name).toBe('ステップ')
|
||||
expect(subgraphNode.outputs[0].localized_name).toBe('結果')
|
||||
})
|
||||
})
|
||||
|
||||
@@ -515,6 +515,8 @@ export class SubgraphNode extends LGraphNode implements BaseLGraph {
|
||||
const prunedEntries: PromotedWidgetSource[] = []
|
||||
|
||||
for (const entry of fallbackStoredEntries) {
|
||||
if (!this.subgraph.getNodeById(entry.sourceNodeId)) continue
|
||||
|
||||
const concreteKey = this._resolveConcretePromotionEntryKey(entry)
|
||||
if (concreteKey && linkedConcreteKeys.has(concreteKey)) continue
|
||||
|
||||
@@ -1063,6 +1065,7 @@ export class SubgraphNode extends LGraphNode implements BaseLGraph {
|
||||
}
|
||||
return null
|
||||
}
|
||||
if (!this.subgraph.getNodeById(nodeId)) return null
|
||||
const entry: PromotedWidgetSource = {
|
||||
sourceNodeId: nodeId,
|
||||
sourceWidgetName: widgetName,
|
||||
@@ -1074,8 +1077,8 @@ export class SubgraphNode extends LGraphNode implements BaseLGraph {
|
||||
|
||||
store.setPromotions(this.rootGraph.id, this.id, entries)
|
||||
|
||||
// Write back resolved entries so legacy -1 format doesn't persist
|
||||
if (raw.some(([id]) => id === '-1')) {
|
||||
// Write back resolved entries so legacy or stale entries don't persist
|
||||
if (entries.length !== raw.length) {
|
||||
this.properties.proxyWidgets = this._serializeEntries(entries)
|
||||
}
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ import type {
|
||||
TWidgetType
|
||||
} from '@/lib/litegraph/src/litegraph'
|
||||
import { BaseWidget, LGraphNode } from '@/lib/litegraph/src/litegraph'
|
||||
import { isPromotedWidgetView } from '@/core/graph/subgraph/promotedWidgetTypes'
|
||||
|
||||
import {
|
||||
createEventCapture,
|
||||
@@ -263,6 +264,62 @@ describe('SubgraphWidgetPromotion', () => {
|
||||
})
|
||||
})
|
||||
|
||||
describe('Nested Subgraph Widget Promotion', () => {
|
||||
it('should prune proxyWidgets referencing nodes not in subgraph on configure', () => {
|
||||
// Reproduces the bug where packing nodes into a nested subgraph leaves
|
||||
// stale proxyWidgets on the outer subgraph node referencing grandchild
|
||||
// node IDs that no longer exist directly in the outer subgraph.
|
||||
// Uses 3 inputs with only 1 having a linked widget entry, matching the
|
||||
// real workflow structure where model/vae inputs don't resolve widgets.
|
||||
const subgraph = createTestSubgraph({
|
||||
inputs: [
|
||||
{ name: 'clip', type: 'CLIP' },
|
||||
{ name: 'model', type: 'MODEL' },
|
||||
{ name: 'vae', type: 'VAE' }
|
||||
]
|
||||
})
|
||||
|
||||
const { node: samplerNode } = createNodeWithWidget(
|
||||
'Sampler',
|
||||
'number',
|
||||
42,
|
||||
'number'
|
||||
)
|
||||
subgraph.add(samplerNode)
|
||||
subgraph.inputNode.slots[1].connect(samplerNode.inputs[0], samplerNode)
|
||||
|
||||
// Add nodes without widget-connected inputs for the other slots
|
||||
const modelNode = new LGraphNode('ModelNode')
|
||||
modelNode.addInput('model', 'MODEL')
|
||||
subgraph.add(modelNode)
|
||||
|
||||
const vaeNode = new LGraphNode('VAENode')
|
||||
vaeNode.addInput('vae', 'VAE')
|
||||
subgraph.add(vaeNode)
|
||||
|
||||
const outerNode = createTestSubgraphNode(subgraph)
|
||||
|
||||
// Inject stale proxyWidgets referencing nodes that don't exist in
|
||||
// this subgraph (they were packed into a nested subgraph)
|
||||
outerNode.properties.proxyWidgets = [
|
||||
['999', 'text'],
|
||||
['998', 'text'],
|
||||
[String(samplerNode.id), 'widget']
|
||||
]
|
||||
|
||||
outerNode.configure(outerNode.serialize())
|
||||
|
||||
// Check widgets getter — stale entries should not produce views
|
||||
const widgetSourceIds = outerNode.widgets
|
||||
.filter(isPromotedWidgetView)
|
||||
.filter((w) => !w.name.startsWith('$$'))
|
||||
.map((w) => w.sourceNodeId)
|
||||
|
||||
expect(widgetSourceIds).not.toContain('999')
|
||||
expect(widgetSourceIds).not.toContain('998')
|
||||
})
|
||||
})
|
||||
|
||||
describe('Tooltip Promotion', () => {
|
||||
it('should preserve widget tooltip when promoting', () => {
|
||||
const subgraph = createTestSubgraph({
|
||||
|
||||
@@ -24,7 +24,13 @@ vi.mock('@/platform/telemetry/topupTracker', () => ({
|
||||
|
||||
const hoisted = vi.hoisted(() => ({
|
||||
mockNodeDefsByName: {} as Record<string, unknown>,
|
||||
mockNodes: [] as Pick<LGraphNode, 'type' | 'isSubgraphNode'>[]
|
||||
mockNodes: [] as Pick<LGraphNode, 'type' | 'isSubgraphNode'>[],
|
||||
mockActiveWorkflow: null as null | {
|
||||
filename: string
|
||||
fullFilename: string
|
||||
},
|
||||
mockKnownTemplateNames: new Set<string>(),
|
||||
mockTemplateByName: null as null | { sourceModule?: string }
|
||||
}))
|
||||
|
||||
vi.mock('@/stores/nodeDefStore', () => ({
|
||||
@@ -35,7 +41,9 @@ vi.mock('@/stores/nodeDefStore', () => ({
|
||||
|
||||
vi.mock('@/platform/workflow/management/stores/workflowStore', () => ({
|
||||
useWorkflowStore: () => ({
|
||||
activeWorkflow: null
|
||||
get activeWorkflow() {
|
||||
return hoisted.mockActiveWorkflow
|
||||
}
|
||||
})
|
||||
}))
|
||||
|
||||
@@ -43,7 +51,11 @@ vi.mock(
|
||||
'@/platform/workflow/templates/repositories/workflowTemplatesStore',
|
||||
() => ({
|
||||
useWorkflowTemplatesStore: () => ({
|
||||
knownTemplateNames: new Set()
|
||||
get knownTemplateNames() {
|
||||
return hoisted.mockKnownTemplateNames
|
||||
},
|
||||
getTemplateByName: (_name: string) => hoisted.mockTemplateByName,
|
||||
getEnglishMetadata: () => null
|
||||
})
|
||||
})
|
||||
)
|
||||
@@ -85,6 +97,9 @@ describe('getExecutionContext', () => {
|
||||
for (const key of Object.keys(hoisted.mockNodeDefsByName)) {
|
||||
delete hoisted.mockNodeDefsByName[key]
|
||||
}
|
||||
hoisted.mockActiveWorkflow = null
|
||||
hoisted.mockKnownTemplateNames = new Set()
|
||||
hoisted.mockTemplateByName = null
|
||||
})
|
||||
|
||||
it('returns has_toolkit_nodes false when no toolkit nodes are present', () => {
|
||||
@@ -175,4 +190,50 @@ describe('getExecutionContext', () => {
|
||||
expect(context.has_toolkit_nodes).toBe(true)
|
||||
expect(context.toolkit_node_names).toEqual(['ImageCrop'])
|
||||
})
|
||||
|
||||
describe('template detection', () => {
|
||||
it('detects a regular template by name', () => {
|
||||
hoisted.mockKnownTemplateNames = new Set(['flux-dev'])
|
||||
hoisted.mockTemplateByName = { sourceModule: 'default' }
|
||||
hoisted.mockActiveWorkflow = {
|
||||
filename: 'flux-dev',
|
||||
fullFilename: 'flux-dev.json'
|
||||
}
|
||||
|
||||
const context = getExecutionContext()
|
||||
|
||||
expect(context.is_template).toBe(true)
|
||||
expect(context.workflow_name).toBe('flux-dev')
|
||||
})
|
||||
|
||||
it('detects an app mode template whose name ends with .app', () => {
|
||||
hoisted.mockKnownTemplateNames = new Set([
|
||||
'templates-qwen_multiangle.app'
|
||||
])
|
||||
hoisted.mockTemplateByName = { sourceModule: 'default' }
|
||||
// getFilenameDetails strips ".app.json" as a compound extension, yielding
|
||||
// filename = "templates-qwen_multiangle" — the previous code would fail here.
|
||||
hoisted.mockActiveWorkflow = {
|
||||
filename: 'templates-qwen_multiangle',
|
||||
fullFilename: 'templates-qwen_multiangle.app.json'
|
||||
}
|
||||
|
||||
const context = getExecutionContext()
|
||||
|
||||
expect(context.is_template).toBe(true)
|
||||
expect(context.workflow_name).toBe('templates-qwen_multiangle.app')
|
||||
})
|
||||
|
||||
it('does not flag a non-template workflow as a template', () => {
|
||||
hoisted.mockKnownTemplateNames = new Set(['flux-dev'])
|
||||
hoisted.mockActiveWorkflow = {
|
||||
filename: 'my-custom-workflow',
|
||||
fullFilename: 'my-custom-workflow.json'
|
||||
}
|
||||
|
||||
const context = getExecutionContext()
|
||||
|
||||
expect(context.is_template).toBe(false)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -80,20 +80,21 @@ export function getExecutionContext(): ExecutionContext {
|
||||
)
|
||||
|
||||
if (activeWorkflow?.filename) {
|
||||
const isTemplate = templatesStore.knownTemplateNames.has(
|
||||
activeWorkflow.filename
|
||||
)
|
||||
// Use fullFilename minus .json to reconstruct the template name, which
|
||||
// preserves compound suffixes like ".app" (e.g. "foo.app.json" → "foo.app").
|
||||
// Using just `filename` strips ".app.json" entirely (e.g. "foo"), which
|
||||
// won't match knownTemplateNames entries like "foo.app".
|
||||
const templateName = activeWorkflow.fullFilename.replace(/\.json$/i, '')
|
||||
const isTemplate = templatesStore.knownTemplateNames.has(templateName)
|
||||
|
||||
if (isTemplate) {
|
||||
const template = templatesStore.getTemplateByName(activeWorkflow.filename)
|
||||
const template = templatesStore.getTemplateByName(templateName)
|
||||
|
||||
const englishMetadata = templatesStore.getEnglishMetadata(
|
||||
activeWorkflow.filename
|
||||
)
|
||||
const englishMetadata = templatesStore.getEnglishMetadata(templateName)
|
||||
|
||||
return {
|
||||
is_template: true,
|
||||
workflow_name: activeWorkflow.filename,
|
||||
workflow_name: templateName,
|
||||
template_source: template?.sourceModule,
|
||||
template_category: englishMetadata?.category ?? template?.category,
|
||||
template_tags: englishMetadata?.tags ?? template?.tags,
|
||||
|
||||
@@ -13,9 +13,21 @@ import { useCanvasStore } from '@/renderer/core/canvas/canvasStore'
|
||||
import { setActivePinia } from 'pinia'
|
||||
|
||||
const mockData = vi.hoisted(() => ({
|
||||
mockExecuting: false
|
||||
mockExecuting: false,
|
||||
mockLgraphNode: null as Record<string, unknown> | null
|
||||
}))
|
||||
|
||||
vi.mock('@/utils/graphTraversalUtil', async (importOriginal) => {
|
||||
const actual = (await importOriginal()) as Record<string, unknown>
|
||||
return {
|
||||
...actual,
|
||||
getLocatorIdFromNodeData: vi.fn(() => 'test-node-123'),
|
||||
getNodeByLocatorId: vi.fn(
|
||||
() => mockData.mockLgraphNode ?? { isSubgraphNode: () => false }
|
||||
)
|
||||
}
|
||||
})
|
||||
|
||||
vi.mock('@/renderer/core/layout/transform/useTransformState', () => {
|
||||
return {
|
||||
useTransformState: () => ({
|
||||
@@ -49,16 +61,6 @@ vi.mock('@/scripts/app', () => ({
|
||||
}
|
||||
}))
|
||||
|
||||
vi.mock('@/utils/graphTraversalUtil', async (importOriginal) => {
|
||||
const actual = (await importOriginal()) as Record<string, unknown>
|
||||
return {
|
||||
...actual,
|
||||
getNodeByLocatorId: vi.fn(() => ({
|
||||
isSubgraphNode: () => false
|
||||
}))
|
||||
}
|
||||
})
|
||||
|
||||
vi.mock('@/composables/useErrorHandling', () => ({
|
||||
useErrorHandling: () => ({
|
||||
toastErrorHandler: vi.fn()
|
||||
@@ -293,4 +295,52 @@ describe('LGraphNode', () => {
|
||||
expect(wrapper.find('[role="button"][aria-label]').exists()).toBe(true)
|
||||
})
|
||||
})
|
||||
|
||||
describe('handleDrop', () => {
|
||||
it('should stop propagation when onDragDrop returns true', async () => {
|
||||
const onDragDrop = vi.fn().mockReturnValue(true)
|
||||
mockData.mockLgraphNode = {
|
||||
onDragDrop,
|
||||
onDragOver: vi.fn(),
|
||||
isSubgraphNode: () => false
|
||||
}
|
||||
|
||||
const wrapper = mountLGraphNode({ nodeData: mockNodeData })
|
||||
|
||||
const parentListener = vi.fn()
|
||||
const parent = wrapper.element.parentElement
|
||||
expect(parent).not.toBeNull()
|
||||
parent!.addEventListener('drop', parentListener)
|
||||
|
||||
wrapper.element.dispatchEvent(
|
||||
new Event('drop', { bubbles: true, cancelable: true })
|
||||
)
|
||||
|
||||
expect(onDragDrop).toHaveBeenCalled()
|
||||
expect(parentListener).not.toHaveBeenCalled()
|
||||
})
|
||||
|
||||
it('should not stop propagation when onDragDrop returns false', async () => {
|
||||
const onDragDrop = vi.fn().mockReturnValue(false)
|
||||
mockData.mockLgraphNode = {
|
||||
onDragDrop,
|
||||
onDragOver: vi.fn(),
|
||||
isSubgraphNode: () => false
|
||||
}
|
||||
|
||||
const wrapper = mountLGraphNode({ nodeData: mockNodeData })
|
||||
|
||||
const parentListener = vi.fn()
|
||||
const parent = wrapper.element.parentElement
|
||||
expect(parent).not.toBeNull()
|
||||
parent!.addEventListener('drop', parentListener)
|
||||
|
||||
wrapper.element.dispatchEvent(
|
||||
new Event('drop', { bubbles: true, cancelable: true })
|
||||
)
|
||||
|
||||
expect(onDragDrop).toHaveBeenCalled()
|
||||
expect(parentListener).toHaveBeenCalled()
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -33,7 +33,7 @@
|
||||
@contextmenu="handleContextMenu"
|
||||
@dragover.prevent="handleDragOver"
|
||||
@dragleave="handleDragLeave"
|
||||
@drop.stop.prevent="handleDrop"
|
||||
@drop.prevent="handleDrop"
|
||||
>
|
||||
<!-- Selection/Execution Outline Overlay -->
|
||||
<AppOutput
|
||||
@@ -827,15 +827,13 @@ function handleDragLeave() {
|
||||
isDraggingOver.value = false
|
||||
}
|
||||
|
||||
async function handleDrop(event: DragEvent) {
|
||||
function handleDrop(event: DragEvent) {
|
||||
isDraggingOver.value = false
|
||||
|
||||
const node = lgraphNode.value
|
||||
if (!node || !node.onDragDrop) {
|
||||
return
|
||||
}
|
||||
if (!node?.onDragDrop) return
|
||||
|
||||
// Forward the drop event to the litegraph node's onDragDrop callback
|
||||
await node.onDragDrop(event)
|
||||
const handled = node.onDragDrop(event)
|
||||
if (handled === true) event.stopPropagation()
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
</div>
|
||||
<div
|
||||
v-else
|
||||
data-testid="node-widgets"
|
||||
:class="
|
||||
cn(
|
||||
'lg-node-widgets grid grid-cols-[min-content_minmax(80px,min-content)_minmax(125px,1fr)] gap-y-1 pr-3',
|
||||
@@ -24,6 +25,7 @@
|
||||
<template v-for="widget in processedWidgets" :key="widget.renderKey">
|
||||
<div
|
||||
v-if="!widget.hidden && (!widget.advanced || showAdvanced)"
|
||||
data-testid="node-widget"
|
||||
class="lg-node-widget group col-span-full grid grid-cols-subgrid items-stretch"
|
||||
>
|
||||
<!-- Widget Input Slot Dot -->
|
||||
|
||||
104
src/renderer/extensions/vueNodes/components/OutputSlot.test.ts
Normal file
104
src/renderer/extensions/vueNodes/components/OutputSlot.test.ts
Normal file
@@ -0,0 +1,104 @@
|
||||
import { mount } from '@vue/test-utils'
|
||||
import { describe, expect, it, vi } from 'vitest'
|
||||
import { defineComponent } from 'vue'
|
||||
import { createI18n } from 'vue-i18n'
|
||||
|
||||
import type { INodeSlot } from '@/lib/litegraph/src/litegraph'
|
||||
import enMessages from '@/locales/en/main.json' with { type: 'json' }
|
||||
|
||||
import OutputSlot from './OutputSlot.vue'
|
||||
|
||||
vi.mock('@/composables/useErrorHandling', () => ({
|
||||
useErrorHandling: () => ({ toastErrorHandler: vi.fn() })
|
||||
}))
|
||||
|
||||
vi.mock('@/renderer/core/canvas/links/slotLinkDragUIState', () => ({
|
||||
useSlotLinkDragUIState: () => ({
|
||||
state: { active: false, compatible: new Map() }
|
||||
})
|
||||
}))
|
||||
|
||||
vi.mock('@/renderer/extensions/vueNodes/composables/useNodeTooltips', () => ({
|
||||
useNodeTooltips: () => ({
|
||||
getOutputSlotTooltip: () => '',
|
||||
createTooltipConfig: (text: string) => ({ value: text })
|
||||
})
|
||||
}))
|
||||
|
||||
vi.mock(
|
||||
'@/renderer/extensions/vueNodes/composables/useSlotElementTracking',
|
||||
() => ({ useSlotElementTracking: vi.fn() })
|
||||
)
|
||||
|
||||
vi.mock(
|
||||
'@/renderer/extensions/vueNodes/composables/useSlotLinkInteraction',
|
||||
() => ({
|
||||
useSlotLinkInteraction: () => ({ onPointerDown: vi.fn() })
|
||||
})
|
||||
)
|
||||
|
||||
vi.mock('@/renderer/core/layout/slots/slotIdentifier', () => ({
|
||||
getSlotKey: () => 'mock-key'
|
||||
}))
|
||||
|
||||
const SlotConnectionDotStub = defineComponent({
|
||||
name: 'SlotConnectionDot',
|
||||
template: '<div class="stub-dot" />'
|
||||
})
|
||||
|
||||
const i18n = createI18n({
|
||||
legacy: false,
|
||||
locale: 'en',
|
||||
messages: { en: enMessages }
|
||||
})
|
||||
|
||||
function mountOutputSlot(slotData: Partial<INodeSlot>, index = 0) {
|
||||
return mount(OutputSlot, {
|
||||
props: {
|
||||
slotData: { type: '*', ...slotData } as INodeSlot,
|
||||
index,
|
||||
nodeId: 'test-node'
|
||||
},
|
||||
global: {
|
||||
plugins: [i18n],
|
||||
directives: { tooltip: {} },
|
||||
stubs: { SlotConnectionDot: SlotConnectionDotStub }
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
describe('OutputSlot', () => {
|
||||
it('renders label when present on slotData', () => {
|
||||
const wrapper = mountOutputSlot({
|
||||
name: 'internal_name',
|
||||
localized_name: 'Localized Name',
|
||||
label: 'My Custom Label'
|
||||
})
|
||||
|
||||
expect(wrapper.text()).toContain('My Custom Label')
|
||||
expect(wrapper.text()).not.toContain('internal_name')
|
||||
expect(wrapper.text()).not.toContain('Localized Name')
|
||||
})
|
||||
|
||||
it('falls back to localized_name when label is absent', () => {
|
||||
const wrapper = mountOutputSlot({
|
||||
name: 'internal_name',
|
||||
localized_name: 'Localized Name'
|
||||
})
|
||||
|
||||
expect(wrapper.text()).toContain('Localized Name')
|
||||
expect(wrapper.text()).not.toContain('internal_name')
|
||||
})
|
||||
|
||||
it('falls back to name when label and localized_name are absent', () => {
|
||||
const wrapper = mountOutputSlot({ name: 'internal_name' })
|
||||
|
||||
expect(wrapper.text()).toContain('internal_name')
|
||||
})
|
||||
|
||||
it('falls back to "Output N" when all names are absent', () => {
|
||||
const wrapper = mountOutputSlot({ name: undefined }, 2)
|
||||
|
||||
expect(wrapper.text()).toContain('Output 2')
|
||||
})
|
||||
})
|
||||
@@ -65,7 +65,6 @@ describe('ComfyApp.getNodeDefs', () => {
|
||||
|
||||
const result = await comfyApp.getNodeDefs()
|
||||
|
||||
// When display_name is empty, should fall back to name
|
||||
expect(result.TestNode.display_name).toBe('TestNode')
|
||||
})
|
||||
|
||||
@@ -84,7 +83,6 @@ describe('ComfyApp.getNodeDefs', () => {
|
||||
}
|
||||
|
||||
vi.mocked(api.getNodeDefs).mockResolvedValue(mockNodeDefs)
|
||||
// Mock st to return a translation instead of fallback
|
||||
vi.mocked(st).mockReturnValue('Translated Display Name')
|
||||
|
||||
const result = await comfyApp.getNodeDefs()
|
||||
@@ -123,12 +123,13 @@ export const useCustomerEventsService = () => {
|
||||
|
||||
function formatJsonValue(value: unknown) {
|
||||
if (typeof value === 'number') {
|
||||
// Format numbers with commas and decimals if needed
|
||||
return value.toLocaleString()
|
||||
}
|
||||
if (typeof value === 'string' && value.match(/^\d{4}-\d{2}-\d{2}/)) {
|
||||
// Format dates nicely
|
||||
return new Date(value).toLocaleString()
|
||||
if (typeof value === 'string') {
|
||||
const date = new Date(value)
|
||||
if (!Number.isNaN(date.getTime()) && /^\d{4}-\d{2}-\d{2}T/.test(value)) {
|
||||
return d(date, { dateStyle: 'medium', timeStyle: 'short' })
|
||||
}
|
||||
}
|
||||
return value
|
||||
}
|
||||
|
||||
@@ -75,6 +75,49 @@ import { getOrderedInputSpecs } from '@/workbench/utils/nodeDefOrderingUtil'
|
||||
import { useExtensionService } from './extensionService'
|
||||
import { useMaskEditor } from '@/composables/maskeditor/useMaskEditor'
|
||||
|
||||
async function reencodeAsPngBlob(
|
||||
blob: Blob,
|
||||
width: number,
|
||||
height: number
|
||||
): Promise<Blob> {
|
||||
const canvas = $el('canvas', { width, height }) as HTMLCanvasElement
|
||||
const ctx = canvas.getContext('2d')
|
||||
if (!ctx) throw new Error('Could not get canvas context')
|
||||
|
||||
let image: ImageBitmap | HTMLImageElement
|
||||
if (typeof window.createImageBitmap === 'undefined') {
|
||||
const img = new Image()
|
||||
const loaded = new Promise<void>((resolve, reject) => {
|
||||
img.onload = () => resolve()
|
||||
img.onerror = () => reject(new Error('Image load failed'))
|
||||
})
|
||||
img.src = URL.createObjectURL(blob)
|
||||
try {
|
||||
await loaded
|
||||
} finally {
|
||||
URL.revokeObjectURL(img.src)
|
||||
}
|
||||
image = img
|
||||
} else {
|
||||
image = await createImageBitmap(blob)
|
||||
}
|
||||
|
||||
try {
|
||||
ctx.drawImage(image, 0, 0)
|
||||
} finally {
|
||||
if ('close' in image && typeof image.close === 'function') {
|
||||
image.close()
|
||||
}
|
||||
}
|
||||
|
||||
return new Promise<Blob>((resolve, reject) => {
|
||||
canvas.toBlob((result) => {
|
||||
if (result) resolve(result)
|
||||
else reject(new Error('PNG conversion failed'))
|
||||
}, 'image/png')
|
||||
})
|
||||
}
|
||||
|
||||
export interface HasInitialMinSize {
|
||||
_initialMinSize: { width: number; height: number }
|
||||
}
|
||||
@@ -605,58 +648,24 @@ export const useLitegraphService = () => {
|
||||
const url = new URL(img.src)
|
||||
url.searchParams.delete('preview')
|
||||
|
||||
// @ts-expect-error fixme ts strict error
|
||||
const writeImage = async (blob) => {
|
||||
await navigator.clipboard.write([
|
||||
new ClipboardItem({
|
||||
[blob.type]: blob
|
||||
})
|
||||
])
|
||||
}
|
||||
|
||||
try {
|
||||
const data = await fetch(url)
|
||||
const blob = await data.blob()
|
||||
try {
|
||||
await writeImage(blob)
|
||||
await navigator.clipboard.write([
|
||||
new ClipboardItem({ [blob.type]: blob })
|
||||
])
|
||||
} catch (error) {
|
||||
// Chrome seems to only support PNG on write, convert and try again
|
||||
if (blob.type !== 'image/png') {
|
||||
const canvas = $el('canvas', {
|
||||
width: img.naturalWidth,
|
||||
height: img.naturalHeight
|
||||
}) as HTMLCanvasElement
|
||||
const ctx = canvas.getContext('2d')
|
||||
// @ts-expect-error fixme ts strict error
|
||||
let image
|
||||
if (typeof window.createImageBitmap === 'undefined') {
|
||||
image = new Image()
|
||||
const p = new Promise((resolve, reject) => {
|
||||
// @ts-expect-error fixme ts strict error
|
||||
image.onload = resolve
|
||||
// @ts-expect-error fixme ts strict error
|
||||
image.onerror = reject
|
||||
}).finally(() => {
|
||||
// @ts-expect-error fixme ts strict error
|
||||
URL.revokeObjectURL(image.src)
|
||||
})
|
||||
image.src = URL.createObjectURL(blob)
|
||||
await p
|
||||
} else {
|
||||
image = await createImageBitmap(blob)
|
||||
}
|
||||
try {
|
||||
// @ts-expect-error fixme ts strict error
|
||||
ctx.drawImage(image, 0, 0)
|
||||
canvas.toBlob(writeImage, 'image/png')
|
||||
} finally {
|
||||
// @ts-expect-error fixme ts strict error
|
||||
if (typeof image.close === 'function') {
|
||||
// @ts-expect-error fixme ts strict error
|
||||
image.close()
|
||||
}
|
||||
}
|
||||
|
||||
const pngBlob = await reencodeAsPngBlob(
|
||||
blob,
|
||||
img.naturalWidth,
|
||||
img.naturalHeight
|
||||
)
|
||||
await navigator.clipboard.write([
|
||||
new ClipboardItem({ 'image/png': pngBlob })
|
||||
])
|
||||
return
|
||||
}
|
||||
throw error
|
||||
|
||||
@@ -1,8 +1,18 @@
|
||||
import { extractFilesFromDragEvent } from '@/utils/eventUtils'
|
||||
import { describe, expect, it } from 'vitest'
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
|
||||
describe('eventUtils', () => {
|
||||
describe('extractFilesFromDragEvent', () => {
|
||||
let fetchSpy: ReturnType<typeof vi.fn>
|
||||
|
||||
beforeEach(() => {
|
||||
fetchSpy = vi.fn()
|
||||
vi.stubGlobal('fetch', fetchSpy)
|
||||
})
|
||||
|
||||
afterEach(() => {
|
||||
vi.unstubAllGlobals()
|
||||
})
|
||||
it('should return empty array when no dataTransfer', async () => {
|
||||
const actual = await extractFilesFromDragEvent(new FakeDragEvent('drop'))
|
||||
expect(actual).toEqual([])
|
||||
@@ -98,19 +108,56 @@ describe('eventUtils', () => {
|
||||
expect(actual).toEqual([file1, file2])
|
||||
})
|
||||
|
||||
// Skip until we can setup MSW
|
||||
it.skip('should handle drops with URLs', async () => {
|
||||
const urlWithWorkflow = 'https://fakewebsite.notreal/fake_workflow.json'
|
||||
it('should fetch URI and return as File when text/uri-list is present', async () => {
|
||||
const uri = 'https://example.com/api/view?filename=test.png&type=input'
|
||||
const imageBlob = new Blob([new Uint8Array([0x89, 0x50])], {
|
||||
type: 'image/png'
|
||||
})
|
||||
fetchSpy.mockResolvedValue(new Response(imageBlob))
|
||||
|
||||
const dataTransfer = new DataTransfer()
|
||||
dataTransfer.setData('text/uri-list', urlWithWorkflow)
|
||||
dataTransfer.setData('text/x-moz-url', urlWithWorkflow)
|
||||
dataTransfer.setData('text/uri-list', uri)
|
||||
|
||||
const actual = await extractFilesFromDragEvent(
|
||||
new FakeDragEvent('drop', { dataTransfer })
|
||||
)
|
||||
expect(actual.length).toBe(1)
|
||||
|
||||
expect(fetchSpy).toHaveBeenCalledOnce()
|
||||
expect(actual).toHaveLength(1)
|
||||
expect(actual[0]).toBeInstanceOf(File)
|
||||
expect(actual[0].type).toBe('image/png')
|
||||
})
|
||||
|
||||
it('should handle text/x-moz-url type', async () => {
|
||||
const uri = 'https://example.com/api/view?filename=test.png&type=input'
|
||||
const imageBlob = new Blob([new Uint8Array([0x89, 0x50])], {
|
||||
type: 'image/png'
|
||||
})
|
||||
fetchSpy.mockResolvedValue(new Response(imageBlob))
|
||||
|
||||
const dataTransfer = new DataTransfer()
|
||||
dataTransfer.setData('text/x-moz-url', uri)
|
||||
|
||||
const actual = await extractFilesFromDragEvent(
|
||||
new FakeDragEvent('drop', { dataTransfer })
|
||||
)
|
||||
|
||||
expect(fetchSpy).toHaveBeenCalledOnce()
|
||||
expect(actual).toHaveLength(1)
|
||||
})
|
||||
|
||||
it('should return empty array when URI fetch fails', async () => {
|
||||
const uri = 'https://example.com/api/view?filename=test.png&type=input'
|
||||
fetchSpy.mockRejectedValue(new TypeError('Failed to fetch'))
|
||||
|
||||
const dataTransfer = new DataTransfer()
|
||||
dataTransfer.setData('text/uri-list', uri)
|
||||
|
||||
const actual = await extractFilesFromDragEvent(
|
||||
new FakeDragEvent('drop', { dataTransfer })
|
||||
)
|
||||
|
||||
expect(actual).toEqual([])
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -20,9 +20,13 @@ export async function extractFilesFromDragEvent(
|
||||
const uri = event.dataTransfer.getData(match)?.split('\n')?.[0]
|
||||
if (!uri) return []
|
||||
|
||||
const response = await fetch(uri)
|
||||
const blob = await response.blob()
|
||||
return [new File([blob], uri, { type: blob.type })]
|
||||
try {
|
||||
const response = await fetch(uri)
|
||||
const blob = await response.blob()
|
||||
return [new File([blob], uri, { type: blob.type })]
|
||||
} catch {
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
export function hasImageType({ type }: File): boolean {
|
||||
|
||||
Reference in New Issue
Block a user