mirror of
https://github.com/Comfy-Org/ComfyUI_frontend.git
synced 2026-05-05 21:54:50 +00:00
Compare commits
36 Commits
feat/Remot
...
fix/cmd-a-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
efae49093b | ||
|
|
0e9a5ecbe9 | ||
|
|
9013102db9 | ||
|
|
6ea5a5e32d | ||
|
|
90b3d8a5c6 | ||
|
|
551cf21fb1 | ||
|
|
2c8ecd82ec | ||
|
|
7b59c561ff | ||
|
|
8b1d564729 | ||
|
|
ea2e8e59f2 | ||
|
|
1f60f7cfcc | ||
|
|
5e3266e0c2 | ||
|
|
b5b502755f | ||
|
|
5fbcea6b27 | ||
|
|
ac36dc47a4 | ||
|
|
aef71852f0 | ||
|
|
94b570a177 | ||
|
|
846412af17 | ||
|
|
aa2169e108 | ||
|
|
cc24d1411a | ||
|
|
c2abbeda80 | ||
|
|
56ac3762a0 | ||
|
|
f70285dcb2 | ||
|
|
6762c08134 | ||
|
|
211c49f538 | ||
|
|
b83602fd23 | ||
|
|
aee2e6e6dd | ||
|
|
2a3b692c0b | ||
|
|
dac3396de8 | ||
|
|
d253d87c92 | ||
|
|
4033dde983 | ||
|
|
61a444ed99 | ||
|
|
385a1d421d | ||
|
|
341fef46a9 | ||
|
|
24b548aebc | ||
|
|
6ea278da30 |
@@ -9,13 +9,18 @@ Cherry-pick backport management for Comfy-Org/ComfyUI_frontend stable release br
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. **Discover** — Collect candidates from Slack bot + git log gap (`reference/discovery.md`)
|
||||
2. **Analyze** — Categorize MUST/SHOULD/SKIP, check deps (`reference/analysis.md`)
|
||||
3. **Human Review** — Present candidates in batches for interactive approval (see Interactive Approval Flow)
|
||||
4. **Plan** — Order by dependency (leaf fixes first), group into waves per branch
|
||||
5. **Execute** — Label-driven automation → worktree fallback for conflicts (`reference/execution.md`)
|
||||
6. **Verify** — After each wave, verify branch integrity before proceeding
|
||||
7. **Log & Report** — Generate session report (`reference/logging.md`)
|
||||
1. **Discover** — Collect candidates from Slack bot + git log gap, then **reconcile both lists** (`reference/discovery.md`)
|
||||
2. **Pre-filter by path** — Auto-skip PRs whose changed files are entirely under `apps/website/`, `browser_tests/`, `.github/`, `packages/design-system/`, `packages/{cloud,registry}-types/`, `.claude/`, `docs/`. Don't read PR bodies for these — they don't ship to core ComfyUI users (`reference/analysis.md`)
|
||||
3. **Verify target file existence** — For each surviving candidate, run `git cat-file -e origin/$TARGET:$path` for primary changed files. If they don't exist on the target, auto-mark SKIP with reason `feature-not-on-branch`
|
||||
4. **Tiered triage** — Bucket into **Tier 1 (core editor must-haves)**, **Tier 2 (cloud-distribution only)**, **Tier 3 (skip)** before reviewing individually (`reference/analysis.md`)
|
||||
5. **Analyze** — Categorize remaining MUST/SHOULD, check deps (`reference/analysis.md`)
|
||||
6. **Human Review** — Present candidates in batches for interactive approval, with tier context attached (see Interactive Approval Flow)
|
||||
7. **Plan** — Order by dependency (leaf fixes first), group into waves per branch
|
||||
8. **Test-then-resolve dry-run** — Classify clean vs conflict before committing time (`reference/execution.md`)
|
||||
9. **Execute** — Label-driven automation for clean PRs → worktree fallback for conflicts (`reference/execution.md`)
|
||||
10. **Public-API conflict review** — If conflict resolution touches a public LiteGraph callback, extension API, or `node.*` method, consult oracle for compat-regression review BEFORE pushing (`reference/execution.md`)
|
||||
11. **Verify** — Per-PR validation (typecheck + targeted tests + lint on changed files) AND per-wave verification (full typecheck + test:unit on branch HEAD)
|
||||
12. **Log & Report** — Generate session report + author accountability report + Slack status update (`reference/logging.md`)
|
||||
|
||||
## System Context
|
||||
|
||||
@@ -107,6 +112,35 @@ Husky hooks fail in worktrees (can't find lint-staged config). Always use `git p
|
||||
|
||||
In the 2026-04-06 session: core/1.42 got 18/26 auto-PRs, cloud/1.42 got only 1/25. The cloud branch has more divergence. **Always plan for manual fallback** — don't assume automation will handle most PRs.
|
||||
|
||||
### Cherry-Picked Tests Can Reference Files Added By Earlier Unbackported PRs
|
||||
|
||||
A common conflict: PR A on main modifies a test file that was _added_ on main by an earlier PR B (not backported to the target). The cherry-pick of A reports "modify/delete" on B's test file because the file doesn't exist on the target. Adding the new file would smuggle in B's test scaffolding without B's runtime changes.
|
||||
|
||||
**Detection:** Conflict says `deleted in HEAD and modified in <PR>`. Verify with:
|
||||
|
||||
```bash
|
||||
git log --diff-filter=A --oneline origin/main -- path/to/test.ts
|
||||
```
|
||||
|
||||
If the introducing commit is **not** on the target branch, the test file isn't a real prerequisite for the runtime fix.
|
||||
|
||||
**Fix:** `git rm` the test file (drop it from the backport). Document in the commit body which PR introduced it on main and why dropping it is safe. The runtime fix itself usually doesn't depend on these tests — coverage exists at the integration layer.
|
||||
|
||||
### Backport-Only Compatibility Shims
|
||||
|
||||
When a PR's _mechanism_ relies on changes upstream that aren't on the older branch, a literal cherry-pick can recreate the original bug for any consumer still using the old contract. This is most dangerous for **public LiteGraph callbacks, extension APIs, and `node.*` methods** that custom-node packages depend on.
|
||||
|
||||
**Real example (#11541, core/1.43 backport):** The PR removed `LGraphNode.vue`'s legacy `handled === true` sync-return check from `handleDrop`, replacing it with `await node.onDragDrop(event, true)`. Safe on `main` because all in-repo `onDragDrop` handlers had migrated to participate in the new `claimEvent` flag. On `core/1.43`, `onDragDrop` is a public callback — custom-node packages with synchronous `onDragDrop` returning `true` would no longer have their event claimed, recreating the duplicate-node-creation bug the PR was fixing.
|
||||
|
||||
**Detection:** The PR's diff modifies a file that is part of a public extension API surface. Look for:
|
||||
|
||||
- `node.onXxx` callback assignments
|
||||
- Methods on `LGraphNode`, `LGraphCanvas`, `LGraph`, `Subgraph`
|
||||
- Public exports from `src/lib/litegraph/`
|
||||
- Type changes affecting `litegraph-augmentation.d.ts`
|
||||
|
||||
**Fix:** Add a backport-only compatibility shim that preserves the old contract while keeping the new fix. Document it explicitly in the commit body under a `## Backport-only compatibility fix` heading. Consult oracle for review before pushing — a bad shim is worse than no fix.
|
||||
|
||||
## Conflict Triage
|
||||
|
||||
**Always categorize before deciding to skip. High conflict count ≠ hard conflicts.**
|
||||
@@ -147,6 +181,26 @@ Skip these without discussion:
|
||||
- **Features not on target branch** — e.g., Painter, GLSLShader, appModeStore on core/1.40
|
||||
- **Cloud-only PRs on core/\* branches** — Team workspaces, cloud queue, cloud-only login. (Note: app mode and Firebase auth are NOT cloud-only — see Branch Scope Rules)
|
||||
|
||||
### Path Pre-Filter (run BEFORE reading PR bodies)
|
||||
|
||||
For 50+ candidate PRs, classify by changed paths first to skip the unproductive ones without spending time on triage. Run `git show --stat $SHA` (or `gh pr view --json files`) and bucket:
|
||||
|
||||
| Path prefix | Bucket | Reason |
|
||||
| ---------------------------------------------- | ---------------------- | ------------------------------------------------ |
|
||||
| `apps/website/` | SKIP | Marketing/platform site, not core ComfyUI bundle |
|
||||
| `apps/desktop-ui/` | SKIP for `core/*` | Desktop app, separate release cadence |
|
||||
| `browser_tests/` only (no `src/`) | SKIP | Test-only |
|
||||
| `.github/workflows/` only | SKIP | CI/release infra |
|
||||
| `packages/design-system/` only | SKIP | Design tokens, not core |
|
||||
| `packages/{cloud,registry,ingest}-types/` only | SKIP | Generated types |
|
||||
| `.claude/`, `.agents/`, `docs/` | SKIP | Agent / documentation |
|
||||
| `*.stories.ts` only | SKIP | Storybook only |
|
||||
| `src/` (core editor) | KEEP — analyze further | Runtime/editor code that requires full triage |
|
||||
|
||||
A PR touches multiple paths? Keep it if **any** changed file is under `src/` (or other core paths) and run normal analysis. Auto-skip is conservative — only skip when _all_ paths match the SKIP buckets.
|
||||
|
||||
This filter alone removes ~30-50% of candidates in a typical session, leaving only the PRs that need real triage.
|
||||
|
||||
## Wave Verification
|
||||
|
||||
After merging each wave of PRs to a target branch, verify branch integrity before proceeding:
|
||||
|
||||
@@ -39,6 +39,89 @@ Check before backporting — these don't exist on older branches:
|
||||
- **App builder** — check per branch
|
||||
- **appModeStore.ts** — not on core/1.40
|
||||
|
||||
### Verify Target File Existence (Run Before Cherry-Pick)
|
||||
|
||||
Before cherry-picking any PR, confirm the files it modifies actually exist on the target branch. If they don't, the PR's runtime fix is for a feature that hasn't been added yet — skip cleanly without attempting cherry-pick:
|
||||
|
||||
```bash
|
||||
# For each file the PR changes
|
||||
for f in $(gh pr view $PR --json files --jq '.files[].path' | grep -v "^browser_tests/\|\.test\." ); do
|
||||
if ! git cat-file -e origin/$TARGET:$f 2>/dev/null; then
|
||||
echo "MISSING on $TARGET: $f"
|
||||
fi
|
||||
done
|
||||
```
|
||||
|
||||
If the _primary_ changed files (the runtime ones, not tests) are missing, mark the PR `SKIP / feature-not-on-branch`. This is faster than letting cherry-pick fail with modify/delete conflicts and gives a clean signal.
|
||||
|
||||
This check is the first thing that runs after the path pre-filter and BEFORE you spend time reading PR descriptions.
|
||||
|
||||
## Tiered Triage (Recommended for 30+ Candidates)
|
||||
|
||||
Before the interactive Y/N approval flow, bucket all surviving candidates into three tiers. This surfaces release-engineering decisions that a flat MUST/SHOULD list obscures:
|
||||
|
||||
### Tier 1 — Core Editor Must-Haves
|
||||
|
||||
User-facing bugs, crashes, data corruption, or security issues in code paths that exist on the target branch. These are the strongest backport candidates.
|
||||
|
||||
Indicators:
|
||||
|
||||
- `fix:` prefix and the bug is reproducible on the target branch
|
||||
- Crash guards, runtime null checks, race-condition fixes
|
||||
- Data-loss bugs (state not persisted, duplicates, drops)
|
||||
- Security hardening (CSRF, XSS, auth)
|
||||
- Vue Nodes 2.0 regression cluster (if the target ships Vue Nodes 2.0)
|
||||
- Subgraph correctness fixes
|
||||
- Public-API extension callback fixes
|
||||
|
||||
Recommend `Y` to user.
|
||||
|
||||
### Tier 2 — Cloud-Distribution Only
|
||||
|
||||
Bugs that only manifest on cloud-hosted distributions (Secrets panel, subscription flows, cloud signup, workspace tracking, etc.). Whether to backport depends on whether cloud ships from the target `core/*` branch in your release matrix.
|
||||
|
||||
Indicators:
|
||||
|
||||
- Files under `src/platform/secrets/`, `src/platform/subscription/`, signup flows
|
||||
- PR description mentions cloud staging issues
|
||||
- Fix gated behind cloud feature flags
|
||||
|
||||
Default: ask the cloud release rotation owner. If unsure, defer.
|
||||
|
||||
### Tier 3 — Skip
|
||||
|
||||
Path pre-filter caught most of these. The rest are PRs where the diff _touches_ `src/` but the practical impact is non-user-facing or scoped to features the target doesn't ship.
|
||||
|
||||
Indicators:
|
||||
|
||||
- All changes in test files even if the PR touched `src/` test files
|
||||
- Storybook stories only
|
||||
- Lint config / lint rule additions
|
||||
- Documentation comments
|
||||
- Internal refactors with no behavior change
|
||||
|
||||
### Presentation Format
|
||||
|
||||
When showing tier results to the user, format as:
|
||||
|
||||
```text
|
||||
Tier 1 (N PRs) — strong backport candidates
|
||||
- #11541 fix: stop duplicate node creation when dropping image on Vue nodes
|
||||
Why: Vue Nodes 2.0 regression — async onDragDrop bypassed handled-check, drops bubble to document, spawns extra LoadImage nodes
|
||||
- #10849 fix: store promoted widget values per SubgraphNode instance
|
||||
Why: Multiple instances overwriting each other's promoted widget values — data loss
|
||||
|
||||
Tier 2 (N PRs) — cloud-distribution release rotation should decide
|
||||
- #11636 fix: enable Chrome password autofill on signup form
|
||||
- ...
|
||||
|
||||
Tier 3 (N PRs) — skip recommended
|
||||
- #11586 fix: website polish (apps/website/ only)
|
||||
- ...
|
||||
```
|
||||
|
||||
Then run interactive Y/N over Tier 1 and Tier 2; Tier 3 gets confirmed-skip without per-PR review.
|
||||
|
||||
## Dep Refresh PRs
|
||||
|
||||
Always SKIP on stable branches. Risk of transitive dependency regressions outweighs audit cleanup benefit. If a specific CVE fix is needed, cherry-pick that individual fix instead.
|
||||
|
||||
@@ -1,5 +1,11 @@
|
||||
# Discovery — Candidate Collection
|
||||
|
||||
**Run all sources, then reconcile.** No single source is authoritative:
|
||||
|
||||
- Slack bot may flag PRs that have already been backported (false positive)
|
||||
- Git gap may include PRs that don't need backport (test-only, design-system, website)
|
||||
- Bot can also miss PRs that landed without the right labels
|
||||
|
||||
## Source 1: Slack Backport-Checker Bot
|
||||
|
||||
Use `slackdump` skill to export `#frontend-releases` channel (C09K9TPU2G7):
|
||||
@@ -36,7 +42,43 @@ gh pr view $PR --json mergeCommit,title --jq '"Title: \(.title)\nMerge: \(.merge
|
||||
gh pr view $PR --json files --jq '.files[].path'
|
||||
```
|
||||
|
||||
## Source 4: Already-Backported PRs (cross-reference)
|
||||
|
||||
When the target branch already has some cherry-picks on it (e.g., partway through a release window), extract the originals to avoid re-backporting:
|
||||
|
||||
```bash
|
||||
# Get all original PR numbers already backported to TARGET since the last release tag
|
||||
git log --format="%H%n%B" $LAST_TAG..origin/$TARGET \
|
||||
| grep -oiE "(backport of|cherry.picked) #?[0-9]+" \
|
||||
| grep -oE "[0-9]+" \
|
||||
| sort -un > /tmp/already-backported.txt
|
||||
```
|
||||
|
||||
Subtract this list from your candidates.
|
||||
|
||||
## Reconciliation Workflow
|
||||
|
||||
```bash
|
||||
# 1. Slack bot list (parse from export)
|
||||
# /tmp/bot-flagged.txt — one PR# per line, sorted
|
||||
|
||||
# 2. Git gap fix/perf only
|
||||
MB=$(git merge-base origin/main origin/$TARGET)
|
||||
git log --format="%h|%s" $MB..origin/main \
|
||||
| grep -iE "^[a-f0-9]+\|(fix|perf)" \
|
||||
| grep -oE "#[0-9]+\)" | grep -oE "[0-9]+" \
|
||||
| sort -un > /tmp/gap-fixes.txt
|
||||
|
||||
# 3. Already backported (Source 4 above)
|
||||
|
||||
# 4. Candidates = (gap-fixes ∪ bot-flagged) − already-backported
|
||||
sort -u /tmp/gap-fixes.txt /tmp/bot-flagged.txt > /tmp/union.txt
|
||||
comm -23 /tmp/union.txt /tmp/already-backported.txt > /tmp/candidates.txt
|
||||
```
|
||||
|
||||
The result is the input to the path pre-filter (`SKILL.md` Quick Start step 2).
|
||||
|
||||
## Output: candidate_list.md
|
||||
|
||||
Table per target branch:
|
||||
| PR# | Title | Category | Flagged by Bot? | Decision |
|
||||
| PR# | Title | Source (bot/gap/both) | Path bucket | Tier | Decision |
|
||||
|
||||
@@ -6,6 +6,43 @@
|
||||
2. Medium gap next (quick win)
|
||||
3. Largest gap last (main effort)
|
||||
|
||||
## Step 0: Test-Then-Resolve Pre-Pass (Recommended)
|
||||
|
||||
Before triggering label-driven automation, run a dry-run cherry-pick loop to classify candidates. This is much faster than discovering conflicts after-the-fact across automation, manual cherry-picks, and CI failures.
|
||||
|
||||
```bash
|
||||
git fetch origin TARGET_BRANCH
|
||||
git worktree add /tmp/dryrun-TARGET origin/TARGET_BRANCH
|
||||
cd /tmp/dryrun-TARGET
|
||||
|
||||
CLEAN=()
|
||||
CONFLICT=()
|
||||
for pr in "${CANDIDATES[@]}"; do
|
||||
SHA=$(gh pr view $pr --json mergeCommit --jq '.mergeCommit.oid')
|
||||
git checkout -b dryrun-$pr origin/TARGET_BRANCH 2>/dev/null
|
||||
if git cherry-pick -m 1 $SHA 2>/dev/null; then
|
||||
CLEAN+=($pr)
|
||||
else
|
||||
CONFLICT+=($pr)
|
||||
git cherry-pick --abort
|
||||
fi
|
||||
git checkout --detach HEAD 2>/dev/null
|
||||
git branch -D dryrun-$pr 2>/dev/null
|
||||
done
|
||||
|
||||
echo "CLEAN (${#CLEAN[@]}): ${CLEAN[*]}"
|
||||
echo "CONFLICT (${#CONFLICT[@]}): ${CONFLICT[*]}"
|
||||
|
||||
cd -
|
||||
git worktree remove /tmp/dryrun-TARGET --force
|
||||
```
|
||||
|
||||
Use the result to:
|
||||
|
||||
- Send CLEAN PRs through label-driven automation (Step 1) — they'll typically self-merge
|
||||
- Reserve manual worktree time (Step 3) for CONFLICT PRs only
|
||||
- Surface PRs likely to need backport-only compat shims (CONFLICT files in `src/lib/litegraph/` or `src/scripts/app.ts`)
|
||||
|
||||
## Step 1: Label-Driven Automation (Batch)
|
||||
|
||||
```bash
|
||||
@@ -88,6 +125,39 @@ for PR in ${CONFLICT_PRS[@]}; do
|
||||
git add .
|
||||
GIT_EDITOR=true git cherry-pick --continue
|
||||
|
||||
# ── Public-API conflict review (REQUIRED for extension-API surfaces) ──
|
||||
# If the conflict resolution touched any of these surfaces, consult oracle
|
||||
# BEFORE pushing. A bad shim is worse than no fix:
|
||||
# - node.onXxx callback assignments (onDragDrop, onConnectionsChange, onRemoved, onConfigure, etc.)
|
||||
# - Methods on LGraphNode, LGraphCanvas, LGraph, Subgraph
|
||||
# - Public exports from src/lib/litegraph/
|
||||
# - Type changes in litegraph-augmentation.d.ts
|
||||
# If a public callback's signature/contract changed: add a backport-only
|
||||
# compatibility shim that preserves the OLD contract while keeping the
|
||||
# new fix. Document it in the commit body under
|
||||
# "## Backport-only compatibility fix". See SKILL.md gotcha section.
|
||||
# ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
# Per-PR validation BEFORE push (catches issues earlier than wave verification).
|
||||
# Guard each targeted command against empty file lists — running `pnpm test:unit -- run`
|
||||
# with no arg matchers would run the full suite, and `pnpm exec eslint` with no args errors.
|
||||
pnpm typecheck
|
||||
|
||||
mapfile -t TEST_FILES < <(git diff --name-only HEAD~1 | grep -E '\.test\.ts$' || true)
|
||||
if [ ${#TEST_FILES[@]} -gt 0 ]; then
|
||||
pnpm test:unit -- run "${TEST_FILES[@]}"
|
||||
else
|
||||
echo "No changed test files — skipping targeted unit tests"
|
||||
fi
|
||||
|
||||
mapfile -t CODE_FILES < <(git diff --name-only HEAD~1 | grep -E '\.(ts|vue)$' || true)
|
||||
if [ ${#CODE_FILES[@]} -gt 0 ]; then
|
||||
pnpm exec eslint "${CODE_FILES[@]}"
|
||||
pnpm exec oxfmt --check "${CODE_FILES[@]}"
|
||||
else
|
||||
echo "No changed ts/vue files — skipping targeted lint/format"
|
||||
fi
|
||||
|
||||
git push origin backport-$PR-to-TARGET --no-verify
|
||||
NEW_PR=$(gh pr create --base TARGET_BRANCH --head backport-$PR-to-TARGET \
|
||||
--title "[backport TARGET] TITLE (#$PR)" \
|
||||
@@ -243,6 +313,9 @@ gh pr checks $PR --watch --fail-fast && gh pr merge $PR --squash --admin
|
||||
16. **Use `--no-verify` in worktrees** — husky hooks fail in `/tmp/` worktrees. Always push/commit with `--no-verify`.
|
||||
17. **Automation success varies by branch** — core/1.42 got 18/26 auto-PRs (69%), cloud/1.42 got 1/25 (4%). Cloud branches diverge more. Plan for manual fallback.
|
||||
18. **Test-then-resolve pattern** — for branches with low automation success, run a dry-run loop to classify clean vs conflict PRs before processing. This is much faster than resolving conflicts serially.
|
||||
19. **Public-API conflict resolutions need oracle review** — when a conflict touches `node.onXxx` callbacks, `LGraphNode`/`LGraphCanvas`/`LGraph`/`Subgraph` methods, or types in `litegraph-augmentation.d.ts`, consult oracle BEFORE pushing. Custom-node packages depend on these contracts. A literal cherry-pick of a refactor-style fix can silently break extensions still using the old contract — sometimes recreating the very bug the PR was fixing. Document any backport-only compatibility shim explicitly in the commit body.
|
||||
20. **Cherry-picked tests can require unbackported test scaffolding** — when a PR modifies a test file that was _added_ on main by an earlier unbackported PR, the cherry-pick reports modify/delete on that file. Drop it from the backport (`git rm`) and document which PR introduced it. Don't smuggle in test infrastructure without its runtime prerequisites.
|
||||
21. **Per-PR validation catches issues earlier than wave verification** — for high-stakes branches, run `pnpm typecheck && pnpm exec eslint <changed files> && pnpm exec oxfmt --check` per PR before pushing. Wave verification still matters (it catches cross-PR interactions), but per-PR makes attribution trivial when something fails.
|
||||
|
||||
## CI Failure Triage
|
||||
|
||||
@@ -268,3 +341,40 @@ Common failure categories:
|
||||
| Type error | Interface changed on main but not branch | May need manual adaptation |
|
||||
|
||||
**Never assume a failure is safe to skip.** Present all failures to the user with analysis.
|
||||
|
||||
## PR Body Template (Manual Cherry-Picks)
|
||||
|
||||
Manual cherry-pick PRs need detail beyond the automation's terse default. Use this template — reviewers will look here before re-deriving conflict-resolution logic from the diff.
|
||||
|
||||
```markdown
|
||||
Manual backport of #ORIG to `TARGET` for inclusion in `vX.Y.Z`.
|
||||
|
||||
Cherry-picked from upstream merge commit `SHORT_SHA`.
|
||||
|
||||
## Why
|
||||
|
||||
[1-2 sentences from the original PR's "Summary" — what bug, what fix mechanism]
|
||||
|
||||
## Conflict resolution
|
||||
|
||||
- **`path/to/file`** — [what conflicted on this branch] → [resolution chosen + why]
|
||||
- **`path/to/dropped-test.test.ts`** — added on main by unrelated PR #XXXX (not backported). Dropped from this backport; runtime fix intact.
|
||||
- [...]
|
||||
|
||||
## Backport-only compatibility fix (if applicable)
|
||||
|
||||
[If you added a shim that wasn't in the upstream PR, document it here — what extension surface, what contract, what the shim preserves, why the upstream version would have regressed it]
|
||||
|
||||
## Validation
|
||||
|
||||
- `pnpm typecheck` ✅
|
||||
- `pnpm test:unit -- run <targeted suites>` ✅ (N/N passing)
|
||||
- `pnpm exec eslint <changed files>` ✅ (0 errors)
|
||||
- `pnpm exec oxfmt --check` ✅ (clean)
|
||||
|
||||
[If manual e2e was skipped, explain why — e.g., requires live backend, headless not feasible. State that source is byte-identical to upstream + how long it's been baking on main.]
|
||||
|
||||
Original PR: #ORIG / Original commit: `FULL_SHA`
|
||||
```
|
||||
|
||||
The conflict-resolution section is non-negotiable — every conflict you resolved by hand needs a one-liner. This makes archaeology trivial six months later when someone asks "why does this look slightly different from main?"
|
||||
|
||||
@@ -1,695 +0,0 @@
|
||||
---
|
||||
name: bug-dump-ingest
|
||||
description: 'Syncs the #bug-dump Slack channel into Linear as the system of record AND auto-fixes verified real bugs via red-green-fix. Every Linear operation (create, search, link, label) is performed by posting an @Linear mention in the bug-dump thread — no Linear MCP, no API key. Flow: fetch → mandatory dedupe gate (@Linear search + gh PR search) → false-defect verification → post @Linear create in thread (tool call) → parse bot card for FE-NNNN + URL → post :white_check_mark: confirmation reply → if candidate is a verified real bug with no dedupe hit and no open PR, invoke red-green-fix automatically to produce failing test + fix + PR. Respects team emoji scheme (:white_check_mark: ticket created, :pr-open: PR open, :question: needs context, :repeat: duplicate). Use when asked to sync #bug-dump to Linear, triage slack bugs, run a bug-dump sweep, or ingest bug reports. Triggers on: bug-dump, sync bug-dump, ingest bugs, triage slack bugs, bug sweep.'
|
||||
---
|
||||
|
||||
# Bug Dump Ingest
|
||||
|
||||
**Primary job: sync `#bug-dump` (Slack: `C0A4XMHANP3`) into Linear as the source of truth, then auto-fix the verified real bugs.** Linear is where status, labels, and follow-up triage happen — this skill gets every bug into Linear with enough context that a downstream agent or human can work from Linear alone. **Every Linear action is performed by mentioning `@Linear` in the bug-dump thread**; there is no Linear MCP and no API key path. When pre-flight verification confirms a candidate is a real bug (not dedupe, not already in a PR, not out of scope), the skill then invokes `red-green-fix` automatically.
|
||||
|
||||
```text
|
||||
fetch → pre-flight dedupe gate (@Linear search + gh) → verify false defects → present approvals
|
||||
→ POST "@Linear create ..." thread reply via slack_send_message (mandatory tool call)
|
||||
→ poll slack_read_thread → parse Linear bot card for FE-NNNN + URL
|
||||
→ POST :white_check_mark: confirmation thread reply via slack_send_message
|
||||
→ if verification = "real bug" AND no dedupe AND no open PR:
|
||||
invoke Skill(skill="red-green-fix") → POST :pr-open: thread reply
|
||||
```
|
||||
|
||||
### Non-negotiable rules
|
||||
|
||||
1. **Linear actions are Slack tool calls.** The skill MUST drive Linear by calling `mcp__plugin_slack_slack__slack_send_message` with `thread_ts` set and text that mentions `@Linear`. There is no MCP-direct path and no API-key path. Printing `@Linear create ...` into the Claude CLI response is NOT a substitute — the Slack thread reply is what triggers the Linear bot, and its card is the canonical receipt.
|
||||
2. **Dedupe is a gate, not a suggestion.** No candidate is proposed for creation until `@Linear search` AND `gh pr` search have been run and recorded. A hit short-circuits creation to `L` (link) or `pr-open`.
|
||||
3. **Auto-fix real bugs.** When the dedupe gate is clean AND false-defect verification is clean AND the candidate isn't on the handoff-exclusion list (see § Handoff conditions), after Linear creation the skill invokes `red-green-fix` via the `Skill` tool — without waiting for an extra human prompt.
|
||||
|
||||
### What the skill cannot do
|
||||
|
||||
The Slack MCP exposes no `reactions.add` tool, so the skill cannot put a `:white_check_mark:` reaction on the parent message. The thread reply with the leading `:white_check_mark:` emoji is the skill's canonical marker; a human can additionally add the parent reaction for channel visibility (see § Parent reaction — optional visibility nudge). Both are respected by Processed Detection.
|
||||
|
||||
## Team emoji scheme
|
||||
|
||||
| Emoji | Meaning | Who adds it | Skill behavior |
|
||||
| -------------------- | ------------------ | ------------------------------------------------------ | ---------------------------------------------- |
|
||||
| `:white_check_mark:` | Ticket created | Human on parent (after skill files); also in bot reply | Skip in future sweeps |
|
||||
| `:pr-open:` | PR open | Human | Skip creation; include PR link in approval row |
|
||||
| `:question:` | Needs more context | Human | Skip creation; agent may ask for clarification |
|
||||
| `:repeat:` | Duplicate | Human | Skip creation; link existing Linear issue |
|
||||
|
||||
## Design Priority
|
||||
|
||||
Optimize for **coverage, label quality, and proven fixes** over fix-path cleverness. Linear is the downstream triage surface — once every bug is there with status, labels, and context, agents and humans can work from Linear alone. A Linear ticket with a wrong severity is cheap to fix; a Slack-only bug is invisible to downstream tooling; a "filed but not fixed" real regression wastes a human turn that the skill could have spent on a red-green PR.
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. **Scope** — default window: messages in the last 48h. Override with `--since YYYY-MM-DD` or a Slack permalink list.
|
||||
2. **Fetch** — `slack_read_channel` for `C0A4XMHANP3`; `slack_read_thread` per message with replies.
|
||||
3. **Filter** — drop already-processed (see Processed Detection).
|
||||
4. **Classify** — bug / discussion / meta (see Classification Rules).
|
||||
5. **Pre-flight dedupe gate (MANDATORY)** — for every bug candidate, run `@Linear search` AND `gh pr` search BEFORE proposing (see § Pre-flight Dedupe Gate). A hit means the candidate goes into the batch as `L` (link) or `pr-open`, not as a new create.
|
||||
6. **Verify false defects** — per candidate, run quick checks before proposing (see False-Defect Verification).
|
||||
7. **Extract** — normalize to ticket schema (see Ticket Schema).
|
||||
8. **Human approval** — batch table, collect Y/N/?/S/L/R per candidate (see Interactive Approval). Default recommendation for clean candidates is `Y` (file + auto-fix).
|
||||
9. **Post `@Linear create` thread reply — MANDATORY TOOL CALL** — for each approved `Y`/`L` row, call `mcp__plugin_slack_slack__slack_send_message` with `channel_id=C0A4XMHANP3`, `thread_ts=<parent-ts>`, and text starting with `@Linear create` (see § Linear Slack Bot Integration). Do NOT print the command into chat as a substitute.
|
||||
10. **Capture the Linear bot card** — poll `slack_read_thread` up to 3× with ~3s spacing, parse the first Linear-app reply for the `FE-NNNN` identifier and `https://linear.app/...` URL. No URL = not ingested; never fabricate one.
|
||||
11. **Post `:white_check_mark:` confirmation reply — MANDATORY TOOL CALL** — call `slack_send_message` again with text starting with `:white_check_mark: Filed to Linear: <URL>` so future sweeps can detect the marker via `has::white_check_mark: from:me`. Record both `ts` values in the session log.
|
||||
12. **Auto-fix (clean candidates only)** — if dedupe gate is clean AND false-defect verification is clean AND the candidate isn't on the Handoff-Exclusion list, immediately invoke the `red-green-fix` skill via the `Skill` tool. See § Fix Workflow for the exact call contract.
|
||||
13. **Log** — append to session log; update `processed.json`.
|
||||
|
||||
## System Context
|
||||
|
||||
| Item | Value |
|
||||
| ------------------ | -------------------------------------------------------------------------------------------------------------------------- |
|
||||
| Source channel | `#bug-dump` (`C0A4XMHANP3`) |
|
||||
| Destination | Linear `Frontend Engineering` team, via the Linear Slack app (`@Linear`). Team is named in every `@Linear create` message. |
|
||||
| Default state | `Triage` — every `@Linear create` message includes `Status: Triage` |
|
||||
| State dir | `~/temp/bug-dump-ingest/` |
|
||||
| Processed registry | `~/temp/bug-dump-ingest/processed.json` |
|
||||
| Session log | `~/temp/bug-dump-ingest/session-YYYY-MM-DD.md` |
|
||||
| Drafts (failure) | `~/temp/bug-dump-ingest/drafts/*.md` — written only when `@Linear` never replies, so the human can retry manually |
|
||||
|
||||
## Label Taxonomy
|
||||
|
||||
Every created Linear issue MUST get the following labels, passed as a comma-separated list in the `Labels:` line of the `@Linear create` message. The Linear Slack app creates missing labels on first use:
|
||||
|
||||
| Label kind | Values | Source |
|
||||
| ------------ | ------------------------------------------------------------------------------ | ------------------------- |
|
||||
| `source:` | `source:bug-dump` | Always (marks Slack sync) |
|
||||
| `area:` | `area:ui`, `area:node-system`, `area:workflow`, `area:cloud`, `area:templates` | Area Heuristics |
|
||||
| `env:` | `env:cloud-prod`, `env:cloud-dev`, `env:local`, `env:electron` | Env Heuristics |
|
||||
| `severity:` | `sev:high`, `sev:medium`, `sev:low` | Severity Heuristics |
|
||||
| `reporter:` | `reporter:<slack-handle>` (kebab-case) | From message author |
|
||||
| Status flags | `needs-repro`, `needs-backend`, `regression`, `pr-open` | When applicable |
|
||||
|
||||
Label rules:
|
||||
|
||||
- Always include `source:bug-dump`, exactly one `area:`, at least one `env:` (or `env:unknown`), exactly one `severity:`, exactly one `reporter:`.
|
||||
- `needs-repro` — set when repro steps were ambiguous; signals "human should confirm before fix".
|
||||
- `needs-backend` — set when fix is clearly in ComfyUI backend, not this frontend repo.
|
||||
- `regression` — set when the bug mentions a version/upgrade correlation.
|
||||
- `pr-open` — set instead of creating a fresh ticket when a fix PR already exists; the Linear issue becomes a tracker.
|
||||
|
||||
Labels are the primary affordance for downstream triage — invest in getting them right, not just in the title.
|
||||
|
||||
## Processed Detection
|
||||
|
||||
A top-level message is considered already-handled (skip creation) if ANY of:
|
||||
|
||||
- Its timestamp appears in `processed.json`.
|
||||
- It carries a `:white_check_mark:` reaction on the parent — ticket already created.
|
||||
- It carries a `:pr-open:` reaction — fix PR is open; skill records the PR link in the session log rather than creating a fresh Linear issue.
|
||||
- It carries a `:repeat:` reaction — duplicate; skill attempts to find the original Linear issue and link it in the session log.
|
||||
- It carries a `:question:` reaction — needs more context; skill skips creation and records for follow-up.
|
||||
- Its thread contains a reply with a `https://linear.app/` URL (fetch via `slack_read_thread`).
|
||||
- Its thread contains a reply starting with `:white_check_mark:` from the skill's bot user.
|
||||
- It is a system/meta message (`has joined the channel`, bot-only message).
|
||||
- Its thread already contains resolution confirmation (`"solved"`, `"resolved"`, `:done:` reaction from the reporter) AND has no fix PR referenced — treat as "resolved without ticket, skip".
|
||||
|
||||
Never re-ingest a message already marked in any of the above ways.
|
||||
|
||||
Filter query for Slack search-based sweeps:
|
||||
|
||||
```text
|
||||
in:<#C0A4XMHANP3> -has::white_check_mark: -has::pr-open: -has::repeat: -has::question: after:YYYY-MM-DD
|
||||
```
|
||||
|
||||
## False-Defect Verification
|
||||
|
||||
Before a candidate hits the approval batch, run cheap checks to demote obvious non-bugs. Goal: keep the approval table high-signal. This is not a full repro — just fast heuristics that catch the top false-positive classes.
|
||||
|
||||
| Check | Command / Signal | Demote-to |
|
||||
| ---------------------------------------- | ---------------------------------------------------------------- | ---------- |
|
||||
| Reporter self-resolved in same msg | "no action needed", "solved", "nvm", "fixed it" | `resolved` |
|
||||
| Reporter self-resolved in thread | `slack_read_thread` → reporter's last reply contains "solved" | `resolved` |
|
||||
| Fix PR merged on main | `gh search prs "in:title <keyword>" --state merged --limit 3` | `fixed` |
|
||||
| Fix PR open (already-filed) | `gh search prs "<keyword>" --state open --limit 3` | `pr-open` |
|
||||
| Linear issue exists (open) | Linear `searchIssues` on title keywords → any open match | `dedupe` |
|
||||
| Behavior is documented / intended | grep `docs/` and `src/locales/en/*.json` for the feature | `expected` |
|
||||
| Not reproducible — feature doesn't exist | grep `src/` for mentioned component/feature → 0 hits | `stale` |
|
||||
| Env drift only (local setup issue) | Thread contains "my machine", "my setup", "proxy" without others | `env` |
|
||||
|
||||
For each demoted candidate, record the demotion reason in the approval table as `Verify: <tag>` so the human can override if they disagree. Never hard-skip based on verification alone — always show the row with the demotion.
|
||||
|
||||
### Recommended verify commands
|
||||
|
||||
```bash
|
||||
# 1. Search recent PRs for the feature in question
|
||||
gh search prs "<keyword>" --repo Comfy-Org/ComfyUI_frontend --limit 5
|
||||
|
||||
# 2. Grep for the feature / component mentioned
|
||||
rg -l "<ComponentOrFeatureName>" src/ apps/
|
||||
|
||||
# 3. Check if it's a known i18n / documented setting
|
||||
rg "<setting-key>" src/locales/en/ docs/
|
||||
```
|
||||
|
||||
Keep verification under ~30s per candidate. If it takes longer, propose a ticket and let the human decide — don't let verification become the bottleneck.
|
||||
|
||||
## Classification Rules
|
||||
|
||||
For each unprocessed top-level message, decide:
|
||||
|
||||
| Class | Signal | Action |
|
||||
| ----------------- | --------------------------------------------------------------------------------------------------------- | ---------------------------- |
|
||||
| **bug** | Describes unexpected behavior, visual glitch, error, regression, crash. Usually has repro steps or media. | Propose Linear ticket |
|
||||
| **discussion** | Design question, rollout thoughts, team chatter, PR planning (e.g. "how about we make a PR to do...") | Skip |
|
||||
| **question** | User asking if something is expected or known | Skip unless answered = bug |
|
||||
| **meta** | Channel joins, bot messages, cross-posts without content | Skip |
|
||||
| **already-filed** | Thread shows PR already open OR existing Linear link | Skip, log with existing link |
|
||||
|
||||
When ambiguous, default to **bug** and let the human decide in the approval batch.
|
||||
|
||||
## Ticket Schema
|
||||
|
||||
Normalize each bug to this shape before presenting:
|
||||
|
||||
```json
|
||||
{
|
||||
"slack_ts": "1776639963.837519",
|
||||
"slack_permalink": "https://comfy-organization.slack.com/archives/C0A4XMHANP3/p1776639963837519",
|
||||
"reporter": "Ali Ranjah (wavey)",
|
||||
"title": "Unet model dropdown missing selected model",
|
||||
"description": "Body with repro steps, env, attachments list, thread summary",
|
||||
"env": ["cloud prod"],
|
||||
"severity": "low | medium | high",
|
||||
"area": "ui | node-system | workflow | cloud | templates | unknown",
|
||||
"attachments": [{ "name": "...", "id": "F...", "type": "image/png" }],
|
||||
"thread_resolution": "solved | open | none"
|
||||
}
|
||||
```
|
||||
|
||||
Keep descriptions copy-paste friendly: lead with repro bullets, then env, then "See Slack: <permalink>". Attach thread summary only if it adds context beyond the top-level message.
|
||||
|
||||
### Severity Heuristics
|
||||
|
||||
- **high** — crash, data loss, blocks a template or core feature, affects paying users broadly (e.g. "job ends in 30m on Pro", "widget values reset").
|
||||
- **medium** — visible regression, template error, wrong pricing, broken UX on a common path.
|
||||
- **low** — cosmetic, single-template edge case, minor tooltip/boundary issue.
|
||||
|
||||
When unsure, mark `medium` and flag for human in the approval batch.
|
||||
|
||||
### Area Heuristics
|
||||
|
||||
- `ui` — visual glitches, palette issues, popover clipping, dropdown styling.
|
||||
- `node-system` — canvas perf, reroute, node drag, widget rendering, undo.
|
||||
- `workflow` — template failures, save/load, refresh regressions.
|
||||
- `cloud` — jobs, pricing, assets, auth, queue.
|
||||
- `templates` — specific template errors.
|
||||
|
||||
## Pre-flight Dedupe Gate (MANDATORY)
|
||||
|
||||
Before any candidate enters the approval table, run BOTH checks below and record the result in the row's `Dedup` and `PR` columns. This is a hard gate — no candidate may be proposed for creation without a verdict.
|
||||
|
||||
### Check 1 — Open Linear issues (via `@Linear search`)
|
||||
|
||||
Extract 3-5 keyword terms from the proposed title (strip stopwords). Post a search command to the bug-dump thread — use a scratch thread if no parent `ts` is available yet, but prefer the candidate's own parent thread so the search card becomes part of that thread's audit trail:
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear search <keyword-1> <keyword-2>\nTeam: Frontend Engineering\nStatus: open"
|
||||
})
|
||||
```
|
||||
|
||||
Poll `slack_read_thread` for up to 10s; parse the Linear app's card reply for `FE-NNNN` identifiers and URLs. Run the search twice with different keyword subsets if the first returns zero hits — reworded titles are the top false-negative class.
|
||||
|
||||
If `@Linear search` is not supported by the workspace's Linear app version, fall back to a Slack search for prior `@Linear` card replies in the channel:
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_search_public({
|
||||
query: "in:<#C0A4XMHANP3> from:@Linear <keyword-1> <keyword-2>"
|
||||
})
|
||||
```
|
||||
|
||||
This scans past Linear bot replies in the channel — any reply containing a matching `FE-NNNN` URL is a candidate duplicate. Record which dedupe path was used in the session log.
|
||||
|
||||
Treat a hit as a duplicate if any of:
|
||||
|
||||
- Title overlap ≥ 80% (after lowercasing + stopword removal)
|
||||
- Same reporter + same component reference in description
|
||||
- Same stack trace or error code
|
||||
|
||||
**Verdict:** set `Dedup: FE-NNNN` and default recommendation to `L` (link, don't create). The human may still override to `Y` to file a separate ticket.
|
||||
|
||||
### Check 2 — Open or merged fix PRs on GitHub
|
||||
|
||||
```bash
|
||||
# Open PRs matching title keywords
|
||||
gh pr list --repo Comfy-Org/ComfyUI_frontend --state open \
|
||||
--search "<keyword-1> <keyword-2>" --limit 5 \
|
||||
--json number,title,url,createdAt
|
||||
|
||||
# Recent merged fixes (last 30d) — catches "already fixed, waiting to ship"
|
||||
gh pr list --repo Comfy-Org/ComfyUI_frontend --state merged \
|
||||
--search "<keyword-1> <keyword-2> merged:>=<YYYY-MM-DD>" --limit 5 \
|
||||
--json number,title,url,mergedAt
|
||||
```
|
||||
|
||||
Treat a hit as a match if the PR title/body mentions the same component or bug phrase and the PR is unmerged or merged within the window covering the reporter's observation.
|
||||
|
||||
**Verdict:**
|
||||
|
||||
- Open PR match → set `PR: #NNNN (open)`, recommendation `pr-open` (file Linear with `pr-open` label linking the PR, skip auto-fix).
|
||||
- Merged PR match → set `PR: #NNNN (merged)`, recommendation `fixed` (demote in verify, usually skip; human can override if the reporter claims the fix didn't land).
|
||||
|
||||
### Failure handling
|
||||
|
||||
If either check errors (Linear Slack app silent or not in channel, `gh` auth expired), DO NOT proceed to proposal — stop the sweep, report the failure to the user, and let them decide whether to re-run or manually dedupe. A silent skip of dedupe is never acceptable; it's the single biggest source of duplicate tickets.
|
||||
|
||||
Log each dedupe query + top hits in `~/temp/bug-dump-ingest/session-YYYY-MM-DD.md` under a per-candidate `Dedup trace:` block so the human can audit.
|
||||
|
||||
## Interactive Approval
|
||||
|
||||
Present candidates in batches of 5-10. Table format (10 columns):
|
||||
|
||||
```text
|
||||
# | Slack (author, time) | Proposed title | Env | Sev | Area | Dedup | PR | Verify | Rec
|
||||
----+------------------------+-----------------------------------------+------------+------+------------+------------+---------------+-------------+-----
|
||||
1 | wavey, 04-20 08:06 | Unet dropdown missing selected model | cloud prod | low | ui | - | - | resolved | N
|
||||
2 | Denys, 04-18 05:45 | Pro plan jobs end at 30 minutes | cloud prod | high | cloud | - | - | clean | Y
|
||||
3 | Terry Jia, 04-18 12:52 | Nodes 2.0 canvas lag on large workflows | - | high | node-system| FE-4521 | - | clean | L
|
||||
4 | Pablo, 04-17 08:52 | Multi-asset delete popup shows hashes | cloud prod | low | ui | - | #11402 (open) | clean | pr-open
|
||||
```
|
||||
|
||||
Each row MUST show: Slack author + date, proposed title, env tags, severity, area, **dedupe status from the Pre-flight Dedupe Gate**, **open/merged PR hit from the Pre-flight Dedupe Gate**, verify tag (from False-Defect Verification), and agent recommendation.
|
||||
|
||||
### Default recommendation logic
|
||||
|
||||
The skill computes `Rec` deterministically from the gate results:
|
||||
|
||||
- `L` — Dedupe hit on open Linear issue.
|
||||
- `pr-open` — Open GitHub PR hit.
|
||||
- `fixed` — Merged PR hit within the reporter's observation window.
|
||||
- `N` — Verify tag is `resolved`, `expected`, `stale`, or `env` only.
|
||||
- `?` — Repro incomplete or classification ambiguous.
|
||||
- `Y` — Everything clean AND candidate is not on the § Handoff-Exclusion list. This is the "file + auto-fix" path.
|
||||
- `Y (file-only)` — Clean but on the handoff-exclusion list (e.g. touches LGraphNode, needs backend). File Linear, skip auto-fix.
|
||||
|
||||
### Response format
|
||||
|
||||
- `Y` — default path: create Linear ticket, post `:white_check_mark:` thread reply, AND if the candidate is eligible (dedupe clean, verify clean, not on handoff-exclusion list), immediately invoke `red-green-fix` via the `Skill` tool. See § Fix Workflow.
|
||||
- `S` — **skip auto-fix** for this row: create Linear ticket + thread reply only, do NOT run red-green-fix. Use when the human knows a specific person is already investigating or wants to batch fixes.
|
||||
- `N` — skip entirely (log reason in session file).
|
||||
- `?` — mark as needs-context; skill posts a thread reply asking for repro details and prompts the human to add `:question:` to the parent.
|
||||
- `L` — link to existing Linear issue instead of creating (skill asks which one if the Pre-flight Dedupe Gate didn't return an exact match).
|
||||
- `R` — duplicate of another bug-dump message; skill links the two and prompts the human for `:repeat:` on the parent.
|
||||
- `E` — edit proposed title/description before creating (skill shows draft for inline tweaks).
|
||||
- Bulk responses accepted: `1 N, 2 Y, 3 L FE-4521, 4 pr-open #11402, 5 ?` — any row omitted from the response is treated as its computed `Rec` default.
|
||||
|
||||
Do not post any `@Linear create` messages until all candidates in the batch have a terminal decision. Auto-fix invocations run sequentially AFTER every `@Linear create` has produced a parsed `FE-NNNN`, so every `red-green-fix` call has a `Fixes FE-NNNN` to put in the PR body.
|
||||
|
||||
## Linear Slack Bot Integration (@Linear)
|
||||
|
||||
Every Linear action — create, search, link, label, status change — is performed by posting a message to the candidate's thread in `#bug-dump` that mentions `@Linear`. The Linear Slack app parses the mention and responds with a card in the same thread. There is no Linear MCP path and no `LINEAR_API_KEY` path; see `reference/linear-api.md` § "Why no direct API path" for the rationale.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- The Comfy Slack workspace already has the Linear Slack app installed (this is how humans add `@Linear` mentions today).
|
||||
- Channel `C0A4XMHANP3` is connected to the `Frontend Engineering` Linear team.
|
||||
- No per-machine setup. If a `@Linear` invocation produces no bot reply, the app is not in the channel — surface to the human, do NOT retry silently.
|
||||
|
||||
### Create an issue
|
||||
|
||||
For each approved `Y` candidate, call:
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear create\nTeam: Frontend Engineering\nTitle: <title>\nStatus: Triage\nLabels: source:bug-dump, area:<area>, env:<env>, sev:<severity>, reporter:<handle>\n\n<description>\n\nSource: <slack-permalink>"
|
||||
})
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- First line MUST be `@Linear create` — this is the command token.
|
||||
- `Team: Frontend Engineering` is required on every create — without it the bot falls back to the workspace default, which may route to a different team.
|
||||
- `Status: Triage` pins the initial state (per § System Context).
|
||||
- `Labels:` — comma-separated, full `source:bug-dump, area:*, env:*, sev:*, reporter:*` set per § Label Taxonomy. Missing labels are auto-created by the Linear Slack app on first use.
|
||||
- Description body is markdown — see `reference/linear-api.md` § "Description body template" and `reference/schema.md` for per-field extraction.
|
||||
- Use real newlines (not literal `\n`) when constructing the text.
|
||||
|
||||
After the tool call returns, poll `slack_read_thread` for the Linear app's reply card (up to 3× with ~3s spacing). Parse the card for:
|
||||
|
||||
- An `FE-NNNN` identifier
|
||||
- A `https://linear.app/<org>/issue/FE-NNNN` URL
|
||||
|
||||
The URL is the ingested receipt. The skill then posts the `:white_check_mark:` confirmation reply (§ Slack Thread Reply).
|
||||
|
||||
### Search (dedupe)
|
||||
|
||||
See § Pre-flight Dedupe Gate § Check 1 for the search command shape and handling of the bot's reply. The search is a tool call in the candidate's thread — not a chat aside.
|
||||
|
||||
### Link an existing issue (`L` response)
|
||||
|
||||
When the human picks `L FE-4521` for a row, do NOT post `@Linear create`. Instead:
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear link FE-4521"
|
||||
})
|
||||
```
|
||||
|
||||
The bot replies with the linked issue card. Then post the `:white_check_mark:` confirmation reply (adjusted to say `Linked to Linear:` rather than `Filed to Linear:`) so Processed Detection still matches.
|
||||
|
||||
### Label / status updates
|
||||
|
||||
When a later sweep needs to flip a ticket (e.g. a PR opened after initial ingest, so add `pr-open` and link):
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear FE-4521 add-labels pr-open"
|
||||
})
|
||||
```
|
||||
|
||||
Status changes are rarely driven by this skill directly — Linear auto-moves issues to `In Review` when a PR with `Fixes FE-NNNN` is opened, and the `red-green-fix` skill handles that PR body.
|
||||
|
||||
### Captured fields per create
|
||||
|
||||
Every successful create must produce, via the Linear bot's reply card:
|
||||
|
||||
- `identifier` — e.g. `FE-4710`, used in `Fixes <LIN-ID>` references and session log
|
||||
- `url` — `https://linear.app/.../issue/FE-4710`, included verbatim in the `:white_check_mark:` reply
|
||||
- `ts` of the Linear bot's card reply — recorded in session log for audit
|
||||
|
||||
If the card is missing the URL or identifier, fall through to the failure path below — do NOT fabricate either value.
|
||||
|
||||
### Failure path
|
||||
|
||||
If the Linear bot does not reply within the poll window, OR replies with a parse error (`couldn't parse`, `no team matched`, `failed`):
|
||||
|
||||
1. Write a draft markdown file to `~/temp/bug-dump-ingest/drafts/NN-short-slug.md` containing the full `@Linear create` text that was sent plus any partial bot reply.
|
||||
2. Post a thread reply that is explicit about the failure — do NOT include `:white_check_mark:` or a fake Linear URL:
|
||||
```text
|
||||
:warning: bug-dump-ingest: @Linear did not respond. Drafted at ~/temp/bug-dump-ingest/drafts/<slug>.md — please file manually and reply with the FE-NNNN.
|
||||
```
|
||||
3. Skip auto-fix for this candidate (no Linear ID = no `Fixes` reference).
|
||||
4. Log the failure in the session log.
|
||||
|
||||
Never invent a Linear URL. Never post `:white_check_mark: Filed to Linear: ...` without a real URL parsed from a real Linear bot card.
|
||||
|
||||
## Slack Thread Reply (Ingested Marker) — MANDATORY TOOL CALL
|
||||
|
||||
Every approved candidate produces **two** mandatory `slack_send_message` calls in the parent thread:
|
||||
|
||||
1. The `@Linear create` (or `@Linear link`) command — see § Linear Slack Bot Integration.
|
||||
2. The `:white_check_mark:` confirmation reply described below, posted after a real `FE-NNNN` + URL have been parsed from the Linear bot's card.
|
||||
|
||||
The second reply is what future sweeps grep for via `has::white_check_mark: from:me`. Even though the Linear bot's own card already contains the URL, the `:white_check_mark:` prefix is the canonical Processed Detection marker — without it, a future sweep may re-ingest the same bug.
|
||||
|
||||
The skill is not done with a candidate until BOTH calls have succeeded. If either fails, do not claim the candidate is ingested.
|
||||
|
||||
### Required call shape
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-message-ts>", // dotted form, e.g. "1776714531.990509"
|
||||
text: ":white_check_mark: Filed to Linear: <LINEAR_URL>\nReporter: <@USER_ID>\nSev: <severity> • Area: <area>"
|
||||
})
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- `thread_ts` MUST be the parent message ts — never the channel ts, never omitted. An omitted `thread_ts` posts at channel level, which pollutes `#bug-dump` and breaks Processed Detection.
|
||||
- The text MUST start with `:white_check_mark:` followed by a space and `Filed to Linear:`. This exact prefix is what future sweeps grep for via `has::white_check_mark: from:me`.
|
||||
- The Linear URL MUST be present. No URL = not ingested; future sweeps will re-file the same bug.
|
||||
- Plain text only — no markdown tables, no bold, no code fences. Slack renders the emoji shortcode into a real `:white_check_mark:` only when the message is plain text.
|
||||
- Capture the returned `ts` and record it in the session log for audit.
|
||||
|
||||
### NEVER-do list (common failure mode)
|
||||
|
||||
- **Do NOT** print `@Linear create ...` or `:white_check_mark: Filed to Linear: <URL>` into the Claude CLI chat response as a substitute for calling `slack_send_message`. The CLI output is not seen by Slack. If you find yourself typing either into a plain assistant message, stop and issue the tool call instead.
|
||||
- **Do NOT** claim the thread reply was posted until the `slack_send_message` tool call has returned a success with a `ts`. If the tool call errors, surface the error and halt the batch — do not fabricate a reply.
|
||||
- **Do NOT** use any other tool (e.g. `slack_schedule_message`, `slack_send_message_draft`) as a substitute. Only an immediate `slack_send_message` with `thread_ts` set counts — the Linear Slack app does not trigger on scheduled/draft messages.
|
||||
- **Do NOT** substitute any direct Linear API call (MCP, GraphQL, curl) for the `@Linear` mention. The Slack thread is intentionally the single audit trail.
|
||||
|
||||
### Fix-path reply (after red-green-fix opens a PR)
|
||||
|
||||
When `red-green-fix` returns a PR URL for an auto-fixed candidate, the skill MUST post a second thread reply on the same parent — again via `slack_send_message`:
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<same parent ts>",
|
||||
text: ":pr-open: Fix PR: <PR_URL>\nRed-green verified: <unit|e2e> test proves the regression.\nFixes <LIN-ID>"
|
||||
})
|
||||
```
|
||||
|
||||
Same "tool call, not chat output" rule applies.
|
||||
|
||||
### Parent reaction — optional visibility nudge (not on critical path)
|
||||
|
||||
The Slack MCP does not expose `reactions.add`, so the skill cannot set a `:white_check_mark:` reaction on the parent. The thread reply above is sufficient for Processed Detection; the parent reaction is a human-only "visible in channel" nudge. At the end of the run, the skill MAY print a compact list for the human:
|
||||
|
||||
```text
|
||||
Optional: add :white_check_mark: to parent messages for in-channel visibility.
|
||||
LIN-4710 → <permalink>
|
||||
LIN-4711 → <permalink>
|
||||
```
|
||||
|
||||
This is a convenience, not a deliverable — a missing parent reaction does not cause re-ingestion.
|
||||
|
||||
## Fix Workflow (auto-invoke red-green-fix)
|
||||
|
||||
For every `Y` row whose `Rec` resolved to auto-fix (dedupe clean, verify clean, not on handoff-exclusion list), the skill MUST — after Linear creation and the `:white_check_mark:` thread reply — invoke the `red-green-fix` skill via the `Skill` tool. This is a real tool call, not a narrative handoff.
|
||||
|
||||
### Required Skill tool call
|
||||
|
||||
```text
|
||||
Skill({
|
||||
skill: "red-green-fix",
|
||||
args: "<composed prompt — see below>"
|
||||
})
|
||||
```
|
||||
|
||||
Compose `args` as a single self-contained prompt so the sub-invocation has everything it needs without re-reading the Linear issue:
|
||||
|
||||
```text
|
||||
Bug: <title>
|
||||
Linear: <LIN-ID> (<LINEAR_URL>)
|
||||
Source: Slack <permalink>
|
||||
Reporter: <display-name>
|
||||
Env: <env tags>
|
||||
Area: <area>
|
||||
Branch: fix/<lin-id-lowercase>-<short-slug>
|
||||
|
||||
Repro:
|
||||
1. <step>
|
||||
2. <step>
|
||||
|
||||
Expected: <expected behavior>
|
||||
Actual: <actual behavior>
|
||||
|
||||
Test layer (inferred from area):
|
||||
- ui → Vitest colocated + Playwright e2e tagged @regression
|
||||
- node-system → Playwright e2e primarily
|
||||
- workflow / templates → Playwright e2e
|
||||
- cloud → Vitest if client-side; otherwise STOP and label the Linear issue "needs-backend"
|
||||
|
||||
Test naming:
|
||||
- describe('<LIN-ID>: <one-line bug summary>', ...)
|
||||
- Playwright test title must include the LIN-ID.
|
||||
|
||||
PR body must include:
|
||||
- "Fixes <LIN-ID>"
|
||||
- "Source: Slack <permalink>"
|
||||
|
||||
Follow the red-green-fix two-commit sequence exactly. Do NOT skip the red commit.
|
||||
```
|
||||
|
||||
The skill MUST wait for `red-green-fix` to return before moving to the next candidate. Process one auto-fix at a time so branch state is deterministic.
|
||||
|
||||
### Verifying the invocation ran
|
||||
|
||||
After the `Skill` call returns, the skill MUST confirm at least one of:
|
||||
|
||||
1. A new git branch named `fix/<lin-id>-*` exists (`git branch --list "fix/<lin-id>-*"`).
|
||||
2. A PR URL is present in `red-green-fix`'s return payload.
|
||||
|
||||
If neither is true, the invocation silently no-op'd. Log the failure to the session log as `auto-fix skipped: invocation returned without branch or PR` and continue — do NOT post the `:pr-open:` thread reply.
|
||||
|
||||
### Inputs summary
|
||||
|
||||
- **Bug description** — the Linear description (includes repro, env, source permalink).
|
||||
- **Linear ID** — inserted into the PR body as `Fixes <LIN-ID>`.
|
||||
- **Branch name** — `fix/<lin-id>-<short-slug>` (e.g. `fix/lin-4711-pro-plan-30min-timeout`).
|
||||
- **Test layer** — inferred from `area`:
|
||||
- `ui` → unit (Vitest) + e2e (Playwright)
|
||||
- `node-system` → e2e primarily; unit if isolable
|
||||
- `workflow` / `templates` → e2e
|
||||
- `cloud` → unit if client-side logic, otherwise flag "backend — out of scope for this repo"
|
||||
|
||||
### Handoff-Exclusion list (do NOT auto-invoke red-green-fix)
|
||||
|
||||
These rows still get a Linear ticket + `:white_check_mark:` thread reply, but the skill MUST skip the `Skill(skill="red-green-fix")` call and instead post a thread nudge explaining why:
|
||||
|
||||
- Repro steps are incomplete (no clear numbered steps, no env) — reply in thread: "Need clearer repro before I can write a failing test. What's the shortest path to reproduce?"
|
||||
- Fix requires backend / ComfyUI repo changes (not frontend) — label Linear `needs-backend`.
|
||||
- Linear ticket was dedupe-linked rather than newly created — existing owner may already be fixing.
|
||||
- Severity is cosmetic AND reporter hasn't asked for a fix — file ticket only.
|
||||
- Fix would touch `LGraphNode`, `LGraphCanvas`, `LGraph`, or `Subgraph` god-objects (ADR-0003/0008 — always human decision).
|
||||
- Pre-flight Dedupe Gate found an open PR (`pr-open`) or a matching merged PR (`fixed`).
|
||||
|
||||
When a row is excluded, record the reason in the session log under `auto-fix excluded: <reason>`.
|
||||
|
||||
### Test authoring rules
|
||||
|
||||
Both tests MUST be written in the "red" commit BEFORE any fix code (per red-green-fix). Rules specific to bug-dump ingestion:
|
||||
|
||||
- **Unit test (Vitest)** — colocated next to the implementation, `<file>.test.ts`. Exercise the specific logic path reproduced by the reporter. One `describe` block named after the Linear ID:
|
||||
|
||||
```typescript
|
||||
// src/components/node/UnetDropdown.test.ts
|
||||
describe('LIN-4710: unet dropdown missing selected model', () => {
|
||||
it('includes the currently-selected model in the list even when not in available models', () => {
|
||||
// ...
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
- **E2E test (Playwright)** — under `browser_tests/tests/`, follow `writing-playwright-tests` skill. Tag with `@regression` and include the Linear ID in the test title:
|
||||
|
||||
```typescript
|
||||
test.describe(
|
||||
'LIN-4710 unet dropdown regression',
|
||||
{ tag: ['@regression'] },
|
||||
() => {
|
||||
test('keeps selected model visible in the dropdown', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
// ...
|
||||
})
|
||||
}
|
||||
)
|
||||
```
|
||||
|
||||
- **Mock data types** — follow `docs/guidance/playwright.md`: mock responses typed from `packages/ingest-types`, `packages/registry-types`, `src/schemas/` — never `as any`.
|
||||
|
||||
(The Handoff-Exclusion list above governs when `red-green-fix` is NOT invoked.)
|
||||
|
||||
### PR body template
|
||||
|
||||
The red-green-fix skill's PR template is extended with a `Source` line:
|
||||
|
||||
```markdown
|
||||
## Summary
|
||||
|
||||
<Root cause>
|
||||
|
||||
- Fixes LIN-NNN
|
||||
- Source: Slack <permalink>
|
||||
|
||||
## Red-Green Verification
|
||||
|
||||
| Commit | CI Status | Purpose |
|
||||
| ------------------------------------------ | -------------------- | ------------------------------- |
|
||||
| `test: LIN-NNN add failing test for <bug>` | :red_circle: Red | Proves the test catches the bug |
|
||||
| `fix: <bug summary>` | :green_circle: Green | Proves the fix resolves the bug |
|
||||
|
||||
## Test Plan
|
||||
|
||||
- [ ] Unit regression test passes locally
|
||||
- [ ] E2E regression test passes locally (if UI)
|
||||
- [ ] Manual repro no longer reproduces
|
||||
- [ ] Linear ticket linked
|
||||
```
|
||||
|
||||
After the PR merges, post the second thread reply on Slack (see Slack Thread Reply § Fix-path reply).
|
||||
|
||||
## Emoji Reaction Hints (read-only)
|
||||
|
||||
The agent cannot add reactions, but respects human-set reactions when filtering. The canonical team scheme (primary):
|
||||
|
||||
| Reaction | Meaning | Action |
|
||||
| -------------------- | ------------------ | -------------------------------------------------------- |
|
||||
| `:white_check_mark:` | Ticket created | Skip — already ingested |
|
||||
| `:pr-open:` | PR open | Skip creation; record PR link in session log |
|
||||
| `:question:` | Needs more context | Skip creation; agent may post a thread reply asking |
|
||||
| `:repeat:` | Duplicate | Skip creation; link existing Linear issue in session log |
|
||||
|
||||
Incidental reactions observed in the channel — treat as soft hints only, do NOT skip solely on these:
|
||||
|
||||
| Reaction | Meaning | Action |
|
||||
| -------- | ------------------- | -------------------------------------------------- |
|
||||
| `:eyes:` | Someone is triaging | Still ingestable |
|
||||
| `:done:` | Reporter resolved | Demote to `resolved` in verify, but still show row |
|
||||
| `:+1:` | Acknowledged | Ignore |
|
||||
|
||||
Approval-table response code `R` (new) corresponds to `:repeat:` — if you pick `R`, the skill treats it as duplicate and asks for the target Linear ID.
|
||||
|
||||
## Session Log
|
||||
|
||||
Append to `~/temp/bug-dump-ingest/session-YYYY-MM-DD.md`:
|
||||
|
||||
```text
|
||||
Bug Dump Ingest Session -- 2026-04-20 11:40 KST
|
||||
|
||||
Window: 2026-04-18 00:00 — 2026-04-20 12:00 KST
|
||||
Scanned: 28 top-level messages
|
||||
Skipped (meta/discussion/processed): 14
|
||||
Proposed: 14
|
||||
Approved: 11
|
||||
Created in Linear: 10
|
||||
Draft-only (creation failed): 1
|
||||
Linked-only (dedupe): 1
|
||||
Thread replies posted: 11
|
||||
|
||||
Created:
|
||||
- LIN-4710 Unet model dropdown missing selected model -- wavey -- low/ui
|
||||
- LIN-4711 Pro plan jobs end at 30 minutes -- Denys -- high/cloud
|
||||
- ...
|
||||
|
||||
Skipped with reason:
|
||||
- 1776592837.616399 -- design discussion in thread, not a bug
|
||||
- ...
|
||||
```
|
||||
|
||||
## Gotchas
|
||||
|
||||
### Thread summaries, not raw dumps
|
||||
|
||||
Pulling the full thread often adds noise. Summarize replies to: (a) confirmed reproductions by other users, (b) env/version details added in replies, (c) links to related PRs/commits. Drop emojis-only replies, joined-channel notifications, and off-topic chatter.
|
||||
|
||||
### Cross-posts are not bugs
|
||||
|
||||
When the top-level message is just a link to a Slack message in another channel (e.g. "X posting" with a URL and nothing else), follow the link to the original source and ingest from there — do NOT create a ticket from the cross-post itself.
|
||||
|
||||
### Resolved-in-thread messages
|
||||
|
||||
If the reporter replies `"No action needed, this is solved"` (see wavey 2026-04-20 08:06), mark the ticket for SKIP in the approval table, not auto-skip. The human may still want a regression test ticket.
|
||||
|
||||
### Permalinks
|
||||
|
||||
Construct Slack permalinks as:
|
||||
|
||||
```text
|
||||
https://comfy-organization.slack.com/archives/{CHANNEL_ID}/p{TS_WITH_DOT_REMOVED}
|
||||
```
|
||||
|
||||
E.g. `1776510375.473579` → `p1776510375473579`.
|
||||
|
||||
### Attachment handling
|
||||
|
||||
Slack file IDs (e.g. `F0AT...`) are private. Do NOT link them directly in Linear. Instead, list the filename and type in the Linear description and include the Slack permalink — anyone with Slack access can see the attachments from the thread.
|
||||
|
||||
### No auto-create without approval
|
||||
|
||||
Never create Linear issues without a human `Y`. This is a hard rule — the skill exists to reduce human toil, not to replace triage judgment.
|
||||
|
||||
## Reference Files
|
||||
|
||||
- `reference/linear-api.md` — `@Linear` Slack bot command reference (create, search, link, labels, status).
|
||||
- `reference/schema.md` — full ticket schema with field-by-field extraction notes.
|
||||
- `reference/examples.md` — worked examples drawn from real #bug-dump messages.
|
||||
- `reference/verify-commands.md` — cookbook of false-defect verification commands per bug class.
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `red-green-fix` — auto-invoked via the `Skill` tool for every eligible `Y` candidate to produce a failing test + fix + PR with the red-green CI proof.
|
||||
- `writing-playwright-tests` — used by red-green-fix when an e2e test is needed.
|
||||
- `hardening-flaky-e2e-tests` — if the e2e test added in the fix PR starts flaking, jump to this skill.
|
||||
@@ -1,123 +0,0 @@
|
||||
# Worked Examples
|
||||
|
||||
Real #bug-dump messages (2026-04-17 → 2026-04-20) normalized through the skill.
|
||||
|
||||
## Example 1 — Clean bug with repro
|
||||
|
||||
**Source message** (wavey, 2026-04-20 08:06):
|
||||
|
||||
> unet model dropdown doesnt display all available models, think this is part of a larger issue with model dropdowns..
|
||||
>
|
||||
> • open flux.2 klein 4b image edit template
|
||||
> • open unet drop down --> notice selected model isnt present in the list, even though its selected
|
||||
> • execute (to check if it flags the model as missing) --> notice it still runs
|
||||
> No action needed, this is solved
|
||||
|
||||
**Thread resolution**: "No action needed, this is solved" — reporter resolved it in the same message.
|
||||
|
||||
**Classification**: bug, but `thread_resolution = solved`. Flag for human.
|
||||
|
||||
**Approval row**:
|
||||
|
||||
```text
|
||||
1 | wavey, 04-20 08:06 | Unet dropdown missing selected model | cloud | low | ui | N | N (reporter marked solved)
|
||||
```
|
||||
|
||||
Default recommendation: `N`. If human overrides to `Y`, file with a "Regression test" label so QA still tracks it.
|
||||
|
||||
---
|
||||
|
||||
## Example 2 — Clear high-severity cloud bug
|
||||
|
||||
**Source message** (Denys Puziak, 2026-04-18 05:45):
|
||||
|
||||
> I see two reports about jobs ending in 30 minutes while the user is on the Pro plan
|
||||
> cc @Hunter
|
||||
> https://discord.com/channels/.../1494078128971055145
|
||||
|
||||
**Classification**: bug, `env: [cloud prod]` (Pro plan = cloud), `severity: high` (paying users), `area: cloud`.
|
||||
|
||||
**Proposed title**: `Pro plan jobs end at 30 minutes`
|
||||
|
||||
**Description** (excerpt):
|
||||
|
||||
```markdown
|
||||
**Reporter:** Denys Puziak
|
||||
**Env:** cloud prod
|
||||
**Severity (proposed):** high
|
||||
**Area:** cloud
|
||||
|
||||
## Repro
|
||||
|
||||
1. User on Pro plan submits a job
|
||||
2. Job ends at 30 minutes instead of the Pro plan limit
|
||||
|
||||
## Notes
|
||||
|
||||
- Two user reports aggregated by Denys
|
||||
- cc'd @Hunter
|
||||
|
||||
## Source
|
||||
|
||||
Slack: <permalink>
|
||||
Discord thread: https://discord.com/channels/.../1494078128971055145
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Example 3 — Not a bug (discussion)
|
||||
|
||||
**Source message** (Christian Byrne, 2026-04-19 19:00):
|
||||
|
||||
> @Glary-Bot okay option A is clearly superior and I feel embarrassed I didn't see that line myself...
|
||||
|
||||
**Classification**: discussion (design review chatter). Skip. Log reason in session file.
|
||||
|
||||
---
|
||||
|
||||
## Example 4 — Meta-action / PR planning
|
||||
|
||||
**Source message** (Christian Byrne, 2026-04-19 09:30):
|
||||
|
||||
> @Glary-Bot how about we make a PR to do:
|
||||
>
|
||||
> 1. Audit the rest of the codebase...
|
||||
> 2. Create a helper in src/base...
|
||||
|
||||
**Classification**: discussion (PR-plan proposal). Skip.
|
||||
|
||||
---
|
||||
|
||||
## Example 5 — Performance regression
|
||||
|
||||
**Source message** (Terry Jia, 2026-04-18 12:52):
|
||||
|
||||
> With Nodes 2.0, large workflows (hundreds of nodes) make the canvas extremely laggy and unusable for actual work — switching tabs takes several seconds or more. Switching back to Litegraph, performance is significantly better.
|
||||
|
||||
**Classification**: bug, `area: node-system`, `severity: high`.
|
||||
|
||||
**Dedupe**: Post `@Linear search nodes 2.0 performance canvas lag` (Team: Frontend Engineering, Status: open) in the candidate's thread. Likely matches exist — flag `Dedup? ?` and ask human which ticket to link to.
|
||||
|
||||
---
|
||||
|
||||
## Example 6 — Reporter says it's a question, not a report
|
||||
|
||||
**Source message** (Luke, 2026-04-17 08:27):
|
||||
|
||||
> Is NodeInfo supposed to show information or docs about the node? It just brings up the node sidebar
|
||||
|
||||
**Classification**: question → ambiguous. Read thread. If replies confirm "that's unexpected, should show docs", upgrade to bug. If "yes that's intended", skip.
|
||||
|
||||
Default recommendation in the approval batch: `?` (needs expansion).
|
||||
|
||||
---
|
||||
|
||||
## Example 7 — Bug with PR already in flight
|
||||
|
||||
**Source message** (Pablo, 2026-04-17 08:52):
|
||||
|
||||
> when deleting multiple assets on cloud -> the confirmation popup still has the assets hashes as names instead of the display name
|
||||
|
||||
**Reaction**: `pr-open (1)` — someone's opened a PR.
|
||||
|
||||
**Classification**: `already-filed` branch. Skip creation; in the session log, note "PR already open". If the human wants a tracking Linear ticket anyway, still fileable with a link to the PR.
|
||||
@@ -1,160 +0,0 @@
|
||||
# Linear Slack Bot (@Linear) Reference
|
||||
|
||||
The skill drives Linear exclusively through the Linear Slack app (`@Linear`). **There is no Linear MCP, no `LINEAR_API_KEY`, no GraphQL.** Every Linear read/write happens as a Slack message that mentions `@Linear` in the `#bug-dump` thread, and the Linear Slack app performs the action and posts a reply card containing the issue URL.
|
||||
|
||||
## Why Slack-only
|
||||
|
||||
- The `#bug-dump` thread is already the source of truth; keeping the entire lifecycle (report → ticket → PR → resolution) in one thread means Processed Detection can grep the thread instead of a separate registry.
|
||||
- No API key rotation, no MCP server install, no OAuth browser flow — works on any machine that already has the Slack MCP configured.
|
||||
- The Linear Slack app's reply card (with issue URL, title, status, and assignee) IS the canonical receipt; the skill records its `ts` in the session log.
|
||||
|
||||
## Prerequisites (one-time, per workspace)
|
||||
|
||||
The Comfy Slack workspace must already have the Linear Slack app installed (it is — that's how humans use `@Linear` reactions today) and `#bug-dump` (channel `C0A4XMHANP3`) must have Linear enabled for the `Frontend Engineering` team. Nothing else to configure. If a `@Linear` invocation silently does nothing, the bot isn't present in the channel — surface that to the human rather than re-trying.
|
||||
|
||||
## Supported operations
|
||||
|
||||
Every operation is a `mcp__plugin_slack_slack__slack_send_message` call with `channel_id=C0A4XMHANP3` and `thread_ts=<parent-ts>`. The `text` is a natural-language instruction to the Linear bot. Keep the text concise — Linear parses the first line as the command intent.
|
||||
|
||||
### 1. Create an issue from the thread
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear create\nTeam: Frontend Engineering\nTitle: <title>\nStatus: Triage\nLabels: source:bug-dump, area:<area>, env:<env>, sev:<severity>, reporter:<handle>\n\n<description body>\n\nSource: <slack-permalink>"
|
||||
})
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- Start with `@Linear create` on its own line — this is the command token the bot keys on.
|
||||
- Always specify `Team: Frontend Engineering`. Without it, the bot falls back to the Slack workspace's default team, which may not be FE.
|
||||
- `Status: Triage` pins the initial workflow state.
|
||||
- `Labels:` — comma-separated. If a label doesn't exist yet in Linear, the bot creates it on first use (verified in Linear workspace settings). Keep the taxonomy exactly as SKILL.md § Label Taxonomy.
|
||||
- `<description body>` — markdown per `reference/schema.md` Description Template. Use real newlines, not literal `\n`.
|
||||
- End with `Source: <slack-permalink>` so the Linear issue body links back even if the auto-attachment of the parent message fails.
|
||||
|
||||
The Linear bot replies in the same thread with a card that contains:
|
||||
|
||||
- The Linear URL (`https://linear.app/comfy-org/issue/FE-NNNN`)
|
||||
- Status, assignee (initially unassigned), and applied labels
|
||||
- A "View in Linear" button
|
||||
|
||||
Parse the URL out of the bot's reply text (or attachments). If no card reply appears within ~10s of polling `slack_read_thread`, treat it as a creation failure — do NOT proceed to the `:white_check_mark:` confirmation reply.
|
||||
|
||||
### 2. Search existing open issues (dedupe)
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear search <keyword-1> <keyword-2>\nTeam: Frontend Engineering\nStatus: open"
|
||||
})
|
||||
```
|
||||
|
||||
The bot replies with a card listing up to ~5 matching open issues. Parse identifier (`FE-NNNN`) and URL per row. Treat a hit as a duplicate per SKILL.md § Pre-flight Dedupe Gate § Check 1.
|
||||
|
||||
If `@Linear search` is not supported in the installed Slack app version, fall back to Slack-native search across the `#bug-dump` thread replies (previous `@Linear` cards contain title + URL — grep those for the same keywords). Record which path was used in the session log so the human can see dedupe coverage.
|
||||
|
||||
### 3. Link an existing issue (dedupe: `L` response)
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear link FE-4521"
|
||||
})
|
||||
```
|
||||
|
||||
The bot replies with the linked issue card. The skill then posts its own `:white_check_mark: Linked to Linear: <URL>` confirmation reply (see SKILL.md § Slack Thread Reply).
|
||||
|
||||
### 4. Add labels to an existing issue
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear FE-4521 add-labels pr-open"
|
||||
})
|
||||
```
|
||||
|
||||
Used when an open PR is discovered after ticket creation and the Linear issue should flip to `pr-open`.
|
||||
|
||||
### 5. Change status
|
||||
|
||||
```text
|
||||
mcp__plugin_slack_slack__slack_send_message({
|
||||
channel_id: "C0A4XMHANP3",
|
||||
thread_ts: "<parent-ts>",
|
||||
text: "@Linear FE-4521 status In Progress"
|
||||
})
|
||||
```
|
||||
|
||||
Rarely used by the skill directly — usually status changes come from the `red-green-fix` PR lifecycle (Linear auto-moves to `In Review` when a PR references `Fixes FE-4521`).
|
||||
|
||||
## Description body template
|
||||
|
||||
The text that follows the command headers is rendered verbatim as the Linear issue description (markdown). Use this template — see `reference/schema.md` for field-by-field extraction notes:
|
||||
|
||||
```markdown
|
||||
**Reporter:** <slack-display-name>
|
||||
**Env:** cloud prod / local / electron / ...
|
||||
**Severity (proposed):** high/medium/low
|
||||
**Area:** ui / node-system / workflow / cloud / templates
|
||||
|
||||
## Repro
|
||||
|
||||
1. ...
|
||||
2. ...
|
||||
|
||||
## Expected
|
||||
|
||||
...
|
||||
|
||||
## Actual
|
||||
|
||||
...
|
||||
|
||||
## Attachments (in Slack thread)
|
||||
|
||||
- image.png (png, 315 KB)
|
||||
- Screen Recording.mov (mov, 37 MB)
|
||||
|
||||
## Source
|
||||
|
||||
Slack: <permalink>
|
||||
Thread summary: <1-3 bullets if thread adds context>
|
||||
```
|
||||
|
||||
The Slack permalink is load-bearing — it's the canonical route to attachments, reporter, and any follow-up discussion. Do NOT embed Slack file IDs (`F0AT...`) directly; they're permissioned.
|
||||
|
||||
## Parsing the bot's reply
|
||||
|
||||
After each `slack_send_message` that mentions `@Linear`, poll `slack_read_thread` (with `channel_id=C0A4XMHANP3`, `thread_ts=<parent-ts>`) up to 3 times, ~3s apart. Scan replies authored by the Linear Slack app user for:
|
||||
|
||||
- Any `https://linear.app/<org>/issue/FE-\d+` URL → capture as the issue URL.
|
||||
- The `FE-NNNN` identifier pattern → capture as the issue identifier.
|
||||
- An error phrase (`couldn't`, `failed`, `not found`, `no team matched`) → treat as failure; surface the full bot text to the human.
|
||||
|
||||
Record the bot reply's `ts` alongside the captured URL and identifier in the session log.
|
||||
|
||||
## Failure modes & handling
|
||||
|
||||
| Symptom | Likely cause | Handling |
|
||||
| ------------------------------------------------- | -------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| No bot reply within 10s | Linear app not in channel, or bot outage | Halt the batch, surface to human, do NOT fabricate a Linear URL. Remaining approved candidates stay queued for re-run. |
|
||||
| Bot replies with "no team matched" | Team name typo or Linear workspace drift | Re-send with the exact team name from the Linear workspace (default: `Frontend Engineering`). If it still fails, ask the human to verify. |
|
||||
| Bot replies with "couldn't parse labels" | One of the labels has syntax the bot rejects | Drop the offending label, re-send; log the partial-label failure so the human can patch after. |
|
||||
| Bot creates the issue but reply lacks the URL | Rare bot format change | Re-fetch the thread after ~5s; if URL still absent, open Linear search via `@Linear search <title>` and recover the identifier + URL. |
|
||||
| Multiple `@Linear` replies match (duplicate card) | The skill retried without polling first | Keep the earliest card's URL; log the extras. Never re-issue `@Linear create` for the same candidate without confirming the first card failed. |
|
||||
|
||||
Never retry `@Linear create` without first running `@Linear search` for the same title keywords — a duplicate card is worse than an initial failure because the human has to close one of them manually.
|
||||
|
||||
## Why no direct API path
|
||||
|
||||
- The Linear MCP (official or community) would require either OAuth setup or `LINEAR_API_KEY` in env — both are per-machine hurdles the skill should not depend on.
|
||||
- Direct GraphQL against `api.linear.app` has the same key-management cost and bypasses the Slack thread as the audit trail.
|
||||
- Routing every action through `@Linear` in the thread gives humans full visibility in the channel (the bot's card is the receipt) and Processed Detection becomes a simple Slack thread read.
|
||||
|
||||
If a future need requires capabilities the `@Linear` Slack app doesn't expose (bulk operations, private field edits, webhooks), stop and surface the limitation to the human rather than quietly adding an API-key path — the "Slack-only" constraint is intentional.
|
||||
@@ -1,94 +0,0 @@
|
||||
# Ticket Schema — Extraction Notes
|
||||
|
||||
Field-by-field guidance for normalizing a Slack #bug-dump message into a ticket.
|
||||
|
||||
## `slack_ts`
|
||||
|
||||
The top-level message timestamp from `slack_read_channel` response (`Message TS:` field). Always store the dotted form (`1776510375.473579`). This is the ingestion identity used in `processed.json`.
|
||||
|
||||
## `slack_permalink`
|
||||
|
||||
Construct:
|
||||
|
||||
```text
|
||||
https://comfy-organization.slack.com/archives/C0A4XMHANP3/p<ts-without-dot>
|
||||
```
|
||||
|
||||
Example: `1776510375.473579` → `.../p1776510375473579`.
|
||||
|
||||
## `reporter`
|
||||
|
||||
The display name + parenthetical nickname if present. Examples from the channel:
|
||||
|
||||
- `Ali Ranjah (wavey)`
|
||||
- `Denys Puziak`
|
||||
- `Christian Byrne`
|
||||
|
||||
Do NOT use the Slack user ID (`U087MJCDHHC`) in Linear — names are more readable.
|
||||
|
||||
## `title`
|
||||
|
||||
Rules:
|
||||
|
||||
- Start with a verb or noun phrase describing the observed defect, not the reporter.
|
||||
- ≤ 80 chars.
|
||||
- Include env qualifier ("cloud prod", "local dev", "electron") only if ambiguous.
|
||||
- Strip emoji and reactions from the original message when extracting.
|
||||
|
||||
Transformations:
|
||||
|
||||
| Slack message (excerpt) | Title |
|
||||
| ----------------------------------------------------------------------- | --------------------------------------------------- |
|
||||
| "unet model dropdown doesnt display all available models..." | Unet dropdown missing selected model |
|
||||
| "Dates are broken on Settings -> Secrets. Cloud Prod" | Settings → Secrets dates broken on cloud prod |
|
||||
| "LTX-2: Audio to VIdeo template results in the "RuntimeError..." error" | LTX-2 Audio-to-Video template RuntimeError on cloud |
|
||||
|
||||
## `description`
|
||||
|
||||
Structure — see `linear-api.md` § "Description body template". Key rules:
|
||||
|
||||
- Lead with **Repro** numbered list. Extract from the message body; if no steps are given, write "Repro: [Slack message body quoted verbatim]" and flag for human in approval.
|
||||
- Preserve the reporter's own words in the Repro section when they include "step 1 / step 2" markers.
|
||||
- Collapse multi-paragraph asides into "Notes" at the end.
|
||||
|
||||
## `env`
|
||||
|
||||
Detect from message text using these terms:
|
||||
|
||||
| Text in message | Tag |
|
||||
| -------------------------- | ---------------------- |
|
||||
| `cloud prod`, `prod cloud` | `cloud prod` |
|
||||
| `cloud dev` | `cloud dev` |
|
||||
| `cloud` | `cloud` (unqual.) |
|
||||
| `local`, `localhost` | `local` |
|
||||
| `electron`, `desktop` | `electron` |
|
||||
| `nodes 2.0`, `LG` | (feature tag, not env) |
|
||||
|
||||
A message can have multiple env tags. If none are detectable, set `env: []` and flag "env unclear" in the approval row.
|
||||
|
||||
## `severity`
|
||||
|
||||
Heuristics in SKILL.md. When uncertain, mark `medium` and note in approval table: `Sev: medium (flag)`.
|
||||
|
||||
## `area`
|
||||
|
||||
Single tag. Use the one that best fits; tiebreak toward the more actionable team:
|
||||
|
||||
- `cloud` > `workflow` when the reported behavior is specific to cloud-hosted features (billing, queue, jobs)
|
||||
- `node-system` > `ui` when the defect is canvas interaction, not just visual
|
||||
- `templates` only when a named template is the subject
|
||||
|
||||
## `attachments`
|
||||
|
||||
From `slack_read_channel` message `Files:` field. Parse name, ID, type. Never include the Slack file ID in the Linear description — those are permissioned — just the filename and type.
|
||||
|
||||
## `thread_resolution`
|
||||
|
||||
Fetch via `slack_read_thread`. Scan replies for:
|
||||
|
||||
- `solved`, `resolved`, `fixed`, `no action needed` → `solved`
|
||||
- A `:done:` reaction from the reporter → `solved`
|
||||
- A `https://github.com/Comfy-Org/ComfyUI_frontend/pull/` URL in a reply → `pr-open` (keep but note in description)
|
||||
- Otherwise → `open`
|
||||
|
||||
If `solved` and no PR merged, flag in approval table: reporter marked solved — confirm before filing.
|
||||
@@ -1,99 +0,0 @@
|
||||
# Verify Commands Cookbook
|
||||
|
||||
One-shot commands for each False-Defect Verification class. Keep each under ~30s.
|
||||
|
||||
## 1. Check for existing fix PR
|
||||
|
||||
```bash
|
||||
# By keyword in title
|
||||
gh search prs --repo Comfy-Org/ComfyUI_frontend "<keyword>" --state merged --limit 5
|
||||
|
||||
# By keyword in body
|
||||
gh pr list --repo Comfy-Org/ComfyUI_frontend --search "<keyword>" --state all --limit 5
|
||||
|
||||
# Recent closing PRs near the reported date
|
||||
gh pr list --repo Comfy-Org/ComfyUI_frontend --state merged \
|
||||
--search "merged:>=<YYYY-MM-DD> <keyword>" --limit 10
|
||||
```
|
||||
|
||||
Verify tag: `fixed` if a merged PR explicitly matches; `pr-open` if an open PR matches.
|
||||
|
||||
## 2. Check for existing open Linear issue
|
||||
|
||||
```text
|
||||
# Primary: @Linear search in the candidate's bug-dump thread
|
||||
# mcp__plugin_slack_slack__slack_send_message({
|
||||
# channel_id: "C0A4XMHANP3",
|
||||
# thread_ts: "<parent-ts>",
|
||||
# text: "@Linear search <keyword-1> <keyword-2>\nTeam: Frontend Engineering\nStatus: open"
|
||||
# })
|
||||
# → poll slack_read_thread, parse the Linear app's reply card for FE-NNNN matches.
|
||||
#
|
||||
# Fallback: grep past @Linear bot replies in the channel for prior ingested titles
|
||||
# mcp__plugin_slack_slack__slack_search_public({
|
||||
# query: "in:<#C0A4XMHANP3> from:@Linear <keyword-1> <keyword-2>"
|
||||
# })
|
||||
```
|
||||
|
||||
Verify tag: `dedupe` with the `FE-NNNN` identifier in the approval row. See `reference/linear-api.md` § "Search existing open issues (dedupe)" for full handling.
|
||||
|
||||
## 3. Feature actually exists in codebase
|
||||
|
||||
```bash
|
||||
# Find the component / feature mentioned
|
||||
rg -l "<ComponentOrFeatureName>" src/ apps/ --type vue --type ts
|
||||
|
||||
# Find a setting key
|
||||
rg "<setting-key>" src/locales/en/ src/stores/settingStore.ts
|
||||
|
||||
# Find a store action
|
||||
rg "<actionName>" src/stores/ --type ts
|
||||
```
|
||||
|
||||
Verify tag: `stale` if 0 hits AND the feature name is specific (not a generic word).
|
||||
|
||||
## 4. Intended behavior check
|
||||
|
||||
```bash
|
||||
# Check docs and release notes
|
||||
rg -l "<feature keyword>" docs/ CHANGELOG.md
|
||||
|
||||
# Check if behavior is asserted in an existing test (green today)
|
||||
rg "<observed behavior>" src/**/*.test.ts browser_tests/
|
||||
```
|
||||
|
||||
Verify tag: `expected` if docs describe this as the intended behavior, or a test asserts it.
|
||||
|
||||
## 5. Reporter self-resolution
|
||||
|
||||
Already gathered via `slack_read_thread`. Look for reporter's own replies containing:
|
||||
|
||||
- "solved", "resolved", "fixed", "no action needed", "nvm", "my bad"
|
||||
- A `:done:` reaction from the reporter
|
||||
- A `:white_check_mark:` reaction
|
||||
|
||||
Verify tag: `resolved`.
|
||||
|
||||
## 6. Env-specific / local setup
|
||||
|
||||
If the message mentions "my machine", "my proxy", "my docker", "my cache" AND no other reporter has confirmed in-thread:
|
||||
|
||||
```bash
|
||||
# Check thread for cross-user confirmations
|
||||
# slack_read_thread → count distinct users replying with "same", "repro'd", "+1"
|
||||
```
|
||||
|
||||
Verify tag: `env` if only the reporter is affected.
|
||||
|
||||
## 7. Cross-post (X posting)
|
||||
|
||||
If the top-level message is just a link + "X posting":
|
||||
|
||||
```bash
|
||||
# Follow the link — use slack_search_public to find the original thread
|
||||
# slack_search_public({ query: "<in:channel from:@reporter> <before:date>" })
|
||||
```
|
||||
|
||||
If the original is already ingestable, ingest from the original's permalink. If it's a GitHub issue, prefer linking that GitHub issue to the Linear ticket instead of creating two entries.
|
||||
|
||||
Verify tag: `cross-post` with the resolved source permalink.
|
||||
@@ -114,7 +114,7 @@ await expect(async () => {
|
||||
## CI Debugging
|
||||
|
||||
1. Download artifacts from failed CI run
|
||||
2. Extract and view trace: `npx playwright show-trace trace.zip`
|
||||
2. Extract and view trace: `pnpm dlx playwright show-trace trace.zip`
|
||||
3. CI deploys HTML report to Cloudflare Pages (link in PR comment)
|
||||
4. Reproduce CI: `CI=true pnpm test:browser`
|
||||
5. Local runs: `pnpm test:browser:local`
|
||||
|
||||
@@ -76,7 +76,7 @@ const executeTask = async (task: MaintenanceTask) => {
|
||||
|
||||
message = t('maintenance.error.taskFailed')
|
||||
} catch (error) {
|
||||
message = (error as Error)?.message
|
||||
message = error instanceof Error ? error.message : undefined
|
||||
}
|
||||
|
||||
toast.add({
|
||||
|
||||
@@ -66,7 +66,7 @@ class MaintenanceTaskRunner {
|
||||
this.error = undefined
|
||||
return true
|
||||
} catch (error) {
|
||||
this.error = (error as Error)?.message
|
||||
this.error = error instanceof Error ? error.message : String(error)
|
||||
throw error
|
||||
} finally {
|
||||
this.executing = false
|
||||
|
||||
@@ -3,6 +3,23 @@ import sitemap from '@astrojs/sitemap'
|
||||
import vue from '@astrojs/vue'
|
||||
import tailwindcss from '@tailwindcss/vite'
|
||||
|
||||
const LOCALES = ['en', 'zh-CN'] as const
|
||||
const DEFAULT_LOCALE = 'en'
|
||||
const PAYMENT_STATUSES = ['success', 'failed'] as const
|
||||
const LOCALE_PREFIXES = LOCALES.map((locale) =>
|
||||
locale === DEFAULT_LOCALE ? '' : `/${locale}`
|
||||
)
|
||||
const SITEMAP_EXCLUDED_PATHNAMES = new Set(
|
||||
LOCALE_PREFIXES.flatMap((prefix) =>
|
||||
PAYMENT_STATUSES.map((status) => `${prefix}/payment/${status}`)
|
||||
)
|
||||
)
|
||||
|
||||
function isExcludedFromSitemap(page: string): boolean {
|
||||
const pathname = new URL(page).pathname.replace(/\/$/, '')
|
||||
return SITEMAP_EXCLUDED_PATHNAMES.has(pathname)
|
||||
}
|
||||
|
||||
export default defineConfig({
|
||||
site: 'https://comfy.org',
|
||||
output: 'static',
|
||||
@@ -17,7 +34,12 @@ export default defineConfig({
|
||||
assets: '_website'
|
||||
},
|
||||
devToolbar: { enabled: !process.env.NO_TOOLBAR },
|
||||
integrations: [vue(), sitemap()],
|
||||
integrations: [
|
||||
vue(),
|
||||
sitemap({
|
||||
filter: (page) => !isExcludedFromSitemap(page)
|
||||
})
|
||||
],
|
||||
vite: {
|
||||
plugins: [tailwindcss()],
|
||||
server: {
|
||||
@@ -27,8 +49,8 @@ export default defineConfig({
|
||||
}
|
||||
},
|
||||
i18n: {
|
||||
locales: ['en', 'zh-CN'],
|
||||
defaultLocale: 'en',
|
||||
locales: [...LOCALES],
|
||||
defaultLocale: DEFAULT_LOCALE,
|
||||
routing: {
|
||||
prefixDefaultLocale: false
|
||||
}
|
||||
|
||||
44
apps/website/e2e/demos.spec.ts
Normal file
44
apps/website/e2e/demos.spec.ts
Normal file
@@ -0,0 +1,44 @@
|
||||
import { expect, test } from '@playwright/test'
|
||||
|
||||
test.describe('Demo pages @smoke', () => {
|
||||
test('demo detail page renders hero and embed', async ({ page }) => {
|
||||
await page.goto('/demos/image-to-video')
|
||||
await expect(page.getByRole('heading', { level: 1 })).toBeVisible()
|
||||
await expect(page.getByRole('heading', { level: 1 })).toContainText(
|
||||
'Create a Video from an Image'
|
||||
)
|
||||
const iframe = page.locator('iframe[title*="Interactive demo"]')
|
||||
await expect(iframe).toBeAttached()
|
||||
})
|
||||
|
||||
test('demo detail page has transcript section', async ({ page }) => {
|
||||
await page.goto('/demos/image-to-video')
|
||||
await expect(
|
||||
page.getByRole('button', { name: /demo transcript/i })
|
||||
).toBeVisible()
|
||||
})
|
||||
|
||||
test('demo detail page has next demo navigation', async ({ page }) => {
|
||||
await page.goto('/demos/image-to-video')
|
||||
await expect(page.getByText(/what's next/i)).toBeVisible()
|
||||
})
|
||||
|
||||
test('demo library page renders', async ({ page }) => {
|
||||
await page.goto('/demos')
|
||||
await expect(page.getByText('Coming Soon')).toBeVisible()
|
||||
})
|
||||
|
||||
test('non-existent demo returns 404', async ({ page }) => {
|
||||
const response = await page.goto('/demos/nonexistent')
|
||||
expect(response?.status()).toBe(404)
|
||||
})
|
||||
|
||||
test('zh-CN demo page renders localized content', async ({ page }) => {
|
||||
await page.goto('/zh-CN/demos/image-to-video')
|
||||
await expect(page.getByRole('heading', { level: 1 })).toContainText(
|
||||
'从图片创建视频'
|
||||
)
|
||||
const nextDemoLink = page.locator('a[href*="/zh-CN/demos/"]').first()
|
||||
await expect(nextDemoLink).toBeAttached()
|
||||
})
|
||||
})
|
||||
@@ -46,7 +46,7 @@ test.describe('Download page @smoke', () => {
|
||||
await expect(githubBtn).toBeVisible()
|
||||
await expect(githubBtn).toHaveAttribute(
|
||||
'href',
|
||||
'https://github.com/Comfy-Org/ComfyUI'
|
||||
'https://github.com/Comfy-Org/ComfyUI#installing'
|
||||
)
|
||||
|
||||
await context.close()
|
||||
|
||||
115
apps/website/e2e/payment.spec.ts
Normal file
115
apps/website/e2e/payment.spec.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
import type { Page } from '@playwright/test'
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import { externalLinks } from '../src/config/routes'
|
||||
import { test } from './fixtures/blockExternalMedia'
|
||||
|
||||
const CLOUD_URL = externalLinks.cloud
|
||||
const PLATFORM_USAGE_URL = externalLinks.platformUsage
|
||||
const SUPPORT_URL = externalLinks.support
|
||||
const DOCS_SUBSCRIPTION_URL = externalLinks.docsSubscription
|
||||
|
||||
async function expectNoIndex(page: Page) {
|
||||
await expect(page.locator('meta[name="robots"]')).toHaveAttribute(
|
||||
'content',
|
||||
'noindex, nofollow'
|
||||
)
|
||||
}
|
||||
|
||||
test.describe('Payment success page @smoke', () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
await page.goto('/payment/success')
|
||||
})
|
||||
|
||||
test('has correct title and is noindex', async ({ page }) => {
|
||||
await expect(page).toHaveTitle('Payment Successful — Comfy')
|
||||
await expectNoIndex(page)
|
||||
})
|
||||
|
||||
test('shows success heading and subtitle', async ({ page }) => {
|
||||
await expect(
|
||||
page.getByRole('heading', { name: /Payment successful/i, level: 1 })
|
||||
).toBeVisible()
|
||||
await expect(page.getByText(/Thanks for your purchase/i)).toBeVisible()
|
||||
})
|
||||
|
||||
test('primary CTA links to Comfy Cloud', async ({ page }) => {
|
||||
const cta = page.getByRole('link', { name: /CONTINUE TO COMFY CLOUD/i })
|
||||
await expect(cta).toBeVisible()
|
||||
await expect(cta).toHaveAttribute('href', CLOUD_URL)
|
||||
})
|
||||
|
||||
test('secondary CTA links to platform usage & payments page', async ({
|
||||
page
|
||||
}) => {
|
||||
const cta = page.getByRole('link', { name: /VIEW USAGE & PAYMENTS/i })
|
||||
await expect(cta).toBeVisible()
|
||||
await expect(cta).toHaveAttribute('href', PLATFORM_USAGE_URL)
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Payment failed page @smoke', () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
await page.goto('/payment/failed')
|
||||
})
|
||||
|
||||
test('has correct title and is noindex', async ({ page }) => {
|
||||
await expect(page).toHaveTitle('Payment Failed — Comfy')
|
||||
await expectNoIndex(page)
|
||||
})
|
||||
|
||||
test('shows failure heading and subtitle', async ({ page }) => {
|
||||
await expect(
|
||||
page.getByRole('heading', {
|
||||
name: /Payment was not completed/i,
|
||||
level: 1
|
||||
})
|
||||
).toBeVisible()
|
||||
await expect(page.getByText(/payment didn't go through/i)).toBeVisible()
|
||||
})
|
||||
|
||||
test('primary CTA links to support help center', async ({ page }) => {
|
||||
const cta = page.getByRole('link', { name: /CONTACT SUPPORT/i })
|
||||
await expect(cta).toBeVisible()
|
||||
await expect(cta).toHaveAttribute('href', SUPPORT_URL)
|
||||
})
|
||||
|
||||
test('secondary CTA links to subscription docs', async ({ page }) => {
|
||||
const cta = page.getByRole('link', { name: /READ SUBSCRIPTION DOCS/i })
|
||||
await expect(cta).toBeVisible()
|
||||
await expect(cta).toHaveAttribute('href', DOCS_SUBSCRIPTION_URL)
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Payment pages zh-CN @smoke', () => {
|
||||
test('zh-CN success page renders and links correctly', async ({ page }) => {
|
||||
await page.goto('/zh-CN/payment/success')
|
||||
await expect(page).toHaveTitle('支付成功 — Comfy')
|
||||
await expectNoIndex(page)
|
||||
await expect(
|
||||
page.getByRole('heading', { name: '支付成功', level: 1 })
|
||||
).toBeVisible()
|
||||
await expect(
|
||||
page.getByRole('link', { name: '前往 COMFY CLOUD' })
|
||||
).toHaveAttribute('href', CLOUD_URL)
|
||||
await expect(
|
||||
page.getByRole('link', { name: '查看用量与支付' })
|
||||
).toHaveAttribute('href', PLATFORM_USAGE_URL)
|
||||
})
|
||||
|
||||
test('zh-CN failed page renders and links correctly', async ({ page }) => {
|
||||
await page.goto('/zh-CN/payment/failed')
|
||||
await expect(page).toHaveTitle('支付失败 — Comfy')
|
||||
await expectNoIndex(page)
|
||||
await expect(
|
||||
page.getByRole('heading', { name: '支付未完成', level: 1 })
|
||||
).toBeVisible()
|
||||
await expect(page.getByRole('link', { name: '联系支持' })).toHaveAttribute(
|
||||
'href',
|
||||
SUPPORT_URL
|
||||
)
|
||||
await expect(
|
||||
page.getByRole('link', { name: '查看订阅文档' })
|
||||
).toHaveAttribute('href', DOCS_SUBSCRIPTION_URL)
|
||||
})
|
||||
})
|
||||
BIN
apps/website/public/images/demos/image-to-video-og.png
Normal file
BIN
apps/website/public/images/demos/image-to-video-og.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 69 B |
BIN
apps/website/public/images/demos/image-to-video-thumb.webp
Normal file
BIN
apps/website/public/images/demos/image-to-video-thumb.webp
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 69 B |
BIN
apps/website/public/images/demos/workflow-templates-og.png
Normal file
BIN
apps/website/public/images/demos/workflow-templates-og.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 69 B |
BIN
apps/website/public/images/demos/workflow-templates-thumb.webp
Normal file
BIN
apps/website/public/images/demos/workflow-templates-thumb.webp
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 69 B |
@@ -29,5 +29,30 @@ Allow: /
|
||||
Disallow: /_astro/
|
||||
Disallow: /_website/
|
||||
Disallow: /_vercel/
|
||||
Disallow: /payment/
|
||||
|
||||
User-agent: GPTBot
|
||||
Allow: /
|
||||
|
||||
User-agent: OAI-SearchBot
|
||||
Allow: /
|
||||
|
||||
User-agent: ChatGPT-User
|
||||
Allow: /
|
||||
|
||||
User-agent: ClaudeBot
|
||||
Allow: /
|
||||
|
||||
User-agent: Claude-User
|
||||
Allow: /
|
||||
|
||||
User-agent: Claude-SearchBot
|
||||
Allow: /
|
||||
|
||||
User-agent: PerplexityBot
|
||||
Allow: /
|
||||
|
||||
User-agent: Google-Extended
|
||||
Allow: /
|
||||
|
||||
Sitemap: https://comfy.org/sitemap-index.xml
|
||||
|
||||
@@ -88,7 +88,7 @@ const contactColumn = {
|
||||
{ label: t('footer.sales', locale), href: routes.contact },
|
||||
{
|
||||
label: t('footer.support', locale),
|
||||
href: externalLinks.discord,
|
||||
href: externalLinks.support,
|
||||
external: true
|
||||
},
|
||||
{ label: t('footer.press', locale), href: 'mailto:press@comfy.org' }
|
||||
|
||||
67
apps/website/src/components/demos/ArcadeEmbed.vue
Normal file
67
apps/website/src/components/demos/ArcadeEmbed.vue
Normal file
@@ -0,0 +1,67 @@
|
||||
<script setup lang="ts">
|
||||
import type { Locale } from '../../i18n/translations'
|
||||
|
||||
import { ref } from 'vue'
|
||||
|
||||
import { t } from '../../i18n/translations'
|
||||
|
||||
const {
|
||||
arcadeId,
|
||||
title,
|
||||
locale = 'en'
|
||||
} = defineProps<{
|
||||
arcadeId: string
|
||||
title: string
|
||||
locale?: Locale
|
||||
}>()
|
||||
|
||||
const loaded = ref(false)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section
|
||||
class="px-4 py-8 lg:px-20 lg:py-16"
|
||||
:aria-label="t('demos.embed.label', locale)"
|
||||
>
|
||||
<div
|
||||
class="relative mx-auto aspect-video max-w-6xl overflow-hidden rounded-4xl border border-white/10"
|
||||
>
|
||||
<div
|
||||
v-if="!loaded"
|
||||
aria-hidden="true"
|
||||
class="absolute inset-0 flex flex-col items-center justify-center bg-black/50"
|
||||
>
|
||||
<div
|
||||
class="border-primary-comfy-canvas/60 mb-4 size-10 animate-pulse rounded-full border-2"
|
||||
/>
|
||||
<p class="text-primary-warm-gray text-sm">
|
||||
{{ t('demos.loading', locale) }}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<iframe
|
||||
class="size-full"
|
||||
:src="`https://demo.arcade.software/${arcadeId}?embed&show_title=0`"
|
||||
:title="`${t('demos.embed.label', locale)}: ${title}`"
|
||||
loading="lazy"
|
||||
allow="clipboard-write"
|
||||
referrerpolicy="strict-origin-when-cross-origin"
|
||||
@load="loaded = true"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<noscript>
|
||||
<p class="text-primary-warm-gray mt-4 text-sm">
|
||||
{{ t('demos.noscript', locale) }}
|
||||
<a
|
||||
class="text-primary-comfy-yellow ml-2 underline"
|
||||
:href="`https://demo.arcade.software/${arcadeId}`"
|
||||
rel="noopener noreferrer"
|
||||
target="_blank"
|
||||
>
|
||||
{{ t('demos.noscript.link', locale) }}
|
||||
</a>
|
||||
</p>
|
||||
</noscript>
|
||||
</section>
|
||||
</template>
|
||||
60
apps/website/src/components/demos/DemoHeroSection.vue
Normal file
60
apps/website/src/components/demos/DemoHeroSection.vue
Normal file
@@ -0,0 +1,60 @@
|
||||
<script setup lang="ts">
|
||||
import type { Locale, TranslationKey } from '../../i18n/translations'
|
||||
|
||||
import { t } from '../../i18n/translations'
|
||||
|
||||
const {
|
||||
label,
|
||||
title,
|
||||
description,
|
||||
difficulty,
|
||||
estimatedTime,
|
||||
locale = 'en'
|
||||
} = defineProps<{
|
||||
label: string
|
||||
title: string
|
||||
description: string
|
||||
difficulty: 'beginner' | 'intermediate' | 'advanced'
|
||||
estimatedTime: string
|
||||
locale?: Locale
|
||||
}>()
|
||||
|
||||
const difficultyKey = `demos.difficulty.${difficulty}` as TranslationKey
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section class="pt-16 lg:px-20 lg:pt-40 lg:pb-8">
|
||||
<div class="mx-auto flex max-w-4xl flex-col items-center text-center">
|
||||
<span
|
||||
class="text-primary-comfy-yellow text-xs font-semibold tracking-widest uppercase"
|
||||
>
|
||||
{{ label }}
|
||||
</span>
|
||||
|
||||
<h1
|
||||
class="text-primary-comfy-canvas mt-4 text-3xl/tight font-light lg:text-5xl/tight"
|
||||
>
|
||||
{{ title }}
|
||||
</h1>
|
||||
|
||||
<p
|
||||
class="text-primary-warm-gray mt-6 max-w-xl text-sm/relaxed lg:text-base/relaxed"
|
||||
>
|
||||
{{ description }}
|
||||
</p>
|
||||
|
||||
<div class="mt-6 flex flex-wrap justify-center gap-3">
|
||||
<span
|
||||
class="bg-transparency-white-t4 text-primary-comfy-canvas rounded-full px-3 py-1 text-xs font-semibold tracking-wide uppercase"
|
||||
>
|
||||
{{ t(difficultyKey, locale) }}
|
||||
</span>
|
||||
<span
|
||||
class="bg-transparency-white-t4 text-primary-comfy-canvas rounded-full px-3 py-1 text-xs font-semibold"
|
||||
>
|
||||
{{ t(estimatedTime as TranslationKey, locale) }}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</template>
|
||||
59
apps/website/src/components/demos/DemoNavSection.vue
Normal file
59
apps/website/src/components/demos/DemoNavSection.vue
Normal file
@@ -0,0 +1,59 @@
|
||||
<script setup lang="ts">
|
||||
import type { Locale, TranslationKey } from '../../i18n/translations'
|
||||
|
||||
import { t } from '../../i18n/translations'
|
||||
|
||||
const {
|
||||
nextTitle,
|
||||
nextSlug,
|
||||
nextThumbnail,
|
||||
locale = 'en'
|
||||
} = defineProps<{
|
||||
nextTitle: string
|
||||
nextSlug: string
|
||||
nextThumbnail: string
|
||||
locale?: Locale
|
||||
}>()
|
||||
|
||||
const localePrefix = locale === 'en' ? '' : `/${locale}`
|
||||
const nextHref = `${localePrefix}/demos/${nextSlug}`
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section class="px-4 py-16 lg:px-20 lg:py-24">
|
||||
<h2 class="text-primary-comfy-canvas mb-10 text-2xl font-light lg:text-3xl">
|
||||
{{ t('demos.nav.nextDemo' as TranslationKey, locale) }}
|
||||
</h2>
|
||||
|
||||
<div
|
||||
class="bg-transparency-white-t4 rounded-5xl mx-auto flex flex-col gap-8 p-2 lg:max-w-237.5 lg:flex-row lg:items-center"
|
||||
>
|
||||
<a :href="nextHref" class="shrink-0 lg:w-1/2">
|
||||
<img
|
||||
:src="nextThumbnail"
|
||||
:alt="nextTitle"
|
||||
class="w-full rounded-4xl object-cover"
|
||||
/>
|
||||
</a>
|
||||
|
||||
<div class="flex flex-col gap-6">
|
||||
<h3 class="text-primary-comfy-canvas text-xl font-light lg:text-2xl">
|
||||
{{ nextTitle }}
|
||||
</h3>
|
||||
|
||||
<a :href="nextHref" class="flex items-center gap-3">
|
||||
<span
|
||||
class="bg-primary-comfy-yellow text-primary-comfy-ink flex size-10 items-center justify-center rounded-full"
|
||||
>
|
||||
<span class="text-lg font-bold">›</span>
|
||||
</span>
|
||||
<span
|
||||
class="text-primary-comfy-canvas ppformula-text-center text-sm font-semibold tracking-wider uppercase"
|
||||
>
|
||||
{{ t('demos.nav.viewDemo' as TranslationKey, locale) }}
|
||||
</span>
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</template>
|
||||
50
apps/website/src/components/demos/DemoTranscript.vue
Normal file
50
apps/website/src/components/demos/DemoTranscript.vue
Normal file
@@ -0,0 +1,50 @@
|
||||
<script setup lang="ts">
|
||||
import type { Locale } from '../../i18n/translations'
|
||||
|
||||
import { cn } from '@comfyorg/tailwind-utils'
|
||||
import { ref } from 'vue'
|
||||
|
||||
import { t } from '../../i18n/translations'
|
||||
|
||||
const { transcript, locale = 'en' } = defineProps<{
|
||||
transcript: string
|
||||
locale?: Locale
|
||||
}>()
|
||||
|
||||
const expanded = ref(false)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section
|
||||
class="px-4 py-8 lg:px-20 lg:py-12"
|
||||
:aria-label="t('demos.transcript.label', locale)"
|
||||
>
|
||||
<div class="mx-auto max-w-4xl">
|
||||
<button
|
||||
type="button"
|
||||
class="text-primary-comfy-canvas text-left"
|
||||
:aria-expanded="expanded"
|
||||
@click="expanded = !expanded"
|
||||
>
|
||||
<span class="text-sm font-semibold tracking-wide uppercase">
|
||||
{{ t('demos.transcript.label', locale) }}
|
||||
</span>
|
||||
<span class="text-primary-warm-gray ml-2 text-xs">
|
||||
{{ t('demos.transcript.note', locale) }}
|
||||
</span>
|
||||
</button>
|
||||
|
||||
<div
|
||||
role="region"
|
||||
:aria-label="t('demos.transcript.label', locale)"
|
||||
:class="
|
||||
cn(
|
||||
expanded ? 'mt-4' : 'sr-only',
|
||||
'text-primary-warm-gray text-sm/relaxed'
|
||||
)
|
||||
"
|
||||
v-html="transcript"
|
||||
/>
|
||||
</div>
|
||||
</section>
|
||||
</template>
|
||||
@@ -106,6 +106,11 @@ function onNavKeydown(event: KeyboardEvent) {
|
||||
navButtons()?.[next]?.focus({ preventScroll: true })
|
||||
}
|
||||
|
||||
function onCategoryHover(index: number) {
|
||||
if (isEnabled.value) return
|
||||
activeCategory.value = index
|
||||
}
|
||||
|
||||
function travelRange(el: HTMLElement) {
|
||||
if (window.matchMedia('(min-width: 1024px)').matches) return 150
|
||||
|
||||
@@ -116,31 +121,29 @@ function travelRange(el: HTMLElement) {
|
||||
}
|
||||
|
||||
const pinScrubEnd = `+=${categories.length * VH_PER_ITEM}%`
|
||||
const parallaxMediaQuery = '(max-width: 1023px)'
|
||||
useParallax([rightImgRef], {
|
||||
trigger: sectionRef,
|
||||
fromY: (el) => -travelRange(el),
|
||||
y: (el) => travelRange(el),
|
||||
start: 'top top',
|
||||
end: pinScrubEnd
|
||||
end: pinScrubEnd,
|
||||
mediaQuery: parallaxMediaQuery
|
||||
})
|
||||
useParallax([leftImgRef], {
|
||||
trigger: sectionRef,
|
||||
fromY: (el) => travelRange(el),
|
||||
y: (el) => -travelRange(el),
|
||||
start: 'top top',
|
||||
end: pinScrubEnd
|
||||
end: pinScrubEnd,
|
||||
mediaQuery: parallaxMediaQuery
|
||||
})
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section
|
||||
ref="sectionRef"
|
||||
:class="
|
||||
cn(
|
||||
'bg-primary-comfy-ink relative isolate overflow-x-clip pt-20 lg:py-24',
|
||||
isEnabled && 'lg:h-[calc(100vh+60px)]'
|
||||
)
|
||||
"
|
||||
class="bg-primary-comfy-ink relative isolate overflow-x-clip pt-20 lg:h-[calc(100vh+60px)] lg:py-24"
|
||||
>
|
||||
<svg class="absolute size-0" width="0" height="0" aria-hidden="true">
|
||||
<defs>
|
||||
@@ -202,6 +205,8 @@ useParallax([leftImgRef], {
|
||||
"
|
||||
:aria-current="index === activeCategory ? 'true' : undefined"
|
||||
@click="scrollToIndex(index)"
|
||||
@mouseenter="onCategoryHover(index)"
|
||||
@focus="onCategoryHover(index)"
|
||||
>
|
||||
{{ category.label }}
|
||||
</button>
|
||||
|
||||
101
apps/website/src/components/payment/PaymentStatusSection.vue
Normal file
101
apps/website/src/components/payment/PaymentStatusSection.vue
Normal file
@@ -0,0 +1,101 @@
|
||||
<script setup lang="ts">
|
||||
import { cn } from '@comfyorg/tailwind-utils'
|
||||
|
||||
import { externalLinks } from '../../config/routes'
|
||||
import type { Locale } from '../../i18n/translations'
|
||||
import { t } from '../../i18n/translations'
|
||||
import BrandButton from '../common/BrandButton.vue'
|
||||
import SectionLabel from '../common/SectionLabel.vue'
|
||||
|
||||
// Display-only thank-you / failure pages: payment state is verified
|
||||
// server-side via Stripe webhooks (see comfy-api). These pages exist
|
||||
// solely as the redirect target for Stripe Checkout.
|
||||
|
||||
type Status = 'success' | 'failed'
|
||||
|
||||
const { status, locale = 'en' } = defineProps<{
|
||||
status: Status
|
||||
locale?: Locale
|
||||
}>()
|
||||
|
||||
const primaryHref =
|
||||
status === 'success' ? externalLinks.cloud : externalLinks.support
|
||||
const secondaryHref =
|
||||
status === 'success'
|
||||
? externalLinks.platformUsage
|
||||
: externalLinks.docsSubscription
|
||||
|
||||
const iconRingClass =
|
||||
status === 'success'
|
||||
? 'border-primary-comfy-yellow text-primary-comfy-yellow'
|
||||
: 'border-secondary-mauve text-secondary-mauve'
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<section
|
||||
class="flex min-h-[calc(100dvh-12rem)] items-center justify-center px-6 py-16 lg:py-24"
|
||||
>
|
||||
<div class="flex max-w-2xl flex-col items-center gap-6 text-center">
|
||||
<div
|
||||
:class="
|
||||
cn(
|
||||
'flex size-20 items-center justify-center rounded-full border-2',
|
||||
iconRingClass
|
||||
)
|
||||
"
|
||||
aria-hidden="true"
|
||||
>
|
||||
<svg
|
||||
v-if="status === 'success'"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="2.5"
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
class="size-10"
|
||||
>
|
||||
<path d="M5 12.5l4.5 4.5L19 7.5" />
|
||||
</svg>
|
||||
<svg
|
||||
v-else
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="2.5"
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
class="size-10"
|
||||
>
|
||||
<path d="M6 6l12 12" />
|
||||
<path d="M18 6L6 18" />
|
||||
</svg>
|
||||
</div>
|
||||
|
||||
<SectionLabel>{{ t(`payment.${status}.label`, locale) }}</SectionLabel>
|
||||
|
||||
<h1
|
||||
class="text-primary-comfy-canvas text-4xl/tight font-light md:text-5xl/tight lg:text-6xl/tight"
|
||||
>
|
||||
{{ t(`payment.${status}.title`, locale) }}
|
||||
</h1>
|
||||
|
||||
<p
|
||||
class="text-primary-comfy-canvas/80 max-w-xl text-base font-light lg:text-lg"
|
||||
>
|
||||
{{ t(`payment.${status}.subtitle`, locale) }}
|
||||
</p>
|
||||
|
||||
<div
|
||||
class="mt-2 flex flex-col items-stretch gap-3 sm:flex-row sm:items-center sm:justify-center"
|
||||
>
|
||||
<BrandButton :href="primaryHref" variant="solid" size="nav">
|
||||
{{ t(`payment.${status}.primaryCta`, locale) }}
|
||||
</BrandButton>
|
||||
<BrandButton :href="secondaryHref" variant="outline" size="nav">
|
||||
{{ t(`payment.${status}.secondaryCta`, locale) }}
|
||||
</BrandButton>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</template>
|
||||
@@ -28,7 +28,11 @@ const { locale = 'en' } = defineProps<{ locale?: Locale }>()
|
||||
<!-- CTA buttons -->
|
||||
<div class="mt-10 flex flex-col gap-4 lg:flex-row">
|
||||
<DownloadLocalButton :locale />
|
||||
<BrandButton :href="externalLinks.github" variant="outline" size="lg">
|
||||
<BrandButton
|
||||
:href="externalLinks.githubInstall"
|
||||
variant="outline"
|
||||
size="lg"
|
||||
>
|
||||
<span class="inline-flex items-center gap-2">
|
||||
<i
|
||||
class="icon-mask size-5 -translate-y-px mask-[url('/icons/social/github.svg')]"
|
||||
|
||||
@@ -323,7 +323,7 @@ onUnmounted(() => {
|
||||
<div class="mt-8 flex flex-col gap-4 lg:flex-row">
|
||||
<DownloadLocalButton :locale class="lg:min-w-60 lg:p-4" />
|
||||
<BrandButton
|
||||
:href="externalLinks.github"
|
||||
:href="externalLinks.githubInstall"
|
||||
variant="outline"
|
||||
size="lg"
|
||||
class="lg:min-w-60 lg:p-4"
|
||||
|
||||
@@ -20,6 +20,9 @@ interface PinScrubOptions {
|
||||
/** Viewport-height percentage each category occupies in the scroll distance. */
|
||||
export const VH_PER_ITEM = 20
|
||||
|
||||
/** Pin/scrub is mobile-only — desktop uses hover-based category switching. */
|
||||
const PIN_SCRUB_MEDIA_QUERY = '(max-width: 1023px)'
|
||||
|
||||
function interpolateY(
|
||||
index: number,
|
||||
buttonCenters: number[],
|
||||
@@ -66,7 +69,8 @@ export function usePinScrub(refs: PinScrubRefs, options: PinScrubOptions) {
|
||||
!refs.section.value ||
|
||||
!refs.content.value ||
|
||||
!refs.nav.value ||
|
||||
prefersReducedMotion()
|
||||
prefersReducedMotion() ||
|
||||
!window.matchMedia(PIN_SCRUB_MEDIA_QUERY).matches
|
||||
)
|
||||
return
|
||||
const section: HTMLElement = refs.section.value
|
||||
|
||||
68
apps/website/src/config/demos.ts
Normal file
68
apps/website/src/config/demos.ts
Normal file
@@ -0,0 +1,68 @@
|
||||
import type { TranslationKey } from '../i18n/translations'
|
||||
|
||||
interface Demo {
|
||||
readonly slug: string
|
||||
readonly arcadeId: string
|
||||
readonly category: TranslationKey
|
||||
readonly title: TranslationKey
|
||||
readonly description: TranslationKey
|
||||
readonly ogImage: string
|
||||
readonly thumbnail: string
|
||||
readonly estimatedTime: TranslationKey
|
||||
readonly durationIso: string
|
||||
readonly difficulty: 'beginner' | 'intermediate' | 'advanced'
|
||||
readonly tags: readonly string[]
|
||||
readonly transcript?: TranslationKey
|
||||
readonly publishedDate: string
|
||||
readonly modifiedDate: string
|
||||
}
|
||||
|
||||
export const demos: readonly Demo[] = [
|
||||
{
|
||||
slug: 'image-to-video',
|
||||
arcadeId: 'F3CTalnGnR4R0qJIVMNX',
|
||||
category: 'demos.category.templates',
|
||||
title: 'demos.image-to-video.title',
|
||||
description: 'demos.image-to-video.description',
|
||||
transcript: 'demos.image-to-video.transcript',
|
||||
ogImage: '/images/demos/image-to-video-og.png',
|
||||
thumbnail: '/images/demos/image-to-video-thumb.webp',
|
||||
estimatedTime: 'demos.duration.2min',
|
||||
durationIso: 'PT2M',
|
||||
difficulty: 'beginner',
|
||||
tags: ['templates', 'image', 'video'],
|
||||
publishedDate: '2026-04-19',
|
||||
modifiedDate: '2026-04-19'
|
||||
},
|
||||
{
|
||||
slug: 'workflow-templates',
|
||||
arcadeId: 'KhqcXDElnFWklo7ACBqE',
|
||||
category: 'demos.category.gettingStarted',
|
||||
title: 'demos.workflow-templates.title',
|
||||
description: 'demos.workflow-templates.description',
|
||||
transcript: 'demos.workflow-templates.transcript',
|
||||
ogImage: '/images/demos/workflow-templates-og.png',
|
||||
thumbnail: '/images/demos/workflow-templates-thumb.webp',
|
||||
estimatedTime: 'demos.duration.2min',
|
||||
durationIso: 'PT2M',
|
||||
difficulty: 'beginner',
|
||||
tags: ['getting-started', 'templates', 'workflow'],
|
||||
publishedDate: '2026-04-19',
|
||||
modifiedDate: '2026-04-19'
|
||||
}
|
||||
]
|
||||
|
||||
export function getDemoBySlug(slug: string): Demo | undefined {
|
||||
return demos.find((demo) => demo.slug === slug)
|
||||
}
|
||||
|
||||
export function getNextDemo(slug: string): Demo {
|
||||
if (demos.length === 0) {
|
||||
throw new Error('No demos configured')
|
||||
}
|
||||
const index = demos.findIndex((demo) => demo.slug === slug)
|
||||
if (index === -1) {
|
||||
throw new Error(`Unknown demo slug: ${slug}`)
|
||||
}
|
||||
return demos[(index + 1) % demos.length]
|
||||
}
|
||||
@@ -11,6 +11,7 @@ const baseRoutes = {
|
||||
about: '/about',
|
||||
careers: '/careers',
|
||||
customers: '/customers',
|
||||
demos: '/demos',
|
||||
termsOfService: '/terms-of-service',
|
||||
privacyPolicy: '/privacy-policy',
|
||||
contact: '/contact'
|
||||
@@ -33,8 +34,12 @@ export const externalLinks = {
|
||||
discord: 'https://discord.com/invite/comfyorg',
|
||||
docs: 'https://docs.comfy.org/',
|
||||
docsApi: 'https://docs.comfy.org/api-reference/cloud',
|
||||
docsSubscription: 'https://docs.comfy.org/support/subscription/subscribing',
|
||||
github: 'https://github.com/Comfy-Org/ComfyUI',
|
||||
githubInstall: 'https://github.com/Comfy-Org/ComfyUI#installing',
|
||||
platform: 'https://platform.comfy.org',
|
||||
platformUsage: 'https://platform.comfy.org/profile/usage',
|
||||
support: 'https://support.comfy.org/hc/en-us',
|
||||
workflows: 'https://comfy.org/workflows',
|
||||
youtube: 'https://www.youtube.com/@ComfyOrg'
|
||||
} as const
|
||||
|
||||
@@ -3542,6 +3542,80 @@ const translations = {
|
||||
'zh-CN': '我们会为您处理请求。'
|
||||
},
|
||||
|
||||
'demos.category.templates': { en: 'TEMPLATES', 'zh-CN': '模板' },
|
||||
'demos.category.gettingStarted': { en: 'GETTING STARTED', 'zh-CN': '入门' },
|
||||
|
||||
'demos.image-to-video.title': {
|
||||
en: 'Create a Video from an Image',
|
||||
'zh-CN': '从图片创建视频'
|
||||
},
|
||||
'demos.image-to-video.description': {
|
||||
en: 'Learn how to use the Image to Video workflow template in ComfyUI to generate short video clips from a single image.',
|
||||
'zh-CN':
|
||||
'了解如何使用 ComfyUI 中的图片转视频工作流模板,从单张图片生成短视频。'
|
||||
},
|
||||
'demos.image-to-video.transcript': {
|
||||
en: '<ol><li><strong>Open ComfyUI</strong> — Launch the application and you\'ll see the node-based workflow canvas where all your AI pipelines are built.</li><li><strong>Browse templates</strong> — Click the workflow templates button in the sidebar to browse available starting points.</li><li><strong>Select Image to Video</strong> — Find and select the "Image to Video" template from the list to load it onto your canvas.</li><li><strong>Upload your image</strong> — Click the image upload node and select the source image you want to animate.</li><li><strong>Run the workflow</strong> — Click the "Queue" button to execute the workflow and generate your video output.</li></ol>',
|
||||
'zh-CN':
|
||||
'<ol><li><strong>打开 ComfyUI</strong> — 启动应用程序,您将看到基于节点的工作流画布。</li><li><strong>浏览模板</strong> — 点击侧栏中的工作流模板按钮,浏览可用模板。</li><li><strong>选择图片转视频</strong> — 从列表中找到并选择"图片转视频"模板。</li><li><strong>上传图片</strong> — 点击图片上传节点,选择要动画化的源图片。</li><li><strong>运行工作流</strong> — 点击"排队"按钮执行工作流并生成视频输出。</li></ol>'
|
||||
},
|
||||
|
||||
'demos.workflow-templates.title': {
|
||||
en: 'Browse Workflow Templates',
|
||||
'zh-CN': '浏览工作流模板'
|
||||
},
|
||||
'demos.workflow-templates.description': {
|
||||
en: "Explore ComfyUI's built-in workflow templates to quickly get started with common AI generation tasks.",
|
||||
'zh-CN': '探索 ComfyUI 内置的工作流模板,快速开始常见的 AI 生成任务。'
|
||||
},
|
||||
'demos.workflow-templates.transcript': {
|
||||
en: '<ol><li><strong>Open the template browser</strong> — Click the templates icon in the ComfyUI sidebar to open the template library.</li><li><strong>Browse categories</strong> — Templates are organized by task: image generation, video, upscaling, and more.</li><li><strong>Preview a template</strong> — Hover over any template to see a preview of its workflow and expected output.</li><li><strong>Load and customize</strong> — Click to load a template, then modify parameters to fit your needs.</li></ol>',
|
||||
'zh-CN':
|
||||
'<ol><li><strong>打开模板浏览器</strong> — 点击 ComfyUI 侧栏中的模板图标。</li><li><strong>浏览分类</strong> — 模板按任务分类:图像生成、视频、放大等。</li><li><strong>预览模板</strong> — 将鼠标悬停在模板上查看预览。</li><li><strong>加载并自定义</strong> — 点击加载模板,然后修改参数。</li></ol>'
|
||||
},
|
||||
|
||||
'demos.nav.nextDemo': { en: "What's Next", 'zh-CN': '下一个演示' },
|
||||
'demos.nav.viewDemo': { en: 'View Demo', 'zh-CN': '查看演示' },
|
||||
'demos.nav.allDemos': { en: 'All Demos', 'zh-CN': '所有演示' },
|
||||
'demos.transcript.label': { en: 'Demo transcript', 'zh-CN': '演示文字记录' },
|
||||
'demos.transcript.note': {
|
||||
en: '(for accessibility & search)',
|
||||
'zh-CN': '(无障碍和搜索)'
|
||||
},
|
||||
'demos.loading': {
|
||||
en: 'Loading interactive demo…',
|
||||
'zh-CN': '正在加载互动演示…'
|
||||
},
|
||||
'demos.noscript': {
|
||||
en: 'This interactive demo requires JavaScript.',
|
||||
'zh-CN': '此互动演示需要 JavaScript。'
|
||||
},
|
||||
'demos.noscript.link': {
|
||||
en: 'View on Arcade →',
|
||||
'zh-CN': '在 Arcade 上查看 →'
|
||||
},
|
||||
'demos.duration.2min': { en: '~2 min', 'zh-CN': '~2 分钟' },
|
||||
'demos.difficulty.beginner': { en: 'Beginner', 'zh-CN': '入门' },
|
||||
'demos.difficulty.intermediate': {
|
||||
en: 'Intermediate',
|
||||
'zh-CN': '中级'
|
||||
},
|
||||
'demos.difficulty.advanced': { en: 'Advanced', 'zh-CN': '高级' },
|
||||
'demos.embed.label': {
|
||||
en: 'Interactive demo',
|
||||
'zh-CN': '互动演示'
|
||||
},
|
||||
'demos.comingSoon.title': {
|
||||
en: 'Coming Soon',
|
||||
'zh-CN': '即将推出'
|
||||
},
|
||||
'demos.comingSoon.body': {
|
||||
en: 'This page is being redesigned. Check back soon.',
|
||||
'zh-CN': '此页面正在重新设计中,请稍后再来。'
|
||||
},
|
||||
'demos.breadcrumb.home': { en: 'Home', 'zh-CN': '首页' },
|
||||
'demos.breadcrumb.demos': { en: 'Demos', 'zh-CN': '演示' },
|
||||
|
||||
'customers.story.whatsNext': {
|
||||
en: "What's next?",
|
||||
'zh-CN': '接下来看什么?'
|
||||
@@ -3592,6 +3666,49 @@ const translations = {
|
||||
'customers.feedback.role3': {
|
||||
en: 'Head of AI at Creative Studios',
|
||||
'zh-CN': 'Creative Studios AI 负责人'
|
||||
},
|
||||
|
||||
// Payment status pages
|
||||
'payment.success.label': {
|
||||
en: 'PAYMENT',
|
||||
'zh-CN': '支付'
|
||||
},
|
||||
'payment.success.title': {
|
||||
en: 'Payment successful',
|
||||
'zh-CN': '支付成功'
|
||||
},
|
||||
'payment.success.subtitle': {
|
||||
en: "Thanks for your purchase. Your account has been credited and you're ready to keep building.",
|
||||
'zh-CN': '感谢您的购买。您的账户已充值完成,可以继续创作了。'
|
||||
},
|
||||
'payment.success.primaryCta': {
|
||||
en: 'CONTINUE TO COMFY CLOUD',
|
||||
'zh-CN': '前往 COMFY CLOUD'
|
||||
},
|
||||
'payment.success.secondaryCta': {
|
||||
en: 'VIEW USAGE & PAYMENTS',
|
||||
'zh-CN': '查看用量与支付'
|
||||
},
|
||||
'payment.failed.label': {
|
||||
en: 'PAYMENT',
|
||||
'zh-CN': '支付'
|
||||
},
|
||||
'payment.failed.title': {
|
||||
en: 'Payment was not completed',
|
||||
'zh-CN': '支付未完成'
|
||||
},
|
||||
'payment.failed.subtitle': {
|
||||
en: "Your payment didn't go through and you have not been charged. Reach out to support or read the subscription docs if you need help.",
|
||||
'zh-CN':
|
||||
'您的支付未能完成,未发生扣款。如需帮助,请联系支持或查阅订阅文档。'
|
||||
},
|
||||
'payment.failed.primaryCta': {
|
||||
en: 'CONTACT SUPPORT',
|
||||
'zh-CN': '联系支持'
|
||||
},
|
||||
'payment.failed.secondaryCta': {
|
||||
en: 'READ SUBSCRIPTION DOCS',
|
||||
'zh-CN': '查看订阅文档'
|
||||
}
|
||||
} as const satisfies Record<string, Record<Locale, string>>
|
||||
|
||||
|
||||
@@ -109,6 +109,7 @@ const websiteJsonLd = {
|
||||
)}
|
||||
|
||||
<ClientRouter />
|
||||
<slot name="head" />
|
||||
</head>
|
||||
<body class="bg-primary-comfy-ink text-white font-formula antialiased overflow-x-clip">
|
||||
{gtmEnabled && (
|
||||
|
||||
139
apps/website/src/pages/demos/[slug].astro
Normal file
139
apps/website/src/pages/demos/[slug].astro
Normal file
@@ -0,0 +1,139 @@
|
||||
---
|
||||
import type { GetStaticPaths } from 'astro'
|
||||
import BaseLayout from '../../layouts/BaseLayout.astro'
|
||||
import DemoHeroSection from '../../components/demos/DemoHeroSection.vue'
|
||||
import ArcadeEmbed from '../../components/demos/ArcadeEmbed.vue'
|
||||
import DemoTranscript from '../../components/demos/DemoTranscript.vue'
|
||||
import DemoNavSection from '../../components/demos/DemoNavSection.vue'
|
||||
import { demos, getDemoBySlug, getNextDemo } from '../../config/demos'
|
||||
import { t } from '../../i18n/translations'
|
||||
|
||||
export const getStaticPaths: GetStaticPaths = () => {
|
||||
return demos.map((demo) => ({
|
||||
params: { slug: demo.slug }
|
||||
}))
|
||||
}
|
||||
|
||||
const { slug } = Astro.params
|
||||
const demo = getDemoBySlug(slug as string)!
|
||||
const nextDemo = getNextDemo(slug as string)
|
||||
const title = t(demo.title)
|
||||
const description = t(demo.description)
|
||||
const canonicalURL = new URL(`/demos/${demo.slug}`, Astro.site)
|
||||
|
||||
const howToJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'HowTo',
|
||||
name: title,
|
||||
description,
|
||||
image: new URL(demo.ogImage, Astro.site).href,
|
||||
totalTime: demo.durationIso,
|
||||
datePublished: demo.publishedDate,
|
||||
dateModified: demo.modifiedDate,
|
||||
author: {
|
||||
'@type': 'Organization',
|
||||
name: 'Comfy Org',
|
||||
url: 'https://comfy.org'
|
||||
}
|
||||
}
|
||||
|
||||
const learningResourceJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'LearningResource',
|
||||
name: title,
|
||||
description,
|
||||
learningResourceType: 'interactive tutorial',
|
||||
interactivityType: 'active',
|
||||
educationalLevel: demo.difficulty === 'beginner'
|
||||
? 'Beginner'
|
||||
: demo.difficulty === 'intermediate'
|
||||
? 'Intermediate'
|
||||
: 'Advanced',
|
||||
url: canonicalURL.href,
|
||||
datePublished: demo.publishedDate,
|
||||
dateModified: demo.modifiedDate,
|
||||
author: {
|
||||
'@type': 'Organization',
|
||||
name: 'Comfy Org',
|
||||
url: 'https://comfy.org'
|
||||
}
|
||||
}
|
||||
|
||||
const breadcrumbJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'BreadcrumbList',
|
||||
itemListElement: [
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 1,
|
||||
name: t('demos.breadcrumb.home'),
|
||||
item: 'https://comfy.org'
|
||||
},
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 2,
|
||||
name: t('demos.breadcrumb.demos'),
|
||||
item: 'https://comfy.org/demos'
|
||||
},
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 3,
|
||||
name: title
|
||||
}
|
||||
]
|
||||
}
|
||||
---
|
||||
|
||||
<BaseLayout
|
||||
title={`${title} — Comfy`}
|
||||
description={description}
|
||||
ogImage={demo.ogImage}
|
||||
>
|
||||
<Fragment slot="head">
|
||||
<meta property="article:published_time" content={demo.publishedDate} />
|
||||
<meta property="article:modified_time" content={demo.modifiedDate} />
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(howToJsonLd)}
|
||||
/>
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(learningResourceJsonLd)}
|
||||
/>
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(breadcrumbJsonLd)}
|
||||
/>
|
||||
<link rel="preconnect" href="https://demo.arcade.software" />
|
||||
</Fragment>
|
||||
|
||||
<DemoHeroSection
|
||||
label={t(demo.category)}
|
||||
title={title}
|
||||
description={description}
|
||||
difficulty={demo.difficulty}
|
||||
estimatedTime={demo.estimatedTime}
|
||||
/>
|
||||
|
||||
<ArcadeEmbed
|
||||
arcadeId={demo.arcadeId}
|
||||
title={title}
|
||||
client:load
|
||||
/>
|
||||
|
||||
{demo.transcript && (
|
||||
<DemoTranscript
|
||||
transcript={t(demo.transcript)}
|
||||
client:visible
|
||||
/>
|
||||
)}
|
||||
|
||||
<DemoNavSection
|
||||
nextTitle={t(nextDemo.title)}
|
||||
nextSlug={nextDemo.slug}
|
||||
nextThumbnail={nextDemo.thumbnail}
|
||||
/>
|
||||
</BaseLayout>
|
||||
8
apps/website/src/pages/demos/index.astro
Normal file
8
apps/website/src/pages/demos/index.astro
Normal file
@@ -0,0 +1,8 @@
|
||||
---
|
||||
import BaseLayout from '../../layouts/BaseLayout.astro'
|
||||
import ComingSoon from '../../components/common/ComingSoon.astro'
|
||||
---
|
||||
|
||||
<BaseLayout title="Demos — Comfy" description="Interactive demos and tutorials for ComfyUI.">
|
||||
<ComingSoon />
|
||||
</BaseLayout>
|
||||
12
apps/website/src/pages/payment/failed.astro
Normal file
12
apps/website/src/pages/payment/failed.astro
Normal file
@@ -0,0 +1,12 @@
|
||||
---
|
||||
import BaseLayout from '../../layouts/BaseLayout.astro'
|
||||
import PaymentStatusSection from '../../components/payment/PaymentStatusSection.vue'
|
||||
---
|
||||
|
||||
<BaseLayout
|
||||
title="Payment Failed — Comfy"
|
||||
description="Your payment was not completed."
|
||||
noindex
|
||||
>
|
||||
<PaymentStatusSection status="failed" />
|
||||
</BaseLayout>
|
||||
12
apps/website/src/pages/payment/success.astro
Normal file
12
apps/website/src/pages/payment/success.astro
Normal file
@@ -0,0 +1,12 @@
|
||||
---
|
||||
import BaseLayout from '../../layouts/BaseLayout.astro'
|
||||
import PaymentStatusSection from '../../components/payment/PaymentStatusSection.vue'
|
||||
---
|
||||
|
||||
<BaseLayout
|
||||
title="Payment Successful — Comfy"
|
||||
description="Your payment was processed successfully."
|
||||
noindex
|
||||
>
|
||||
<PaymentStatusSection status="success" />
|
||||
</BaseLayout>
|
||||
143
apps/website/src/pages/zh-CN/demos/[slug].astro
Normal file
143
apps/website/src/pages/zh-CN/demos/[slug].astro
Normal file
@@ -0,0 +1,143 @@
|
||||
---
|
||||
import type { GetStaticPaths } from 'astro'
|
||||
import BaseLayout from '../../../layouts/BaseLayout.astro'
|
||||
import DemoHeroSection from '../../../components/demos/DemoHeroSection.vue'
|
||||
import ArcadeEmbed from '../../../components/demos/ArcadeEmbed.vue'
|
||||
import DemoTranscript from '../../../components/demos/DemoTranscript.vue'
|
||||
import DemoNavSection from '../../../components/demos/DemoNavSection.vue'
|
||||
import { demos, getDemoBySlug, getNextDemo } from '../../../config/demos'
|
||||
import { t } from '../../../i18n/translations'
|
||||
|
||||
export const getStaticPaths: GetStaticPaths = () => {
|
||||
return demos.map((demo) => ({
|
||||
params: { slug: demo.slug }
|
||||
}))
|
||||
}
|
||||
|
||||
const { slug } = Astro.params
|
||||
const demo = getDemoBySlug(slug as string)!
|
||||
const nextDemo = getNextDemo(slug as string)
|
||||
const title = t(demo.title, 'zh-CN')
|
||||
const description = t(demo.description, 'zh-CN')
|
||||
const canonicalURL = new URL(`/zh-CN/demos/${demo.slug}`, Astro.site)
|
||||
|
||||
const howToJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'HowTo',
|
||||
name: title,
|
||||
description,
|
||||
image: new URL(demo.ogImage, Astro.site).href,
|
||||
totalTime: demo.durationIso,
|
||||
datePublished: demo.publishedDate,
|
||||
dateModified: demo.modifiedDate,
|
||||
author: {
|
||||
'@type': 'Organization',
|
||||
name: 'Comfy Org',
|
||||
url: 'https://comfy.org'
|
||||
}
|
||||
}
|
||||
|
||||
const learningResourceJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'LearningResource',
|
||||
name: title,
|
||||
description,
|
||||
learningResourceType: 'interactive tutorial',
|
||||
interactivityType: 'active',
|
||||
educationalLevel: demo.difficulty === 'beginner'
|
||||
? 'Beginner'
|
||||
: demo.difficulty === 'intermediate'
|
||||
? 'Intermediate'
|
||||
: 'Advanced',
|
||||
url: canonicalURL.href,
|
||||
datePublished: demo.publishedDate,
|
||||
dateModified: demo.modifiedDate,
|
||||
author: {
|
||||
'@type': 'Organization',
|
||||
name: 'Comfy Org',
|
||||
url: 'https://comfy.org'
|
||||
}
|
||||
}
|
||||
|
||||
const breadcrumbJsonLd = {
|
||||
'@context': 'https://schema.org',
|
||||
'@type': 'BreadcrumbList',
|
||||
itemListElement: [
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 1,
|
||||
name: t('demos.breadcrumb.home', 'zh-CN'),
|
||||
item: 'https://comfy.org/zh-CN'
|
||||
},
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 2,
|
||||
name: t('demos.breadcrumb.demos', 'zh-CN'),
|
||||
item: 'https://comfy.org/zh-CN/demos'
|
||||
},
|
||||
{
|
||||
'@type': 'ListItem',
|
||||
position: 3,
|
||||
name: title
|
||||
}
|
||||
]
|
||||
}
|
||||
---
|
||||
|
||||
<BaseLayout
|
||||
title={`${title} — Comfy`}
|
||||
description={description}
|
||||
ogImage={demo.ogImage}
|
||||
>
|
||||
<Fragment slot="head">
|
||||
<meta property="article:published_time" content={demo.publishedDate} />
|
||||
<meta property="article:modified_time" content={demo.modifiedDate} />
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(howToJsonLd)}
|
||||
/>
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(learningResourceJsonLd)}
|
||||
/>
|
||||
<script
|
||||
is:inline
|
||||
type="application/ld+json"
|
||||
set:html={JSON.stringify(breadcrumbJsonLd)}
|
||||
/>
|
||||
<link rel="preconnect" href="https://demo.arcade.software" />
|
||||
</Fragment>
|
||||
|
||||
<DemoHeroSection
|
||||
label={t(demo.category, 'zh-CN')}
|
||||
title={title}
|
||||
description={description}
|
||||
difficulty={demo.difficulty}
|
||||
estimatedTime={demo.estimatedTime}
|
||||
locale="zh-CN"
|
||||
/>
|
||||
|
||||
<ArcadeEmbed
|
||||
arcadeId={demo.arcadeId}
|
||||
title={title}
|
||||
locale="zh-CN"
|
||||
client:load
|
||||
/>
|
||||
|
||||
{demo.transcript && (
|
||||
<DemoTranscript
|
||||
transcript={t(demo.transcript, 'zh-CN')}
|
||||
locale="zh-CN"
|
||||
client:visible
|
||||
/>
|
||||
)}
|
||||
|
||||
<DemoNavSection
|
||||
nextTitle={t(nextDemo.title, 'zh-CN')}
|
||||
nextSlug={nextDemo.slug}
|
||||
nextThumbnail={nextDemo.thumbnail}
|
||||
locale="zh-CN"
|
||||
/>
|
||||
</BaseLayout>
|
||||
17
apps/website/src/pages/zh-CN/demos/index.astro
Normal file
17
apps/website/src/pages/zh-CN/demos/index.astro
Normal file
@@ -0,0 +1,17 @@
|
||||
---
|
||||
import BaseLayout from '../../../layouts/BaseLayout.astro'
|
||||
import { t } from '../../../i18n/translations'
|
||||
---
|
||||
|
||||
<BaseLayout title="演示 — Comfy" description="ComfyUI 的互动演示和教程。">
|
||||
<section class="flex min-h-[60vh] items-center justify-center px-6">
|
||||
<div class="text-center">
|
||||
<h1 class="text-primary-comfy-canvas text-4xl font-light">
|
||||
{t('demos.comingSoon.title', 'zh-CN')}
|
||||
</h1>
|
||||
<p class="text-primary-warm-gray mt-4 text-sm">
|
||||
{t('demos.comingSoon.body', 'zh-CN')}
|
||||
</p>
|
||||
</div>
|
||||
</section>
|
||||
</BaseLayout>
|
||||
8
apps/website/src/pages/zh-CN/payment/failed.astro
Normal file
8
apps/website/src/pages/zh-CN/payment/failed.astro
Normal file
@@ -0,0 +1,8 @@
|
||||
---
|
||||
import BaseLayout from '../../../layouts/BaseLayout.astro'
|
||||
import PaymentStatusSection from '../../../components/payment/PaymentStatusSection.vue'
|
||||
---
|
||||
|
||||
<BaseLayout title="支付失败 — Comfy" description="您的支付未能完成。" noindex>
|
||||
<PaymentStatusSection status="failed" locale="zh-CN" />
|
||||
</BaseLayout>
|
||||
8
apps/website/src/pages/zh-CN/payment/success.astro
Normal file
8
apps/website/src/pages/zh-CN/payment/success.astro
Normal file
@@ -0,0 +1,8 @@
|
||||
---
|
||||
import BaseLayout from '../../../layouts/BaseLayout.astro'
|
||||
import PaymentStatusSection from '../../../components/payment/PaymentStatusSection.vue'
|
||||
---
|
||||
|
||||
<BaseLayout title="支付成功 — Comfy" description="您的支付已成功完成。" noindex>
|
||||
<PaymentStatusSection status="success" locale="zh-CN" />
|
||||
</BaseLayout>
|
||||
68
browser_tests/assets/missing/node_replacement_multi.json
Normal file
68
browser_tests/assets/missing/node_replacement_multi.json
Normal file
@@ -0,0 +1,68 @@
|
||||
{
|
||||
"last_node_id": 4,
|
||||
"last_link_id": 2,
|
||||
"nodes": [
|
||||
{
|
||||
"id": 1,
|
||||
"type": "E2E_OldSampler",
|
||||
"pos": [100, 100],
|
||||
"size": [400, 262],
|
||||
"flags": {},
|
||||
"order": 0,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{ "name": "model", "type": "MODEL", "link": null },
|
||||
{ "name": "positive", "type": "CONDITIONING", "link": null },
|
||||
{ "name": "negative", "type": "CONDITIONING", "link": null },
|
||||
{ "name": "latent_image", "type": "LATENT", "link": null }
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "LATENT",
|
||||
"type": "LATENT",
|
||||
"links": [],
|
||||
"slot_index": 0
|
||||
}
|
||||
],
|
||||
"properties": { "Node name for S&R": "E2E_OldSampler" },
|
||||
"widgets_values": [42, 20, 7, "euler", "normal"]
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"type": "E2E_OldUpscaler",
|
||||
"pos": [500, 100],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 1,
|
||||
"mode": 0,
|
||||
"inputs": [{ "name": "image", "type": "IMAGE", "link": null }],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "IMAGE",
|
||||
"type": "IMAGE",
|
||||
"links": [2],
|
||||
"slot_index": 0
|
||||
}
|
||||
],
|
||||
"properties": { "Node name for S&R": "E2E_OldUpscaler" },
|
||||
"widgets_values": ["lanczos", 1.5]
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"type": "SaveImage",
|
||||
"pos": [900, 100],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 2,
|
||||
"mode": 0,
|
||||
"inputs": [{ "name": "images", "type": "IMAGE", "link": 2 }],
|
||||
"properties": { "Node name for S&R": "SaveImage" },
|
||||
"widgets_values": ["ComfyUI"]
|
||||
}
|
||||
],
|
||||
"links": [[2, 2, 0, 3, 0, "IMAGE"]],
|
||||
"groups": [],
|
||||
"config": {},
|
||||
"extra": { "ds": { "scale": 1, "offset": [0, 0] } },
|
||||
"version": 0.4
|
||||
}
|
||||
59
browser_tests/assets/missing/node_replacement_simple.json
Normal file
59
browser_tests/assets/missing/node_replacement_simple.json
Normal file
@@ -0,0 +1,59 @@
|
||||
{
|
||||
"last_node_id": 3,
|
||||
"last_link_id": 1,
|
||||
"nodes": [
|
||||
{
|
||||
"id": 1,
|
||||
"type": "E2E_OldSampler",
|
||||
"pos": [100, 100],
|
||||
"size": [400, 262],
|
||||
"flags": {},
|
||||
"order": 0,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{ "name": "model", "type": "MODEL", "link": null },
|
||||
{ "name": "positive", "type": "CONDITIONING", "link": null },
|
||||
{ "name": "negative", "type": "CONDITIONING", "link": null },
|
||||
{ "name": "latent_image", "type": "LATENT", "link": null }
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "LATENT",
|
||||
"type": "LATENT",
|
||||
"links": [1],
|
||||
"slot_index": 0
|
||||
}
|
||||
],
|
||||
"properties": { "Node name for S&R": "E2E_OldSampler" },
|
||||
"widgets_values": [42, 20, 7, "euler", "normal"]
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"type": "VAEDecode",
|
||||
"pos": [500, 100],
|
||||
"size": [400, 200],
|
||||
"flags": {},
|
||||
"order": 1,
|
||||
"mode": 0,
|
||||
"inputs": [
|
||||
{ "name": "samples", "type": "LATENT", "link": 1 },
|
||||
{ "name": "vae", "type": "VAE", "link": null }
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "IMAGE",
|
||||
"type": "IMAGE",
|
||||
"links": [],
|
||||
"slot_index": 0
|
||||
}
|
||||
],
|
||||
"properties": { "Node name for S&R": "VAEDecode" },
|
||||
"widgets_values": []
|
||||
}
|
||||
],
|
||||
"links": [[1, 1, 0, 2, 0, "LATENT"]],
|
||||
"groups": [],
|
||||
"config": {},
|
||||
"extra": { "ds": { "scale": 1, "offset": [0, 0] } },
|
||||
"version": 0.4
|
||||
}
|
||||
@@ -505,6 +505,7 @@ export const comfyPageFixture = base.extend<{
|
||||
'Comfy.userId': userId,
|
||||
// Set tutorial completed to true to avoid loading the tutorial workflow.
|
||||
'Comfy.TutorialCompleted': true,
|
||||
'Comfy.Queue.MaxHistoryItems': 64,
|
||||
'Comfy.SnapToGrid.GridSize': testComfySnapToGridGridSize,
|
||||
'Comfy.VueNodes.AutoScaleLayout': false,
|
||||
// Disable toast warning about version compatibility, as they may or
|
||||
|
||||
47
browser_tests/fixtures/data/nodeReplacements.ts
Normal file
47
browser_tests/fixtures/data/nodeReplacements.ts
Normal file
@@ -0,0 +1,47 @@
|
||||
import type { NodeReplacementResponse } from '@/platform/nodeReplacement/types'
|
||||
|
||||
/**
|
||||
* Mock node replacement mappings for e2e tests.
|
||||
*
|
||||
* Maps fake "missing" node types (E2E_OldSampler, E2E_OldUpscaler) to real
|
||||
* core node types that are always available in the test server.
|
||||
*/
|
||||
export const mockNodeReplacements: NodeReplacementResponse = {
|
||||
E2E_OldSampler: [
|
||||
{
|
||||
new_node_id: 'KSampler',
|
||||
old_node_id: 'E2E_OldSampler',
|
||||
old_widget_ids: ['seed', 'steps', 'cfg', 'sampler_name', 'scheduler'],
|
||||
input_mapping: [
|
||||
{ new_id: 'model', old_id: 'model' },
|
||||
{ new_id: 'positive', old_id: 'positive' },
|
||||
{ new_id: 'negative', old_id: 'negative' },
|
||||
{ new_id: 'latent_image', old_id: 'latent_image' },
|
||||
{ new_id: 'seed', old_id: 'seed' },
|
||||
{ new_id: 'steps', old_id: 'steps' },
|
||||
{ new_id: 'cfg', old_id: 'cfg' },
|
||||
{ new_id: 'sampler_name', old_id: 'sampler_name' },
|
||||
{ new_id: 'scheduler', old_id: 'scheduler' }
|
||||
],
|
||||
output_mapping: [{ new_idx: 0, old_idx: 0 }]
|
||||
}
|
||||
],
|
||||
E2E_OldUpscaler: [
|
||||
{
|
||||
new_node_id: 'ImageScaleBy',
|
||||
old_node_id: 'E2E_OldUpscaler',
|
||||
old_widget_ids: ['upscale_method', 'scale_by'],
|
||||
input_mapping: [
|
||||
{ new_id: 'image', old_id: 'image' },
|
||||
{ new_id: 'upscale_method', old_id: 'upscale_method' },
|
||||
{ new_id: 'scale_by', old_id: 'scale_by' }
|
||||
],
|
||||
output_mapping: [{ new_idx: 0, old_idx: 0 }]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
/** Subset containing only the E2E_OldSampler replacement. */
|
||||
export const mockNodeReplacementsSingle: NodeReplacementResponse = {
|
||||
E2E_OldSampler: mockNodeReplacements.E2E_OldSampler
|
||||
}
|
||||
136
browser_tests/fixtures/helpers/MaskEditorHelper.ts
Normal file
136
browser_tests/fixtures/helpers/MaskEditorHelper.ts
Normal file
@@ -0,0 +1,136 @@
|
||||
import type { Locator } from '@playwright/test'
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import { comfyPageFixture } from '@e2e/fixtures/ComfyPage'
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
const MASK_CANVAS_INDEX = 2
|
||||
const RGB_CANVAS_INDEX = 1
|
||||
|
||||
export type BrushSliderLabel = 'thickness'
|
||||
|
||||
export class MaskEditorHelper {
|
||||
constructor(private comfyPage: ComfyPage) {}
|
||||
|
||||
private get page() {
|
||||
return this.comfyPage.page
|
||||
}
|
||||
|
||||
async loadImageOnNode() {
|
||||
await this.comfyPage.workflow.loadWorkflow('widgets/load_image_widget')
|
||||
|
||||
const loadImageNode = (
|
||||
await this.comfyPage.nodeOps.getNodeRefsByType('LoadImage')
|
||||
)[0]
|
||||
const { x, y } = await loadImageNode.getPosition()
|
||||
|
||||
await this.comfyPage.dragDrop.dragAndDropFile('image64x64.webp', {
|
||||
dropPosition: { x, y }
|
||||
})
|
||||
|
||||
const imagePreview = this.page.locator('.image-preview')
|
||||
await expect(imagePreview).toBeVisible()
|
||||
await expect(imagePreview.locator('img')).toBeVisible()
|
||||
await expect(imagePreview).toContainText('x')
|
||||
|
||||
return {
|
||||
imagePreview,
|
||||
nodeId: String(loadImageNode.id)
|
||||
}
|
||||
}
|
||||
|
||||
async openDialog(): Promise<Locator> {
|
||||
const { imagePreview } = await this.loadImageOnNode()
|
||||
|
||||
await imagePreview.getByRole('region').hover()
|
||||
await this.page.getByLabel('Edit or mask image').click()
|
||||
|
||||
const dialog = this.page.locator('.mask-editor-dialog')
|
||||
await expect(dialog).toBeVisible()
|
||||
await expect(
|
||||
dialog.getByRole('heading', { name: 'Mask Editor' })
|
||||
).toBeVisible()
|
||||
|
||||
const canvasContainer = dialog.locator('#maskEditorCanvasContainer')
|
||||
await expect(canvasContainer).toBeVisible()
|
||||
await expect(canvasContainer.locator('canvas')).toHaveCount(4)
|
||||
|
||||
return dialog
|
||||
}
|
||||
|
||||
async drawStrokeOnPointerZone(dialog: Locator) {
|
||||
const pointerZone = dialog.getByTestId('pointer-zone')
|
||||
await expect(pointerZone).toBeVisible()
|
||||
|
||||
const box = await pointerZone.boundingBox()
|
||||
if (!box) throw new Error('Pointer zone bounding box not found')
|
||||
|
||||
const startX = box.x + box.width * 0.3
|
||||
const startY = box.y + box.height * 0.5
|
||||
const endX = box.x + box.width * 0.7
|
||||
const endY = box.y + box.height * 0.5
|
||||
|
||||
await this.page.mouse.move(startX, startY)
|
||||
await this.page.mouse.down()
|
||||
await this.page.mouse.move(endX, endY, { steps: 10 })
|
||||
await this.page.mouse.up()
|
||||
|
||||
return { startX, startY, endX, endY, box }
|
||||
}
|
||||
|
||||
async drawStrokeAndExpectPixels(dialog: Locator) {
|
||||
await this.drawStrokeOnPointerZone(dialog)
|
||||
await expect.poll(() => this.pollMaskPixelCount()).toBeGreaterThan(0)
|
||||
}
|
||||
|
||||
getCanvasPixelData(canvasIndex: number) {
|
||||
return this.page.evaluate((idx) => {
|
||||
const canvases = document.querySelectorAll(
|
||||
'#maskEditorCanvasContainer canvas'
|
||||
)
|
||||
const canvas = canvases[idx] as HTMLCanvasElement | undefined
|
||||
if (!canvas) return null
|
||||
const ctx = canvas.getContext('2d')
|
||||
if (!ctx) return null
|
||||
const data = ctx.getImageData(0, 0, canvas.width, canvas.height)
|
||||
let nonTransparentPixels = 0
|
||||
for (let i = 3; i < data.data.length; i += 4) {
|
||||
if (data.data[i] > 0) nonTransparentPixels++
|
||||
}
|
||||
return { nonTransparentPixels, totalPixels: data.data.length / 4 }
|
||||
}, canvasIndex)
|
||||
}
|
||||
|
||||
pollMaskPixelCount(): Promise<number> {
|
||||
return this.getCanvasPixelData(MASK_CANVAS_INDEX).then(
|
||||
(d) => d?.nonTransparentPixels ?? 0
|
||||
)
|
||||
}
|
||||
|
||||
pollRgbPixelCount(): Promise<number> {
|
||||
return this.getCanvasPixelData(RGB_CANVAS_INDEX).then(
|
||||
(d) => d?.nonTransparentPixels ?? 0
|
||||
)
|
||||
}
|
||||
|
||||
getCanvasSnapshot(canvasIndex: number): Promise<string> {
|
||||
return this.page.evaluate((idx) => {
|
||||
const canvas = document.querySelectorAll(
|
||||
'#maskEditorCanvasContainer canvas'
|
||||
)[idx] as HTMLCanvasElement | undefined
|
||||
return canvas?.toDataURL() ?? ''
|
||||
}, canvasIndex)
|
||||
}
|
||||
|
||||
brushInput(dialog: Locator, label: BrushSliderLabel): Locator {
|
||||
return dialog.getByTestId(`brush-${label}-input`)
|
||||
}
|
||||
}
|
||||
|
||||
export const maskEditorTest = comfyPageFixture.extend<{
|
||||
maskEditor: MaskEditorHelper
|
||||
}>({
|
||||
maskEditor: async ({ comfyPage }, use) => {
|
||||
await use(new MaskEditorHelper(comfyPage))
|
||||
}
|
||||
})
|
||||
93
browser_tests/fixtures/helpers/NodeReplacementHelper.ts
Normal file
93
browser_tests/fixtures/helpers/NodeReplacementHelper.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import type { Locator, Page } from '@playwright/test'
|
||||
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
import { TestIds } from '@e2e/fixtures/selectors'
|
||||
import type { NodeReplacementResponse } from '@/platform/nodeReplacement/types'
|
||||
|
||||
/**
|
||||
* Mock `/api/node_replacements` and enable the node replacement feature.
|
||||
*
|
||||
* Unlike features that only consult settings (e.g. shareWorkflowDialog,
|
||||
* managerDialog), node replacement gates on `api.serverFeatureFlags`. The
|
||||
* server sends a `feature_flags` WS message that wholesale replaces
|
||||
* `serverFeatureFlags`, racing with any test-side override done via
|
||||
* `page.evaluate`. To make the flow deterministic across CI shards, this
|
||||
* helper patches `WebSocket.prototype` so every incoming `feature_flags`
|
||||
* message has `node_replacements: true` injected before the api's WS
|
||||
* handler sees it. Reload the page so the patched WebSocket and persisted
|
||||
* settings apply to a fresh app boot, then wait for the resulting
|
||||
* `/api/node_replacements` fetch before returning.
|
||||
*/
|
||||
export async function setupNodeReplacement(
|
||||
comfyPage: ComfyPage,
|
||||
replacements: NodeReplacementResponse
|
||||
): Promise<void> {
|
||||
await comfyPage.page.route('**/api/node_replacements', (route) =>
|
||||
route.fulfill({ json: replacements })
|
||||
)
|
||||
|
||||
await comfyPage.settings.setSetting(
|
||||
'Comfy.RightSidePanel.ShowErrorsTab',
|
||||
true
|
||||
)
|
||||
await comfyPage.settings.setSetting('Comfy.NodeReplacement.Enabled', true)
|
||||
|
||||
await comfyPage.page.addInitScript(() => {
|
||||
const proto = window.WebSocket.prototype
|
||||
const originalAdd = proto.addEventListener
|
||||
proto.addEventListener = function patchedAdd(
|
||||
this: WebSocket,
|
||||
type: string,
|
||||
listener: EventListenerOrEventListenerObject | null,
|
||||
options?: AddEventListenerOptions | boolean
|
||||
) {
|
||||
if (type === 'message' && typeof listener === 'function') {
|
||||
const wrapped = function (this: WebSocket, event: Event) {
|
||||
const msgEvent = event as MessageEvent
|
||||
if (typeof msgEvent.data === 'string') {
|
||||
try {
|
||||
const msg = JSON.parse(msgEvent.data)
|
||||
if (
|
||||
msg &&
|
||||
msg.type === 'feature_flags' &&
|
||||
msg.data &&
|
||||
typeof msg.data === 'object'
|
||||
) {
|
||||
msg.data.node_replacements = true
|
||||
const patched = new MessageEvent('message', {
|
||||
data: JSON.stringify(msg),
|
||||
origin: msgEvent.origin,
|
||||
lastEventId: msgEvent.lastEventId
|
||||
})
|
||||
return (listener as EventListener).call(this, patched)
|
||||
}
|
||||
} catch {
|
||||
// not JSON or not a feature_flags message - pass through
|
||||
}
|
||||
}
|
||||
return (listener as EventListener).call(this, event)
|
||||
}
|
||||
return originalAdd.call(this, type, wrapped as EventListener, options)
|
||||
}
|
||||
return originalAdd.call(
|
||||
this,
|
||||
type,
|
||||
listener as EventListenerOrEventListenerObject,
|
||||
options
|
||||
)
|
||||
}
|
||||
})
|
||||
|
||||
const fetchPromise = comfyPage.page.waitForResponse(
|
||||
(response) =>
|
||||
response.url().includes('/api/node_replacements') && response.ok(),
|
||||
{ timeout: 10000 }
|
||||
)
|
||||
|
||||
await comfyPage.workflow.reloadAndWaitForApp()
|
||||
await fetchPromise
|
||||
}
|
||||
|
||||
export function getSwapNodesGroup(page: Page): Locator {
|
||||
return page.getByTestId(TestIds.dialogs.swapNodesGroup)
|
||||
}
|
||||
@@ -64,6 +64,7 @@ export const TestIds = {
|
||||
missingModelRefresh: 'missing-model-refresh',
|
||||
missingModelImportUnsupported: 'missing-model-import-unsupported',
|
||||
missingMediaGroup: 'error-group-missing-media',
|
||||
swapNodesGroup: 'error-group-swap-nodes',
|
||||
missingMediaRow: 'missing-media-row',
|
||||
missingMediaUploadDropzone: 'missing-media-upload-dropzone',
|
||||
missingMediaLibrarySelect: 'missing-media-library-select',
|
||||
|
||||
34
browser_tests/fixtures/utils/groupHelpers.ts
Normal file
34
browser_tests/fixtures/utils/groupHelpers.ts
Normal file
@@ -0,0 +1,34 @@
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
const GROUP_TITLE_CLICK_OFFSET_X = 50
|
||||
const GROUP_TITLE_CLICK_OFFSET_Y = 15
|
||||
|
||||
/**
|
||||
* Returns the client-space position of a group's title bar (for clicking).
|
||||
*/
|
||||
export async function getGroupTitlePosition(
|
||||
comfyPage: ComfyPage,
|
||||
title: string
|
||||
): Promise<{ x: number; y: number }> {
|
||||
const pos = await comfyPage.page.evaluate(
|
||||
({ title, offsetX, offsetY }) => {
|
||||
const app = window.app!
|
||||
const group = app.graph.groups.find(
|
||||
(g: { title: string }) => g.title === title
|
||||
)
|
||||
if (!group) return null
|
||||
const clientPos = app.canvasPosToClientPos([
|
||||
group.pos[0] + offsetX,
|
||||
group.pos[1] + offsetY
|
||||
])
|
||||
return { x: clientPos[0], y: clientPos[1] }
|
||||
},
|
||||
{
|
||||
title,
|
||||
offsetX: GROUP_TITLE_CLICK_OFFSET_X,
|
||||
offsetY: GROUP_TITLE_CLICK_OFFSET_Y
|
||||
}
|
||||
)
|
||||
if (!pos) throw new Error(`Group "${title}" not found`)
|
||||
return pos
|
||||
}
|
||||
20
browser_tests/fixtures/utils/selectionToolbox.ts
Normal file
20
browser_tests/fixtures/utils/selectionToolbox.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
import type { Locator } from '@playwright/test'
|
||||
|
||||
import { comfyExpect as expect } from '@e2e/fixtures/ComfyPage'
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
/**
|
||||
* Opens the selection toolbox "More Options" menu and returns the menu
|
||||
* locator so callers can scope follow-up queries to it.
|
||||
*/
|
||||
export async function openMoreOptions(comfyPage: ComfyPage): Promise<Locator> {
|
||||
await expect(comfyPage.selectionToolbox).toBeVisible()
|
||||
|
||||
const moreOptionsBtn = comfyPage.page.getByTestId('more-options-button')
|
||||
await expect(moreOptionsBtn).toBeVisible()
|
||||
await moreOptionsBtn.click()
|
||||
|
||||
const menu = comfyPage.page.locator('.p-contextmenu')
|
||||
await expect(menu.getByText('Copy', { exact: true })).toBeVisible()
|
||||
return menu
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import { comfyPageFixture as test } from '@e2e/fixtures/ComfyPage'
|
||||
import { getGroupTitlePosition } from '@e2e/fixtures/utils/groupHelpers'
|
||||
|
||||
test.describe('Group Copy Paste', { tag: ['@canvas'] }, () => {
|
||||
test.afterEach(async ({ comfyPage }) => {
|
||||
@@ -12,15 +13,7 @@ test.describe('Group Copy Paste', { tag: ['@canvas'] }, () => {
|
||||
}) => {
|
||||
await comfyPage.workflow.loadWorkflow('groups/single_group_only')
|
||||
|
||||
const titlePos = await comfyPage.page.evaluate(() => {
|
||||
const app = window.app!
|
||||
const group = app.graph.groups[0]
|
||||
const clientPos = app.canvasPosToClientPos([
|
||||
group.pos[0] + 50,
|
||||
group.pos[1] + 15
|
||||
])
|
||||
return { x: clientPos[0], y: clientPos[1] }
|
||||
})
|
||||
const titlePos = await getGroupTitlePosition(comfyPage, 'Group')
|
||||
await comfyPage.canvas.click({ position: titlePos })
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
|
||||
@@ -2,29 +2,7 @@ import { expect } from '@playwright/test'
|
||||
|
||||
import { comfyPageFixture as test } from '@e2e/fixtures/ComfyPage'
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
/**
|
||||
* Returns the client-space position of a group's title bar (for clicking).
|
||||
*/
|
||||
async function getGroupTitlePosition(
|
||||
comfyPage: ComfyPage,
|
||||
title: string
|
||||
): Promise<{ x: number; y: number }> {
|
||||
const pos = await comfyPage.page.evaluate((title) => {
|
||||
const app = window.app!
|
||||
const group = app.graph.groups.find(
|
||||
(g: { title: string }) => g.title === title
|
||||
)
|
||||
if (!group) return null
|
||||
const clientPos = app.canvasPosToClientPos([
|
||||
group.pos[0] + 50,
|
||||
group.pos[1] + 15
|
||||
])
|
||||
return { x: clientPos[0], y: clientPos[1] }
|
||||
}, title)
|
||||
if (!pos) throw new Error(`Group "${title}" not found`)
|
||||
return pos
|
||||
}
|
||||
import { getGroupTitlePosition } from '@e2e/fixtures/utils/groupHelpers'
|
||||
|
||||
/**
|
||||
* Returns {selectedNodeCount, selectedGroupCount, selectedItemCount}
|
||||
|
||||
@@ -13,45 +13,35 @@ test.describe('Keyboard shortcut actions', { tag: '@keyboard' }, () => {
|
||||
await comfyPage.setup()
|
||||
})
|
||||
|
||||
test('Ctrl+Z undoes the last graph change', async ({ comfyPage }) => {
|
||||
test('Ctrl+Z undoes and Ctrl+Shift+Z redoes the last graph change', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const initialNodeCount = await comfyPage.nodeOps.getNodeCount()
|
||||
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
await test.step('Ctrl+Z undoes the last graph change', async () => {
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
|
||||
await comfyPage.canvas.click()
|
||||
await comfyPage.page.keyboard.press('ControlOrMeta+z')
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
|
||||
await comfyPage.canvas.click()
|
||||
await comfyPage.page.keyboard.press('ControlOrMeta+z')
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
})
|
||||
|
||||
test('Ctrl+Shift+Z redoes after undo', async ({ comfyPage }) => {
|
||||
const initialNodeCount = await comfyPage.nodeOps.getNodeCount()
|
||||
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
await test.step('Ctrl+Shift+Z redoes after undo', async () => {
|
||||
await comfyPage.page.keyboard.press('ControlOrMeta+Shift+z')
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await comfyPage.canvas.click()
|
||||
await comfyPage.page.keyboard.press('ControlOrMeta+z')
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
|
||||
await comfyPage.page.keyboard.press('ControlOrMeta+Shift+z')
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
})
|
||||
|
||||
test('Ctrl+S opens save dialog', async ({ comfyPage }) => {
|
||||
@@ -62,25 +52,23 @@ test.describe('Keyboard shortcut actions', { tag: '@keyboard' }, () => {
|
||||
await expect(saveDialog).toBeVisible()
|
||||
})
|
||||
|
||||
test('Ctrl+, opens settings dialog', async ({ comfyPage }) => {
|
||||
await comfyPage.page.keyboard.down('ControlOrMeta')
|
||||
await comfyPage.page.keyboard.press(',')
|
||||
await comfyPage.page.keyboard.up('ControlOrMeta')
|
||||
|
||||
test('Ctrl+, opens and Escape closes settings dialog', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const settingsDialog = comfyPage.page.getByTestId('settings-dialog')
|
||||
await expect(settingsDialog).toBeVisible()
|
||||
})
|
||||
|
||||
test('Escape closes settings dialog', async ({ comfyPage }) => {
|
||||
await comfyPage.page.keyboard.down('ControlOrMeta')
|
||||
await comfyPage.page.keyboard.press(',')
|
||||
await comfyPage.page.keyboard.up('ControlOrMeta')
|
||||
await test.step('Ctrl+, opens settings dialog', async () => {
|
||||
await comfyPage.page.keyboard.down('ControlOrMeta')
|
||||
await comfyPage.page.keyboard.press(',')
|
||||
await comfyPage.page.keyboard.up('ControlOrMeta')
|
||||
|
||||
const settingsDialog = comfyPage.page.getByTestId('settings-dialog')
|
||||
await expect(settingsDialog).toBeVisible()
|
||||
await expect(settingsDialog).toBeVisible()
|
||||
})
|
||||
|
||||
await comfyPage.page.keyboard.press('Escape')
|
||||
await expect(settingsDialog).toBeHidden()
|
||||
await test.step('Escape closes settings dialog', async () => {
|
||||
await comfyPage.page.keyboard.press('Escape')
|
||||
await expect(settingsDialog).toBeHidden()
|
||||
})
|
||||
})
|
||||
|
||||
test('Delete key removes selected nodes', async ({ comfyPage }) => {
|
||||
|
||||
32
browser_tests/tests/load3d/load3dLod.spec.ts
Normal file
32
browser_tests/tests/load3d/load3dLod.spec.ts
Normal file
@@ -0,0 +1,32 @@
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import { load3dTest as test } from '@e2e/fixtures/helpers/Load3DFixtures'
|
||||
|
||||
test.describe('Load3D LOD', () => {
|
||||
test(
|
||||
'canvas pixel dimensions scale with ComfyUI canvas zoom level',
|
||||
{ tag: '@smoke' },
|
||||
async ({ comfyPage, load3d }) => {
|
||||
await expect(load3d.canvas).toBeVisible()
|
||||
|
||||
await expect
|
||||
.poll(() => load3d.canvas.evaluate((el: HTMLCanvasElement) => el.width))
|
||||
.toBeGreaterThan(0)
|
||||
|
||||
const initialWidth = await load3d.canvas.evaluate(
|
||||
(el: HTMLCanvasElement) => el.width
|
||||
)
|
||||
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.app!.graph!.nodes[0]
|
||||
window.app!.canvas.ds.scale = 2.0
|
||||
node.onResize?.(node.size)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect
|
||||
.poll(() => load3d.canvas.evaluate((el: HTMLCanvasElement) => el.width))
|
||||
.toBeGreaterThan(initialWidth)
|
||||
}
|
||||
)
|
||||
})
|
||||
@@ -1,117 +1,13 @@
|
||||
import type { Page } from '@playwright/test'
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
import { comfyPageFixture as test } from '@e2e/fixtures/ComfyPage'
|
||||
import { maskEditorTest as test } from '@e2e/fixtures/helpers/MaskEditorHelper'
|
||||
|
||||
test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
async function loadImageOnNode(comfyPage: ComfyPage) {
|
||||
await comfyPage.workflow.loadWorkflow('widgets/load_image_widget')
|
||||
|
||||
const loadImageNode = (
|
||||
await comfyPage.nodeOps.getNodeRefsByType('LoadImage')
|
||||
)[0]
|
||||
const { x, y } = await loadImageNode.getPosition()
|
||||
|
||||
await comfyPage.dragDrop.dragAndDropFile('image64x64.webp', {
|
||||
dropPosition: { x, y }
|
||||
})
|
||||
|
||||
const imagePreview = comfyPage.page.locator('.image-preview')
|
||||
await expect(imagePreview).toBeVisible()
|
||||
await expect(imagePreview.locator('img')).toBeVisible()
|
||||
await expect(imagePreview).toContainText('x')
|
||||
|
||||
return {
|
||||
imagePreview,
|
||||
nodeId: String(loadImageNode.id)
|
||||
}
|
||||
}
|
||||
|
||||
async function openMaskEditorDialog(comfyPage: ComfyPage) {
|
||||
const { imagePreview } = await loadImageOnNode(comfyPage)
|
||||
|
||||
await imagePreview.getByRole('region').hover()
|
||||
await comfyPage.page.getByLabel('Edit or mask image').click()
|
||||
|
||||
const dialog = comfyPage.page.locator('.mask-editor-dialog')
|
||||
await expect(dialog).toBeVisible()
|
||||
await expect(
|
||||
dialog.getByRole('heading', { name: 'Mask Editor' })
|
||||
).toBeVisible()
|
||||
|
||||
const canvasContainer = dialog.locator('#maskEditorCanvasContainer')
|
||||
await expect(canvasContainer).toBeVisible()
|
||||
await expect(canvasContainer.locator('canvas')).toHaveCount(4)
|
||||
|
||||
return dialog
|
||||
}
|
||||
|
||||
async function getMaskCanvasPixelData(page: Page) {
|
||||
return page.evaluate(() => {
|
||||
const canvases = document.querySelectorAll(
|
||||
'#maskEditorCanvasContainer canvas'
|
||||
)
|
||||
// The mask canvas is the 3rd canvas (index 2, z-30)
|
||||
const maskCanvas = canvases[2] as HTMLCanvasElement
|
||||
if (!maskCanvas) return null
|
||||
const ctx = maskCanvas.getContext('2d')
|
||||
if (!ctx) return null
|
||||
const data = ctx.getImageData(0, 0, maskCanvas.width, maskCanvas.height)
|
||||
let nonTransparentPixels = 0
|
||||
for (let i = 3; i < data.data.length; i += 4) {
|
||||
if (data.data[i] > 0) nonTransparentPixels++
|
||||
}
|
||||
return { nonTransparentPixels, totalPixels: data.data.length / 4 }
|
||||
})
|
||||
}
|
||||
|
||||
function pollMaskPixelCount(page: Page): Promise<number> {
|
||||
return getMaskCanvasPixelData(page).then(
|
||||
(d) => d?.nonTransparentPixels ?? 0
|
||||
)
|
||||
}
|
||||
|
||||
async function drawStrokeOnPointerZone(
|
||||
page: Page,
|
||||
dialog: ReturnType<typeof page.locator>
|
||||
) {
|
||||
const pointerZone = dialog.locator(
|
||||
'.maskEditor-ui-container [class*="w-[calc"]'
|
||||
)
|
||||
await expect(pointerZone).toBeVisible()
|
||||
|
||||
const box = await pointerZone.boundingBox()
|
||||
if (!box) throw new Error('Pointer zone bounding box not found')
|
||||
|
||||
const startX = box.x + box.width * 0.3
|
||||
const startY = box.y + box.height * 0.5
|
||||
const endX = box.x + box.width * 0.7
|
||||
const endY = box.y + box.height * 0.5
|
||||
|
||||
await page.mouse.move(startX, startY)
|
||||
await page.mouse.down()
|
||||
await page.mouse.move(endX, endY, { steps: 10 })
|
||||
await page.mouse.up()
|
||||
|
||||
return { startX, startY, endX, endY, box }
|
||||
}
|
||||
|
||||
async function drawStrokeAndExpectPixels(
|
||||
comfyPage: ComfyPage,
|
||||
dialog: ReturnType<typeof comfyPage.page.locator>
|
||||
) {
|
||||
await drawStrokeOnPointerZone(comfyPage.page, dialog)
|
||||
await expect
|
||||
.poll(() => pollMaskPixelCount(comfyPage.page))
|
||||
.toBeGreaterThan(0)
|
||||
}
|
||||
|
||||
test(
|
||||
'opens mask editor from image preview button',
|
||||
{ tag: ['@smoke', '@screenshot'] },
|
||||
async ({ comfyPage }) => {
|
||||
const { imagePreview } = await loadImageOnNode(comfyPage)
|
||||
async ({ comfyPage, maskEditor }) => {
|
||||
const { imagePreview } = await maskEditor.loadImageOnNode()
|
||||
|
||||
// Hover over the image panel to reveal action buttons
|
||||
await imagePreview.getByRole('region').hover()
|
||||
@@ -139,8 +35,8 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
test(
|
||||
'opens mask editor from context menu',
|
||||
{ tag: ['@smoke', '@screenshot'] },
|
||||
async ({ comfyPage }) => {
|
||||
const { nodeId } = await loadImageOnNode(comfyPage)
|
||||
async ({ comfyPage, maskEditor }) => {
|
||||
const { nodeId } = await maskEditor.loadImageOnNode()
|
||||
|
||||
const nodeHeader = comfyPage.vueNodes
|
||||
.getNodeLocator(nodeId)
|
||||
@@ -166,63 +62,61 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
}
|
||||
)
|
||||
|
||||
test('draws a brush stroke on the mask canvas', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('draws a brush stroke on the mask canvas', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
const dataBefore = await getMaskCanvasPixelData(comfyPage.page)
|
||||
const dataBefore = await maskEditor.getCanvasPixelData(2)
|
||||
expect(dataBefore).not.toBeNull()
|
||||
expect(dataBefore!.nonTransparentPixels).toBe(0)
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
})
|
||||
|
||||
test('undo reverts a brush stroke', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('undo reverts a brush stroke', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const undoButton = dialog.locator('button[title="Undo"]')
|
||||
await expect(undoButton).toBeVisible()
|
||||
await undoButton.click()
|
||||
|
||||
await expect.poll(() => pollMaskPixelCount(comfyPage.page)).toBe(0)
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBe(0)
|
||||
})
|
||||
|
||||
test('redo restores an undone stroke', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('redo restores an undone stroke', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const undoButton = dialog.locator('button[title="Undo"]')
|
||||
await undoButton.click()
|
||||
|
||||
await expect.poll(() => pollMaskPixelCount(comfyPage.page)).toBe(0)
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBe(0)
|
||||
|
||||
const redoButton = dialog.locator('button[title="Redo"]')
|
||||
await expect(redoButton).toBeVisible()
|
||||
await redoButton.click()
|
||||
|
||||
await expect
|
||||
.poll(() => pollMaskPixelCount(comfyPage.page))
|
||||
.toBeGreaterThan(0)
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBeGreaterThan(0)
|
||||
})
|
||||
|
||||
test('clear button removes all mask content', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('clear button removes all mask content', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const clearButton = dialog.getByRole('button', { name: 'Clear' })
|
||||
await expect(clearButton).toBeVisible()
|
||||
await clearButton.click()
|
||||
|
||||
await expect.poll(() => pollMaskPixelCount(comfyPage.page)).toBe(0)
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBe(0)
|
||||
})
|
||||
|
||||
test('cancel closes the dialog without saving', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('cancel closes the dialog without saving', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const cancelButton = dialog.getByRole('button', { name: 'Cancel' })
|
||||
await cancelButton.click()
|
||||
@@ -230,10 +124,10 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
await expect(dialog).toBeHidden()
|
||||
})
|
||||
|
||||
test('invert button inverts the mask', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('invert button inverts the mask', async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
const dataBefore = await getMaskCanvasPixelData(comfyPage.page)
|
||||
const dataBefore = await maskEditor.getCanvasPixelData(2)
|
||||
expect(dataBefore).not.toBeNull()
|
||||
const pixelsBefore = dataBefore!.nonTransparentPixels
|
||||
|
||||
@@ -242,26 +136,29 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
await invertButton.click()
|
||||
|
||||
await expect
|
||||
.poll(() => pollMaskPixelCount(comfyPage.page))
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeGreaterThan(pixelsBefore)
|
||||
})
|
||||
|
||||
test('keyboard shortcut Ctrl+Z triggers undo', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('keyboard shortcut Ctrl+Z triggers undo', async ({
|
||||
comfyPage,
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const modifier = process.platform === 'darwin' ? 'Meta+z' : 'Control+z'
|
||||
await comfyPage.page.keyboard.press(modifier)
|
||||
|
||||
await expect.poll(() => pollMaskPixelCount(comfyPage.page)).toBe(0)
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBe(0)
|
||||
})
|
||||
|
||||
test(
|
||||
'tool panel shows all five tools',
|
||||
{ tag: ['@smoke'] },
|
||||
async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
const toolPanel = dialog.locator('.maskEditor-ui-container')
|
||||
await expect(toolPanel).toBeVisible()
|
||||
@@ -279,9 +176,9 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
)
|
||||
|
||||
test('switching tools updates the selected indicator', async ({
|
||||
comfyPage
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
const toolEntries = dialog.locator('.maskEditor_toolPanelContainer')
|
||||
await expect(toolEntries).toHaveCount(5)
|
||||
@@ -300,9 +197,9 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
})
|
||||
|
||||
test('brush settings panel is visible with thickness controls', async ({
|
||||
comfyPage
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
// The side panel should show brush settings by default
|
||||
const thicknessLabel = dialog.getByText('Thickness')
|
||||
@@ -315,8 +212,11 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
await expect(hardnessLabel).toBeVisible()
|
||||
})
|
||||
|
||||
test('save uploads all layers and closes dialog', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('save uploads all layers and closes dialog', async ({
|
||||
comfyPage,
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
let maskUploadCount = 0
|
||||
let imageUploadCount = 0
|
||||
@@ -359,8 +259,8 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
).toBeGreaterThan(0)
|
||||
})
|
||||
|
||||
test('save failure keeps dialog open', async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
test('save failure keeps dialog open', async ({ comfyPage, maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
// Fail all upload routes
|
||||
await comfyPage.page.route('**/upload/mask', (route) =>
|
||||
@@ -380,23 +280,23 @@ test.describe('Mask Editor', { tag: '@vue-nodes' }, () => {
|
||||
test(
|
||||
'eraser tool removes mask content',
|
||||
{ tag: ['@screenshot'] },
|
||||
async ({ comfyPage }) => {
|
||||
const dialog = await openMaskEditorDialog(comfyPage)
|
||||
async ({ maskEditor }) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
// Draw a stroke with the mask pen (default tool)
|
||||
await drawStrokeAndExpectPixels(comfyPage, dialog)
|
||||
await maskEditor.drawStrokeAndExpectPixels(dialog)
|
||||
|
||||
const pixelsAfterDraw = await getMaskCanvasPixelData(comfyPage.page)
|
||||
const pixelsAfterDraw = await maskEditor.getCanvasPixelData(2)
|
||||
|
||||
// Switch to eraser tool (3rd tool, index 2)
|
||||
const toolEntries = dialog.locator('.maskEditor_toolPanelContainer')
|
||||
await toolEntries.nth(2).click()
|
||||
|
||||
// Draw over the same area with the eraser
|
||||
await drawStrokeOnPointerZone(comfyPage.page, dialog)
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
|
||||
await expect
|
||||
.poll(() => pollMaskPixelCount(comfyPage.page))
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeLessThan(pixelsAfterDraw!.nonTransparentPixels)
|
||||
}
|
||||
)
|
||||
|
||||
100
browser_tests/tests/maskEditorBrushLayers.spec.ts
Normal file
100
browser_tests/tests/maskEditorBrushLayers.spec.ts
Normal file
@@ -0,0 +1,100 @@
|
||||
import { expect } from '@playwright/test'
|
||||
|
||||
import { maskEditorTest as test } from '@e2e/fixtures/helpers/MaskEditorHelper'
|
||||
|
||||
const RGB_PAINT_TOOL_INDEX = 1 // RGB / color paint tool
|
||||
const ERASER_TOOL_INDEX = 2 // Eraser tool
|
||||
|
||||
test.describe(
|
||||
'Mask Editor brush adjustment and layer management',
|
||||
{ tag: '@vue-nodes' },
|
||||
() => {
|
||||
test.describe('Brush settings interaction', () => {
|
||||
test('Adjusting brush thickness slider changes stroke output', async ({
|
||||
comfyPage,
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
const thicknessInput = maskEditor.brushInput(dialog, 'thickness')
|
||||
|
||||
// Thin brush
|
||||
await thicknessInput.fill('2')
|
||||
await expect(thicknessInput).toHaveValue('2')
|
||||
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
await expect
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeGreaterThan(0)
|
||||
const thinPixels = await maskEditor.pollMaskPixelCount()
|
||||
|
||||
await comfyPage.page.keyboard.press('Control+z')
|
||||
await expect.poll(() => maskEditor.pollMaskPixelCount()).toBe(0)
|
||||
|
||||
// Thick brush
|
||||
await thicknessInput.fill('200')
|
||||
await expect(thicknessInput).toHaveValue('200')
|
||||
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
await expect
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeGreaterThan(thinPixels)
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Layer management', () => {
|
||||
test('Drawing on different tools produces independent mask data', async ({
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
await expect
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeGreaterThan(0)
|
||||
const maskSnapshotAfterPen = await maskEditor.getCanvasSnapshot(2)
|
||||
|
||||
const toolEntries = dialog.locator('.maskEditor_toolPanelContainer')
|
||||
await expect(toolEntries).toHaveCount(5)
|
||||
await toolEntries.nth(RGB_PAINT_TOOL_INDEX).click()
|
||||
await expect(toolEntries.nth(RGB_PAINT_TOOL_INDEX)).toHaveClass(
|
||||
/Selected/
|
||||
)
|
||||
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
await expect
|
||||
.poll(() => maskEditor.pollRgbPixelCount())
|
||||
.toBeGreaterThan(0)
|
||||
|
||||
await expect
|
||||
.poll(() => maskEditor.getCanvasSnapshot(2))
|
||||
.toBe(maskSnapshotAfterPen)
|
||||
})
|
||||
|
||||
test("Switching between tools preserves previous tool's mask data", async ({
|
||||
maskEditor
|
||||
}) => {
|
||||
const dialog = await maskEditor.openDialog()
|
||||
|
||||
await maskEditor.drawStrokeOnPointerZone(dialog)
|
||||
await expect
|
||||
.poll(() => maskEditor.pollMaskPixelCount())
|
||||
.toBeGreaterThan(0)
|
||||
|
||||
const maskSnapshot = await maskEditor.getCanvasSnapshot(2)
|
||||
|
||||
const toolEntries = dialog.locator('.maskEditor_toolPanelContainer')
|
||||
await expect(toolEntries).toHaveCount(5)
|
||||
|
||||
await toolEntries.nth(ERASER_TOOL_INDEX).click()
|
||||
await expect(toolEntries.nth(ERASER_TOOL_INDEX)).toHaveClass(/Selected/)
|
||||
|
||||
await toolEntries.nth(0).click()
|
||||
await expect(toolEntries.nth(0)).toHaveClass(/Selected/)
|
||||
|
||||
await expect
|
||||
.poll(() => maskEditor.getCanvasSnapshot(2))
|
||||
.toBe(maskSnapshot)
|
||||
})
|
||||
})
|
||||
}
|
||||
)
|
||||
168
browser_tests/tests/nodeReplacement.spec.ts
Normal file
168
browser_tests/tests/nodeReplacement.spec.ts
Normal file
@@ -0,0 +1,168 @@
|
||||
import {
|
||||
comfyPageFixture as test,
|
||||
comfyExpect as expect
|
||||
} from '@e2e/fixtures/ComfyPage'
|
||||
import {
|
||||
mockNodeReplacements,
|
||||
mockNodeReplacementsSingle
|
||||
} from '@e2e/fixtures/data/nodeReplacements'
|
||||
import { loadWorkflowAndOpenErrorsTab } from '@e2e/fixtures/helpers/ErrorsTabHelper'
|
||||
import {
|
||||
getSwapNodesGroup,
|
||||
setupNodeReplacement
|
||||
} from '@e2e/fixtures/helpers/NodeReplacementHelper'
|
||||
|
||||
const renderModes = [
|
||||
{ name: 'vue nodes', vueNodesEnabled: true },
|
||||
{ name: 'litegraph', vueNodesEnabled: false }
|
||||
] as const
|
||||
|
||||
test.describe('Node replacement', { tag: ['@node', '@ui'] }, () => {
|
||||
for (const mode of renderModes) {
|
||||
test.describe(`(${mode.name})`, () => {
|
||||
test.describe('Single replacement', () => {
|
||||
test.beforeEach(async ({ comfyPage }) => {
|
||||
await comfyPage.settings.setSetting(
|
||||
'Comfy.VueNodes.Enabled',
|
||||
mode.vueNodesEnabled
|
||||
)
|
||||
await setupNodeReplacement(comfyPage, mockNodeReplacementsSingle)
|
||||
await loadWorkflowAndOpenErrorsTab(
|
||||
comfyPage,
|
||||
'missing/node_replacement_simple'
|
||||
)
|
||||
})
|
||||
|
||||
test('Swap Nodes group appears in errors tab for replaceable nodes', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const swapGroup = getSwapNodesGroup(comfyPage.page)
|
||||
await expect(swapGroup).toBeVisible()
|
||||
await expect(swapGroup).toContainText('E2E_OldSampler')
|
||||
await expect(
|
||||
swapGroup.getByRole('button', { name: 'Replace All', exact: true })
|
||||
).toBeVisible()
|
||||
})
|
||||
|
||||
test('Replace Node replaces a single group in-place', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const swapGroup = getSwapNodesGroup(comfyPage.page)
|
||||
await swapGroup.getByRole('button', { name: /replace node/i }).click()
|
||||
await expect(swapGroup).toBeHidden()
|
||||
|
||||
const workflow = await comfyPage.workflow.getExportedWorkflow()
|
||||
expect(
|
||||
workflow.nodes,
|
||||
'Node count should be unchanged after in-place replacement'
|
||||
).toHaveLength(2)
|
||||
|
||||
const nodeTypes = workflow.nodes.map((n) => n.type)
|
||||
expect(nodeTypes).not.toContain('E2E_OldSampler')
|
||||
expect(nodeTypes).toContain('KSampler')
|
||||
|
||||
const ksampler = workflow.nodes.find((n) => n.type === 'KSampler')
|
||||
expect(
|
||||
ksampler?.id,
|
||||
'Replaced node should keep the original id'
|
||||
).toBe(1)
|
||||
|
||||
const linkFromReplacedToDecode = workflow.links?.find(
|
||||
(l) => l[1] === 1 && l[3] === 2
|
||||
)
|
||||
expect(
|
||||
linkFromReplacedToDecode,
|
||||
'Output link from replaced node to VAEDecode should be preserved'
|
||||
).toBeDefined()
|
||||
})
|
||||
|
||||
test('Widget values are preserved after replacement', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await getSwapNodesGroup(comfyPage.page)
|
||||
.getByRole('button', { name: /replace node/i })
|
||||
.click()
|
||||
|
||||
const workflow = await comfyPage.workflow.getExportedWorkflow()
|
||||
const ksampler = workflow.nodes.find((n) => n.type === 'KSampler')
|
||||
|
||||
expect(ksampler?.widgets_values).toBeDefined()
|
||||
const widgetValues = ksampler!.widgets_values as unknown[]
|
||||
expect(widgetValues).toEqual([
|
||||
42,
|
||||
'randomize',
|
||||
20,
|
||||
7,
|
||||
'euler',
|
||||
'normal',
|
||||
1
|
||||
])
|
||||
})
|
||||
|
||||
test('Success toast is shown after replacement', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await getSwapNodesGroup(comfyPage.page)
|
||||
.getByRole('button', { name: /replace node/i })
|
||||
.click()
|
||||
|
||||
await expect(comfyPage.visibleToasts.first()).toContainText(
|
||||
/replaced|swapped/i
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Multi-type replacement', () => {
|
||||
test.beforeEach(async ({ comfyPage }) => {
|
||||
await comfyPage.settings.setSetting(
|
||||
'Comfy.VueNodes.Enabled',
|
||||
mode.vueNodesEnabled
|
||||
)
|
||||
await setupNodeReplacement(comfyPage, mockNodeReplacements)
|
||||
await loadWorkflowAndOpenErrorsTab(
|
||||
comfyPage,
|
||||
'missing/node_replacement_multi'
|
||||
)
|
||||
})
|
||||
|
||||
test('Replace All replaces all groups across multiple types', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const swapGroup = getSwapNodesGroup(comfyPage.page)
|
||||
await expect(swapGroup).toBeVisible()
|
||||
await expect(swapGroup).toContainText('E2E_OldSampler')
|
||||
await expect(swapGroup).toContainText('E2E_OldUpscaler')
|
||||
|
||||
await swapGroup
|
||||
.getByRole('button', { name: 'Replace All', exact: true })
|
||||
.click()
|
||||
await expect(swapGroup).toBeHidden()
|
||||
|
||||
const workflow = await comfyPage.workflow.getExportedWorkflow()
|
||||
const nodeTypes = workflow.nodes.map((n) => n.type)
|
||||
expect(nodeTypes).not.toContain('E2E_OldSampler')
|
||||
expect(nodeTypes).not.toContain('E2E_OldUpscaler')
|
||||
expect(nodeTypes).toContain('KSampler')
|
||||
expect(nodeTypes).toContain('ImageScaleBy')
|
||||
})
|
||||
|
||||
test('Output connections are preserved across replacement with output mapping', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await getSwapNodesGroup(comfyPage.page)
|
||||
.getByRole('button', { name: 'Replace All', exact: true })
|
||||
.click()
|
||||
|
||||
const replacedNodeOutputLinkCount = await comfyPage.page.evaluate(
|
||||
() =>
|
||||
window.app!.graph!.getNodeById(2)?.outputs[0]?.links?.length ?? 0
|
||||
)
|
||||
expect(
|
||||
replacedNodeOutputLinkCount,
|
||||
'Replaced upscaler should still drive its downstream consumer'
|
||||
).toBeGreaterThan(0)
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
})
|
||||
@@ -15,6 +15,7 @@ import { webSocketFixture } from '@e2e/fixtures/ws'
|
||||
const test = mergeTests(comfyPageFixture, webSocketFixture)
|
||||
|
||||
const TOTAL_MOCK_JOBS = 20
|
||||
const MAX_HISTORY_ITEMS_SETTING = 'Comfy.Queue.MaxHistoryItems'
|
||||
const overflowJobsListRoutePattern = '**/api/jobs?*'
|
||||
|
||||
function isHistoryJobsRequest(url: string): boolean {
|
||||
@@ -59,7 +60,7 @@ test.describe('Queue settings', { tag: '@canvas' }, () => {
|
||||
}) => {
|
||||
const TARGET_LIMIT = 6
|
||||
await comfyPage.settings.setSetting(
|
||||
'Comfy.Queue.MaxHistoryItems',
|
||||
MAX_HISTORY_ITEMS_SETTING,
|
||||
TARGET_LIMIT
|
||||
)
|
||||
|
||||
@@ -106,7 +107,7 @@ test.describe('Queue settings', { tag: '@canvas' }, () => {
|
||||
|
||||
const VISIBLE_LIMIT = 6
|
||||
await comfyPage.settings.setSetting(
|
||||
'Comfy.Queue.MaxHistoryItems',
|
||||
MAX_HISTORY_ITEMS_SETTING,
|
||||
VISIBLE_LIMIT
|
||||
)
|
||||
const exec = new ExecutionHelper(comfyPage, await getWebSocket())
|
||||
|
||||
@@ -2,21 +2,7 @@ import {
|
||||
comfyExpect as expect,
|
||||
comfyPageFixture as test
|
||||
} from '@e2e/fixtures/ComfyPage'
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
async function openMoreOptions(comfyPage: ComfyPage) {
|
||||
await expect(comfyPage.selectionToolbox).toBeVisible()
|
||||
|
||||
const moreOptionsBtn = comfyPage.page.getByTestId('more-options-button')
|
||||
await expect(moreOptionsBtn).toBeVisible()
|
||||
await moreOptionsBtn.click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
// Wait for the context menu to appear by checking for 'Copy', which is
|
||||
// always present regardless of single or multi-node selection.
|
||||
const menu = comfyPage.page.locator('.p-contextmenu')
|
||||
await expect(menu.getByText('Copy', { exact: true })).toBeVisible()
|
||||
}
|
||||
import { openMoreOptions } from '@e2e/fixtures/utils/selectionToolbox'
|
||||
|
||||
test.describe('Selection Toolbox - More Options', { tag: '@ui' }, () => {
|
||||
test.describe('Single node actions', () => {
|
||||
@@ -34,14 +20,14 @@ test.describe('Selection Toolbox - More Options', { tag: '@ui' }, () => {
|
||||
|
||||
await expect(nodeRef).not.toBePinned()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Pin', { exact: true }).click()
|
||||
let menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Pin', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect(nodeRef).toBePinned()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Unpin', { exact: true }).click()
|
||||
menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Unpin', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect(nodeRef).not.toBePinned()
|
||||
@@ -57,14 +43,14 @@ test.describe('Selection Toolbox - More Options', { tag: '@ui' }, () => {
|
||||
|
||||
await expect(nodeRef).not.toBeCollapsed()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Minimize Node', { exact: true }).click()
|
||||
let menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Minimize Node', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect(nodeRef).toBeCollapsed()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Expand Node', { exact: true }).click()
|
||||
menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Expand Node', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect(nodeRef).not.toBeCollapsed()
|
||||
@@ -78,8 +64,8 @@ test.describe('Selection Toolbox - More Options', { tag: '@ui' }, () => {
|
||||
|
||||
const initialCount = await comfyPage.nodeOps.getGraphNodesCount()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Copy', { exact: true }).click()
|
||||
const menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Copy', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
// Paste the copied node
|
||||
@@ -99,8 +85,8 @@ test.describe('Selection Toolbox - More Options', { tag: '@ui' }, () => {
|
||||
|
||||
const initialCount = await comfyPage.nodeOps.getGraphNodesCount()
|
||||
|
||||
await openMoreOptions(comfyPage)
|
||||
await comfyPage.page.getByText('Duplicate', { exact: true }).click()
|
||||
const menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Duplicate', { exact: true }).click()
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect
|
||||
|
||||
96
browser_tests/tests/selectionToolboxRename.spec.ts
Normal file
96
browser_tests/tests/selectionToolboxRename.spec.ts
Normal file
@@ -0,0 +1,96 @@
|
||||
import {
|
||||
comfyExpect as expect,
|
||||
comfyPageFixture as test
|
||||
} from '@e2e/fixtures/ComfyPage'
|
||||
import { getGroupTitlePosition } from '@e2e/fixtures/utils/groupHelpers'
|
||||
import { openMoreOptions } from '@e2e/fixtures/utils/selectionToolbox'
|
||||
|
||||
test.describe('Selection toolbox rename', { tag: '@ui' }, () => {
|
||||
test.beforeEach(async ({ comfyPage }) => {
|
||||
await comfyPage.settings.setSetting('Comfy.Canvas.SelectionToolbox', true)
|
||||
await comfyPage.workflow.loadWorkflow('default')
|
||||
await comfyPage.nextFrame()
|
||||
})
|
||||
|
||||
test.describe('Single rename', () => {
|
||||
test('Rename via More Options opens title editor for single node', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const nodeRef = (
|
||||
await comfyPage.nodeOps.getNodeRefsByTitle('KSampler')
|
||||
)[0]
|
||||
await comfyPage.nodeOps.selectNodeWithPan(nodeRef)
|
||||
|
||||
const menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Rename', { exact: true }).click()
|
||||
|
||||
await expect(comfyPage.page.getByTestId('node-title-input')).toHaveValue(
|
||||
'KSampler'
|
||||
)
|
||||
})
|
||||
|
||||
test('Rename shows prompt dialog for group', async ({ comfyPage }) => {
|
||||
await comfyPage.settings.setSetting(
|
||||
'LiteGraph.Group.SelectChildrenOnClick',
|
||||
false
|
||||
)
|
||||
await comfyPage.workflow.loadWorkflow('groups/nested-groups-1-inner-node')
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
const outerGroupPos = await getGroupTitlePosition(
|
||||
comfyPage,
|
||||
'Outer Group'
|
||||
)
|
||||
await comfyPage.canvas.click({ position: outerGroupPos })
|
||||
|
||||
const menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Rename', { exact: true }).click()
|
||||
|
||||
await expect(comfyPage.nodeOps.promptDialogInput).toBeVisible()
|
||||
await comfyPage.nodeOps.promptDialogInput.fill('Renamed Group')
|
||||
await comfyPage.page.keyboard.press('Enter')
|
||||
await expect(comfyPage.nodeOps.promptDialogInput).toBeHidden()
|
||||
|
||||
await expect
|
||||
.poll(() =>
|
||||
comfyPage.page.evaluate(() => {
|
||||
return window.app!.graph.groups.some(
|
||||
(g) => g.title === 'Renamed Group'
|
||||
)
|
||||
})
|
||||
)
|
||||
.toBe(true)
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Batch rename', () => {
|
||||
test('Batch rename multiple selected nodes', async ({ comfyPage }) => {
|
||||
const ksampler = (
|
||||
await comfyPage.nodeOps.getNodeRefsByTitle('KSampler')
|
||||
)[0]
|
||||
const emptyLatent = (
|
||||
await comfyPage.nodeOps.getNodeRefsByTitle('Empty Latent Image')
|
||||
)[0]
|
||||
|
||||
await comfyPage.nodeOps.selectNodes(['KSampler', 'Empty Latent Image'])
|
||||
|
||||
const menu = await openMoreOptions(comfyPage)
|
||||
await menu.getByText('Rename', { exact: true }).click()
|
||||
|
||||
await expect(comfyPage.nodeOps.promptDialogInput).toBeVisible()
|
||||
await comfyPage.nodeOps.promptDialogInput.fill('TestNode')
|
||||
await comfyPage.page.keyboard.press('Enter')
|
||||
await expect(comfyPage.nodeOps.promptDialogInput).toBeHidden()
|
||||
|
||||
await expect
|
||||
.poll(async () => {
|
||||
const titles = await Promise.all([
|
||||
ksampler.getProperty<string>('title'),
|
||||
emptyLatent.getProperty<string>('title')
|
||||
])
|
||||
return [...titles].sort()
|
||||
})
|
||||
.toEqual(['TestNode 1', 'TestNode 2'])
|
||||
})
|
||||
})
|
||||
})
|
||||
@@ -3,6 +3,7 @@ import { expect } from '@playwright/test'
|
||||
import { comfyPageFixture as test, comfyExpect } from '@e2e/fixtures/ComfyPage'
|
||||
import { SubgraphHelper } from '@e2e/fixtures/helpers/SubgraphHelper'
|
||||
import { TestIds } from '@e2e/fixtures/selectors'
|
||||
import { getPromotedWidgets } from '@e2e/fixtures/utils/promotedWidgets'
|
||||
|
||||
test.describe('Nested Subgraphs', { tag: ['@subgraph'] }, () => {
|
||||
test.describe('Nested subgraph configure order', () => {
|
||||
@@ -190,4 +191,106 @@ test.describe('Nested Subgraphs', { tag: ['@subgraph'] }, () => {
|
||||
})
|
||||
}
|
||||
)
|
||||
|
||||
test.describe(
|
||||
'Nested subgraph input target resolution',
|
||||
{ tag: ['@widget', '@vue-nodes'] },
|
||||
() => {
|
||||
const WORKFLOW = 'subgraphs/subgraph-nested-promotion'
|
||||
const OUTER_NODE_ID = '5'
|
||||
const INNER_SUBGRAPH_NODE_ID = '6'
|
||||
|
||||
test('Nested SubgraphNode promoted widgets render without resolution failures', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const { warnings, dispose } = SubgraphHelper.collectConsoleWarnings(
|
||||
comfyPage.page,
|
||||
['No link found', 'Failed to resolve legacy -1']
|
||||
)
|
||||
|
||||
try {
|
||||
await comfyPage.workflow.loadWorkflow(WORKFLOW)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const outerNode = comfyPage.vueNodes.getNodeLocator(OUTER_NODE_ID)
|
||||
await comfyExpect(outerNode).toBeVisible()
|
||||
|
||||
const widgets = outerNode.getByTestId(TestIds.widgets.widget)
|
||||
await comfyExpect(
|
||||
widgets,
|
||||
'asset has 4 promoted widgets on outer subgraph node'
|
||||
).toHaveCount(4)
|
||||
|
||||
expect(warnings).toEqual([])
|
||||
} finally {
|
||||
dispose()
|
||||
}
|
||||
})
|
||||
|
||||
test('Promoted widgets from inner SubgraphNode are visible with correct values', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.workflow.loadWorkflow(WORKFLOW)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const outerNode = comfyPage.vueNodes.getNodeLocator(OUTER_NODE_ID)
|
||||
await comfyExpect(outerNode).toBeVisible()
|
||||
|
||||
const widgets = outerNode.getByTestId(TestIds.widgets.widget)
|
||||
await comfyExpect(widgets).toHaveCount(4)
|
||||
|
||||
const valueWidget = outerNode
|
||||
.getByRole('textbox', { name: 'value' })
|
||||
.first()
|
||||
await comfyExpect(valueWidget).toBeVisible()
|
||||
await comfyExpect(valueWidget).toHaveValue(/Inner 1/)
|
||||
})
|
||||
|
||||
test('Promoted widgets from inner SubgraphNode carry correct source identity', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.workflow.loadWorkflow(WORKFLOW)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
await expect
|
||||
.poll(async () => {
|
||||
const widgets = await getPromotedWidgets(comfyPage, OUTER_NODE_ID)
|
||||
return widgets
|
||||
.filter(
|
||||
([sourceNodeId]) => sourceNodeId === INNER_SUBGRAPH_NODE_ID
|
||||
)
|
||||
.map(([, sourceWidgetName]) => sourceWidgetName)
|
||||
})
|
||||
.toContain('value')
|
||||
})
|
||||
|
||||
test('Serialize and reload preserves nested promoted widget visibility', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.workflow.loadWorkflow(WORKFLOW)
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const outerNode = comfyPage.vueNodes.getNodeLocator(OUTER_NODE_ID)
|
||||
const widgets = outerNode.getByTestId(TestIds.widgets.widget)
|
||||
await comfyExpect(
|
||||
widgets,
|
||||
'asset has 4 promoted widgets on outer subgraph node'
|
||||
).toHaveCount(4)
|
||||
const initialCount = await widgets.count()
|
||||
|
||||
await comfyPage.subgraph.serializeAndReload()
|
||||
await comfyPage.vueNodes.waitForNodes()
|
||||
|
||||
const outerNodeAfter = comfyPage.vueNodes.getNodeLocator(OUTER_NODE_ID)
|
||||
const widgetsAfter = outerNodeAfter.getByTestId(TestIds.widgets.widget)
|
||||
await comfyExpect(widgetsAfter).toHaveCount(initialCount)
|
||||
|
||||
const valueWidget = outerNodeAfter
|
||||
.getByRole('textbox', { name: 'value' })
|
||||
.first()
|
||||
await comfyExpect(valueWidget).toBeVisible()
|
||||
await comfyExpect(valueWidget).toHaveValue(/Inner 1/)
|
||||
})
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
@@ -22,44 +22,35 @@ test.describe('Topbar menu commands', { tag: '@ui' }, () => {
|
||||
await expect.poll(() => topbar.getTabNames()).toHaveLength(2)
|
||||
})
|
||||
|
||||
test('Edit > Undo undoes the last action', async ({ comfyPage }) => {
|
||||
test('Edit > Undo undoes and Edit > Redo restores the last action', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
const initialNodeCount = await comfyPage.nodeOps.getNodeCount()
|
||||
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
await test.step('Edit > Undo undoes the last action', async () => {
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
|
||||
await comfyPage.menu.topbar.triggerTopbarCommand(['Edit', 'Undo'])
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
|
||||
await comfyPage.menu.topbar.triggerTopbarCommand(['Edit', 'Undo'])
|
||||
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
})
|
||||
|
||||
test('Edit > Redo restores an undone action', async ({ comfyPage }) => {
|
||||
const initialNodeCount = await comfyPage.nodeOps.getNodeCount()
|
||||
|
||||
await comfyPage.page.evaluate(() => {
|
||||
const node = window.LiteGraph!.createNode('Note')
|
||||
window.app!.graph!.add(node)
|
||||
await test.step('Edit > Redo restores an undone action', async () => {
|
||||
await comfyPage.menu.topbar.triggerTopbarCommand(['Edit', 'Redo'])
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
})
|
||||
await comfyPage.nextFrame()
|
||||
|
||||
await comfyPage.menu.topbar.triggerTopbarCommand(['Edit', 'Undo'])
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount)
|
||||
|
||||
await comfyPage.menu.topbar.triggerTopbarCommand(['Edit', 'Redo'])
|
||||
await expect
|
||||
.poll(() => comfyPage.nodeOps.getNodeCount())
|
||||
.toBe(initialNodeCount + 1)
|
||||
})
|
||||
|
||||
test('File > Save opens save dialog', async ({ comfyPage }) => {
|
||||
|
||||
49
browser_tests/tests/workflowDeleteSettings.spec.ts
Normal file
49
browser_tests/tests/workflowDeleteSettings.spec.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import {
|
||||
comfyExpect as expect,
|
||||
comfyPageFixture as test
|
||||
} from '@e2e/fixtures/ComfyPage'
|
||||
import type { ComfyPage } from '@e2e/fixtures/ComfyPage'
|
||||
|
||||
const WORKFLOW_NAME = 'test-confirm-delete'
|
||||
|
||||
async function startDeletingFromSidebar(comfyPage: ComfyPage) {
|
||||
const { workflowsTab } = comfyPage.menu
|
||||
await workflowsTab.open()
|
||||
await workflowsTab.getPersistedItem(WORKFLOW_NAME).click({ button: 'right' })
|
||||
await comfyPage.contextMenu.clickMenuItem('Delete')
|
||||
}
|
||||
|
||||
test.describe('Comfy.Workflow.ConfirmDelete', () => {
|
||||
test.beforeEach(async ({ comfyPage }) => {
|
||||
await comfyPage.menu.topbar.saveWorkflowAs(WORKFLOW_NAME)
|
||||
})
|
||||
|
||||
test.afterEach(async ({ comfyPage }) => {
|
||||
await comfyPage.workflow.setupWorkflowsDirectory({})
|
||||
})
|
||||
|
||||
test('on (default): right-click → Delete prompts the confirm dialog', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.settings.setSetting('Comfy.Workflow.ConfirmDelete', true)
|
||||
|
||||
await startDeletingFromSidebar(comfyPage)
|
||||
|
||||
await expect(comfyPage.confirmDialog.root).toBeVisible()
|
||||
await expect(comfyPage.confirmDialog.delete).toBeVisible()
|
||||
})
|
||||
|
||||
test('off: right-click → Delete bypasses the confirm dialog', async ({
|
||||
comfyPage
|
||||
}) => {
|
||||
await comfyPage.settings.setSetting('Comfy.Workflow.ConfirmDelete', false)
|
||||
|
||||
await startDeletingFromSidebar(comfyPage)
|
||||
|
||||
const { workflowsTab } = comfyPage.menu
|
||||
await expect(comfyPage.confirmDialog.root).toBeHidden()
|
||||
await expect
|
||||
.poll(() => workflowsTab.getTopLevelSavedWorkflowNames())
|
||||
.not.toContain(WORKFLOW_NAME)
|
||||
})
|
||||
})
|
||||
@@ -147,7 +147,7 @@ it('should subscribe to logs API', () => {
|
||||
})
|
||||
```
|
||||
|
||||
## Mocking Lodash Functions
|
||||
## Mocking Utility Functions
|
||||
|
||||
Mocking utility functions like debounce:
|
||||
|
||||
|
||||
@@ -230,6 +230,37 @@ export default defineConfig([
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: 'comfy/no-unsafe-error-assertion',
|
||||
files: [
|
||||
'src/**/*.ts',
|
||||
'src/**/*.tsx',
|
||||
'src/**/*.vue',
|
||||
'apps/*/src/**/*.ts',
|
||||
'apps/*/src/**/*.tsx',
|
||||
'apps/*/src/**/*.vue'
|
||||
],
|
||||
ignores: ['**/*.test.ts', '**/*.spec.ts'],
|
||||
rules: {
|
||||
'no-restricted-syntax': [
|
||||
'error',
|
||||
{
|
||||
// Bans `value as Error` and `value as Error & { ... }`.
|
||||
// Use `error instanceof Error` narrowing or `toError()` from
|
||||
// @/utils/errorUtil instead — see issue #11429.
|
||||
selector: "TSAsExpression TSTypeReference[typeName.name='Error']",
|
||||
message:
|
||||
'Do not use Error type assertions. Use `instanceof Error` narrowing or `toError()` from @/utils/errorUtil instead. See issue #11429.'
|
||||
},
|
||||
{
|
||||
// Bans `<Error>value` and `<Error & { ... }>value`.
|
||||
selector: "TSTypeAssertion TSTypeReference[typeName.name='Error']",
|
||||
message:
|
||||
'Do not use Error type assertions. Use `instanceof Error` narrowing or `toError()` from @/utils/errorUtil instead. See issue #11429.'
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
files: ['**/*.spec.ts'],
|
||||
ignores: ['browser_tests/tests/**/*.spec.ts', 'apps/*/e2e/**/*.spec.ts'],
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@comfyorg/comfyui-frontend",
|
||||
"version": "1.44.15",
|
||||
"version": "1.44.16",
|
||||
"private": true,
|
||||
"description": "Official front-end implementation of ComfyUI",
|
||||
"homepage": "https://comfy.org",
|
||||
|
||||
@@ -29,6 +29,17 @@ export type {
|
||||
BillingStatus,
|
||||
BillingStatusResponse,
|
||||
BindingErrorResponse,
|
||||
BulkRevokeApiKeysResponse,
|
||||
BulkRevokeWorkspaceMemberApiKeysData,
|
||||
BulkRevokeWorkspaceMemberApiKeysError,
|
||||
BulkRevokeWorkspaceMemberApiKeysErrors,
|
||||
BulkRevokeWorkspaceMemberApiKeysResponse,
|
||||
BulkRevokeWorkspaceMemberApiKeysResponses,
|
||||
CancelJobData,
|
||||
CancelJobError,
|
||||
CancelJobErrors,
|
||||
CancelJobResponse,
|
||||
CancelJobResponses,
|
||||
CancelSubscriptionData,
|
||||
CancelSubscriptionError,
|
||||
CancelSubscriptionErrors,
|
||||
@@ -307,6 +318,28 @@ export type {
|
||||
GetJwksData,
|
||||
GetJwksResponse,
|
||||
GetJwksResponses,
|
||||
GetLegacyAssetContentData,
|
||||
GetLegacyAssetContentErrors,
|
||||
GetLegacyHistoryByIdData,
|
||||
GetLegacyHistoryByIdErrors,
|
||||
GetLegacyHistoryData,
|
||||
GetLegacyHistoryErrors,
|
||||
GetLegacyJobByIdData,
|
||||
GetLegacyJobByIdErrors,
|
||||
GetLegacyJobOutputsData,
|
||||
GetLegacyJobOutputsErrors,
|
||||
GetLegacyModelsByFolderData,
|
||||
GetLegacyModelsByFolderErrors,
|
||||
GetLegacyModelsData,
|
||||
GetLegacyModelsErrors,
|
||||
GetLegacyObjectInfoByNodeClassData,
|
||||
GetLegacyObjectInfoByNodeClassErrors,
|
||||
GetLegacyPromptByIdData,
|
||||
GetLegacyPromptByIdErrors,
|
||||
GetLegacyUserdataV2Data,
|
||||
GetLegacyUserdataV2Errors,
|
||||
GetLegacyViewMetadataData,
|
||||
GetLegacyViewMetadataErrors,
|
||||
GetLogsData,
|
||||
GetLogsError,
|
||||
GetLogsErrors,
|
||||
@@ -505,6 +538,7 @@ export type {
|
||||
InterruptJobError,
|
||||
InterruptJobErrors,
|
||||
InterruptJobResponses,
|
||||
JobCancelResponse,
|
||||
JobDetailResponse,
|
||||
JobEntry,
|
||||
JobsListResponse,
|
||||
@@ -719,6 +753,13 @@ export type {
|
||||
SubscribeResponses,
|
||||
SubscriptionDuration,
|
||||
SubscriptionTier,
|
||||
SyncApiKeyData,
|
||||
SyncApiKeyError,
|
||||
SyncApiKeyErrors,
|
||||
SyncApiKeyRequest,
|
||||
SyncApiKeyResponse,
|
||||
SyncApiKeyResponse2,
|
||||
SyncApiKeyResponses,
|
||||
SystemStatsResponse,
|
||||
TagInfo,
|
||||
TagsModificationResponse,
|
||||
|
||||
785
packages/ingest-types/src/types.gen.ts
generated
785
packages/ingest-types/src/types.gen.ts
generated
File diff suppressed because it is too large
Load Diff
497
packages/ingest-types/src/zod.gen.ts
generated
497
packages/ingest-types/src/zod.gen.ts
generated
File diff suppressed because it is too large
Load Diff
134
packages/registry-types/src/comfyRegistryTypes.ts
generated
134
packages/registry-types/src/comfyRegistryTypes.ts
generated
@@ -4014,6 +4014,26 @@ export interface paths {
|
||||
patch?: never;
|
||||
trace?: never;
|
||||
};
|
||||
"/proxy/seedance/visual-validate/groups": {
|
||||
parameters: {
|
||||
query?: never;
|
||||
header?: never;
|
||||
path?: never;
|
||||
cookie?: never;
|
||||
};
|
||||
/**
|
||||
* List the caller's completed visual-validation groups
|
||||
* @description Returns the caller's completed visual-validation groups (real-person H5 verification). Used to power the group selector in client UIs. Excludes virtual-library (AIGC) groups, which are not part of the public API surface.
|
||||
*/
|
||||
get: operations["seedanceListVisualValidationGroups"];
|
||||
put?: never;
|
||||
post?: never;
|
||||
delete?: never;
|
||||
options?: never;
|
||||
head?: never;
|
||||
patch?: never;
|
||||
trace?: never;
|
||||
};
|
||||
"/proxy/seedance/visual-validate/sessions/{session_id}": {
|
||||
parameters: {
|
||||
query?: never;
|
||||
@@ -4037,7 +4057,11 @@ export interface paths {
|
||||
path?: never;
|
||||
cookie?: never;
|
||||
};
|
||||
get?: never;
|
||||
/**
|
||||
* List the caller's assets across all owned groups
|
||||
* @description Fans out to BytePlus ListAssets across the caller's completed verification groups, denormalizes the group label into each row, and returns a single flat list. Result is post-filtered by asset_type. Optional group_id narrows to one group. Hard caps: 5 pages × 100 assets per group, 1000 total assets.
|
||||
*/
|
||||
get: operations["seedanceListUserAssets"];
|
||||
put?: never;
|
||||
post: operations["seedanceCreateAsset"];
|
||||
delete?: never;
|
||||
@@ -13569,7 +13593,7 @@ export interface components {
|
||||
stream: boolean | null;
|
||||
};
|
||||
/** @enum {string} */
|
||||
OpenAIModels: "gpt-4" | "gpt-4-0314" | "gpt-4-0613" | "gpt-4-32k" | "gpt-4-32k-0314" | "gpt-4-32k-0613" | "gpt-4-0125-preview" | "gpt-4-turbo" | "gpt-4-turbo-2024-04-09" | "gpt-4-turbo-preview" | "gpt-4-1106-preview" | "gpt-4-vision-preview" | "gpt-3.5-turbo" | "gpt-3.5-turbo-16k" | "gpt-3.5-turbo-0301" | "gpt-3.5-turbo-0613" | "gpt-3.5-turbo-1106" | "gpt-3.5-turbo-0125" | "gpt-3.5-turbo-16k-0613" | "gpt-4.1" | "gpt-4.1-mini" | "gpt-4.1-nano" | "gpt-4.1-2025-04-14" | "gpt-4.1-mini-2025-04-14" | "gpt-4.1-nano-2025-04-14" | "o1" | "o1-mini" | "o1-preview" | "o1-pro" | "o1-2024-12-17" | "o1-preview-2024-09-12" | "o1-mini-2024-09-12" | "o1-pro-2025-03-19" | "o3" | "o3-mini" | "o3-2025-04-16" | "o3-mini-2025-01-31" | "o4-mini" | "o4-mini-2025-04-16" | "gpt-4o" | "gpt-4o-mini" | "gpt-4o-2024-11-20" | "gpt-4o-2024-08-06" | "gpt-4o-2024-05-13" | "gpt-4o-mini-2024-07-18" | "gpt-4o-audio-preview" | "gpt-4o-audio-preview-2024-10-01" | "gpt-4o-audio-preview-2024-12-17" | "gpt-4o-mini-audio-preview" | "gpt-4o-mini-audio-preview-2024-12-17" | "gpt-4o-search-preview" | "gpt-4o-mini-search-preview" | "gpt-4o-search-preview-2025-03-11" | "gpt-4o-mini-search-preview-2025-03-11" | "computer-use-preview" | "computer-use-preview-2025-03-11" | "gpt-5" | "gpt-5-mini" | "gpt-5-nano" | "chatgpt-4o-latest";
|
||||
OpenAIModels: "gpt-4" | "gpt-4-0314" | "gpt-4-0613" | "gpt-4-32k" | "gpt-4-32k-0314" | "gpt-4-32k-0613" | "gpt-4-0125-preview" | "gpt-4-turbo" | "gpt-4-turbo-2024-04-09" | "gpt-4-turbo-preview" | "gpt-4-1106-preview" | "gpt-4-vision-preview" | "gpt-3.5-turbo" | "gpt-3.5-turbo-16k" | "gpt-3.5-turbo-0301" | "gpt-3.5-turbo-0613" | "gpt-3.5-turbo-1106" | "gpt-3.5-turbo-0125" | "gpt-3.5-turbo-16k-0613" | "gpt-4.1" | "gpt-4.1-mini" | "gpt-4.1-nano" | "gpt-4.1-2025-04-14" | "gpt-4.1-mini-2025-04-14" | "gpt-4.1-nano-2025-04-14" | "o1" | "o1-mini" | "o1-preview" | "o1-pro" | "o1-2024-12-17" | "o1-preview-2024-09-12" | "o1-mini-2024-09-12" | "o1-pro-2025-03-19" | "o3" | "o3-mini" | "o3-2025-04-16" | "o3-mini-2025-01-31" | "o4-mini" | "o4-mini-2025-04-16" | "gpt-4o" | "gpt-4o-mini" | "gpt-4o-2024-11-20" | "gpt-4o-2024-08-06" | "gpt-4o-2024-05-13" | "gpt-4o-mini-2024-07-18" | "gpt-4o-audio-preview" | "gpt-4o-audio-preview-2024-10-01" | "gpt-4o-audio-preview-2024-12-17" | "gpt-4o-mini-audio-preview" | "gpt-4o-mini-audio-preview-2024-12-17" | "gpt-4o-search-preview" | "gpt-4o-mini-search-preview" | "gpt-4o-search-preview-2025-03-11" | "gpt-4o-mini-search-preview-2025-03-11" | "computer-use-preview" | "computer-use-preview-2025-03-11" | "gpt-5" | "gpt-5-mini" | "gpt-5-nano" | "gpt-5.5" | "gpt-5.5-pro" | "chatgpt-4o-latest";
|
||||
MoonvalleyTextToVideoInferenceParams: {
|
||||
/**
|
||||
* @description Height of the generated video in pixels
|
||||
@@ -14442,6 +14466,10 @@ export interface components {
|
||||
total_tokens?: number;
|
||||
};
|
||||
};
|
||||
SeedanceCreateVisualValidateSessionRequest: {
|
||||
/** @description Optional human-readable label for the asset group that will be created by this verification. Stored locally and returned by seedanceListVisualValidationGroups so users can identify their groups in selectors. */
|
||||
name?: string;
|
||||
};
|
||||
SeedanceCreateVisualValidateSessionResponse: {
|
||||
/**
|
||||
* Format: uuid
|
||||
@@ -14451,6 +14479,37 @@ export interface components {
|
||||
/** @description BytePlus-issued H5 liveness link. Open in a browser with camera access. Valid for ~120 seconds. */
|
||||
h5_link: string;
|
||||
};
|
||||
SeedanceListVisualValidationGroupsResponse: {
|
||||
groups: components["schemas"]["SeedanceVisualValidationGroup"][];
|
||||
};
|
||||
SeedanceListUserAssetsResponse: {
|
||||
assets: components["schemas"]["SeedanceUserAsset"][];
|
||||
/** @description True if the global per-request asset cap was hit and older results were dropped. */
|
||||
truncated: boolean;
|
||||
};
|
||||
SeedanceUserAsset: {
|
||||
asset_id: string;
|
||||
name?: string | null;
|
||||
/** @description BytePlus access URL (~12h validity). Refreshed on each list call. */
|
||||
url?: string | null;
|
||||
group_id: string;
|
||||
/** @description Display label of the source group, denormalized for client-side search. */
|
||||
group_name: string;
|
||||
/** @enum {string} */
|
||||
asset_type: "Image" | "Video" | "Audio";
|
||||
/** @enum {string} */
|
||||
status: "Active" | "Processing" | "Failed";
|
||||
/** Format: date-time */
|
||||
create_time: string;
|
||||
};
|
||||
SeedanceVisualValidationGroup: {
|
||||
/** @description BytePlus-issued asset group id. */
|
||||
group_id: string;
|
||||
/** @description Display label. Caller-supplied at creation time when available; otherwise a server-generated fallback derived from the creation date. */
|
||||
name: string;
|
||||
/** Format: date-time */
|
||||
created_at: string;
|
||||
};
|
||||
SeedanceGetVisualValidateSessionResponse: {
|
||||
/** Format: uuid */
|
||||
session_id: string;
|
||||
@@ -14458,6 +14517,8 @@ export interface components {
|
||||
status: "pending" | "completed" | "failed";
|
||||
/** @description Populated only when status == completed. This is the BytePlus Asset Group ID the user will upload assets into. */
|
||||
group_id?: string | null;
|
||||
/** @description Optional human-readable label provided when the session was created. */
|
||||
name?: string | null;
|
||||
error_code?: string | null;
|
||||
error_message?: string | null;
|
||||
};
|
||||
@@ -30275,7 +30336,11 @@ export interface operations {
|
||||
path?: never;
|
||||
cookie?: never;
|
||||
};
|
||||
requestBody?: never;
|
||||
requestBody?: {
|
||||
content: {
|
||||
"application/json": components["schemas"]["SeedanceCreateVisualValidateSessionRequest"];
|
||||
};
|
||||
};
|
||||
responses: {
|
||||
/** @description Verification session created */
|
||||
201: {
|
||||
@@ -30297,6 +30362,35 @@ export interface operations {
|
||||
};
|
||||
};
|
||||
};
|
||||
seedanceListVisualValidationGroups: {
|
||||
parameters: {
|
||||
query?: never;
|
||||
header?: never;
|
||||
path?: never;
|
||||
cookie?: never;
|
||||
};
|
||||
requestBody?: never;
|
||||
responses: {
|
||||
/** @description Visual-validation groups owned by the caller */
|
||||
200: {
|
||||
headers: {
|
||||
[name: string]: unknown;
|
||||
};
|
||||
content: {
|
||||
"application/json": components["schemas"]["SeedanceListVisualValidationGroupsResponse"];
|
||||
};
|
||||
};
|
||||
/** @description Error 4xx/5xx */
|
||||
default: {
|
||||
headers: {
|
||||
[name: string]: unknown;
|
||||
};
|
||||
content: {
|
||||
"application/json": components["schemas"]["ErrorResponse"];
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
seedanceGetVisualValidateSession: {
|
||||
parameters: {
|
||||
query?: never;
|
||||
@@ -30329,6 +30423,40 @@ export interface operations {
|
||||
};
|
||||
};
|
||||
};
|
||||
seedanceListUserAssets: {
|
||||
parameters: {
|
||||
query: {
|
||||
/** @description Asset type to return. */
|
||||
asset_type: "Image" | "Video";
|
||||
/** @description Narrow the listing to one group. Caller must own it. */
|
||||
group_id?: string;
|
||||
};
|
||||
header?: never;
|
||||
path?: never;
|
||||
cookie?: never;
|
||||
};
|
||||
requestBody?: never;
|
||||
responses: {
|
||||
/** @description Assets owned by the caller */
|
||||
200: {
|
||||
headers: {
|
||||
[name: string]: unknown;
|
||||
};
|
||||
content: {
|
||||
"application/json": components["schemas"]["SeedanceListUserAssetsResponse"];
|
||||
};
|
||||
};
|
||||
/** @description Error 4xx/5xx */
|
||||
default: {
|
||||
headers: {
|
||||
[name: string]: unknown;
|
||||
};
|
||||
content: {
|
||||
"application/json": components["schemas"]["ErrorResponse"];
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
seedanceCreateAsset: {
|
||||
parameters: {
|
||||
query?: never;
|
||||
|
||||
119
src/base/webviewDetection.test.ts
Normal file
119
src/base/webviewDetection.test.ts
Normal file
@@ -0,0 +1,119 @@
|
||||
import { afterEach, describe, expect, it, vi } from 'vitest'
|
||||
|
||||
import { isEmbeddedWebView } from '@/base/webviewDetection'
|
||||
|
||||
describe('isEmbeddedWebView', () => {
|
||||
afterEach(() => {
|
||||
vi.unstubAllGlobals()
|
||||
})
|
||||
|
||||
describe('Android WebView', () => {
|
||||
it('detects Android WebView with wv token', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (Linux; Android 13; SM-G991B; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/120.0.0.0 Mobile Safari/537.36'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('does not flag regular Chrome on Android', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (Linux; Android 13; SM-G991B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Mobile Safari/537.36'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('iOS WKWebView', () => {
|
||||
it('detects iOS WKWebView (AppleWebKit without Safari/)', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('does not flag regular Safari on iOS', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.0 Mobile/15E148 Safari/604.1'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
|
||||
it('does not flag Chrome on iOS (CriOS)', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/120.0.0.0 Mobile/15E148'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
|
||||
it('does not flag Firefox on iOS (FxiOS)', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) FxiOS/120.0 Mobile/15E148'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('social app in-app browsers', () => {
|
||||
it('detects Facebook (FBAN)', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 [FBAN/FBIOS;FBAV/400.0]'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('detects Instagram', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 Instagram 300.0'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('detects TikTok', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (Linux; Android 13) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Mobile Safari/537.36 TikTok/30.0'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('detects Line', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 Line/13.0'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
|
||||
it('detects Snapchat', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (iPhone; CPU iPhone OS 17_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 Snapchat/12.0'
|
||||
expect(isEmbeddedWebView(ua)).toBe(true)
|
||||
})
|
||||
})
|
||||
|
||||
describe('regular desktop browsers', () => {
|
||||
it('does not flag Chrome desktop', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
|
||||
it('does not flag Firefox desktop', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (X11; Linux x86_64; rv:120.0) Gecko/20100101 Firefox/120.0'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
|
||||
it('does not flag Safari desktop', () => {
|
||||
const ua =
|
||||
'Mozilla/5.0 (Macintosh; Intel Mac OS X 14_0) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.0 Safari/605.1.15'
|
||||
expect(isEmbeddedWebView(ua)).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('edge cases', () => {
|
||||
it('handles empty string', () => {
|
||||
expect(isEmbeddedWebView('')).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('JS bridge detection', () => {
|
||||
it('detects webkit.messageHandlers bridge', () => {
|
||||
vi.stubGlobal('webkit', { messageHandlers: {} })
|
||||
expect(isEmbeddedWebView('')).toBe(true)
|
||||
})
|
||||
|
||||
it('detects ReactNativeWebView bridge', () => {
|
||||
vi.stubGlobal('ReactNativeWebView', { postMessage: vi.fn() })
|
||||
expect(isEmbeddedWebView('')).toBe(true)
|
||||
})
|
||||
})
|
||||
})
|
||||
72
src/base/webviewDetection.ts
Normal file
72
src/base/webviewDetection.ts
Normal file
@@ -0,0 +1,72 @@
|
||||
/**
|
||||
* Detects whether the app is running inside an embedded webview.
|
||||
*
|
||||
* Google blocks OAuth via `signInWithPopup` in embedded webviews,
|
||||
* returning a 403 `disallowed_useragent` error (policy since 2021).
|
||||
* This utility is used to hide the Google SSO button in those contexts.
|
||||
*
|
||||
* Detection covers:
|
||||
* • Android WebView (`wv` token in UA)
|
||||
* • iOS WKWebView (has `AppleWebKit` but lacks `Safari/`)
|
||||
* • Social app in-app browsers (Facebook, Instagram, TikTok, etc.)
|
||||
* • JS bridge objects (`window.webkit.messageHandlers`, `ReactNativeWebView`)
|
||||
*/
|
||||
|
||||
const SOCIAL_APP_PATTERNS =
|
||||
/FBAN|FBAV|Instagram|Line\/|Snapchat|TikTok|musical_ly/i
|
||||
|
||||
function isAndroidWebView(ua: string): boolean {
|
||||
return /\bwv\b/.test(ua) && /Android/.test(ua)
|
||||
}
|
||||
|
||||
function isIOSWebView(ua: string): boolean {
|
||||
if (!/AppleWebKit/i.test(ua)) return false
|
||||
if (/Safari\//i.test(ua)) return false
|
||||
if (/CriOS|FxiOS|OPiOS|EdgiOS/i.test(ua)) return false
|
||||
return true
|
||||
}
|
||||
|
||||
function isSocialAppBrowser(ua: string): boolean {
|
||||
return SOCIAL_APP_PATTERNS.test(ua)
|
||||
}
|
||||
|
||||
function hasWebViewBridge(): boolean {
|
||||
try {
|
||||
const win = globalThis as Record<string, unknown>
|
||||
if (
|
||||
typeof win.webkit === 'object' &&
|
||||
win.webkit !== null &&
|
||||
typeof (win.webkit as Record<string, unknown>).messageHandlers ===
|
||||
'object'
|
||||
) {
|
||||
return true
|
||||
}
|
||||
if (win.ReactNativeWebView != null) return true
|
||||
} catch {
|
||||
// Access to bridge objects may throw in sandboxed contexts
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
export function isEmbeddedWebView(ua: string = navigator.userAgent): boolean {
|
||||
if (isSocialAppBrowser(ua)) return true
|
||||
if (isAndroidWebView(ua)) return true
|
||||
if (isIOSWebView(ua)) return true
|
||||
if (hasWebViewBridge()) return true
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Reason why Google SSO is blocked in the current environment, or `null` if it
|
||||
* is available. Modeled as a discriminated string so call sites read as
|
||||
* "if blocked, here's why" rather than an opaque boolean. Extend this union
|
||||
* (e.g. `'unauthorized-host'`) as new blocking conditions are detected.
|
||||
*/
|
||||
type GoogleSsoBlockedReason = 'embedded-webview' | null
|
||||
|
||||
export function getGoogleSsoBlockedReason(
|
||||
ua: string = navigator.userAgent
|
||||
): GoogleSsoBlockedReason {
|
||||
if (isEmbeddedWebView(ua)) return 'embedded-webview'
|
||||
return null
|
||||
}
|
||||
@@ -49,6 +49,7 @@
|
||||
<div class="flex flex-col gap-6">
|
||||
<template v-if="ssoAllowed">
|
||||
<Button
|
||||
v-if="!googleSsoBlockedReason"
|
||||
type="button"
|
||||
class="h-10"
|
||||
variant="secondary"
|
||||
@@ -157,6 +158,7 @@ import type { SignInData, SignUpData } from '@/schemas/signInSchema'
|
||||
import { isCloud } from '@/platform/distribution/types'
|
||||
import { isHostWhitelisted, normalizeHost } from '@/utils/hostWhitelist'
|
||||
import { isInChina } from '@/utils/networkUtil'
|
||||
import { getGoogleSsoBlockedReason } from '@/base/webviewDetection'
|
||||
|
||||
import ApiKeyForm from './signin/ApiKeyForm.vue'
|
||||
import SignInForm from './signin/SignInForm.vue'
|
||||
@@ -172,6 +174,7 @@ const isSecureContext = window.isSecureContext
|
||||
const isSignIn = ref(true)
|
||||
const showApiKeyForm = ref(false)
|
||||
const ssoAllowed = isHostWhitelisted(normalizeHost(window.location.hostname))
|
||||
const googleSsoBlockedReason = getGoogleSsoBlockedReason()
|
||||
const comfyPlatformBaseUrl = computed(() =>
|
||||
configValueOrDefault(
|
||||
remoteConfig.value,
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
<span class="flex flex-row gap-0.5">
|
||||
<template v-for="(sequence, index) in keySequences" :key="index">
|
||||
<Tag
|
||||
class="min-w-6 justify-center gap-1 bg-interface-menu-keybind-surface-default text-center font-normal text-base-foreground uppercase"
|
||||
class="min-w-6 justify-center gap-1 bg-interface-menu-keybind-surface-default text-center font-normal text-base-foreground capitalize"
|
||||
:severity="isModified ? 'info' : 'secondary'"
|
||||
>
|
||||
{{ sequence }}
|
||||
|
||||
@@ -64,6 +64,7 @@
|
||||
</span>
|
||||
<input
|
||||
v-model.number="brushSize"
|
||||
data-testid="brush-thickness-input"
|
||||
type="number"
|
||||
class="border-p-form-field-border-color text-input-text w-16 rounded-md border bg-comfy-menu-bg px-2 py-1 text-center text-sm"
|
||||
:min="1"
|
||||
|
||||
@@ -76,6 +76,7 @@ import SidebarShortcutsToggleButton from '@/components/sidebar/SidebarShortcutsT
|
||||
import { isCloud, isDesktop, isNightly } from '@/platform/distribution/types'
|
||||
import { useSettingStore } from '@/platform/settings/settingStore'
|
||||
import { useTelemetry } from '@/platform/telemetry'
|
||||
import { usePreventFocusLoss } from '@/composables/usePreventFocusLoss'
|
||||
import { useCanvasStore } from '@/renderer/core/canvas/canvasStore'
|
||||
import { useCommandStore } from '@/stores/commandStore'
|
||||
import { useKeybindingStore } from '@/platform/keybindings/keybindingStore'
|
||||
@@ -103,6 +104,8 @@ const userStore = useUserStore()
|
||||
const commandStore = useCommandStore()
|
||||
const canvasStore = useCanvasStore()
|
||||
const sideToolbarRef = ref<HTMLElement>()
|
||||
usePreventFocusLoss(sideToolbarRef)
|
||||
|
||||
const topToolbarRef = ref<HTMLElement>()
|
||||
const bottomToolbarRef = ref<HTMLElement>()
|
||||
|
||||
|
||||
@@ -117,6 +117,7 @@ import Button from '@/components/ui/button/Button.vue'
|
||||
import { useCurrentUser } from '@/composables/auth/useCurrentUser'
|
||||
import { useFeatureFlags } from '@/composables/useFeatureFlags'
|
||||
import { useOverflowObserver } from '@/composables/element/useOverflowObserver'
|
||||
import { usePreventFocusLoss } from '@/composables/usePreventFocusLoss'
|
||||
import { useSettingStore } from '@/platform/settings/settingStore'
|
||||
import { buildFeedbackUrl } from '@/platform/support/config'
|
||||
import { useWorkflowService } from '@/platform/workflow/core/services/workflowService'
|
||||
@@ -157,6 +158,8 @@ function openFeedback() {
|
||||
}
|
||||
|
||||
const containerRef = ref<HTMLElement | null>(null)
|
||||
usePreventFocusLoss(containerRef)
|
||||
|
||||
const showOverflowArrows = ref(false)
|
||||
const leftArrowEnabled = ref(false)
|
||||
const rightArrowEnabled = ref(false)
|
||||
|
||||
124
src/composables/maskeditor/gpuUtils.test.ts
Normal file
124
src/composables/maskeditor/gpuUtils.test.ts
Normal file
@@ -0,0 +1,124 @@
|
||||
import { describe, expect, it } from 'vitest'
|
||||
|
||||
import { buildStrokePoints, clampDirtyRect } from './gpuUtils'
|
||||
|
||||
const uninit = {
|
||||
minX: Infinity,
|
||||
minY: Infinity,
|
||||
maxX: -Infinity,
|
||||
maxY: -Infinity
|
||||
}
|
||||
|
||||
describe('clampDirtyRect', () => {
|
||||
it('returns full canvas when dirty rect is uninitialised', () => {
|
||||
expect(clampDirtyRect(uninit, 100, 200)).toEqual({
|
||||
dx: 0,
|
||||
dy: 0,
|
||||
dw: 100,
|
||||
dh: 200
|
||||
})
|
||||
})
|
||||
|
||||
it('returns the clamped rect when fully inside canvas bounds', () => {
|
||||
const rect = { minX: 10, minY: 20, maxX: 60, maxY: 90 }
|
||||
expect(clampDirtyRect(rect, 100, 200)).toEqual({
|
||||
dx: 10,
|
||||
dy: 20,
|
||||
dw: 50,
|
||||
dh: 70
|
||||
})
|
||||
})
|
||||
|
||||
it('clamps rect that extends beyond canvas edges', () => {
|
||||
const rect = { minX: -5, minY: -10, maxX: 120, maxY: 250 }
|
||||
expect(clampDirtyRect(rect, 100, 200)).toEqual({
|
||||
dx: 0,
|
||||
dy: 0,
|
||||
dw: 100,
|
||||
dh: 200
|
||||
})
|
||||
})
|
||||
|
||||
it('returns full canvas when the clamped area has zero width', () => {
|
||||
const rect = { minX: 50, minY: 10, maxX: 50, maxY: 80 }
|
||||
expect(clampDirtyRect(rect, 100, 200)).toEqual({
|
||||
dx: 0,
|
||||
dy: 0,
|
||||
dw: 100,
|
||||
dh: 200
|
||||
})
|
||||
})
|
||||
|
||||
it('returns full canvas when the clamped area has zero height', () => {
|
||||
const rect = { minX: 10, minY: 50, maxX: 80, maxY: 50 }
|
||||
expect(clampDirtyRect(rect, 100, 200)).toEqual({
|
||||
dx: 0,
|
||||
dy: 0,
|
||||
dw: 100,
|
||||
dh: 200
|
||||
})
|
||||
})
|
||||
|
||||
it('floors dx/dy and ceils the far edges', () => {
|
||||
const rect = { minX: 10.7, minY: 20.3, maxX: 59.2, maxY: 89.9 }
|
||||
const result = clampDirtyRect(rect, 100, 200)
|
||||
expect(result.dx).toBe(10)
|
||||
expect(result.dy).toBe(20)
|
||||
expect(result.dw).toBe(60 - 10) // ceil(59.2)=60, dx=10
|
||||
expect(result.dh).toBe(90 - 20) // ceil(89.9)=90, dy=20
|
||||
})
|
||||
})
|
||||
|
||||
describe('buildStrokePoints', () => {
|
||||
it('returns input points as-is when skipResampling is true', () => {
|
||||
const points = [
|
||||
{ x: 0, y: 0 },
|
||||
{ x: 100, y: 100 }
|
||||
]
|
||||
const result = buildStrokePoints(points, true, 10)
|
||||
expect(result).toHaveLength(2)
|
||||
expect(result[0]).toEqual({ x: 0, y: 0, pressure: 1.0 })
|
||||
expect(result[1]).toEqual({ x: 100, y: 100, pressure: 1.0 })
|
||||
})
|
||||
|
||||
it('returns empty array for empty input', () => {
|
||||
expect(buildStrokePoints([], false, 10)).toHaveLength(0)
|
||||
expect(buildStrokePoints([], true, 10)).toHaveLength(0)
|
||||
})
|
||||
|
||||
it('returns empty array for a single point (no segments to interpolate)', () => {
|
||||
expect(buildStrokePoints([{ x: 5, y: 5 }], false, 10)).toHaveLength(0)
|
||||
})
|
||||
|
||||
it('interpolates a horizontal segment into multiple evenly-spaced points', () => {
|
||||
const points = [
|
||||
{ x: 0, y: 0 },
|
||||
{ x: 30, y: 0 }
|
||||
]
|
||||
const result = buildStrokePoints(points, false, 10)
|
||||
// 30px distance / 10 stepSize = 3 steps → 4 points (s=0,1,2,3)
|
||||
expect(result).toHaveLength(4)
|
||||
expect(result[0]).toMatchObject({ x: 0, y: 0 })
|
||||
expect(result[3]).toMatchObject({ x: 30, y: 0 })
|
||||
result.forEach((p) => expect(p.pressure).toBe(1.0))
|
||||
})
|
||||
|
||||
it('uses at least one step when points are very close together', () => {
|
||||
const points = [
|
||||
{ x: 0, y: 0 },
|
||||
{ x: 0.1, y: 0 }
|
||||
]
|
||||
// distance 0.1 < stepSize 10 → steps=1 → 2 points
|
||||
const result = buildStrokePoints(points, false, 10)
|
||||
expect(result).toHaveLength(2)
|
||||
})
|
||||
|
||||
it('interpolates all pressure values to 1.0', () => {
|
||||
const points = [
|
||||
{ x: 0, y: 0 },
|
||||
{ x: 50, y: 50 }
|
||||
]
|
||||
const result = buildStrokePoints(points, false, 10)
|
||||
result.forEach((p) => expect(p.pressure).toBe(1.0))
|
||||
})
|
||||
})
|
||||
60
src/composables/maskeditor/gpuUtils.ts
Normal file
60
src/composables/maskeditor/gpuUtils.ts
Normal file
@@ -0,0 +1,60 @@
|
||||
import type { Point } from '@/extensions/core/maskeditor/types'
|
||||
|
||||
import type { DirtyRect } from './brushDrawingUtils'
|
||||
|
||||
/**
|
||||
* Computes the clamped dirty-rect coordinates for a putImageData call.
|
||||
*
|
||||
* Returns the full canvas dimensions when the dirty rect is uninitialised
|
||||
* (Infinity sentinels) or the resulting area has zero/negative size.
|
||||
*/
|
||||
export function clampDirtyRect(
|
||||
rect: DirtyRect,
|
||||
canvasWidth: number,
|
||||
canvasHeight: number
|
||||
): { dx: number; dy: number; dw: number; dh: number } {
|
||||
const full = { dx: 0, dy: 0, dw: canvasWidth, dh: canvasHeight }
|
||||
if (rect.minX === Infinity || rect.maxX === -Infinity) return full
|
||||
|
||||
const dx = Math.floor(Math.max(0, rect.minX))
|
||||
const dy = Math.floor(Math.max(0, rect.minY))
|
||||
const dw = Math.ceil(Math.min(canvasWidth, rect.maxX)) - dx
|
||||
const dh = Math.ceil(Math.min(canvasHeight, rect.maxY)) - dy
|
||||
|
||||
return dw > 0 && dh > 0 ? { dx, dy, dw, dh } : full
|
||||
}
|
||||
|
||||
/**
|
||||
* Linearly interpolates a sequence of points at a fixed step size,
|
||||
* returning GPU-ready stroke points with pressure=1.
|
||||
*
|
||||
* When skipResampling is true the input points are returned as-is (used
|
||||
* during live preview where the caller has already handled spacing).
|
||||
*/
|
||||
export function buildStrokePoints(
|
||||
points: Point[],
|
||||
skipResampling: boolean,
|
||||
stepSize: number
|
||||
): { x: number; y: number; pressure: number }[] {
|
||||
if (skipResampling) {
|
||||
return points.map((p) => ({ x: p.x, y: p.y, pressure: 1.0 }))
|
||||
}
|
||||
const result: { x: number; y: number; pressure: number }[] = []
|
||||
for (let i = 0; i < points.length - 1; i++) {
|
||||
const p1 = points[i]
|
||||
const p2 = points[i + 1]
|
||||
const steps = Math.max(
|
||||
1,
|
||||
Math.ceil(Math.hypot(p2.x - p1.x, p2.y - p1.y) / stepSize)
|
||||
)
|
||||
for (let s = 0; s <= steps; s++) {
|
||||
const t = s / steps
|
||||
result.push({
|
||||
x: p1.x + (p2.x - p1.x) * t,
|
||||
y: p1.y + (p2.y - p1.y) * t,
|
||||
pressure: 1.0
|
||||
})
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
295
src/composables/maskeditor/useBrushDrawing.test.ts
Normal file
295
src/composables/maskeditor/useBrushDrawing.test.ts
Normal file
@@ -0,0 +1,295 @@
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
import { effectScope, ref } from 'vue'
|
||||
import type { EffectScope } from 'vue'
|
||||
|
||||
// vi.hoisted runs before imports — only vi.fn() is safe here (no Vue)
|
||||
const saveStateSpy = vi.hoisted(() => vi.fn())
|
||||
|
||||
const mockStoreDef = vi.hoisted(() => ({
|
||||
brushSettings: {
|
||||
size: 20,
|
||||
hardness: 0.9,
|
||||
opacity: 1,
|
||||
stepSize: 5,
|
||||
type: 'arc' as string
|
||||
},
|
||||
currentTool: 'pen' as string,
|
||||
activeLayer: 'mask' as string,
|
||||
maskCanvas: null as HTMLCanvasElement | null,
|
||||
maskCtx: null as CanvasRenderingContext2D | null,
|
||||
rgbCanvas: null as HTMLCanvasElement | null,
|
||||
rgbCtx: null as CanvasRenderingContext2D | null,
|
||||
maskBlendMode: 'black',
|
||||
maskOpacity: 0.8,
|
||||
maskColor: { r: 0, g: 0, b: 0 },
|
||||
rgbColor: '#FF0000',
|
||||
canvasHistory: { saveState: saveStateSpy }
|
||||
}))
|
||||
|
||||
// vi.mock factory runs after hoisting — ref/computed from Vue are available
|
||||
vi.mock('./useGPUResources', () => {
|
||||
// Singletons shared across all calls to useGPUResources() in this test file
|
||||
const isSavingHistory = ref(false)
|
||||
const dirtyRect = ref({
|
||||
minX: Infinity,
|
||||
minY: Infinity,
|
||||
maxX: -Infinity,
|
||||
maxY: -Infinity
|
||||
})
|
||||
const hasRenderer = ref(false)
|
||||
const previewCanvas = ref<HTMLCanvasElement | null>(null)
|
||||
const prepareStroke = vi.fn()
|
||||
const clearPreview = vi.fn()
|
||||
const compositeStroke = vi.fn()
|
||||
const copyGpuToCanvas = vi
|
||||
.fn()
|
||||
.mockResolvedValue({ maskData: undefined, rgbData: undefined })
|
||||
return {
|
||||
useGPUResources: () => ({
|
||||
isSavingHistory,
|
||||
dirtyRect,
|
||||
hasRenderer,
|
||||
previewCanvas,
|
||||
prepareStroke,
|
||||
clearPreview,
|
||||
compositeStroke,
|
||||
copyGpuToCanvas,
|
||||
gpuRender: vi.fn(),
|
||||
gpuDrawPoint: vi.fn(),
|
||||
clearGPU: vi.fn(),
|
||||
destroy: vi.fn(),
|
||||
initGPUResources: vi.fn().mockResolvedValue(undefined),
|
||||
initPreviewCanvas: vi.fn()
|
||||
})
|
||||
}
|
||||
})
|
||||
|
||||
vi.mock('./useCoordinateTransform', () => ({
|
||||
useCoordinateTransform: () => ({
|
||||
screenToCanvas: vi.fn(({ x, y }: { x: number; y: number }) => ({ x, y }))
|
||||
})
|
||||
}))
|
||||
|
||||
vi.mock('./useBrushPersistence', () => ({
|
||||
useBrushPersistence: () => ({ loadAndApply: vi.fn(), save: vi.fn() })
|
||||
}))
|
||||
|
||||
vi.mock('./useBrushAdjustment', () => ({
|
||||
useBrushAdjustment: () => ({
|
||||
startBrushAdjustment: vi.fn(),
|
||||
handleBrushAdjustment: vi.fn()
|
||||
})
|
||||
}))
|
||||
|
||||
vi.mock('@/stores/maskEditorStore', () => ({
|
||||
useMaskEditorStore: vi.fn(() => mockStoreDef)
|
||||
}))
|
||||
|
||||
vi.mock('@/scripts/app', () => ({
|
||||
app: { registerExtension: vi.fn() }
|
||||
}))
|
||||
|
||||
import { useGPUResources } from './useGPUResources'
|
||||
import { useBrushDrawing } from './useBrushDrawing'
|
||||
|
||||
function makePointerEvent(
|
||||
x: number,
|
||||
y: number,
|
||||
opts: { buttons?: number; shiftKey?: boolean } = {}
|
||||
): PointerEvent {
|
||||
return {
|
||||
offsetX: x,
|
||||
offsetY: y,
|
||||
buttons: opts.buttons ?? 1,
|
||||
shiftKey: opts.shiftKey ?? false,
|
||||
preventDefault: vi.fn()
|
||||
} as unknown as PointerEvent
|
||||
}
|
||||
|
||||
function makeMockCtx(): CanvasRenderingContext2D {
|
||||
const gradient = { addColorStop: vi.fn() }
|
||||
return {
|
||||
beginPath: vi.fn(),
|
||||
fill: vi.fn(),
|
||||
rect: vi.fn(),
|
||||
arc: vi.fn(),
|
||||
fillStyle: '',
|
||||
drawImage: vi.fn(),
|
||||
createRadialGradient: vi.fn(() => gradient),
|
||||
globalCompositeOperation: 'source-over'
|
||||
} as unknown as CanvasRenderingContext2D
|
||||
}
|
||||
|
||||
let scope: EffectScope | null = null
|
||||
|
||||
function setup() {
|
||||
scope = effectScope()
|
||||
return scope.run(() => useBrushDrawing())!
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
|
||||
const mockCtx = makeMockCtx()
|
||||
const mockCanvas = {
|
||||
width: 200,
|
||||
height: 200,
|
||||
style: { opacity: '' }
|
||||
} as unknown as HTMLCanvasElement
|
||||
|
||||
mockStoreDef.maskCanvas = mockCanvas
|
||||
mockStoreDef.maskCtx = mockCtx
|
||||
mockStoreDef.rgbCanvas = mockCanvas
|
||||
mockStoreDef.rgbCtx = mockCtx
|
||||
mockStoreDef.currentTool = 'pen'
|
||||
mockStoreDef.activeLayer = 'mask'
|
||||
|
||||
const gpu = useGPUResources()
|
||||
gpu.isSavingHistory.value = false
|
||||
gpu.hasRenderer.value = false
|
||||
gpu.previewCanvas.value = null
|
||||
gpu.dirtyRect.value = {
|
||||
minX: Infinity,
|
||||
minY: Infinity,
|
||||
maxX: -Infinity,
|
||||
maxY: -Infinity
|
||||
}
|
||||
})
|
||||
|
||||
afterEach(() => {
|
||||
scope?.stop()
|
||||
scope = null
|
||||
})
|
||||
|
||||
describe('startDrawing', () => {
|
||||
it('calls prepareStroke on the GPU resources', async () => {
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
expect(useGPUResources().prepareStroke).toHaveBeenCalledOnce()
|
||||
})
|
||||
|
||||
it('sets DestinationOut composition when tool is eraser', async () => {
|
||||
mockStoreDef.currentTool = 'eraser'
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
expect(mockStoreDef.maskCtx!.globalCompositeOperation).toBe(
|
||||
'destination-out'
|
||||
)
|
||||
})
|
||||
|
||||
it('sets SourceOver composition when tool is mask pen', async () => {
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
expect(mockStoreDef.maskCtx!.globalCompositeOperation).toBe('source-over')
|
||||
})
|
||||
|
||||
it('sets DestinationOut composition when right mouse button is used', async () => {
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50, { buttons: 2 }))
|
||||
expect(mockStoreDef.maskCtx!.globalCompositeOperation).toBe(
|
||||
'destination-out'
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
describe('startDrawing error handling', () => {
|
||||
it('catches initShape errors and resets drawing state', async () => {
|
||||
mockStoreDef.maskCtx = null
|
||||
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
expect(consoleSpy).toHaveBeenCalledWith(
|
||||
'[useBrushDrawing] Failed to start drawing:',
|
||||
expect.any(Error)
|
||||
)
|
||||
expect(mockStoreDef.maskCtx).toBeNull()
|
||||
consoleSpy.mockRestore()
|
||||
})
|
||||
})
|
||||
|
||||
describe('startDrawing shift+click', () => {
|
||||
it('draws a line from the previous point when shift is held', async () => {
|
||||
const { startDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await startDrawing(makePointerEvent(100, 50, { shiftKey: true }))
|
||||
expect(
|
||||
(mockStoreDef.maskCtx as unknown as ReturnType<typeof makeMockCtx>)
|
||||
.beginPath
|
||||
).toHaveBeenCalled()
|
||||
})
|
||||
})
|
||||
|
||||
describe('handleDrawing', () => {
|
||||
it('updates smoothingLastDrawTime after each move event', async () => {
|
||||
const rafSpy = vi
|
||||
.spyOn(window, 'requestAnimationFrame')
|
||||
.mockImplementation((cb) => {
|
||||
cb(0)
|
||||
return 0
|
||||
})
|
||||
const { startDrawing, handleDrawing } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await handleDrawing(makePointerEvent(55, 55))
|
||||
expect(rafSpy).toHaveBeenCalled()
|
||||
rafSpy.mockRestore()
|
||||
})
|
||||
})
|
||||
|
||||
describe('drawEnd canvas visibility', () => {
|
||||
it('restores rgb canvas opacity when activeLayer is rgb', async () => {
|
||||
mockStoreDef.activeLayer = 'rgb'
|
||||
const mockRgbCanvas = {
|
||||
width: 200,
|
||||
height: 200,
|
||||
style: { opacity: '' }
|
||||
} as unknown as HTMLCanvasElement
|
||||
mockStoreDef.rgbCanvas = mockRgbCanvas
|
||||
const { startDrawing, drawEnd } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(mockRgbCanvas.style.opacity).toBe('1')
|
||||
})
|
||||
|
||||
it('restores preview canvas opacity to 1 after drawEnd', async () => {
|
||||
const gpu = useGPUResources()
|
||||
const mockPreviewCanvas = {
|
||||
style: { opacity: '' }
|
||||
} as unknown as HTMLCanvasElement
|
||||
gpu.previewCanvas.value = mockPreviewCanvas
|
||||
const { startDrawing, drawEnd } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(mockPreviewCanvas.style.opacity).toBe('1')
|
||||
})
|
||||
})
|
||||
|
||||
describe('drawEnd', () => {
|
||||
it('calls compositeStroke indicating the active layer and erasing state', async () => {
|
||||
const { startDrawing, drawEnd } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(useGPUResources().compositeStroke).toHaveBeenCalledOnce()
|
||||
expect(useGPUResources().compositeStroke).toHaveBeenCalledWith(false, false)
|
||||
})
|
||||
|
||||
it('calls clearPreview to clean up the GPU overlay', async () => {
|
||||
const { startDrawing, drawEnd } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(useGPUResources().clearPreview).toHaveBeenCalledOnce()
|
||||
})
|
||||
|
||||
it('saves canvas history on stroke completion', async () => {
|
||||
const { startDrawing, drawEnd } = setup()
|
||||
await startDrawing(makePointerEvent(50, 50))
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(saveStateSpy).toHaveBeenCalledOnce()
|
||||
})
|
||||
|
||||
it('is a no-op when drawing was never started', async () => {
|
||||
const { drawEnd } = setup()
|
||||
await drawEnd(makePointerEvent(60, 60))
|
||||
expect(useGPUResources().compositeStroke).not.toHaveBeenCalled()
|
||||
expect(saveStateSpy).not.toHaveBeenCalled()
|
||||
})
|
||||
})
|
||||
File diff suppressed because it is too large
Load Diff
192
src/composables/maskeditor/useGPUResources.test.ts
Normal file
192
src/composables/maskeditor/useGPUResources.test.ts
Normal file
@@ -0,0 +1,192 @@
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
import { effectScope, nextTick, reactive } from 'vue'
|
||||
import type { EffectScope } from 'vue'
|
||||
|
||||
vi.mock('typegpu', () => ({
|
||||
tgpu: {
|
||||
init: vi.fn().mockRejectedValue(new Error('WebGPU not supported'))
|
||||
}
|
||||
}))
|
||||
|
||||
vi.mock('./gpu/GPUBrushRenderer', () => ({
|
||||
GPUBrushRenderer: vi.fn()
|
||||
}))
|
||||
|
||||
const mockStore = reactive({
|
||||
tgpuRoot: null as unknown,
|
||||
maskCanvas: null as HTMLCanvasElement | null,
|
||||
rgbCanvas: null as HTMLCanvasElement | null,
|
||||
maskCtx: null as CanvasRenderingContext2D | null,
|
||||
rgbCtx: null as CanvasRenderingContext2D | null,
|
||||
clearTrigger: 0,
|
||||
canvasHistory: { currentStateIndex: 0 },
|
||||
gpuTexturesNeedRecreation: false,
|
||||
gpuTextureWidth: 0,
|
||||
gpuTextureHeight: 0,
|
||||
pendingGPUMaskData: null as null,
|
||||
pendingGPURgbData: null as null,
|
||||
brushSettings: {
|
||||
size: 20,
|
||||
hardness: 0.9,
|
||||
opacity: 1,
|
||||
stepSize: 5,
|
||||
type: 'arc'
|
||||
},
|
||||
activeLayer: 'mask',
|
||||
currentTool: 'pen',
|
||||
maskColor: { r: 0, g: 0, b: 0 },
|
||||
rgbColor: '#FF0000'
|
||||
})
|
||||
|
||||
vi.mock('@/stores/maskEditorStore', () => ({
|
||||
useMaskEditorStore: vi.fn(() => mockStore)
|
||||
}))
|
||||
|
||||
import { resetDirtyRect } from './brushDrawingUtils'
|
||||
import { useGPUResources } from './useGPUResources'
|
||||
|
||||
let scope: EffectScope | null = null
|
||||
|
||||
function setup() {
|
||||
scope = effectScope()
|
||||
return scope.run(() => useGPUResources())!
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
mockStore.tgpuRoot = null
|
||||
mockStore.maskCanvas = null
|
||||
mockStore.rgbCanvas = null
|
||||
mockStore.maskCtx = null
|
||||
mockStore.rgbCtx = null
|
||||
mockStore.clearTrigger = 0
|
||||
mockStore.canvasHistory.currentStateIndex = 0
|
||||
mockStore.gpuTexturesNeedRecreation = false
|
||||
})
|
||||
|
||||
afterEach(() => {
|
||||
scope?.stop()
|
||||
scope = null
|
||||
})
|
||||
|
||||
describe('initial reactive state', () => {
|
||||
it('hasRenderer is false when no renderer exists', () => {
|
||||
const { hasRenderer } = setup()
|
||||
expect(hasRenderer.value).toBe(false)
|
||||
})
|
||||
|
||||
it('isSavingHistory is false initially', () => {
|
||||
const { isSavingHistory } = setup()
|
||||
expect(isSavingHistory.value).toBe(false)
|
||||
})
|
||||
|
||||
it('previewCanvas is null initially', () => {
|
||||
const { previewCanvas } = setup()
|
||||
expect(previewCanvas.value).toBeNull()
|
||||
})
|
||||
|
||||
it('dirtyRect starts with uninitialised sentinel values', () => {
|
||||
const { dirtyRect } = setup()
|
||||
expect(dirtyRect.value).toEqual(resetDirtyRect())
|
||||
})
|
||||
})
|
||||
|
||||
describe('no-op when GPU is not initialised', () => {
|
||||
it('prepareStroke does not throw', () => {
|
||||
const { prepareStroke } = setup()
|
||||
expect(() => prepareStroke()).not.toThrow()
|
||||
})
|
||||
|
||||
it('clearPreview does not throw', () => {
|
||||
const { clearPreview } = setup()
|
||||
expect(() => clearPreview()).not.toThrow()
|
||||
})
|
||||
|
||||
it('clearGPU does not throw', () => {
|
||||
const { clearGPU } = setup()
|
||||
expect(() => clearGPU()).not.toThrow()
|
||||
})
|
||||
|
||||
it('destroy does not throw', () => {
|
||||
const { destroy } = setup()
|
||||
expect(() => destroy()).not.toThrow()
|
||||
})
|
||||
|
||||
it('gpuRender does not throw with empty or non-empty point arrays', () => {
|
||||
const { gpuRender } = setup()
|
||||
expect(() => gpuRender([])).not.toThrow()
|
||||
expect(() => gpuRender([{ x: 10, y: 20 }])).not.toThrow()
|
||||
})
|
||||
|
||||
it('compositeStroke does not throw for any combination of flags', () => {
|
||||
const { compositeStroke } = setup()
|
||||
expect(() => compositeStroke(false, false)).not.toThrow()
|
||||
expect(() => compositeStroke(true, true)).not.toThrow()
|
||||
})
|
||||
})
|
||||
|
||||
describe('initGPUResources', () => {
|
||||
it('leaves hasRenderer false when TypeGPU initialisation fails', async () => {
|
||||
const { initGPUResources, hasRenderer } = setup()
|
||||
await initGPUResources()
|
||||
expect(hasRenderer.value).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('copyGpuToCanvas', () => {
|
||||
it('rejects with a descriptive error when GPU resources are not ready', async () => {
|
||||
const { copyGpuToCanvas } = setup()
|
||||
await expect(copyGpuToCanvas()).rejects.toThrow('GPU resources not ready')
|
||||
})
|
||||
})
|
||||
|
||||
describe('watchers', () => {
|
||||
it('clearTrigger watcher calls clearGPU without throwing', async () => {
|
||||
setup()
|
||||
mockStore.clearTrigger++
|
||||
await nextTick()
|
||||
})
|
||||
|
||||
it('currentStateIndex watcher short-circuits when isSavingHistory is true', async () => {
|
||||
const { isSavingHistory } = setup()
|
||||
isSavingHistory.value = true
|
||||
mockStore.canvasHistory.currentStateIndex++
|
||||
await nextTick()
|
||||
})
|
||||
|
||||
it('currentStateIndex watcher calls updateGPUFromCanvas when not saving history', async () => {
|
||||
setup()
|
||||
mockStore.canvasHistory.currentStateIndex++
|
||||
await nextTick()
|
||||
})
|
||||
|
||||
it('gpuTexturesNeedRecreation watcher returns early when device is not initialised', async () => {
|
||||
setup()
|
||||
mockStore.gpuTexturesNeedRecreation = true
|
||||
await nextTick()
|
||||
})
|
||||
})
|
||||
|
||||
describe('initGPUResources with pre-existing tgpuRoot', () => {
|
||||
it('returns early with a warning when canvas contexts are not ready', async () => {
|
||||
const { initGPUResources, hasRenderer } = setup()
|
||||
mockStore.tgpuRoot = { device: {} } as unknown
|
||||
await initGPUResources()
|
||||
expect(hasRenderer.value).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('initPreviewCanvas', () => {
|
||||
it('returns early when device is not initialised', () => {
|
||||
const { initPreviewCanvas } = setup()
|
||||
const canvas = document.createElement('canvas')
|
||||
expect(() => initPreviewCanvas(canvas)).not.toThrow()
|
||||
})
|
||||
})
|
||||
|
||||
describe('gpuDrawPoint', () => {
|
||||
it('resolves immediately when renderer is not initialised', async () => {
|
||||
const { gpuDrawPoint } = setup()
|
||||
await expect(gpuDrawPoint({ x: 10, y: 20 })).resolves.toBeUndefined()
|
||||
})
|
||||
})
|
||||
624
src/composables/maskeditor/useGPUResources.ts
Normal file
624
src/composables/maskeditor/useGPUResources.ts
Normal file
@@ -0,0 +1,624 @@
|
||||
/// <reference types="@webgpu/types" />
|
||||
import { onUnmounted, ref, watch } from 'vue'
|
||||
import { tgpu } from 'typegpu'
|
||||
|
||||
import { BrushShape } from '@/extensions/core/maskeditor/types'
|
||||
import type { Point } from '@/extensions/core/maskeditor/types'
|
||||
import { useMaskEditorStore } from '@/stores/maskEditorStore'
|
||||
import { parseToRgb } from '@/utils/colorUtil'
|
||||
|
||||
import type { DirtyRect } from './brushDrawingUtils'
|
||||
import {
|
||||
premultiplyData,
|
||||
resetDirtyRect,
|
||||
updateDirtyRect
|
||||
} from './brushDrawingUtils'
|
||||
import { getEffectiveBrushSize, getEffectiveHardness } from './brushUtils'
|
||||
import { GPUBrushRenderer } from './gpu/GPUBrushRenderer'
|
||||
import { buildStrokePoints, clampDirtyRect } from './gpuUtils'
|
||||
|
||||
export function useGPUResources() {
|
||||
const store = useMaskEditorStore()
|
||||
|
||||
// GPU state — plain variables, not reactive, as Vue doesn't need to track them
|
||||
let maskTexture: GPUTexture | null = null
|
||||
let rgbTexture: GPUTexture | null = null
|
||||
let device: GPUDevice | null = null
|
||||
let renderer: GPUBrushRenderer | null = null
|
||||
let previewContext: GPUCanvasContext | null = null
|
||||
|
||||
// Readback buffers
|
||||
let readbackStorageMask: GPUBuffer | null = null
|
||||
let readbackStorageRgb: GPUBuffer | null = null
|
||||
let readbackStagingMask: GPUBuffer | null = null
|
||||
let readbackStagingRgb: GPUBuffer | null = null
|
||||
let currentBufferSize = 0
|
||||
|
||||
// Reactive state shared with useBrushDrawing
|
||||
const previewCanvas = ref<HTMLCanvasElement | null>(null)
|
||||
const isSavingHistory = ref(false)
|
||||
const dirtyRect = ref<DirtyRect>(resetDirtyRect())
|
||||
|
||||
const hasRenderer = ref(false)
|
||||
|
||||
const isRecreatingTextures = ref(false)
|
||||
|
||||
// ── Watchers ────────────────────────────────────────────────────────────────
|
||||
|
||||
watch(
|
||||
() => store.clearTrigger,
|
||||
() => clearGPU()
|
||||
)
|
||||
|
||||
watch(
|
||||
() => store.canvasHistory.currentStateIndex,
|
||||
async () => {
|
||||
if (isSavingHistory.value) return
|
||||
await updateGPUFromCanvas()
|
||||
if (renderer && previewContext) renderer.clearPreview(previewContext)
|
||||
}
|
||||
)
|
||||
|
||||
watch(
|
||||
() => store.gpuTexturesNeedRecreation,
|
||||
async (needsRecreation) => {
|
||||
if (
|
||||
!needsRecreation ||
|
||||
!device ||
|
||||
!store.maskCanvas ||
|
||||
isRecreatingTextures.value
|
||||
)
|
||||
return
|
||||
|
||||
/* c8 ignore start */
|
||||
isRecreatingTextures.value = true
|
||||
|
||||
const width = store.gpuTextureWidth
|
||||
const height = store.gpuTextureHeight
|
||||
|
||||
try {
|
||||
maskTexture?.destroy()
|
||||
maskTexture = null
|
||||
rgbTexture?.destroy()
|
||||
rgbTexture = null
|
||||
|
||||
maskTexture = createTexture(device, width, height)
|
||||
rgbTexture = createTexture(device, width, height)
|
||||
|
||||
if (store.pendingGPUMaskData && store.pendingGPURgbData) {
|
||||
device.queue.writeTexture(
|
||||
{ texture: maskTexture },
|
||||
store.pendingGPUMaskData,
|
||||
{ bytesPerRow: width * 4 },
|
||||
{ width, height }
|
||||
)
|
||||
device.queue.writeTexture(
|
||||
{ texture: rgbTexture },
|
||||
store.pendingGPURgbData,
|
||||
{ bytesPerRow: width * 4 },
|
||||
{ width, height }
|
||||
)
|
||||
} else {
|
||||
await updateGPUFromCanvas()
|
||||
}
|
||||
|
||||
if (previewCanvas.value && renderer) {
|
||||
previewCanvas.value.width = width
|
||||
previewCanvas.value.height = height
|
||||
}
|
||||
|
||||
resizeReadbackBuffers(device, width, height)
|
||||
} catch (error) {
|
||||
console.error(
|
||||
'[useGPUResources] Failed to recreate GPU textures:',
|
||||
error
|
||||
)
|
||||
} finally {
|
||||
store.gpuTexturesNeedRecreation = false
|
||||
store.gpuTextureWidth = 0
|
||||
store.gpuTextureHeight = 0
|
||||
store.pendingGPUMaskData = null
|
||||
store.pendingGPURgbData = null
|
||||
isRecreatingTextures.value = false
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
)
|
||||
|
||||
onUnmounted(() => {
|
||||
// c8 ignore start
|
||||
renderer?.destroy()
|
||||
renderer = null
|
||||
hasRenderer.value = false
|
||||
maskTexture?.destroy()
|
||||
maskTexture = null
|
||||
rgbTexture?.destroy()
|
||||
rgbTexture = null
|
||||
readbackStorageMask?.destroy()
|
||||
readbackStorageMask = null
|
||||
readbackStorageRgb?.destroy()
|
||||
readbackStorageRgb = null
|
||||
readbackStagingMask?.destroy()
|
||||
readbackStagingMask = null
|
||||
readbackStagingRgb?.destroy()
|
||||
readbackStagingRgb = null
|
||||
previewContext = null
|
||||
previewCanvas.value = null
|
||||
dirtyRect.value = resetDirtyRect()
|
||||
// Device is managed by TGPU root; do not destroy it here
|
||||
// c8 ignore stop
|
||||
})
|
||||
|
||||
// ── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/* c8 ignore start — requires a live GPUDevice */
|
||||
function createTexture(
|
||||
gpuDevice: GPUDevice,
|
||||
width: number,
|
||||
height: number
|
||||
): GPUTexture {
|
||||
return gpuDevice.createTexture({
|
||||
size: [width, height],
|
||||
format: 'rgba8unorm',
|
||||
usage:
|
||||
GPUTextureUsage.TEXTURE_BINDING |
|
||||
GPUTextureUsage.STORAGE_BINDING |
|
||||
GPUTextureUsage.RENDER_ATTACHMENT |
|
||||
GPUTextureUsage.COPY_DST |
|
||||
GPUTextureUsage.COPY_SRC
|
||||
})
|
||||
}
|
||||
|
||||
function resizeReadbackBuffers(
|
||||
gpuDevice: GPUDevice,
|
||||
width: number,
|
||||
height: number
|
||||
): void {
|
||||
const bufferSize = width * height * 4
|
||||
if (currentBufferSize === bufferSize) return
|
||||
|
||||
readbackStorageMask?.destroy()
|
||||
readbackStorageRgb?.destroy()
|
||||
readbackStagingMask?.destroy()
|
||||
readbackStagingRgb?.destroy()
|
||||
|
||||
readbackStorageMask = gpuDevice.createBuffer({
|
||||
size: bufferSize,
|
||||
usage: GPUBufferUsage.STORAGE | GPUBufferUsage.COPY_SRC
|
||||
})
|
||||
readbackStorageRgb = gpuDevice.createBuffer({
|
||||
size: bufferSize,
|
||||
usage: GPUBufferUsage.STORAGE | GPUBufferUsage.COPY_SRC
|
||||
})
|
||||
readbackStagingMask = gpuDevice.createBuffer({
|
||||
size: bufferSize,
|
||||
usage: GPUBufferUsage.COPY_DST | GPUBufferUsage.MAP_READ
|
||||
})
|
||||
readbackStagingRgb = gpuDevice.createBuffer({
|
||||
size: bufferSize,
|
||||
usage: GPUBufferUsage.COPY_DST | GPUBufferUsage.MAP_READ
|
||||
})
|
||||
currentBufferSize = bufferSize
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
|
||||
// ── Internal functions ───────────────────────────────────────────────────────
|
||||
|
||||
async function initTypeGPU(): Promise<void> {
|
||||
if (store.tgpuRoot) {
|
||||
/* c8 ignore start */
|
||||
device = store.tgpuRoot.device
|
||||
return
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
try {
|
||||
/* c8 ignore start — requires functional WebGPU hardware */
|
||||
const root = await tgpu.init()
|
||||
store.tgpuRoot = root
|
||||
device = root.device
|
||||
console.warn('✅ TypeGPU initialized! Root:', root)
|
||||
console.warn('Device info:', root.device.limits)
|
||||
/* c8 ignore stop */
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : String(error)
|
||||
console.warn('Failed to initialize TypeGPU:', message)
|
||||
}
|
||||
}
|
||||
|
||||
async function updateGPUFromCanvas(): Promise<void> {
|
||||
if (
|
||||
!device ||
|
||||
!maskTexture ||
|
||||
!rgbTexture ||
|
||||
!store.maskCanvas ||
|
||||
!store.maskCtx ||
|
||||
!store.rgbCtx
|
||||
)
|
||||
return
|
||||
|
||||
/* c8 ignore start — requires live GPU device and textures */
|
||||
const w = store.maskCanvas.width
|
||||
const h = store.maskCanvas.height
|
||||
|
||||
const maskData = store.maskCtx.getImageData(0, 0, w, h)
|
||||
premultiplyData(maskData.data)
|
||||
device.queue.writeTexture(
|
||||
{ texture: maskTexture },
|
||||
maskData.data,
|
||||
{ bytesPerRow: w * 4 },
|
||||
{ width: w, height: h }
|
||||
)
|
||||
|
||||
const rgbData = store.rgbCtx.getImageData(0, 0, w, h)
|
||||
premultiplyData(rgbData.data)
|
||||
device.queue.writeTexture(
|
||||
{ texture: rgbTexture },
|
||||
rgbData.data,
|
||||
{ bytesPerRow: w * 4 },
|
||||
{ width: w, height: h }
|
||||
)
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
// ── Public API ───────────────────────────────────────────────────────────────
|
||||
|
||||
async function initGPUResources(): Promise<void> {
|
||||
await initTypeGPU()
|
||||
|
||||
if (!store.tgpuRoot || !device) {
|
||||
console.warn('TypeGPU not initialized, skipping GPU resource setup')
|
||||
return
|
||||
}
|
||||
if (
|
||||
!store.maskCanvas ||
|
||||
!store.rgbCanvas ||
|
||||
!store.maskCtx ||
|
||||
!store.rgbCtx
|
||||
) {
|
||||
console.warn('Canvas contexts not ready, skipping GPU resource setup')
|
||||
return
|
||||
}
|
||||
|
||||
const w = store.maskCanvas.width
|
||||
const h = store.maskCanvas.height
|
||||
|
||||
/* c8 ignore start — requires functional WebGPU hardware */
|
||||
try {
|
||||
console.warn(`🎨 Initializing GPU resources for ${w}x${h} canvas`)
|
||||
maskTexture = createTexture(device, w, h)
|
||||
rgbTexture = createTexture(device, w, h)
|
||||
await updateGPUFromCanvas()
|
||||
console.warn('✅ GPU resources initialized successfully')
|
||||
renderer = new GPUBrushRenderer(
|
||||
device,
|
||||
navigator.gpu.getPreferredCanvasFormat()
|
||||
)
|
||||
hasRenderer.value = true
|
||||
console.warn('✅ Brush renderer initialized')
|
||||
} catch (error) {
|
||||
console.error('Failed to initialize GPU resources:', error)
|
||||
maskTexture = null
|
||||
rgbTexture = null
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
function initPreviewCanvas(canvas: HTMLCanvasElement): void {
|
||||
if (!device) return
|
||||
/* c8 ignore start — requires live GPUDevice and WebGPU canvas context */
|
||||
const ctx = canvas.getContext('webgpu')
|
||||
if (!ctx) return
|
||||
ctx.configure({
|
||||
device,
|
||||
format: navigator.gpu.getPreferredCanvasFormat(),
|
||||
alphaMode: 'premultiplied'
|
||||
})
|
||||
previewContext = ctx
|
||||
previewCanvas.value = canvas
|
||||
console.warn('✅ Preview Canvas Initialized')
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
function clearGPU(): void {
|
||||
if (!device || !maskTexture || !rgbTexture || !store.maskCanvas) return
|
||||
/* c8 ignore start — requires live GPUDevice and textures */
|
||||
const w = store.maskCanvas.width
|
||||
const h = store.maskCanvas.height
|
||||
const zeros = new Uint8Array(w * h * 4)
|
||||
device.queue.writeTexture(
|
||||
{ texture: maskTexture },
|
||||
zeros,
|
||||
{ bytesPerRow: w * 4 },
|
||||
{ width: w, height: h }
|
||||
)
|
||||
device.queue.writeTexture(
|
||||
{ texture: rgbTexture },
|
||||
zeros,
|
||||
{ bytesPerRow: w * 4 },
|
||||
{ width: w, height: h }
|
||||
)
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
function destroy(): void {
|
||||
renderer?.destroy()
|
||||
maskTexture?.destroy()
|
||||
rgbTexture?.destroy()
|
||||
readbackStorageMask?.destroy()
|
||||
readbackStorageRgb?.destroy()
|
||||
readbackStagingMask?.destroy()
|
||||
readbackStagingRgb?.destroy()
|
||||
renderer = null
|
||||
hasRenderer.value = false
|
||||
maskTexture = null
|
||||
rgbTexture = null
|
||||
readbackStorageMask = null
|
||||
readbackStorageRgb = null
|
||||
readbackStagingMask = null
|
||||
readbackStagingRgb = null
|
||||
currentBufferSize = 0
|
||||
previewContext = null
|
||||
previewCanvas.value = null
|
||||
dirtyRect.value = resetDirtyRect()
|
||||
/* c8 ignore next — tgpuRoot only exists after successful GPU init */
|
||||
if (store.tgpuRoot) {
|
||||
store.tgpuRoot.destroy()
|
||||
store.tgpuRoot = null
|
||||
}
|
||||
device = null
|
||||
}
|
||||
|
||||
// ── Wrappers called by useBrushDrawing ──────────────────────────────────────
|
||||
|
||||
function prepareStroke(): void {
|
||||
if (!renderer || !store.maskCanvas) return
|
||||
/* c8 ignore next */
|
||||
renderer.prepareStroke(store.maskCanvas.width, store.maskCanvas.height)
|
||||
}
|
||||
|
||||
function clearPreview(): void {
|
||||
if (!renderer || !previewContext) return
|
||||
/* c8 ignore next */
|
||||
renderer.clearPreview(previewContext)
|
||||
}
|
||||
|
||||
function compositeStroke(isRgb: boolean, isErasing: boolean): void {
|
||||
if (!renderer || !maskTexture || !rgbTexture || !store.maskCanvas) return
|
||||
/* c8 ignore start — requires live renderer */
|
||||
const targetTex = isRgb ? rgbTexture : maskTexture
|
||||
const { size, hardness, opacity, type } = store.brushSettings
|
||||
const effectiveSize = getEffectiveBrushSize(size, hardness)
|
||||
const effectiveHardness = getEffectiveHardness(
|
||||
size,
|
||||
hardness,
|
||||
effectiveSize
|
||||
)
|
||||
const brushShape = type === BrushShape.Rect ? 1 : 0
|
||||
renderer.compositeStroke(targetTex.createView(), {
|
||||
opacity,
|
||||
color: [0, 0, 0],
|
||||
hardness: effectiveHardness,
|
||||
screenSize: [store.maskCanvas.width, store.maskCanvas.height],
|
||||
brushShape,
|
||||
isErasing
|
||||
})
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
async function copyGpuToCanvas(): Promise<{
|
||||
maskData: ImageData
|
||||
rgbData: ImageData
|
||||
}> {
|
||||
if (
|
||||
!device ||
|
||||
!maskTexture ||
|
||||
!rgbTexture ||
|
||||
!store.maskCanvas ||
|
||||
!store.rgbCanvas ||
|
||||
!store.maskCtx ||
|
||||
!store.rgbCtx ||
|
||||
!renderer
|
||||
)
|
||||
throw new Error('GPU resources not ready')
|
||||
|
||||
/* c8 ignore start — requires live GPU device, textures and renderer */
|
||||
const width = store.maskCanvas.width
|
||||
const height = store.maskCanvas.height
|
||||
|
||||
resizeReadbackBuffers(device, width, height)
|
||||
|
||||
renderer.prepareReadback(maskTexture, readbackStorageMask!)
|
||||
renderer.prepareReadback(rgbTexture, readbackStorageRgb!)
|
||||
|
||||
const encoder = device.createCommandEncoder()
|
||||
encoder.copyBufferToBuffer(
|
||||
readbackStorageMask!,
|
||||
0,
|
||||
readbackStagingMask!,
|
||||
0,
|
||||
currentBufferSize
|
||||
)
|
||||
encoder.copyBufferToBuffer(
|
||||
readbackStorageRgb!,
|
||||
0,
|
||||
readbackStagingRgb!,
|
||||
0,
|
||||
currentBufferSize
|
||||
)
|
||||
device.queue.submit([encoder.finish()])
|
||||
|
||||
await Promise.all([
|
||||
readbackStagingMask!.mapAsync(GPUMapMode.READ),
|
||||
readbackStagingRgb!.mapAsync(GPUMapMode.READ)
|
||||
])
|
||||
|
||||
const maskDataArr = new Uint8ClampedArray(
|
||||
readbackStagingMask!.getMappedRange().slice(0)
|
||||
)
|
||||
const rgbDataArr = new Uint8ClampedArray(
|
||||
readbackStagingRgb!.getMappedRange().slice(0)
|
||||
)
|
||||
readbackStagingMask!.unmap()
|
||||
readbackStagingRgb!.unmap()
|
||||
|
||||
const maskImageData = new ImageData(maskDataArr, width, height)
|
||||
const rgbImageData = new ImageData(rgbDataArr, width, height)
|
||||
|
||||
const { dx, dy, dw, dh } = clampDirtyRect(dirtyRect.value, width, height)
|
||||
store.maskCtx.putImageData(maskImageData, 0, 0, dx, dy, dw, dh)
|
||||
store.rgbCtx.putImageData(rgbImageData, 0, 0, dx, dy, dw, dh)
|
||||
|
||||
return { maskData: maskImageData, rgbData: rgbImageData }
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
function gpuRender(points: Point[], skipResampling = false): void {
|
||||
if (!renderer || !maskTexture || !rgbTexture) return
|
||||
|
||||
/* c8 ignore start — requires live renderer */
|
||||
const isRgb = store.activeLayer === 'rgb'
|
||||
const color = resolveColor(isRgb)
|
||||
const stepPercentage =
|
||||
Math.pow(100, store.brushSettings.stepSize / 100) / 100
|
||||
const gpuStepSize = Math.max(1.0, store.brushSettings.size * stepPercentage)
|
||||
const strokePoints = buildStrokePoints(points, skipResampling, gpuStepSize)
|
||||
|
||||
const { size, hardness } = store.brushSettings
|
||||
const effectiveSize = getEffectiveBrushSize(size, hardness)
|
||||
const effectiveHardness = getEffectiveHardness(
|
||||
size,
|
||||
hardness,
|
||||
effectiveSize
|
||||
)
|
||||
const brushShape = store.brushSettings.type === BrushShape.Rect ? 1 : 0
|
||||
|
||||
renderer.renderStrokeToAccumulator(strokePoints, {
|
||||
size: effectiveSize,
|
||||
opacity: 0.5,
|
||||
hardness: effectiveHardness,
|
||||
color,
|
||||
width: store.maskCanvas!.width,
|
||||
height: store.maskCanvas!.height,
|
||||
brushShape
|
||||
})
|
||||
|
||||
for (const p of strokePoints) {
|
||||
dirtyRect.value = updateDirtyRect(
|
||||
dirtyRect.value,
|
||||
p.x,
|
||||
p.y,
|
||||
effectiveSize
|
||||
)
|
||||
}
|
||||
|
||||
if (previewContext) {
|
||||
const isErasing =
|
||||
store.currentTool === 'eraser' ||
|
||||
store.maskCtx?.globalCompositeOperation === 'destination-out'
|
||||
const targetTex = isRgb ? rgbTexture : maskTexture
|
||||
renderer.blitToCanvas(
|
||||
previewContext,
|
||||
{
|
||||
opacity: store.brushSettings.opacity,
|
||||
color,
|
||||
hardness: effectiveHardness,
|
||||
screenSize: [store.maskCanvas!.width, store.maskCanvas!.height],
|
||||
brushShape,
|
||||
isErasing
|
||||
},
|
||||
targetTex ?? undefined
|
||||
)
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
async function gpuDrawPoint(point: Point, opacity = 1): Promise<void> {
|
||||
if (!renderer) return
|
||||
|
||||
/* c8 ignore start — requires live renderer */
|
||||
const width = store.maskCanvas!.width
|
||||
const height = store.maskCanvas!.height
|
||||
const { size, hardness } = store.brushSettings
|
||||
const effectiveSize = getEffectiveBrushSize(size, hardness)
|
||||
const effectiveHardness = getEffectiveHardness(
|
||||
size,
|
||||
hardness,
|
||||
effectiveSize
|
||||
)
|
||||
const brushShape = store.brushSettings.type === BrushShape.Rect ? 1 : 0
|
||||
|
||||
dirtyRect.value = updateDirtyRect(
|
||||
dirtyRect.value,
|
||||
point.x,
|
||||
point.y,
|
||||
effectiveSize
|
||||
)
|
||||
|
||||
renderer.renderStrokeToAccumulator(
|
||||
[{ x: point.x, y: point.y, pressure: opacity }],
|
||||
{
|
||||
size: effectiveSize,
|
||||
opacity: 0.5,
|
||||
hardness: effectiveHardness,
|
||||
color: [1, 1, 1],
|
||||
width,
|
||||
height,
|
||||
brushShape
|
||||
}
|
||||
)
|
||||
|
||||
if (maskTexture && previewContext) {
|
||||
const isRgb = store.activeLayer === 'rgb'
|
||||
const isErasing =
|
||||
store.currentTool === 'eraser' ||
|
||||
store.maskCtx?.globalCompositeOperation === 'destination-out'
|
||||
renderer.blitToCanvas(
|
||||
previewContext,
|
||||
{
|
||||
opacity: store.brushSettings.opacity,
|
||||
color: resolveColor(isRgb),
|
||||
hardness: effectiveHardness,
|
||||
screenSize: [width, height],
|
||||
brushShape,
|
||||
isErasing
|
||||
},
|
||||
undefined
|
||||
)
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
}
|
||||
|
||||
// ── Private helpers ─────────────────────────────────────────────────────────
|
||||
|
||||
/* c8 ignore start — only reachable after successful GPU init */
|
||||
function resolveColor(isRgb: boolean): [number, number, number] {
|
||||
if (isRgb) {
|
||||
const c = parseToRgb(store.rgbColor)
|
||||
return [c.r / 255, c.g / 255, c.b / 255]
|
||||
}
|
||||
const c = store.maskColor as { r: number; g: number; b: number }
|
||||
return [c.r / 255, c.g / 255, c.b / 255]
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
|
||||
return {
|
||||
// Lifecycle — spread into useBrushDrawing's public return
|
||||
initGPUResources,
|
||||
initPreviewCanvas,
|
||||
clearGPU,
|
||||
destroy,
|
||||
// Rendering — called internally by useBrushDrawing
|
||||
gpuRender,
|
||||
gpuDrawPoint,
|
||||
copyGpuToCanvas,
|
||||
// Renderer wrappers — called internally by useBrushDrawing
|
||||
prepareStroke,
|
||||
clearPreview,
|
||||
compositeStroke,
|
||||
// Shared reactive state
|
||||
hasRenderer,
|
||||
previewCanvas,
|
||||
isSavingHistory,
|
||||
dirtyRect
|
||||
}
|
||||
}
|
||||
@@ -481,6 +481,30 @@ describe('useImageCrop', () => {
|
||||
expect(vm.modelValue.x).toBe(50)
|
||||
})
|
||||
|
||||
it('resizes from the top edge, moving y and shrinking height', async () => {
|
||||
const vm = await mountHarness()
|
||||
setupImageLayout(vm, 500, 500)
|
||||
vm.modelValue = { x: 50, y: 100, width: 120, height: 200 }
|
||||
|
||||
const captureEl = document.createElement('div')
|
||||
captureEl.setPointerCapture = vi.fn()
|
||||
captureEl.releasePointerCapture = vi.fn()
|
||||
|
||||
const resizeStart = vm.handleResizeStart as (
|
||||
e: PointerEvent,
|
||||
dir: string
|
||||
) => void
|
||||
const resizeMove = vm.handleResizeMove as (e: PointerEvent) => void
|
||||
const resizeEnd = vm.handleResizeEnd as (e: PointerEvent) => void
|
||||
|
||||
resizeStart(makePointerEvent('pointerdown', captureEl, 100, 100), 'top')
|
||||
resizeMove(makePointerEvent('pointermove', captureEl, 100, 150))
|
||||
resizeEnd(makePointerEvent('pointerup', captureEl, 100, 150))
|
||||
|
||||
expect(vm.modelValue.y).toBeGreaterThan(100)
|
||||
expect(vm.modelValue.height).toBeLessThan(200)
|
||||
})
|
||||
|
||||
it('applies a preset aspect ratio and clamps height to the image', async () => {
|
||||
const vm = await mountHarness()
|
||||
setupImageLayout(vm, 800, 500)
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
import { nextTick, ref, shallowRef } from 'vue'
|
||||
import { nextTick, reactive, ref, shallowRef } from 'vue'
|
||||
import type { Pinia } from 'pinia'
|
||||
import { getActivePinia } from 'pinia'
|
||||
|
||||
import { nodeToLoad3dMap, useLoad3d } from '@/composables/useLoad3d'
|
||||
import Load3d from '@/extensions/core/load3d/Load3d'
|
||||
@@ -9,6 +11,7 @@ import type { Size } from '@/lib/litegraph/src/interfaces'
|
||||
import type { LGraph } from '@/lib/litegraph/src/LGraph'
|
||||
import type { LGraphNode } from '@/lib/litegraph/src/LGraphNode'
|
||||
import type { IWidget } from '@/lib/litegraph/src/types/widgets'
|
||||
import { useCanvasStore } from '@/renderer/core/canvas/canvasStore'
|
||||
import { useToastStore } from '@/platform/updates/common/toastStore'
|
||||
import { api } from '@/scripts/api'
|
||||
import {
|
||||
@@ -59,6 +62,18 @@ vi.mock('@/i18n', () => ({
|
||||
t: vi.fn((key) => key)
|
||||
}))
|
||||
|
||||
vi.mock('pinia', async (importOriginal) => {
|
||||
const actual = await importOriginal()
|
||||
return {
|
||||
...(actual as Record<string, unknown>),
|
||||
getActivePinia: vi.fn(() => null)
|
||||
}
|
||||
})
|
||||
|
||||
vi.mock('@/renderer/core/canvas/canvasStore', () => ({
|
||||
useCanvasStore: vi.fn()
|
||||
}))
|
||||
|
||||
describe('useLoad3d', () => {
|
||||
let mockLoad3d: Partial<Load3d>
|
||||
let mockNode: LGraphNode
|
||||
@@ -67,6 +82,7 @@ describe('useLoad3d', () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
nodeToLoad3dMap.clear()
|
||||
vi.mocked(getActivePinia).mockReturnValue(null as unknown as Pinia)
|
||||
|
||||
mockNode = createMockLGraphNode({
|
||||
properties: {
|
||||
@@ -334,6 +350,73 @@ describe('useLoad3d', () => {
|
||||
|
||||
expect(composable.sceneConfig.value.backgroundColor).toBe('#000000')
|
||||
})
|
||||
|
||||
it('passes getZoomScale callback to createLoad3d', async () => {
|
||||
const composable = useLoad3d(mockNode)
|
||||
const containerRef = document.createElement('div')
|
||||
|
||||
await composable.initializeLoad3d(containerRef)
|
||||
|
||||
expect(createLoad3d).toHaveBeenCalledWith(
|
||||
containerRef,
|
||||
expect.objectContaining({ getZoomScale: expect.any(Function) })
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
describe('zoom watcher', () => {
|
||||
it('calls load3d.handleResize after debounce when canvas appScalePercentage changes', async () => {
|
||||
vi.useFakeTimers()
|
||||
|
||||
const canvasStore = reactive({ appScalePercentage: 100 })
|
||||
vi.mocked(getActivePinia).mockReturnValue({} as unknown as Pinia)
|
||||
vi.mocked(useCanvasStore).mockReturnValue(
|
||||
canvasStore as unknown as ReturnType<typeof useCanvasStore>
|
||||
)
|
||||
|
||||
const composable = useLoad3d(mockNode)
|
||||
const containerRef = document.createElement('div')
|
||||
await composable.initializeLoad3d(containerRef)
|
||||
|
||||
vi.mocked(mockLoad3d.handleResize!).mockClear()
|
||||
|
||||
canvasStore.appScalePercentage = 200
|
||||
await nextTick()
|
||||
expect(mockLoad3d.handleResize).not.toHaveBeenCalled()
|
||||
|
||||
vi.advanceTimersByTime(150)
|
||||
expect(mockLoad3d.handleResize).toHaveBeenCalledOnce()
|
||||
|
||||
vi.useRealTimers()
|
||||
})
|
||||
|
||||
it('debounces rapid zoom changes into a single handleResize call', async () => {
|
||||
vi.useFakeTimers()
|
||||
|
||||
const canvasStore = reactive({ appScalePercentage: 100 })
|
||||
vi.mocked(getActivePinia).mockReturnValue({} as unknown as Pinia)
|
||||
vi.mocked(useCanvasStore).mockReturnValue(
|
||||
canvasStore as unknown as ReturnType<typeof useCanvasStore>
|
||||
)
|
||||
|
||||
const composable = useLoad3d(mockNode)
|
||||
const containerRef = document.createElement('div')
|
||||
await composable.initializeLoad3d(containerRef)
|
||||
|
||||
vi.mocked(mockLoad3d.handleResize!).mockClear()
|
||||
|
||||
canvasStore.appScalePercentage = 150
|
||||
await nextTick()
|
||||
canvasStore.appScalePercentage = 200
|
||||
await nextTick()
|
||||
canvasStore.appScalePercentage = 250
|
||||
await nextTick()
|
||||
|
||||
vi.advanceTimersByTime(150)
|
||||
expect(mockLoad3d.handleResize).toHaveBeenCalledOnce()
|
||||
|
||||
vi.useRealTimers()
|
||||
})
|
||||
})
|
||||
|
||||
describe('preserves existing node callbacks through initializeLoad3d', () => {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { MaybeRef } from 'vue'
|
||||
|
||||
import { toRef } from '@vueuse/core'
|
||||
import { toRef, useDebounceFn } from '@vueuse/core'
|
||||
import { getActivePinia } from 'pinia'
|
||||
import { ref, toRaw, watch } from 'vue'
|
||||
|
||||
@@ -31,6 +31,7 @@ import type { LGraphNode } from '@/lib/litegraph/src/LGraphNode'
|
||||
import { LiteGraph } from '@/lib/litegraph/src/litegraph'
|
||||
import { useSettingStore } from '@/platform/settings/settingStore'
|
||||
import { useToastStore } from '@/platform/updates/common/toastStore'
|
||||
import { useCanvasStore } from '@/renderer/core/canvas/canvasStore'
|
||||
import { api } from '@/scripts/api'
|
||||
import { app } from '@/scripts/app'
|
||||
import { useLoad3dService } from '@/services/load3dService'
|
||||
@@ -44,6 +45,15 @@ export const useLoad3d = (nodeOrRef: MaybeRef<LGraphNode | null>) => {
|
||||
let load3d: Load3d | null = null
|
||||
let isFirstModelLoad = true
|
||||
|
||||
const debouncedHandleResize = useDebounceFn(() => {
|
||||
load3d?.handleResize()
|
||||
}, 150)
|
||||
|
||||
watch(
|
||||
() => (getActivePinia() ? useCanvasStore().appScalePercentage : 0),
|
||||
debouncedHandleResize
|
||||
)
|
||||
|
||||
const sceneConfig = ref<SceneConfig>({
|
||||
showGrid: true,
|
||||
backgroundColor: '#000000',
|
||||
@@ -132,6 +142,7 @@ export const useLoad3d = (nodeOrRef: MaybeRef<LGraphNode | null>) => {
|
||||
height: heightWidget.value as number
|
||||
})
|
||||
: undefined,
|
||||
getZoomScale: () => app.canvas?.ds?.scale ?? 1,
|
||||
onContextMenu: (event) => {
|
||||
const menuOptions = app.canvas.getNodeMenuOptions(node)
|
||||
new LiteGraph.ContextMenu(menuOptions, {
|
||||
|
||||
@@ -404,6 +404,23 @@ describe('useLoad3dViewer', () => {
|
||||
.intensity
|
||||
).toBe(1)
|
||||
})
|
||||
|
||||
it('should preserve unknown fields on Model Config when restoring', async () => {
|
||||
const viewer = useLoad3dViewer(mockNode)
|
||||
const containerRef = document.createElement('div')
|
||||
|
||||
await viewer.initializeViewer(containerRef, mockSourceLoad3d as Load3d)
|
||||
;(
|
||||
mockNode.properties!['Model Config'] as Record<string, unknown>
|
||||
).futureField = 'preserve-me'
|
||||
|
||||
viewer.restoreInitialState()
|
||||
|
||||
expect(
|
||||
(mockNode.properties!['Model Config'] as Record<string, unknown>)
|
||||
.futureField
|
||||
).toBe('preserve-me')
|
||||
})
|
||||
})
|
||||
|
||||
describe('applyChanges', () => {
|
||||
@@ -457,6 +474,23 @@ describe('useLoad3dViewer', () => {
|
||||
|
||||
expect(result).toBe(false)
|
||||
})
|
||||
|
||||
it('should preserve unknown fields on Model Config when applying', async () => {
|
||||
const viewer = useLoad3dViewer(mockNode)
|
||||
const containerRef = document.createElement('div')
|
||||
|
||||
await viewer.initializeViewer(containerRef, mockSourceLoad3d as Load3d)
|
||||
;(
|
||||
mockNode.properties!['Model Config'] as Record<string, unknown>
|
||||
).futureField = 'preserve-me'
|
||||
|
||||
await viewer.applyChanges()
|
||||
|
||||
expect(
|
||||
(mockNode.properties!['Model Config'] as Record<string, unknown>)
|
||||
.futureField
|
||||
).toBe('preserve-me')
|
||||
})
|
||||
})
|
||||
|
||||
describe('refreshViewport', () => {
|
||||
|
||||
@@ -619,7 +619,11 @@ export const useLoad3dViewer = (node?: LGraphNode) => {
|
||||
intensity: initialState.value.lightIntensity
|
||||
}
|
||||
|
||||
const existingModelConfig = nodeValue.properties['Model Config'] as
|
||||
| ModelConfig
|
||||
| undefined
|
||||
nodeValue.properties['Model Config'] = {
|
||||
...existingModelConfig,
|
||||
upDirection: initialState.value.upDirection,
|
||||
materialMode: initialState.value.materialMode,
|
||||
gizmo: {
|
||||
@@ -671,10 +675,13 @@ export const useLoad3dViewer = (node?: LGraphNode) => {
|
||||
}
|
||||
|
||||
const gizmoTransform = load3d.getGizmoTransform()
|
||||
const existingModelConfig = nodeValue.properties['Model Config'] as
|
||||
| ModelConfig
|
||||
| undefined
|
||||
nodeValue.properties['Model Config'] = {
|
||||
...existingModelConfig,
|
||||
upDirection: upDirection.value,
|
||||
materialMode: materialMode.value,
|
||||
showSkeleton: false,
|
||||
gizmo: {
|
||||
enabled: gizmoEnabled.value,
|
||||
mode: gizmoMode.value,
|
||||
|
||||
83
src/composables/usePreventFocusLoss.test.ts
Normal file
83
src/composables/usePreventFocusLoss.test.ts
Normal file
@@ -0,0 +1,83 @@
|
||||
import { afterEach, describe, expect, it } from 'vitest'
|
||||
import { effectScope, ref } from 'vue'
|
||||
|
||||
import { usePreventFocusLoss } from './usePreventFocusLoss'
|
||||
|
||||
function setup(container: HTMLElement) {
|
||||
const scope = effectScope()
|
||||
const containerRef = ref<HTMLElement | null>(container)
|
||||
scope.run(() => usePreventFocusLoss(containerRef))
|
||||
document.body.appendChild(container)
|
||||
return () => {
|
||||
scope.stop()
|
||||
container.remove()
|
||||
}
|
||||
}
|
||||
|
||||
function fireMousedown(el: Element): MouseEvent {
|
||||
const event = new MouseEvent('mousedown', { bubbles: true, cancelable: true })
|
||||
el.dispatchEvent(event)
|
||||
return event
|
||||
}
|
||||
|
||||
describe('usePreventFocusLoss', () => {
|
||||
let teardown: () => void
|
||||
|
||||
afterEach(() => teardown?.())
|
||||
|
||||
it('prevents default on mousedown for a plain div (stops focus theft)', () => {
|
||||
const container = document.createElement('div')
|
||||
const inner = document.createElement('div')
|
||||
container.appendChild(inner)
|
||||
teardown = setup(container)
|
||||
|
||||
const event = fireMousedown(inner)
|
||||
|
||||
expect(event.defaultPrevented).toBe(true)
|
||||
})
|
||||
|
||||
it('prevents default when clicking a button (click still fires, focus stays on canvas)', () => {
|
||||
const container = document.createElement('div')
|
||||
const btn = document.createElement('button')
|
||||
container.appendChild(btn)
|
||||
teardown = setup(container)
|
||||
|
||||
const event = fireMousedown(btn)
|
||||
|
||||
expect(event.defaultPrevented).toBe(true)
|
||||
})
|
||||
|
||||
it('does not prevent default when clicking an input', () => {
|
||||
const container = document.createElement('div')
|
||||
const input = document.createElement('input')
|
||||
container.appendChild(input)
|
||||
teardown = setup(container)
|
||||
|
||||
const event = fireMousedown(input)
|
||||
|
||||
expect(event.defaultPrevented).toBe(false)
|
||||
})
|
||||
|
||||
it('does not prevent default when clicking a textarea', () => {
|
||||
const container = document.createElement('div')
|
||||
const textarea = document.createElement('textarea')
|
||||
container.appendChild(textarea)
|
||||
teardown = setup(container)
|
||||
|
||||
const event = fireMousedown(textarea)
|
||||
|
||||
expect(event.defaultPrevented).toBe(false)
|
||||
})
|
||||
|
||||
it('does not prevent default when clicking a contenteditable element', () => {
|
||||
const container = document.createElement('div')
|
||||
const editable = document.createElement('div')
|
||||
editable.contentEditable = 'true'
|
||||
container.appendChild(editable)
|
||||
teardown = setup(container)
|
||||
|
||||
const event = fireMousedown(editable)
|
||||
|
||||
expect(event.defaultPrevented).toBe(false)
|
||||
})
|
||||
})
|
||||
22
src/composables/usePreventFocusLoss.ts
Normal file
22
src/composables/usePreventFocusLoss.ts
Normal file
@@ -0,0 +1,22 @@
|
||||
import type { Ref } from 'vue'
|
||||
import { useEventListener } from '@vueuse/core'
|
||||
|
||||
const FOCUS_ACCEPTING_SELECTOR =
|
||||
'input, textarea, select, [contenteditable="true"]'
|
||||
|
||||
/**
|
||||
* Prevents non-interactive areas of a container from stealing keyboard focus
|
||||
* away from the canvas. Call this on "passive" UI regions (tab bar, sidebar
|
||||
* icon strip) so that canvas keybindings remain active after the user clicks
|
||||
* within those regions.
|
||||
*
|
||||
* Focus is still allowed to move when the user clicks a genuine text-entry
|
||||
* element (input, textarea, contenteditable).
|
||||
*/
|
||||
export function usePreventFocusLoss(el: Ref<HTMLElement | null | undefined>) {
|
||||
useEventListener(el, 'mousedown', (event: MouseEvent) => {
|
||||
if (!(event.target as HTMLElement).closest(FOCUS_ACCEPTING_SELECTOR)) {
|
||||
event.preventDefault()
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -1,7 +1,9 @@
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
|
||||
|
||||
import type Load3d from '@/extensions/core/load3d/Load3d'
|
||||
import Load3DConfiguration from '@/extensions/core/load3d/Load3DConfiguration'
|
||||
import Load3DConfiguration, {
|
||||
parseAnnotatedFilename
|
||||
} from '@/extensions/core/load3d/Load3DConfiguration'
|
||||
import Load3dUtils from '@/extensions/core/load3d/Load3dUtils'
|
||||
import type {
|
||||
GizmoConfig,
|
||||
@@ -249,3 +251,47 @@ describe('Load3DConfiguration.silentOnNotFound propagation', () => {
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('parseAnnotatedFilename', () => {
|
||||
it('strips a [output] suffix and switches to the output folder', () => {
|
||||
expect(parseAnnotatedFilename('foo.glb [output]', 'input')).toEqual({
|
||||
filename: 'foo.glb',
|
||||
folder: 'output'
|
||||
})
|
||||
})
|
||||
|
||||
it('strips a [input] suffix and switches to the input folder', () => {
|
||||
expect(parseAnnotatedFilename('sub/foo.glb [input]', 'output')).toEqual({
|
||||
filename: 'sub/foo.glb',
|
||||
folder: 'input'
|
||||
})
|
||||
})
|
||||
|
||||
it('strips a [temp] suffix and switches to the temp folder', () => {
|
||||
expect(parseAnnotatedFilename('foo.glb [temp]', 'input')).toEqual({
|
||||
filename: 'foo.glb',
|
||||
folder: 'temp'
|
||||
})
|
||||
})
|
||||
|
||||
it('returns the value unchanged with the fallback folder when unannotated', () => {
|
||||
expect(parseAnnotatedFilename('foo.glb', 'input')).toEqual({
|
||||
filename: 'foo.glb',
|
||||
folder: 'input'
|
||||
})
|
||||
})
|
||||
|
||||
it('does not strip a non-folder annotation', () => {
|
||||
expect(parseAnnotatedFilename('foo.glb [draft]', 'input')).toEqual({
|
||||
filename: 'foo.glb [draft]',
|
||||
folder: 'input'
|
||||
})
|
||||
})
|
||||
|
||||
it('only matches a trailing annotation, not one in the middle', () => {
|
||||
expect(parseAnnotatedFilename('foo [output] bar.glb', 'input')).toEqual({
|
||||
filename: 'foo [output] bar.glb',
|
||||
folder: 'input'
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -24,6 +24,20 @@ type Load3DConfigurationSettings = {
|
||||
silentOnNotFound?: boolean
|
||||
}
|
||||
|
||||
const ANNOTATED_FILENAME_PATTERN = / \[(input|output|temp)\]$/
|
||||
|
||||
export function parseAnnotatedFilename(
|
||||
rawValue: string,
|
||||
fallbackFolder: string
|
||||
): { filename: string; folder: string } {
|
||||
const match = ANNOTATED_FILENAME_PATTERN.exec(rawValue)
|
||||
if (!match) return { filename: rawValue, folder: fallbackFolder }
|
||||
return {
|
||||
filename: rawValue.slice(0, match.index),
|
||||
folder: match[1]
|
||||
}
|
||||
}
|
||||
|
||||
class Load3DConfiguration {
|
||||
constructor(
|
||||
private load3d: Load3d,
|
||||
@@ -268,14 +282,17 @@ class Load3DConfiguration {
|
||||
return async (value: string | number | boolean | object) => {
|
||||
if (!value) return
|
||||
|
||||
const filename = value as string
|
||||
const { filename, folder } = parseAnnotatedFilename(
|
||||
value as string,
|
||||
loadFolder
|
||||
)
|
||||
|
||||
this.setResourceFolder(filename)
|
||||
|
||||
const modelUrl = api.apiURL(
|
||||
Load3dUtils.getResourceURL(
|
||||
...Load3dUtils.splitFilePath(filename),
|
||||
loadFolder
|
||||
folder
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
@@ -280,7 +280,7 @@ describe('Load3d', () => {
|
||||
const sceneResize = vi.fn()
|
||||
|
||||
Object.assign(ctx.load3d, {
|
||||
renderer: { domElement: canvas, setSize },
|
||||
renderer: { domElement: canvas, setSize, setPixelRatio: vi.fn() },
|
||||
targetWidth: 400,
|
||||
targetHeight: 200,
|
||||
targetAspectRatio: 2,
|
||||
@@ -383,6 +383,70 @@ describe('Load3d', () => {
|
||||
expect(args[2]).toBe(800)
|
||||
expect(args[3]).toBe(400)
|
||||
})
|
||||
|
||||
it('handleResize calls setPixelRatio with the value returned by getZoomScaleCallback', () => {
|
||||
delete (ctx.load3d as { handleResize?: unknown }).handleResize
|
||||
|
||||
const parent = document.createElement('div')
|
||||
Object.defineProperty(parent, 'clientWidth', {
|
||||
value: 400,
|
||||
configurable: true
|
||||
})
|
||||
Object.defineProperty(parent, 'clientHeight', {
|
||||
value: 400,
|
||||
configurable: true
|
||||
})
|
||||
const canvas = document.createElement('canvas')
|
||||
parent.appendChild(canvas)
|
||||
|
||||
const setPixelRatio = vi.fn()
|
||||
|
||||
Object.assign(ctx.load3d, {
|
||||
renderer: { domElement: canvas, setSize: vi.fn(), setPixelRatio },
|
||||
getZoomScaleCallback: () => 2.5,
|
||||
targetWidth: 0,
|
||||
targetHeight: 0,
|
||||
isViewerMode: false,
|
||||
cameraManager: { ...ctx.cameraManager, handleResize: vi.fn() },
|
||||
sceneManager: { ...ctx.sceneManager, handleResize: vi.fn() }
|
||||
})
|
||||
|
||||
ctx.load3d.handleResize()
|
||||
|
||||
expect(setPixelRatio).toHaveBeenCalledWith(2.5)
|
||||
})
|
||||
|
||||
it('handleResize defaults to pixelRatio 1 when no getZoomScaleCallback is provided', () => {
|
||||
delete (ctx.load3d as { handleResize?: unknown }).handleResize
|
||||
|
||||
const parent = document.createElement('div')
|
||||
Object.defineProperty(parent, 'clientWidth', {
|
||||
value: 400,
|
||||
configurable: true
|
||||
})
|
||||
Object.defineProperty(parent, 'clientHeight', {
|
||||
value: 400,
|
||||
configurable: true
|
||||
})
|
||||
const canvas = document.createElement('canvas')
|
||||
parent.appendChild(canvas)
|
||||
|
||||
const setPixelRatio = vi.fn()
|
||||
|
||||
Object.assign(ctx.load3d, {
|
||||
renderer: { domElement: canvas, setSize: vi.fn(), setPixelRatio },
|
||||
getZoomScaleCallback: undefined,
|
||||
targetWidth: 0,
|
||||
targetHeight: 0,
|
||||
isViewerMode: false,
|
||||
cameraManager: { ...ctx.cameraManager, handleResize: vi.fn() },
|
||||
sceneManager: { ...ctx.sceneManager, handleResize: vi.fn() }
|
||||
})
|
||||
|
||||
ctx.load3d.handleResize()
|
||||
|
||||
expect(setPixelRatio).toHaveBeenCalledWith(1)
|
||||
})
|
||||
})
|
||||
|
||||
describe('render loop wiring', () => {
|
||||
|
||||
@@ -102,6 +102,7 @@ class Load3d {
|
||||
|
||||
private disposeContextMenuGuard: (() => void) | null = null
|
||||
private resizeObserver: ResizeObserver | null = null
|
||||
private getZoomScaleCallback: (() => number) | undefined
|
||||
|
||||
constructor(
|
||||
container: Element | HTMLElement,
|
||||
@@ -112,6 +113,7 @@ class Load3d {
|
||||
this.isViewerMode = options.isViewerMode || false
|
||||
this.onContextMenuCallback = options.onContextMenu
|
||||
this.getDimensionsCallback = options.getDimensions
|
||||
this.getZoomScaleCallback = options.getZoomScale
|
||||
|
||||
if (options.width && options.height) {
|
||||
this.targetWidth = options.width
|
||||
@@ -645,6 +647,11 @@ class Load3d {
|
||||
const containerWidth = parentElement.clientWidth
|
||||
const containerHeight = parentElement.clientHeight
|
||||
|
||||
// Scale pixel density to match the graph zoom level so the 3D scene
|
||||
// renders at the correct resolution when the canvas is zoomed in or out.
|
||||
const zoomScale = this.getZoomScaleCallback?.() ?? 1
|
||||
this.renderer.setPixelRatio(Math.min(zoomScale, 3))
|
||||
|
||||
if (this.getDimensionsCallback) {
|
||||
const dims = this.getDimensionsCallback()
|
||||
if (dims) {
|
||||
|
||||
@@ -23,8 +23,15 @@ import type {
|
||||
*/
|
||||
function isNotFoundError(error: unknown): boolean {
|
||||
if (!(error instanceof Error)) return false
|
||||
const withResponse = error as Error & { response?: { status?: number } }
|
||||
if (withResponse.response?.status === 404) return true
|
||||
if (
|
||||
'response' in error &&
|
||||
typeof error.response === 'object' &&
|
||||
error.response !== null &&
|
||||
'status' in error.response &&
|
||||
error.response.status === 404
|
||||
) {
|
||||
return true
|
||||
}
|
||||
return /\b404\b/.test(error.message)
|
||||
}
|
||||
|
||||
|
||||
@@ -25,6 +25,37 @@ vi.mock('three', async (importOriginal) => {
|
||||
return { ...actual, TextureLoader: StubTextureLoader }
|
||||
})
|
||||
|
||||
vi.mock('three/examples/jsm/controls/OrbitControls', () => {
|
||||
class OrbitControls {}
|
||||
return { OrbitControls }
|
||||
})
|
||||
|
||||
function makeMockRenderer(pixelRatio = 1): THREE.WebGLRenderer {
|
||||
const domElement = {
|
||||
toDataURL: vi.fn().mockReturnValue('data:image/png;base64,abc'),
|
||||
clientWidth: 400,
|
||||
clientHeight: 300
|
||||
}
|
||||
return {
|
||||
domElement,
|
||||
outputColorSpace: THREE.SRGBColorSpace,
|
||||
toneMapping: THREE.ACESFilmicToneMapping,
|
||||
toneMappingExposure: 1,
|
||||
getSize: vi.fn((v: THREE.Vector2) => {
|
||||
v.set(400, 300)
|
||||
return v
|
||||
}),
|
||||
getPixelRatio: vi.fn().mockReturnValue(pixelRatio),
|
||||
getClearColor: vi.fn((c: THREE.Color) => c),
|
||||
getClearAlpha: vi.fn().mockReturnValue(0),
|
||||
setPixelRatio: vi.fn(),
|
||||
setSize: vi.fn(),
|
||||
setClearColor: vi.fn(),
|
||||
clear: vi.fn(),
|
||||
render: vi.fn()
|
||||
} as unknown as THREE.WebGLRenderer
|
||||
}
|
||||
|
||||
function makeMockEventManager() {
|
||||
return {
|
||||
addEventListener: vi.fn(),
|
||||
@@ -50,6 +81,12 @@ function makeRenderer() {
|
||||
domElement: canvas,
|
||||
setClearColor: vi.fn(),
|
||||
setSize: vi.fn(),
|
||||
getSize: vi.fn((v: THREE.Vector2) => {
|
||||
v.set(800, 600)
|
||||
return v
|
||||
}),
|
||||
getPixelRatio: vi.fn().mockReturnValue(1),
|
||||
setPixelRatio: vi.fn(),
|
||||
render: vi.fn(),
|
||||
clear: vi.fn(),
|
||||
getClearColor: vi.fn().mockReturnValue(new THREE.Color(0xffffff)),
|
||||
@@ -544,3 +581,90 @@ describe('SceneManager', () => {
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
function makeSceneManager(
|
||||
pixelRatio = 1,
|
||||
cameraOverride?: THREE.PerspectiveCamera | THREE.OrthographicCamera
|
||||
) {
|
||||
const renderer = makeMockRenderer(pixelRatio)
|
||||
const camera = cameraOverride ?? new THREE.PerspectiveCamera()
|
||||
const eventManager = makeMockEventManager()
|
||||
const manager = new SceneManager(
|
||||
renderer,
|
||||
() => camera,
|
||||
vi.fn() as unknown as () => InstanceType<
|
||||
typeof import('three/examples/jsm/controls/OrbitControls').OrbitControls
|
||||
>,
|
||||
eventManager
|
||||
)
|
||||
return { manager, renderer, camera, eventManager }
|
||||
}
|
||||
|
||||
describe('SceneManager.captureScene', () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
})
|
||||
|
||||
it('resolves with scene, mask, and normal data URLs', async () => {
|
||||
const { manager } = makeSceneManager()
|
||||
const result = await manager.captureScene(800, 600)
|
||||
expect(result.scene).toContain('data:image/png')
|
||||
expect(result.mask).toContain('data:image/png')
|
||||
expect(result.normal).toContain('data:image/png')
|
||||
})
|
||||
|
||||
it('forces pixel ratio to 1 before rendering regardless of original value', async () => {
|
||||
const { manager, renderer } = makeSceneManager(2)
|
||||
await manager.captureScene(800, 600)
|
||||
expect(vi.mocked(renderer.setPixelRatio).mock.calls[0]).toEqual([1])
|
||||
})
|
||||
|
||||
it('restores original pixel ratio after capture completes', async () => {
|
||||
const originalPixelRatio = 3
|
||||
const { manager, renderer } = makeSceneManager(originalPixelRatio)
|
||||
await manager.captureScene(800, 600)
|
||||
const calls = vi.mocked(renderer.setPixelRatio).mock.calls
|
||||
expect(calls.at(-1)).toEqual([originalPixelRatio])
|
||||
})
|
||||
|
||||
it('renders at requested capture dimensions', async () => {
|
||||
const { manager, renderer } = makeSceneManager()
|
||||
await manager.captureScene(1920, 1080)
|
||||
expect(vi.mocked(renderer.setSize).mock.calls[0]).toEqual([1920, 1080])
|
||||
})
|
||||
|
||||
it('restores original renderer size after capture', async () => {
|
||||
const { manager, renderer } = makeSceneManager()
|
||||
await manager.captureScene(1920, 1080)
|
||||
const calls = vi.mocked(renderer.setSize).mock.calls
|
||||
expect(calls.at(-1)).toEqual([400, 300])
|
||||
})
|
||||
|
||||
it('restores perspective camera aspect after capture', async () => {
|
||||
const camera = new THREE.PerspectiveCamera(75, 1, 0.1, 1000)
|
||||
const { manager } = makeSceneManager(1, camera)
|
||||
const originalAspect = camera.aspect
|
||||
await manager.captureScene(800, 600)
|
||||
expect(camera.aspect).toBe(originalAspect)
|
||||
})
|
||||
|
||||
it('restores orthographic camera bounds after capture', async () => {
|
||||
const camera = new THREE.OrthographicCamera(-5, 5, 5, -5, 0.1, 1000)
|
||||
const { manager } = makeSceneManager(1, camera)
|
||||
await manager.captureScene(800, 600)
|
||||
expect(camera.left).toBe(-5)
|
||||
expect(camera.right).toBe(5)
|
||||
expect(camera.top).toBe(5)
|
||||
expect(camera.bottom).toBe(-5)
|
||||
})
|
||||
|
||||
it('disposes each temporary MeshNormalMaterial after the normal pass', async () => {
|
||||
const { manager } = makeSceneManager()
|
||||
manager.scene.add(
|
||||
new THREE.Mesh(new THREE.BoxGeometry(), new THREE.MeshBasicMaterial())
|
||||
)
|
||||
const disposeSpy = vi.spyOn(THREE.MeshNormalMaterial.prototype, 'dispose')
|
||||
await manager.captureScene(800, 600)
|
||||
expect(disposeSpy).toHaveBeenCalledOnce()
|
||||
})
|
||||
})
|
||||
|
||||
@@ -332,120 +332,138 @@ export class SceneManager implements SceneManagerInterface {
|
||||
}
|
||||
}
|
||||
|
||||
captureScene(
|
||||
async captureScene(
|
||||
width: number,
|
||||
height: number
|
||||
): Promise<{ scene: string; mask: string; normal: string }> {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
try {
|
||||
const originalWidth = this.renderer.domElement.width
|
||||
const originalHeight = this.renderer.domElement.height
|
||||
const originalClearColor = this.renderer.getClearColor(
|
||||
new THREE.Color()
|
||||
)
|
||||
const originalClearAlpha = this.renderer.getClearAlpha()
|
||||
const originalOutputColorSpace = this.renderer.outputColorSpace
|
||||
const originalSize = new THREE.Vector2()
|
||||
this.renderer.getSize(originalSize)
|
||||
const originalPixelRatio = this.renderer.getPixelRatio()
|
||||
const originalClearColor = this.renderer.getClearColor(new THREE.Color())
|
||||
const originalClearAlpha = this.renderer.getClearAlpha()
|
||||
const originalOutputColorSpace = this.renderer.outputColorSpace
|
||||
|
||||
this.renderer.setSize(width, height)
|
||||
|
||||
if (this.getActiveCamera() instanceof THREE.PerspectiveCamera) {
|
||||
const perspectiveCamera =
|
||||
this.getActiveCamera() as THREE.PerspectiveCamera
|
||||
|
||||
perspectiveCamera.aspect = width / height
|
||||
perspectiveCamera.updateProjectionMatrix()
|
||||
} else {
|
||||
const orthographicCamera =
|
||||
this.getActiveCamera() as THREE.OrthographicCamera
|
||||
|
||||
const frustumSize = 10
|
||||
const aspect = width / height
|
||||
|
||||
orthographicCamera.left = (-frustumSize * aspect) / 2
|
||||
orthographicCamera.right = (frustumSize * aspect) / 2
|
||||
orthographicCamera.top = frustumSize / 2
|
||||
orthographicCamera.bottom = -frustumSize / 2
|
||||
|
||||
orthographicCamera.updateProjectionMatrix()
|
||||
}
|
||||
|
||||
if (
|
||||
this.backgroundTexture &&
|
||||
this.backgroundMesh &&
|
||||
this.currentBackgroundType === 'image'
|
||||
) {
|
||||
this.updateBackgroundSize(
|
||||
this.backgroundTexture,
|
||||
this.backgroundMesh,
|
||||
width,
|
||||
height
|
||||
)
|
||||
}
|
||||
|
||||
const originalMaterials = new Map<
|
||||
THREE.Mesh,
|
||||
THREE.Material | THREE.Material[]
|
||||
>()
|
||||
|
||||
this.renderer.clear()
|
||||
this.renderBackground()
|
||||
this.renderer.render(this.scene, this.getActiveCamera())
|
||||
const sceneData = this.renderer.domElement.toDataURL('image/png')
|
||||
|
||||
this.renderer.setClearColor(0x000000, 0)
|
||||
this.renderer.clear()
|
||||
this.renderer.render(this.scene, this.getActiveCamera())
|
||||
const maskData = this.renderer.domElement.toDataURL('image/png')
|
||||
|
||||
this.scene.traverse((child) => {
|
||||
if (child instanceof THREE.Mesh) {
|
||||
originalMaterials.set(child, child.material)
|
||||
|
||||
child.material = new THREE.MeshNormalMaterial({
|
||||
flatShading: false,
|
||||
side: THREE.DoubleSide,
|
||||
normalScale: new THREE.Vector2(1, 1)
|
||||
})
|
||||
const activeCamera = this.getActiveCamera()
|
||||
const savedCameraParams =
|
||||
activeCamera instanceof THREE.PerspectiveCamera
|
||||
? { type: 'perspective' as const, aspect: activeCamera.aspect }
|
||||
: {
|
||||
type: 'orthographic' as const,
|
||||
left: (activeCamera as THREE.OrthographicCamera).left,
|
||||
right: (activeCamera as THREE.OrthographicCamera).right,
|
||||
top: (activeCamera as THREE.OrthographicCamera).top,
|
||||
bottom: (activeCamera as THREE.OrthographicCamera).bottom
|
||||
}
|
||||
})
|
||||
|
||||
const gridVisible = this.gridHelper.visible
|
||||
this.gridHelper.visible = false
|
||||
const originalMaterials = new Map<
|
||||
THREE.Mesh,
|
||||
THREE.Material | THREE.Material[]
|
||||
>()
|
||||
const tempMaterials: THREE.MeshNormalMaterial[] = []
|
||||
const gridVisible = this.gridHelper.visible
|
||||
|
||||
this.renderer.setClearColor(0x000000, 1)
|
||||
this.renderer.clear()
|
||||
this.renderer.render(this.scene, this.getActiveCamera())
|
||||
const normalData = this.renderer.domElement.toDataURL('image/png')
|
||||
try {
|
||||
// Capture at exactly the requested pixel dimensions, independent of
|
||||
// the current zoom-driven pixel ratio.
|
||||
this.renderer.setPixelRatio(1)
|
||||
this.renderer.setSize(width, height)
|
||||
|
||||
this.scene.traverse((child) => {
|
||||
if (child instanceof THREE.Mesh) {
|
||||
const originalMaterial = originalMaterials.get(child)
|
||||
if (originalMaterial) {
|
||||
child.material = originalMaterial
|
||||
}
|
||||
}
|
||||
})
|
||||
if (activeCamera instanceof THREE.PerspectiveCamera) {
|
||||
activeCamera.aspect = width / height
|
||||
activeCamera.updateProjectionMatrix()
|
||||
} else {
|
||||
const orthographicCamera = activeCamera as THREE.OrthographicCamera
|
||||
|
||||
this.renderer.setClearColor(0xffffff, 1)
|
||||
this.renderer.clear()
|
||||
const frustumSize = 10
|
||||
const aspect = width / height
|
||||
|
||||
this.gridHelper.visible = gridVisible
|
||||
orthographicCamera.left = (-frustumSize * aspect) / 2
|
||||
orthographicCamera.right = (frustumSize * aspect) / 2
|
||||
orthographicCamera.top = frustumSize / 2
|
||||
orthographicCamera.bottom = -frustumSize / 2
|
||||
|
||||
this.renderer.setClearColor(originalClearColor, originalClearAlpha)
|
||||
this.renderer.setSize(originalWidth, originalHeight)
|
||||
this.renderer.outputColorSpace = originalOutputColorSpace
|
||||
|
||||
this.handleResize(originalWidth, originalHeight)
|
||||
|
||||
resolve({
|
||||
scene: sceneData,
|
||||
mask: maskData,
|
||||
normal: normalData
|
||||
})
|
||||
} catch (error) {
|
||||
reject(error)
|
||||
orthographicCamera.updateProjectionMatrix()
|
||||
}
|
||||
})
|
||||
|
||||
if (
|
||||
this.backgroundTexture &&
|
||||
this.backgroundMesh &&
|
||||
this.currentBackgroundType === 'image'
|
||||
) {
|
||||
this.updateBackgroundSize(
|
||||
this.backgroundTexture,
|
||||
this.backgroundMesh,
|
||||
width,
|
||||
height
|
||||
)
|
||||
}
|
||||
|
||||
this.renderer.clear()
|
||||
this.renderBackground()
|
||||
this.renderer.render(this.scene, activeCamera)
|
||||
const sceneData = this.renderer.domElement.toDataURL('image/png')
|
||||
|
||||
this.renderer.setClearColor(0x000000, 0)
|
||||
this.renderer.clear()
|
||||
this.renderer.render(this.scene, activeCamera)
|
||||
const maskData = this.renderer.domElement.toDataURL('image/png')
|
||||
|
||||
this.scene.traverse((child) => {
|
||||
if (child instanceof THREE.Mesh) {
|
||||
originalMaterials.set(child, child.material)
|
||||
|
||||
const tempMaterial = new THREE.MeshNormalMaterial({
|
||||
flatShading: false,
|
||||
side: THREE.DoubleSide,
|
||||
normalScale: new THREE.Vector2(1, 1)
|
||||
})
|
||||
tempMaterials.push(tempMaterial)
|
||||
child.material = tempMaterial
|
||||
}
|
||||
})
|
||||
|
||||
this.gridHelper.visible = false
|
||||
|
||||
this.renderer.setClearColor(0x000000, 1)
|
||||
this.renderer.clear()
|
||||
this.renderer.render(this.scene, activeCamera)
|
||||
const normalData = this.renderer.domElement.toDataURL('image/png')
|
||||
|
||||
this.renderer.setClearColor(0xffffff, 1)
|
||||
this.renderer.clear()
|
||||
|
||||
return { scene: sceneData, mask: maskData, normal: normalData }
|
||||
} finally {
|
||||
this.scene.traverse((child) => {
|
||||
if (child instanceof THREE.Mesh) {
|
||||
const originalMaterial = originalMaterials.get(child)
|
||||
if (originalMaterial) {
|
||||
child.material = originalMaterial
|
||||
}
|
||||
}
|
||||
})
|
||||
for (const mat of tempMaterials) {
|
||||
mat.dispose()
|
||||
}
|
||||
this.gridHelper.visible = gridVisible
|
||||
if (savedCameraParams.type === 'perspective') {
|
||||
const persp = activeCamera as THREE.PerspectiveCamera
|
||||
persp.aspect = savedCameraParams.aspect
|
||||
persp.updateProjectionMatrix()
|
||||
} else {
|
||||
const ortho = activeCamera as THREE.OrthographicCamera
|
||||
ortho.left = savedCameraParams.left
|
||||
ortho.right = savedCameraParams.right
|
||||
ortho.top = savedCameraParams.top
|
||||
ortho.bottom = savedCameraParams.bottom
|
||||
ortho.updateProjectionMatrix()
|
||||
}
|
||||
this.renderer.setClearColor(originalClearColor, originalClearAlpha)
|
||||
this.renderer.setPixelRatio(originalPixelRatio)
|
||||
this.renderer.setSize(originalSize.x, originalSize.y)
|
||||
this.renderer.outputColorSpace = originalOutputColorSpace
|
||||
this.handleResize(originalSize.x, originalSize.y)
|
||||
}
|
||||
}
|
||||
|
||||
reset(): void {}
|
||||
|
||||
@@ -211,20 +211,40 @@ describe('SceneModelManager', () => {
|
||||
expect(setupCamera).toHaveBeenCalled()
|
||||
})
|
||||
|
||||
it('does not skip materialMode when it differs from original', async () => {
|
||||
it('reapplies non-original materialMode after snapshotting', async () => {
|
||||
const { manager } = createManager()
|
||||
const model = createMeshModel()
|
||||
|
||||
// setupModel checks materialMode !== 'original' and calls
|
||||
// setMaterialMode, but the guard `mode === this.materialMode`
|
||||
// causes it to no-op. Then setupModelMaterials resets to 'original'.
|
||||
// setupModel calls setupModelMaterials first (which internally calls
|
||||
// setMaterialMode('original') to reset), then reapplies the stored mode.
|
||||
manager.materialMode = 'wireframe'
|
||||
const spy = vi.spyOn(manager, 'setMaterialMode')
|
||||
await manager.setupModel(model)
|
||||
|
||||
// setMaterialMode is called with the stored mode and then 'original'
|
||||
expect(spy).toHaveBeenCalledWith('wireframe')
|
||||
expect(spy).toHaveBeenCalledWith('original')
|
||||
expect(spy).toHaveBeenCalledWith('wireframe')
|
||||
// The final material mode visible on the mesh should be wireframe.
|
||||
const mesh = model.children[0] as THREE.Mesh
|
||||
expect((mesh.material as THREE.MeshBasicMaterial).wireframe).toBe(true)
|
||||
})
|
||||
|
||||
it('snapshots original materials before applying materialMode so restore is correct', async () => {
|
||||
const { manager } = createManager()
|
||||
const model = createMeshModel()
|
||||
const mesh = model.children[0] as THREE.Mesh
|
||||
const originalMat = mesh.material
|
||||
|
||||
// Set a non-original mode before loading — this was the bug:
|
||||
// originalMaterials would capture the wireframe material instead of the real one.
|
||||
manager.materialMode = 'wireframe'
|
||||
await manager.setupModel(model)
|
||||
|
||||
// The snapshot must hold the *pre-mutation* material.
|
||||
expect(manager.originalMaterials.get(mesh)).toBe(originalMat)
|
||||
|
||||
// Restoring to 'original' must give back the true original, not wireframe.
|
||||
manager.setMaterialMode('original')
|
||||
expect(mesh.material).toBe(originalMat)
|
||||
})
|
||||
|
||||
it('applies current up direction if not original', async () => {
|
||||
@@ -679,6 +699,80 @@ describe('SceneModelManager', () => {
|
||||
})
|
||||
})
|
||||
|
||||
describe('fitToViewer', () => {
|
||||
it('does nothing when no current model', () => {
|
||||
const { manager, setupCamera, setupGizmo } = createManager()
|
||||
|
||||
manager.fitToViewer()
|
||||
|
||||
expect(setupCamera).not.toHaveBeenCalled()
|
||||
expect(setupGizmo).not.toHaveBeenCalled()
|
||||
})
|
||||
|
||||
it('reapplies currentUpDirection after fitting', async () => {
|
||||
const { manager, eventManager } = createManager()
|
||||
const model = createMeshModel()
|
||||
await manager.setupModel(model)
|
||||
|
||||
manager.setUpDirection('+z')
|
||||
vi.mocked(eventManager.emitEvent).mockClear()
|
||||
|
||||
manager.fitToViewer()
|
||||
|
||||
// rotation.x should reflect +z direction (-PI/2) applied to the post-fit base (0,0,0)
|
||||
expect(model.rotation.x).toBeCloseTo(-Math.PI / 2)
|
||||
expect(eventManager.emitEvent).toHaveBeenCalledWith(
|
||||
'upDirectionChange',
|
||||
'+z'
|
||||
)
|
||||
})
|
||||
|
||||
it('does not compound rotations when fitToViewer is called multiple times', async () => {
|
||||
const { manager } = createManager()
|
||||
const model = createMeshModel()
|
||||
await manager.setupModel(model)
|
||||
|
||||
manager.setUpDirection('-x')
|
||||
|
||||
manager.fitToViewer()
|
||||
const rotationAfterFirst = model.rotation.z
|
||||
|
||||
manager.fitToViewer()
|
||||
expect(model.rotation.z).toBeCloseTo(rotationAfterFirst)
|
||||
})
|
||||
|
||||
it('leaves rotation at zero when currentUpDirection is original', async () => {
|
||||
const { manager } = createManager()
|
||||
const model = createMeshModel()
|
||||
await manager.setupModel(model)
|
||||
|
||||
manager.fitToViewer()
|
||||
|
||||
expect(model.rotation.x).toBeCloseTo(0)
|
||||
expect(model.rotation.y).toBeCloseTo(0)
|
||||
expect(model.rotation.z).toBeCloseTo(0)
|
||||
})
|
||||
|
||||
it('does not compound rotation when fitToViewer is called after manual rotation override', async () => {
|
||||
const { manager } = createManager()
|
||||
const model = createMeshModel()
|
||||
await manager.setupModel(model)
|
||||
|
||||
// Set an up direction, then manually override originalRotation to simulate
|
||||
// a prior state where the base rotation was non-zero before fit
|
||||
manager.setUpDirection('+x')
|
||||
// Simulate that originalRotation was captured at a non-zero rotation
|
||||
manager.originalRotation = new THREE.Euler(0.5, 0.3, 0.1)
|
||||
|
||||
manager.fitToViewer()
|
||||
|
||||
// After fit, the rotation should be correct for +x direction applied to (0,0,0) base
|
||||
// Not compounded with the stale originalRotation
|
||||
expect(model.rotation.x).toBeCloseTo(0)
|
||||
expect(model.rotation.z).toBeCloseTo(-Math.PI / 2)
|
||||
})
|
||||
})
|
||||
|
||||
describe('PLY mode switching', () => {
|
||||
function createPLYManager() {
|
||||
const ctx = createManager({
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user