initial commit
This commit is contained in:
185
extensions-builtin/sd_forge_controlnet/.gitignore
vendored
Executable file
185
extensions-builtin/sd_forge_controlnet/.gitignore
vendored
Executable file
@@ -0,0 +1,185 @@
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
build/
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
share/python-wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
MANIFEST
|
||||
|
||||
# PyInstaller
|
||||
# Usually these files are written by a python script from a template
|
||||
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
||||
*.manifest
|
||||
*.spec
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
pip-delete-this-directory.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
htmlcov/
|
||||
.tox/
|
||||
.nox/
|
||||
.coverage
|
||||
.coverage.*
|
||||
.cache
|
||||
nosetests.xml
|
||||
coverage.xml
|
||||
*.cover
|
||||
*.py,cover
|
||||
.hypothesis/
|
||||
.pytest_cache/
|
||||
cover/
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
*.pot
|
||||
|
||||
# Django stuff:
|
||||
*.log
|
||||
local_settings.py
|
||||
db.sqlite3
|
||||
db.sqlite3-journal
|
||||
|
||||
# Flask stuff:
|
||||
instance/
|
||||
.webassets-cache
|
||||
|
||||
# Scrapy stuff:
|
||||
.scrapy
|
||||
|
||||
# Sphinx documentation
|
||||
docs/_build/
|
||||
|
||||
# PyBuilder
|
||||
.pybuilder/
|
||||
target/
|
||||
|
||||
# Jupyter Notebook
|
||||
.ipynb_checkpoints
|
||||
|
||||
# IPython
|
||||
profile_default/
|
||||
ipython_config.py
|
||||
|
||||
# pyenv
|
||||
# For a library or package, you might want to ignore these files since the code is
|
||||
# intended to run in multiple environments; otherwise, check them in:
|
||||
# .python-version
|
||||
|
||||
# pipenv
|
||||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
|
||||
# However, in case of collaboration, if having platform-specific dependencies or dependencies
|
||||
# having no cross-platform support, pipenv may install dependencies that don't work, or not
|
||||
# install all needed dependencies.
|
||||
#Pipfile.lock
|
||||
|
||||
# poetry
|
||||
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
|
||||
# This is especially recommended for binary packages to ensure reproducibility, and is more
|
||||
# commonly ignored for libraries.
|
||||
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
|
||||
#poetry.lock
|
||||
|
||||
# pdm
|
||||
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
|
||||
#pdm.lock
|
||||
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
|
||||
# in version control.
|
||||
# https://pdm.fming.dev/#use-with-ide
|
||||
.pdm.toml
|
||||
|
||||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
|
||||
__pypackages__/
|
||||
|
||||
# Celery stuff
|
||||
celerybeat-schedule
|
||||
celerybeat.pid
|
||||
|
||||
# SageMath parsed files
|
||||
*.sage.py
|
||||
|
||||
# Environments
|
||||
.env
|
||||
.venv
|
||||
env/
|
||||
venv/
|
||||
ENV/
|
||||
env.bak/
|
||||
venv.bak/
|
||||
|
||||
# Spyder project settings
|
||||
.spyderproject
|
||||
.spyproject
|
||||
|
||||
# Rope project settings
|
||||
.ropeproject
|
||||
|
||||
# mkdocs documentation
|
||||
/site
|
||||
|
||||
# mypy
|
||||
.mypy_cache/
|
||||
.dmypy.json
|
||||
dmypy.json
|
||||
|
||||
# Pyre type checker
|
||||
.pyre/
|
||||
|
||||
# pytype static type analyzer
|
||||
.pytype/
|
||||
|
||||
# Cython debug symbols
|
||||
cython_debug/
|
||||
|
||||
# PyCharm
|
||||
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
|
||||
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
|
||||
# and can be added to the global gitignore or merged into this file. For a more nuclear
|
||||
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
||||
#.idea
|
||||
*.pt
|
||||
*.pth
|
||||
*.ckpt
|
||||
*.bin
|
||||
*.safetensors
|
||||
|
||||
# Editor setting metadata
|
||||
.idea/
|
||||
.vscode/
|
||||
detected_maps/
|
||||
annotator/downloads/
|
||||
|
||||
# test results and expectations
|
||||
web_tests/results/
|
||||
web_tests/expectations/
|
||||
tests/web_api/full_coverage/results/
|
||||
tests/web_api/full_coverage/expectations/
|
||||
|
||||
*_diff.png
|
||||
|
||||
# Presets
|
||||
presets/
|
||||
|
||||
# Ignore existing dir of hand refiner if exists.
|
||||
annotator/hand_refiner_portable
|
||||
674
extensions-builtin/sd_forge_controlnet/LICENSE
Executable file
674
extensions-builtin/sd_forge_controlnet/LICENSE
Executable file
@@ -0,0 +1,674 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
||||
66
extensions-builtin/sd_forge_controlnet/install.py
Executable file
66
extensions-builtin/sd_forge_controlnet/install.py
Executable file
@@ -0,0 +1,66 @@
|
||||
import launch
|
||||
import pkg_resources
|
||||
import sys
|
||||
import os
|
||||
import shutil
|
||||
import platform
|
||||
from pathlib import Path
|
||||
from typing import Tuple, Optional
|
||||
|
||||
|
||||
repo_root = Path(__file__).parent
|
||||
main_req_file = repo_root / "requirements.txt"
|
||||
|
||||
|
||||
def comparable_version(version: str) -> Tuple:
|
||||
return tuple(map(int, version.split(".")))
|
||||
|
||||
|
||||
def get_installed_version(package: str) -> Optional[str]:
|
||||
try:
|
||||
return pkg_resources.get_distribution(package).version
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def extract_base_package(package_string: str) -> str:
|
||||
base_package = package_string.split("@git")[0]
|
||||
return base_package
|
||||
|
||||
|
||||
def install_requirements(req_file):
|
||||
with open(req_file) as file:
|
||||
for package in file:
|
||||
try:
|
||||
package = package.strip()
|
||||
if "==" in package:
|
||||
package_name, package_version = package.split("==")
|
||||
installed_version = get_installed_version(package_name)
|
||||
if installed_version != package_version:
|
||||
launch.run_pip(
|
||||
f"install -U {package}",
|
||||
f"sd-forge-controlnet requirement: changing {package_name} version from {installed_version} to {package_version}",
|
||||
)
|
||||
elif ">=" in package:
|
||||
package_name, package_version = package.split(">=")
|
||||
installed_version = get_installed_version(package_name)
|
||||
if not installed_version or comparable_version(
|
||||
installed_version
|
||||
) < comparable_version(package_version):
|
||||
launch.run_pip(
|
||||
f"install -U {package}",
|
||||
f"sd-forge-controlnet requirement: changing {package_name} version from {installed_version} to {package_version}",
|
||||
)
|
||||
elif not launch.is_installed(extract_base_package(package)):
|
||||
launch.run_pip(
|
||||
f"install {package}",
|
||||
f"sd-forge-controlnet requirement: {package}",
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
print(
|
||||
f"Warning: Failed to install {package}, some preprocessors may not work."
|
||||
)
|
||||
|
||||
|
||||
install_requirements(main_req_file)
|
||||
403
extensions-builtin/sd_forge_controlnet/javascript/active_units.js
Executable file
403
extensions-builtin/sd_forge_controlnet/javascript/active_units.js
Executable file
@@ -0,0 +1,403 @@
|
||||
/**
|
||||
* Give a badge on ControlNet Accordion indicating total number of active
|
||||
* units.
|
||||
* Make active unit's tab name green.
|
||||
* Append control type to tab name.
|
||||
* Disable resize mode selection when A1111 img2img input is used.
|
||||
*/
|
||||
(function () {
|
||||
const cnetAllAccordions = new Set();
|
||||
onUiUpdate(() => {
|
||||
const ImgChangeType = {
|
||||
NO_CHANGE: 0,
|
||||
REMOVE: 1,
|
||||
ADD: 2,
|
||||
SRC_CHANGE: 3,
|
||||
};
|
||||
|
||||
function imgChangeObserved(mutationsList) {
|
||||
// Iterate over all mutations that just occured
|
||||
for (let mutation of mutationsList) {
|
||||
// Check if the mutation is an addition or removal of a node
|
||||
if (mutation.type === 'childList') {
|
||||
// Check if nodes were added
|
||||
if (mutation.addedNodes.length > 0) {
|
||||
for (const node of mutation.addedNodes) {
|
||||
if (node.tagName === 'IMG') {
|
||||
return ImgChangeType.ADD;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check if nodes were removed
|
||||
if (mutation.removedNodes.length > 0) {
|
||||
for (const node of mutation.removedNodes) {
|
||||
if (node.tagName === 'IMG') {
|
||||
return ImgChangeType.REMOVE;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// Check if the mutation is a change of an attribute
|
||||
else if (mutation.type === 'attributes') {
|
||||
if (mutation.target.tagName === 'IMG' && mutation.attributeName === 'src') {
|
||||
return ImgChangeType.SRC_CHANGE;
|
||||
}
|
||||
}
|
||||
}
|
||||
return ImgChangeType.NO_CHANGE;
|
||||
}
|
||||
|
||||
function childIndex(element) {
|
||||
// Get all child nodes of the parent
|
||||
let children = Array.from(element.parentNode.childNodes);
|
||||
|
||||
// Filter out non-element nodes (like text nodes and comments)
|
||||
children = children.filter(child => child.nodeType === Node.ELEMENT_NODE);
|
||||
|
||||
return children.indexOf(element);
|
||||
}
|
||||
|
||||
function imageInputDisabledAlert() {
|
||||
alert('Inpaint control type must use a1111 input in img2img mode.');
|
||||
}
|
||||
|
||||
class ControlNetUnitTab {
|
||||
constructor(tab, accordion) {
|
||||
this.tab = tab;
|
||||
this.tabOpen = false; // Whether the tab is open.
|
||||
this.accordion = accordion;
|
||||
this.isImg2Img = tab.querySelector('.cnet-mask-upload').id.includes('img2img');
|
||||
|
||||
this.enabledAccordionCheckbox = tab.querySelector('.input-accordion-checkbox');
|
||||
this.enabledCheckbox = tab.querySelector('.cnet-unit-enabled input');
|
||||
this.inputImage = tab.querySelector('.cnet-input-image-group .cnet-image input[type="file"]');
|
||||
this.inputImageContainer = tab.querySelector('.cnet-input-image-group .cnet-image');
|
||||
this.generatedImageGroup = tab.querySelector('.cnet-generated-image-group');
|
||||
this.maskImageGroup = tab.querySelector('.cnet-mask-image-group');
|
||||
this.inputImageGroup = tab.querySelector('.cnet-input-image-group');
|
||||
this.controlTypeRadios = tab.querySelectorAll('.controlnet_control_type_filter_group input[type="radio"]');
|
||||
this.resizeModeRadios = tab.querySelectorAll('.controlnet_resize_mode_radio input[type="radio"]');
|
||||
this.runPreprocessorButton = tab.querySelector('.cnet-run-preprocessor');
|
||||
|
||||
this.tabs = tab.parentNode;
|
||||
this.tabIndex = childIndex(tab);
|
||||
|
||||
// By default the InputAccordion checkbox is linked with the state
|
||||
// of accordion's open/close state. To disable this link, we can
|
||||
// simulate click to check the checkbox and uncheck it.
|
||||
this.enabledAccordionCheckbox.click();
|
||||
this.enabledAccordionCheckbox.click();
|
||||
|
||||
this.sync_enabled_checkbox();
|
||||
this.attachEnabledButtonListener();
|
||||
this.attachControlTypeRadioListener();
|
||||
this.attachImageUploadListener();
|
||||
this.attachImageStateChangeObserver();
|
||||
this.attachA1111SendInfoObserver();
|
||||
this.attachAccordionStateObserver();
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync the states of enabledCheckbox and enabledAccordionCheckbox.
|
||||
*/
|
||||
sync_enabled_checkbox() {
|
||||
this.enabledCheckbox.addEventListener("change", () => {
|
||||
if (this.enabledAccordionCheckbox.checked != this.enabledCheckbox.checked) {
|
||||
this.enabledAccordionCheckbox.click();
|
||||
}
|
||||
});
|
||||
this.enabledAccordionCheckbox.addEventListener("change", () => {
|
||||
if (this.enabledCheckbox.checked != this.enabledAccordionCheckbox.checked) {
|
||||
this.enabledCheckbox.click();
|
||||
}
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Get the span that has text "Unit {X}".
|
||||
*/
|
||||
getUnitHeaderTextElement() {
|
||||
return this.tab.querySelector(
|
||||
`button > span:nth-child(1)`
|
||||
);
|
||||
}
|
||||
|
||||
getActiveControlType() {
|
||||
for (let radio of this.controlTypeRadios) {
|
||||
if (radio.checked) {
|
||||
return radio.value;
|
||||
}
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
updateActiveState() {
|
||||
const unitHeader = this.getUnitHeaderTextElement();
|
||||
if (!unitHeader) return;
|
||||
|
||||
if (this.enabledCheckbox.checked) {
|
||||
unitHeader.classList.add('cnet-unit-active');
|
||||
} else {
|
||||
unitHeader.classList.remove('cnet-unit-active');
|
||||
}
|
||||
}
|
||||
|
||||
updateActiveUnitCount() {
|
||||
function getActiveUnitCount(checkboxes) {
|
||||
let activeUnitCount = 0;
|
||||
for (const checkbox of checkboxes) {
|
||||
if (checkbox.checked)
|
||||
activeUnitCount++;
|
||||
}
|
||||
return activeUnitCount;
|
||||
}
|
||||
|
||||
const checkboxes = this.accordion.querySelectorAll('.cnet-unit-enabled input');
|
||||
const span = this.accordion.querySelector('.label-wrap span');
|
||||
|
||||
// Remove existing badge.
|
||||
if (span.childNodes.length !== 1) {
|
||||
span.removeChild(span.lastChild);
|
||||
}
|
||||
// Add new badge if necessary.
|
||||
const activeUnitCount = getActiveUnitCount(checkboxes);
|
||||
if (activeUnitCount > 0) {
|
||||
const div = document.createElement('div');
|
||||
div.classList.add('cnet-badge');
|
||||
div.classList.add('primary');
|
||||
div.innerHTML = `${activeUnitCount} unit${activeUnitCount > 1 ? 's' : ''}`;
|
||||
span.appendChild(div);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add the active control type to tab displayed text.
|
||||
*/
|
||||
updateActiveControlType() {
|
||||
const unitHeader = this.getUnitHeaderTextElement();
|
||||
if (!unitHeader) return;
|
||||
|
||||
// Remove the control if exists
|
||||
const controlTypeSuffix = unitHeader.querySelector('.control-type-suffix');
|
||||
if (controlTypeSuffix) controlTypeSuffix.remove();
|
||||
|
||||
// Add new suffix.
|
||||
const controlType = this.getActiveControlType();
|
||||
if (controlType === 'All') return;
|
||||
|
||||
const span = document.createElement('span');
|
||||
span.innerHTML = `[${controlType}]`;
|
||||
span.classList.add('control-type-suffix');
|
||||
unitHeader.appendChild(span);
|
||||
}
|
||||
getInputImageSrc() {
|
||||
const img = this.inputImageGroup.querySelector('.cnet-image .forge-image');
|
||||
return (img && img.src.startsWith('data')) ? img.src : null;
|
||||
}
|
||||
getPreprocessorPreviewImageSrc() {
|
||||
const img = this.generatedImageGroup.querySelector('.cnet-image .forge-image');
|
||||
return (img && img.src.startsWith('data')) ? img.src : null;
|
||||
}
|
||||
getMaskImageSrc() {
|
||||
function isEmptyCanvas(canvas) {
|
||||
if (!canvas) return true;
|
||||
if (canvas.width == 0 || canvas.height ==0) return true;
|
||||
const ctx = canvas.getContext('2d');
|
||||
// Get the image data
|
||||
const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height);
|
||||
const data = imageData.data; // This is a Uint8ClampedArray
|
||||
// Check each pixel
|
||||
let isPureBlack = true;
|
||||
for (let i = 0; i < data.length; i += 4) {
|
||||
if (data[i] !== 0 || data[i + 1] !== 0 || data[i + 2] !== 0) { // Check RGB values
|
||||
isPureBlack = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
return isPureBlack;
|
||||
}
|
||||
const maskImg = this.maskImageGroup.querySelector('.cnet-mask-image .forge-image');
|
||||
// Hand-drawn mask on mask upload.
|
||||
const handDrawnMaskCanvas = this.maskImageGroup.querySelector('.cnet-mask-image .forge-drawing-canvas');
|
||||
// Hand-drawn mask on input image upload.
|
||||
const inputImageHandDrawnMaskCanvas = this.inputImageGroup.querySelector('.cnet-image .forge-drawing-canvas');
|
||||
if (!isEmptyCanvas(handDrawnMaskCanvas)) {
|
||||
return handDrawnMaskCanvas.toDataURL();
|
||||
} else if (maskImg && maskImg.src.startsWith('data')) {
|
||||
return maskImg.src;
|
||||
} else if (!isEmptyCanvas(inputImageHandDrawnMaskCanvas)) {
|
||||
return inputImageHandDrawnMaskCanvas.toDataURL();
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
setThumbnail(imgSrc, maskSrc) {
|
||||
if (!imgSrc) return;
|
||||
const unitHeader = this.getUnitHeaderTextElement();
|
||||
if (!unitHeader) return;
|
||||
const img = document.createElement('img');
|
||||
img.src = imgSrc;
|
||||
img.classList.add('cnet-thumbnail');
|
||||
unitHeader.appendChild(img);
|
||||
|
||||
if (maskSrc) {
|
||||
const mask = document.createElement('img');
|
||||
mask.src = maskSrc;
|
||||
mask.classList.add('cnet-thumbnail');
|
||||
unitHeader.appendChild(mask);
|
||||
}
|
||||
}
|
||||
removeThumbnail() {
|
||||
const unitHeader = this.getUnitHeaderTextElement();
|
||||
if (!unitHeader) return;
|
||||
const imgs = unitHeader.querySelectorAll('.cnet-thumbnail');
|
||||
for (const img of imgs) {
|
||||
img.remove();
|
||||
}
|
||||
}
|
||||
/**
|
||||
* When the accordion is folded, display a thumbnail of input image
|
||||
* and mask on the accordion header.
|
||||
*/
|
||||
updateInputImageThumbnail() {
|
||||
if (!opts.controlnet_input_thumbnail) return;
|
||||
if (this.tabOpen) {
|
||||
this.removeThumbnail();
|
||||
} else {
|
||||
this.setThumbnail(this.getInputImageSrc(), this.getMaskImageSrc());
|
||||
}
|
||||
}
|
||||
|
||||
attachEnabledButtonListener() {
|
||||
this.enabledCheckbox.addEventListener('change', () => {
|
||||
this.updateActiveState();
|
||||
this.updateActiveUnitCount();
|
||||
});
|
||||
}
|
||||
|
||||
attachControlTypeRadioListener() {
|
||||
for (const radio of this.controlTypeRadios) {
|
||||
radio.addEventListener('change', () => {
|
||||
this.updateActiveControlType();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
attachImageUploadListener() {
|
||||
// Automatically check `enable` checkbox when image is uploaded.
|
||||
this.inputImage.addEventListener('change', (event) => {
|
||||
if (!event.target.files) return;
|
||||
if (!this.enabledCheckbox.checked)
|
||||
this.enabledCheckbox.click();
|
||||
});
|
||||
|
||||
// Automatically check `enable` checkbox when JSON pose file is uploaded.
|
||||
this.tab.querySelector('.cnet-upload-pose input').addEventListener('change', (event) => {
|
||||
if (!event.target.files) return;
|
||||
if (!this.enabledCheckbox.checked)
|
||||
this.enabledCheckbox.click();
|
||||
});
|
||||
}
|
||||
|
||||
attachImageStateChangeObserver() {
|
||||
new MutationObserver((mutationsList) => {
|
||||
const changeObserved = imgChangeObserved(mutationsList);
|
||||
|
||||
if (changeObserved === ImgChangeType.ADD) {
|
||||
// enabling the run preprocessor button
|
||||
this.runPreprocessorButton.removeAttribute("disabled");
|
||||
this.runPreprocessorButton.title = 'Run preprocessor';
|
||||
}
|
||||
|
||||
if (changeObserved === ImgChangeType.REMOVE) {
|
||||
// disabling the run preprocessor button
|
||||
this.runPreprocessorButton.setAttribute("disabled", true);
|
||||
this.runPreprocessorButton.title = "No ControlNet input image available";
|
||||
}
|
||||
}).observe(this.inputImageContainer, {
|
||||
childList: true,
|
||||
subtree: true,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Observe send PNG info buttons in A1111, as they can also directly
|
||||
* set states of ControlNetUnit.
|
||||
*/
|
||||
attachA1111SendInfoObserver() {
|
||||
const pasteButtons = gradioApp().querySelectorAll('#paste');
|
||||
const pngButtons = gradioApp().querySelectorAll(
|
||||
this.isImg2Img ?
|
||||
'#img2img_tab, #inpaint_tab' :
|
||||
'#txt2img_tab'
|
||||
);
|
||||
|
||||
for (const button of [...pasteButtons, ...pngButtons]) {
|
||||
button.addEventListener('click', () => {
|
||||
// The paste/send img generation info feature goes
|
||||
// though gradio, which is pretty slow. Ideally we should
|
||||
// observe the event when gradio has done the job, but
|
||||
// that is not an easy task.
|
||||
// Here we just do a 2 second delay until the refresh.
|
||||
setTimeout(() => {
|
||||
this.updateActiveState();
|
||||
this.updateActiveUnitCount();
|
||||
}, 2000);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Observer that triggers when the ControlNetUnit's accordion(tab) closes.
|
||||
*/
|
||||
attachAccordionStateObserver() {
|
||||
new MutationObserver((mutationsList) => {
|
||||
for(const mutation of mutationsList) {
|
||||
if (mutation.type === 'attributes' && mutation.attributeName === 'class') {
|
||||
const newState = mutation.target.classList.contains('open');
|
||||
if (this.tabOpen != newState) {
|
||||
this.tabOpen = newState;
|
||||
if (newState) {
|
||||
this.onAccordionOpen();
|
||||
} else {
|
||||
this.onAccordionClose();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}).observe(this.tab.querySelector('.label-wrap'), { attributes: true, attributeFilter: ['class'] });
|
||||
}
|
||||
|
||||
onAccordionOpen() {
|
||||
this.updateInputImageThumbnail();
|
||||
}
|
||||
|
||||
onAccordionClose() {
|
||||
this.updateInputImageThumbnail();
|
||||
}
|
||||
}
|
||||
|
||||
gradioApp().querySelectorAll('#controlnet').forEach(accordion => {
|
||||
if (cnetAllAccordions.has(accordion)) return;
|
||||
const tabs = [...accordion.querySelectorAll('.input-accordion')]
|
||||
.map(tab => new ControlNetUnitTab(tab, accordion));
|
||||
|
||||
// On open of main extension accordion, if no unit is enabled,
|
||||
// open unit 0 for edit.
|
||||
const labelWrap = accordion.querySelector('.label-wrap');
|
||||
const observerAccordionOpen = new MutationObserver(function (mutations) {
|
||||
for (const mutation of mutations) {
|
||||
if (mutation.target.classList.contains('open') &&
|
||||
tabs.every(tab => !tab.enabledCheckbox.checked &&
|
||||
!tab.tab.querySelector('.label-wrap').classList.contains('open'))
|
||||
) {
|
||||
tabs[0].tab.querySelector('.label-wrap').click();
|
||||
}
|
||||
}
|
||||
});
|
||||
observerAccordionOpen.observe(labelWrap, { attributes: true, attributeFilter: ['class'] });
|
||||
|
||||
cnetAllAccordions.add(accordion);
|
||||
});
|
||||
});
|
||||
})();
|
||||
17
extensions-builtin/sd_forge_controlnet/javascript/canvas.js
Executable file
17
extensions-builtin/sd_forge_controlnet/javascript/canvas.js
Executable file
@@ -0,0 +1,17 @@
|
||||
(function () {
|
||||
var hasApplied = false;
|
||||
onUiUpdate(function () {
|
||||
if (!hasApplied) {
|
||||
if (typeof window.applyZoomAndPanIntegration === "function") {
|
||||
hasApplied = true;
|
||||
window.applyZoomAndPanIntegration("#txt2img_controlnet", Array.from({ length: 20 }, (_, i) => `#txt2img_controlnet_ControlNet-${i}_input_image`));
|
||||
window.applyZoomAndPanIntegration("#img2img_controlnet", Array.from({ length: 20 }, (_, i) => `#img2img_controlnet_ControlNet-${i}_input_image`));
|
||||
window.applyZoomAndPanIntegration("#txt2img_controlnet", ["#txt2img_controlnet_ControlNet_input_image"]);
|
||||
window.applyZoomAndPanIntegration("#img2img_controlnet", ["#img2img_controlnet_ControlNet_input_image"]);
|
||||
//console.log("window.applyZoomAndPanIntegration applied.");
|
||||
} else {
|
||||
//console.log("window.applyZoomAndPanIntegration is not available.");
|
||||
}
|
||||
}
|
||||
});
|
||||
})();
|
||||
33
extensions-builtin/sd_forge_controlnet/javascript/modal.js
Executable file
33
extensions-builtin/sd_forge_controlnet/javascript/modal.js
Executable file
@@ -0,0 +1,33 @@
|
||||
(function () {
|
||||
const cnetModalRegisteredElements = new Set();
|
||||
onUiUpdate(() => {
|
||||
// Get all the buttons that open a modal
|
||||
const btns = gradioApp().querySelectorAll(".cnet-modal-open");
|
||||
|
||||
// Get all the <span> elements that close a modal
|
||||
const spans = document.querySelectorAll(".cnet-modal-close");
|
||||
|
||||
// For each button, add a click event listener that opens the corresponding modal
|
||||
btns.forEach((btn) => {
|
||||
if (cnetModalRegisteredElements.has(btn)) return;
|
||||
cnetModalRegisteredElements.add(btn);
|
||||
|
||||
const modalId = btn.id.replace('cnet-modal-open-', '');
|
||||
const modal = document.getElementById("cnet-modal-" + modalId);
|
||||
btn.addEventListener('click', () => {
|
||||
modal.style.display = "block";
|
||||
});
|
||||
});
|
||||
|
||||
// For each <span> element, add a click event listener that closes the corresponding modal
|
||||
spans.forEach((span) => {
|
||||
if (cnetModalRegisteredElements.has(span)) return;
|
||||
cnetModalRegisteredElements.add(span);
|
||||
|
||||
const modal = span.parentNode;
|
||||
span.addEventListener('click', () => {
|
||||
modal.style.display = "none";
|
||||
});
|
||||
});
|
||||
});
|
||||
})();
|
||||
152
extensions-builtin/sd_forge_controlnet/javascript/openpose_editor.js
Executable file
152
extensions-builtin/sd_forge_controlnet/javascript/openpose_editor.js
Executable file
@@ -0,0 +1,152 @@
|
||||
(function () {
|
||||
async function checkEditorAvailable() {
|
||||
const LOCAL_EDITOR_PATH = '/openpose_editor_index';
|
||||
const REMOTE_EDITOR_PATH = 'https://huchenlei.github.io/sd-webui-openpose-editor/';
|
||||
|
||||
async function testEditorPath(path) {
|
||||
const res = await fetch(path);
|
||||
return res.status === 200 ? path : null;
|
||||
}
|
||||
|
||||
// Use local editor if the user has the extension installed. Fallback
|
||||
// onto remote editor if the local editor is not ready yet.
|
||||
// See https://github.com/huchenlei/sd-webui-openpose-editor/issues/53
|
||||
// for more details.
|
||||
return await testEditorPath(LOCAL_EDITOR_PATH) || await testEditorPath(REMOTE_EDITOR_PATH);
|
||||
}
|
||||
|
||||
const cnetOpenposeEditorRegisteredElements = new Set();
|
||||
let editorURL = null;
|
||||
function loadOpenposeEditor() {
|
||||
// Simulate an `input` DOM event for Gradio Textbox component. Needed after you edit its contents in javascript, otherwise your edits
|
||||
// will only visible on web page and not sent to python.
|
||||
function updateInput(target) {
|
||||
let e = new Event("input", { bubbles: true })
|
||||
Object.defineProperty(e, "target", { value: target })
|
||||
target.dispatchEvent(e);
|
||||
}
|
||||
|
||||
function navigateIframe(iframe, editorURL) {
|
||||
function getPathname(rawURL) {
|
||||
try {
|
||||
return new URL(rawURL).pathname;
|
||||
} catch (e) {
|
||||
return rawURL;
|
||||
}
|
||||
}
|
||||
|
||||
return new Promise((resolve) => {
|
||||
const darkThemeParam = document.body.classList.contains('dark') ?
|
||||
new URLSearchParams({ theme: 'dark' }).toString() :
|
||||
'';
|
||||
|
||||
window.addEventListener('message', (event) => {
|
||||
const message = event.data;
|
||||
if (message['ready']) resolve();
|
||||
}, { once: true });
|
||||
|
||||
if ((editorURL.startsWith("http") ? iframe.src : getPathname(iframe.src)) !== editorURL) {
|
||||
iframe.src = `${editorURL}?${darkThemeParam}`;
|
||||
// By default assume 5 second is enough for the openpose editor
|
||||
// to load.
|
||||
setTimeout(resolve, 5000);
|
||||
} else {
|
||||
// If no navigation is required, immediately return.
|
||||
resolve();
|
||||
}
|
||||
});
|
||||
}
|
||||
const tabs = gradioApp().querySelectorAll('#controlnet .input-accordion');
|
||||
tabs.forEach(tab => {
|
||||
if (cnetOpenposeEditorRegisteredElements.has(tab)) return;
|
||||
cnetOpenposeEditorRegisteredElements.add(tab);
|
||||
|
||||
const generatedImageGroup = tab.querySelector('.cnet-generated-image-group');
|
||||
const editButton = generatedImageGroup.querySelector('.cnet-edit-pose');
|
||||
|
||||
editButton.addEventListener('click', async () => {
|
||||
const inputImageGroup = tab.querySelector('.cnet-input-image-group');
|
||||
const inputImage = inputImageGroup.querySelector('.cnet-image img');
|
||||
const downloadLink = generatedImageGroup.querySelector('.cnet-download-pose a');
|
||||
const modalId = editButton.id.replace('cnet-modal-open-', '');
|
||||
const modalIframe = generatedImageGroup.querySelector('.cnet-modal iframe');
|
||||
|
||||
if (!editorURL) {
|
||||
editorURL = await checkEditorAvailable();
|
||||
if (!editorURL) {
|
||||
alert("No openpose editor available.")
|
||||
}
|
||||
}
|
||||
|
||||
await navigateIframe(modalIframe, editorURL);
|
||||
modalIframe.contentWindow.postMessage({
|
||||
modalId,
|
||||
imageURL: inputImage ? inputImage.src : undefined,
|
||||
poseURL: downloadLink.href,
|
||||
}, '*');
|
||||
// Focus the iframe so that the focus is no longer on the `Edit` button.
|
||||
// Pressing space when the focus is on `Edit` button will trigger
|
||||
// the click again to resend the frame message.
|
||||
modalIframe.contentWindow.focus();
|
||||
});
|
||||
/*
|
||||
* Writes the pose data URL to an link element on input image group.
|
||||
* Click a hidden button to trigger a backend rendering of the pose JSON.
|
||||
*
|
||||
* The backend should:
|
||||
* - Set the rendered pose image as preprocessor generated image.
|
||||
*/
|
||||
function updatePreviewPose(poseURL) {
|
||||
const downloadLink = generatedImageGroup.querySelector('.cnet-download-pose a');
|
||||
const renderButton = generatedImageGroup.querySelector('.cnet-render-pose');
|
||||
const poseTextbox = generatedImageGroup.querySelector('.cnet-pose-json textarea');
|
||||
const allowPreviewCheckbox = tab.querySelector('.cnet-allow-preview input');
|
||||
|
||||
if (!allowPreviewCheckbox.checked)
|
||||
allowPreviewCheckbox.click();
|
||||
|
||||
// Only set href when download link exists and needs an update. `downloadLink`
|
||||
// can be null when user closes preview and click `Upload JSON` button again.
|
||||
// https://github.com/Mikubill/sd-webui-controlnet/issues/2308
|
||||
if (downloadLink !== null)
|
||||
downloadLink.href = poseURL;
|
||||
|
||||
poseTextbox.value = poseURL;
|
||||
updateInput(poseTextbox);
|
||||
renderButton.click();
|
||||
}
|
||||
|
||||
// Updates preview image when edit is done.
|
||||
window.addEventListener('message', (event) => {
|
||||
const message = event.data;
|
||||
const modalId = editButton.id.replace('cnet-modal-open-', '');
|
||||
if (message.modalId !== modalId) return;
|
||||
updatePreviewPose(message.poseURL);
|
||||
|
||||
const closeModalButton = generatedImageGroup.querySelector('.cnet-modal .cnet-modal-close');
|
||||
closeModalButton.click();
|
||||
});
|
||||
|
||||
const inputImageGroup = tab.querySelector('.cnet-input-image-group');
|
||||
const uploadButton = inputImageGroup.querySelector('.cnet-upload-pose input');
|
||||
// Updates preview image when JSON file is uploaded.
|
||||
uploadButton.addEventListener('change', (event) => {
|
||||
const file = event.target.files[0];
|
||||
if (!file)
|
||||
return;
|
||||
|
||||
const reader = new FileReader();
|
||||
reader.onload = function (e) {
|
||||
const contents = e.target.result;
|
||||
const poseURL = `data:application/json;base64,${btoa(contents)}`;
|
||||
updatePreviewPose(poseURL);
|
||||
};
|
||||
reader.readAsText(file);
|
||||
// Reset the file input value so that uploading the same file still triggers callback.
|
||||
event.target.value = '';
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
onUiUpdate(loadOpenposeEditor);
|
||||
})();
|
||||
435
extensions-builtin/sd_forge_controlnet/javascript/photopea.js
Executable file
435
extensions-builtin/sd_forge_controlnet/javascript/photopea.js
Executable file
@@ -0,0 +1,435 @@
|
||||
(function () {
|
||||
/*
|
||||
MIT LICENSE
|
||||
Copyright 2011 Jon Leighton
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
|
||||
associated documentation files (the "Software"), to deal in the Software without restriction,
|
||||
including without limitation the rights to use, copy, modify, merge, publish, distribute,
|
||||
sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial
|
||||
portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
|
||||
PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
*/
|
||||
// From: https://gist.github.com/jonleighton/958841
|
||||
function base64ArrayBuffer(arrayBuffer) {
|
||||
var base64 = ''
|
||||
var encodings = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
|
||||
|
||||
var bytes = new Uint8Array(arrayBuffer)
|
||||
var byteLength = bytes.byteLength
|
||||
var byteRemainder = byteLength % 3
|
||||
var mainLength = byteLength - byteRemainder
|
||||
|
||||
var a, b, c, d
|
||||
var chunk
|
||||
|
||||
// Main loop deals with bytes in chunks of 3
|
||||
for (var i = 0; i < mainLength; i = i + 3) {
|
||||
// Combine the three bytes into a single integer
|
||||
chunk = (bytes[i] << 16) | (bytes[i + 1] << 8) | bytes[i + 2]
|
||||
|
||||
// Use bitmasks to extract 6-bit segments from the triplet
|
||||
a = (chunk & 16515072) >> 18 // 16515072 = (2^6 - 1) << 18
|
||||
b = (chunk & 258048) >> 12 // 258048 = (2^6 - 1) << 12
|
||||
c = (chunk & 4032) >> 6 // 4032 = (2^6 - 1) << 6
|
||||
d = chunk & 63 // 63 = 2^6 - 1
|
||||
|
||||
// Convert the raw binary segments to the appropriate ASCII encoding
|
||||
base64 += encodings[a] + encodings[b] + encodings[c] + encodings[d]
|
||||
}
|
||||
|
||||
// Deal with the remaining bytes and padding
|
||||
if (byteRemainder == 1) {
|
||||
chunk = bytes[mainLength]
|
||||
|
||||
a = (chunk & 252) >> 2 // 252 = (2^6 - 1) << 2
|
||||
|
||||
// Set the 4 least significant bits to zero
|
||||
b = (chunk & 3) << 4 // 3 = 2^2 - 1
|
||||
|
||||
base64 += encodings[a] + encodings[b] + '=='
|
||||
} else if (byteRemainder == 2) {
|
||||
chunk = (bytes[mainLength] << 8) | bytes[mainLength + 1]
|
||||
|
||||
a = (chunk & 64512) >> 10 // 64512 = (2^6 - 1) << 10
|
||||
b = (chunk & 1008) >> 4 // 1008 = (2^6 - 1) << 4
|
||||
|
||||
// Set the 2 least significant bits to zero
|
||||
c = (chunk & 15) << 2 // 15 = 2^4 - 1
|
||||
|
||||
base64 += encodings[a] + encodings[b] + encodings[c] + '='
|
||||
}
|
||||
|
||||
return base64
|
||||
}
|
||||
|
||||
// Turn a base64 string into a blob.
|
||||
// From https://gist.github.com/gauravmehla/7a7dfd87dd7d1b13697b6e894426615f
|
||||
function b64toBlob(b64Data, contentType, sliceSize) {
|
||||
var contentType = contentType || '';
|
||||
var sliceSize = sliceSize || 512;
|
||||
var byteCharacters = atob(b64Data);
|
||||
var byteArrays = [];
|
||||
for (var offset = 0; offset < byteCharacters.length; offset += sliceSize) {
|
||||
var slice = byteCharacters.slice(offset, offset + sliceSize);
|
||||
var byteNumbers = new Array(slice.length);
|
||||
for (var i = 0; i < slice.length; i++) {
|
||||
byteNumbers[i] = slice.charCodeAt(i);
|
||||
}
|
||||
var byteArray = new Uint8Array(byteNumbers);
|
||||
byteArrays.push(byteArray);
|
||||
}
|
||||
return new Blob(byteArrays, { type: contentType });
|
||||
}
|
||||
|
||||
function createBlackImageBase64(width, height) {
|
||||
// Create a canvas element
|
||||
var canvas = document.createElement('canvas');
|
||||
canvas.width = width;
|
||||
canvas.height = height;
|
||||
|
||||
// Get the context of the canvas
|
||||
var ctx = canvas.getContext('2d');
|
||||
|
||||
// Fill the canvas with black color
|
||||
ctx.fillStyle = 'black';
|
||||
ctx.fillRect(0, 0, width, height);
|
||||
|
||||
// Get the base64 encoded string
|
||||
var base64Image = canvas.toDataURL('image/png');
|
||||
|
||||
return base64Image;
|
||||
}
|
||||
|
||||
// Functions to be called within photopea context.
|
||||
// Start of photopea functions
|
||||
function pasteImage(base64image) {
|
||||
app.open(base64image, null, /* asSmart */ true);
|
||||
app.echoToOE("success");
|
||||
}
|
||||
|
||||
function setLayerNames(names) {
|
||||
const layers = app.activeDocument.layers;
|
||||
if (layers.length !== names.length) {
|
||||
console.error("layer length does not match names length");
|
||||
echoToOE("error");
|
||||
return;
|
||||
}
|
||||
|
||||
for (let i = 0; i < names.length; i++) {
|
||||
const layer = layers[i];
|
||||
layer.name = names[i];
|
||||
}
|
||||
app.echoToOE("success");
|
||||
}
|
||||
|
||||
function removeLayersWithNames(names) {
|
||||
const layers = app.activeDocument.layers;
|
||||
for (let i = 0; i < layers.length; i++) {
|
||||
const layer = layers[i];
|
||||
if (names.includes(layer.name)) {
|
||||
layer.remove();
|
||||
}
|
||||
}
|
||||
app.echoToOE("success");
|
||||
}
|
||||
|
||||
function getAllLayerNames() {
|
||||
const layers = app.activeDocument.layers;
|
||||
const names = [];
|
||||
for (let i = 0; i < layers.length; i++) {
|
||||
const layer = layers[i];
|
||||
names.push(layer.name);
|
||||
}
|
||||
app.echoToOE(JSON.stringify(names));
|
||||
}
|
||||
|
||||
// Hides all layers except the current one, outputs the whole image, then restores the previous
|
||||
// layers state.
|
||||
function exportSelectedLayerOnly(format, layerName) {
|
||||
// Gets all layers recursively, including the ones inside folders.
|
||||
function getAllArtLayers(document) {
|
||||
let allArtLayers = [];
|
||||
|
||||
for (let i = 0; i < document.layers.length; i++) {
|
||||
const currentLayer = document.layers[i];
|
||||
allArtLayers.push(currentLayer);
|
||||
if (currentLayer.typename === "LayerSet") {
|
||||
allArtLayers = allArtLayers.concat(getAllArtLayers(currentLayer));
|
||||
}
|
||||
}
|
||||
return allArtLayers;
|
||||
}
|
||||
|
||||
function makeLayerVisible(layer) {
|
||||
let currentLayer = layer;
|
||||
while (currentLayer != app.activeDocument) {
|
||||
currentLayer.visible = true;
|
||||
if (currentLayer.parent.typename != 'Document') {
|
||||
currentLayer = currentLayer.parent;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
const allLayers = getAllArtLayers(app.activeDocument);
|
||||
// Make all layers except the currently selected one invisible, and store
|
||||
// their initial state.
|
||||
const layerStates = [];
|
||||
for (let i = 0; i < allLayers.length; i++) {
|
||||
const layer = allLayers[i];
|
||||
layerStates.push(layer.visible);
|
||||
}
|
||||
// Hide all layers to begin with
|
||||
for (let i = 0; i < allLayers.length; i++) {
|
||||
const layer = allLayers[i];
|
||||
layer.visible = false;
|
||||
}
|
||||
for (let i = 0; i < allLayers.length; i++) {
|
||||
const layer = allLayers[i];
|
||||
const selected = layer.name === layerName;
|
||||
if (selected) {
|
||||
makeLayerVisible(layer);
|
||||
}
|
||||
}
|
||||
app.activeDocument.saveToOE(format);
|
||||
|
||||
for (let i = 0; i < allLayers.length; i++) {
|
||||
const layer = allLayers[i];
|
||||
layer.visible = layerStates[i];
|
||||
}
|
||||
}
|
||||
|
||||
function hasActiveDocument() {
|
||||
app.echoToOE(app.documents.length > 0 ? "true" : "false");
|
||||
}
|
||||
// End of photopea functions
|
||||
|
||||
const MESSAGE_END_ACK = "done";
|
||||
const MESSAGE_ERROR = "error";
|
||||
const PHOTOPEA_URL = "https://www.photopea.com/";
|
||||
class PhotopeaContext {
|
||||
constructor(photopeaIframe) {
|
||||
this.photopeaIframe = photopeaIframe;
|
||||
this.timeout = 1000;
|
||||
}
|
||||
|
||||
navigateIframe() {
|
||||
const iframe = this.photopeaIframe;
|
||||
const editorURL = PHOTOPEA_URL;
|
||||
|
||||
return new Promise(async (resolve) => {
|
||||
if (iframe.src !== editorURL) {
|
||||
iframe.src = editorURL;
|
||||
// Stop waiting after 10s.
|
||||
setTimeout(resolve, 10000);
|
||||
|
||||
// Testing whether photopea is able to accept message.
|
||||
while (true) {
|
||||
try {
|
||||
await this.invoke(hasActiveDocument);
|
||||
break;
|
||||
} catch (e) {
|
||||
console.log("Keep waiting for photopea to accept message.");
|
||||
}
|
||||
}
|
||||
this.timeout = 5000; // Restore to a longer timeout in normal messaging.
|
||||
}
|
||||
resolve();
|
||||
});
|
||||
}
|
||||
|
||||
// From https://github.com/huchenlei/stable-diffusion-ps-pea/blob/main/src/Photopea.ts
|
||||
postMessageToPhotopea(message) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const responseDataPieces = [];
|
||||
let hasError = false;
|
||||
const photopeaMessageHandle = (event) => {
|
||||
if (event.source !== this.photopeaIframe.contentWindow) {
|
||||
return;
|
||||
}
|
||||
// Filter out the ping messages
|
||||
if (typeof event.data === 'string' && event.data.includes('MSFAPI#')) {
|
||||
return;
|
||||
}
|
||||
// Ignore "done" when no data has been received. The "done" can come from
|
||||
// MSFAPI ping.
|
||||
if (event.data === MESSAGE_END_ACK && responseDataPieces.length === 0) {
|
||||
return;
|
||||
}
|
||||
if (event.data === MESSAGE_END_ACK) {
|
||||
window.removeEventListener("message", photopeaMessageHandle);
|
||||
if (hasError) {
|
||||
reject('Photopea Error.');
|
||||
} else {
|
||||
resolve(responseDataPieces.length === 1 ? responseDataPieces[0] : responseDataPieces);
|
||||
}
|
||||
} else if (event.data === MESSAGE_ERROR) {
|
||||
responseDataPieces.push(event.data);
|
||||
hasError = true;
|
||||
} else {
|
||||
responseDataPieces.push(event.data);
|
||||
}
|
||||
};
|
||||
|
||||
window.addEventListener("message", photopeaMessageHandle);
|
||||
setTimeout(() => reject("Photopea message timeout"), this.timeout);
|
||||
this.photopeaIframe.contentWindow.postMessage(message, "*");
|
||||
});
|
||||
}
|
||||
|
||||
// From https://github.com/huchenlei/stable-diffusion-ps-pea/blob/main/src/Photopea.ts
|
||||
async invoke(func, ...args) {
|
||||
await this.navigateIframe();
|
||||
const message = `${func.toString()} ${func.name}(${args.map(arg => JSON.stringify(arg)).join(',')});`;
|
||||
try {
|
||||
return await this.postMessageToPhotopea(message);
|
||||
} catch (e) {
|
||||
throw `Failed to invoke ${func.name}. ${e}.`;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch detected maps from each ControlNet units.
|
||||
* Create a new photopea document.
|
||||
* Add those detected maps to the created document.
|
||||
*/
|
||||
async fetchFromControlNet(tabs) {
|
||||
if (tabs.length === 0) return;
|
||||
const isImg2Img = tabs[0].querySelector('.cnet-mask-upload').id.includes('img2img');
|
||||
const generationType = isImg2Img ? 'img2img' : 'txt2img';
|
||||
const width = gradioApp().querySelector(`#${generationType}_width input[type=number]`).value;
|
||||
const height = gradioApp().querySelector(`#${generationType}_height input[type=number]`).value;
|
||||
|
||||
const layerNames = ["background"];
|
||||
await this.invoke(pasteImage, createBlackImageBase64(width, height));
|
||||
await new Promise(r => setTimeout(r, 200));
|
||||
for (const [i, tab] of tabs.entries()) {
|
||||
const generatedImage = tab.querySelector('.cnet-generated-image-group .cnet-image img');
|
||||
if (!generatedImage) continue;
|
||||
await this.invoke(pasteImage, generatedImage.src);
|
||||
// Wait 200ms for pasting to fully complete so that we do not ended up with 2 separate
|
||||
// documents.
|
||||
await new Promise(r => setTimeout(r, 200));
|
||||
layerNames.push(`unit-${i}`);
|
||||
}
|
||||
await this.invoke(removeLayersWithNames, layerNames);
|
||||
await this.invoke(setLayerNames, layerNames.reverse());
|
||||
}
|
||||
|
||||
/**
|
||||
* Send the images in the active photopea document back to each ControlNet units.
|
||||
*/
|
||||
async sendToControlNet(tabs) {
|
||||
// Gradio's image widgets are inputs. To set the image in one, we set the image on the input and
|
||||
// force it to refresh.
|
||||
function setImageOnInput(imageInput, file) {
|
||||
// Createa a data transfer element to set as the data in the input.
|
||||
const dt = new DataTransfer();
|
||||
dt.items.add(file);
|
||||
const list = dt.files;
|
||||
|
||||
// Actually set the image in the image widget.
|
||||
imageInput.files = list;
|
||||
|
||||
// Foce the image widget to update with the new image, after setting its source files.
|
||||
const event = new Event('change', {
|
||||
'bubbles': true,
|
||||
"composed": true
|
||||
});
|
||||
imageInput.dispatchEvent(event);
|
||||
}
|
||||
|
||||
function sendToControlNetUnit(b64Image, index) {
|
||||
const tab = tabs[index];
|
||||
// Upload image to output image element.
|
||||
const outputImage = tab.querySelector('.cnet-photopea-output');
|
||||
const outputImageUpload = outputImage.querySelector('input[type="file"]');
|
||||
setImageOnInput(outputImageUpload, new File([b64toBlob(b64Image, "image/png")], "photopea_output.png"));
|
||||
|
||||
// Make sure `UsePreviewAsInput` checkbox is checked.
|
||||
const checkbox = tab.querySelector('.cnet-preview-as-input input[type="checkbox"]');
|
||||
if (!checkbox.checked) {
|
||||
checkbox.click();
|
||||
}
|
||||
}
|
||||
|
||||
const layerNames =
|
||||
JSON.parse(await this.invoke(getAllLayerNames))
|
||||
.filter(name => /unit-\d+/.test(name));
|
||||
|
||||
for (const layerName of layerNames) {
|
||||
const arrayBuffer = await this.invoke(exportSelectedLayerOnly, 'PNG', layerName);
|
||||
const b64Image = base64ArrayBuffer(arrayBuffer);
|
||||
const layerIndex = Number.parseInt(layerName.split('-')[1]);
|
||||
sendToControlNetUnit(b64Image, layerIndex);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let photopeaWarningShown = false;
|
||||
|
||||
function firstTimeUserPrompt() {
|
||||
if (opts.controlnet_photopea_warning){
|
||||
const photopeaPopupMsg = "you are about to connect to https://photopea.com\n" +
|
||||
"- Click OK: proceed.\n" +
|
||||
"- Click Cancel: abort.\n" +
|
||||
"Photopea integration can be disabled in Settings > ControlNet > Disable photopea edit.\n" +
|
||||
"This popup can be disabled in Settings > ControlNet > Photopea popup warning.";
|
||||
if (photopeaWarningShown || confirm(photopeaPopupMsg)) photopeaWarningShown = true;
|
||||
else return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
const cnetRegisteredAccordions = new Set();
|
||||
function loadPhotopea() {
|
||||
function registerCallbacks(accordion) {
|
||||
const photopeaMainTrigger = accordion.querySelector('.cnet-photopea-main-trigger');
|
||||
// Photopea edit feature disabled.
|
||||
if (!photopeaMainTrigger) {
|
||||
console.log("ControlNet photopea edit disabled.");
|
||||
return;
|
||||
}
|
||||
|
||||
const closeModalButton = accordion.querySelector('.cnet-photopea-edit .cnet-modal-close');
|
||||
const tabs = accordion.querySelectorAll('.controlnet .input-accordion');
|
||||
const photopeaIframe = accordion.querySelector('.photopea-iframe');
|
||||
const photopeaContext = new PhotopeaContext(photopeaIframe, tabs);
|
||||
|
||||
tabs.forEach(tab => {
|
||||
const photopeaChildTrigger = tab.querySelector('.cnet-photopea-child-trigger');
|
||||
photopeaChildTrigger.addEventListener('click', async () => {
|
||||
if (!firstTimeUserPrompt()) return;
|
||||
|
||||
photopeaMainTrigger.click();
|
||||
if (await photopeaContext.invoke(hasActiveDocument) === "false") {
|
||||
await photopeaContext.fetchFromControlNet(tabs);
|
||||
}
|
||||
});
|
||||
});
|
||||
accordion.querySelector('.photopea-fetch').addEventListener('click', () => photopeaContext.fetchFromControlNet(tabs));
|
||||
accordion.querySelector('.photopea-send').addEventListener('click', () => {
|
||||
photopeaContext.sendToControlNet(tabs)
|
||||
closeModalButton.click();
|
||||
});
|
||||
}
|
||||
|
||||
const accordions = gradioApp().querySelectorAll('#controlnet');
|
||||
accordions.forEach(accordion => {
|
||||
if (cnetRegisteredAccordions.has(accordion)) return;
|
||||
registerCallbacks(accordion);
|
||||
cnetRegisteredAccordions.add(accordion);
|
||||
});
|
||||
}
|
||||
|
||||
onUiUpdate(loadPhotopea);
|
||||
})();
|
||||
138
extensions-builtin/sd_forge_controlnet/lib_controlnet/api.py
Executable file
138
extensions-builtin/sd_forge_controlnet/lib_controlnet/api.py
Executable file
@@ -0,0 +1,138 @@
|
||||
from typing import List
|
||||
|
||||
import numpy as np
|
||||
from fastapi import FastAPI, Body
|
||||
from fastapi.exceptions import HTTPException
|
||||
from PIL import Image
|
||||
import gradio as gr
|
||||
|
||||
from modules.api import api
|
||||
from .global_state import (
|
||||
get_all_preprocessor_names,
|
||||
get_all_controlnet_names,
|
||||
get_preprocessor,
|
||||
get_all_preprocessor_tags,
|
||||
select_control_type,
|
||||
)
|
||||
from .utils import judge_image_type
|
||||
from .logging import logger
|
||||
|
||||
|
||||
def encode_to_base64(image):
|
||||
if isinstance(image, str):
|
||||
return image
|
||||
elif not judge_image_type(image):
|
||||
return "Detect result is not image"
|
||||
elif isinstance(image, Image.Image):
|
||||
return api.encode_pil_to_base64(image)
|
||||
elif isinstance(image, np.ndarray):
|
||||
return encode_np_to_base64(image)
|
||||
else:
|
||||
logger.warn("Unable to encode image.")
|
||||
return ""
|
||||
|
||||
|
||||
def encode_np_to_base64(image):
|
||||
pil = Image.fromarray(image)
|
||||
return api.encode_pil_to_base64(pil)
|
||||
|
||||
|
||||
def controlnet_api(_: gr.Blocks, app: FastAPI):
|
||||
@app.get("/controlnet/model_list")
|
||||
async def model_list():
|
||||
up_to_date_model_list = get_all_controlnet_names()
|
||||
logger.debug(up_to_date_model_list)
|
||||
return {"model_list": up_to_date_model_list}
|
||||
|
||||
@app.get("/controlnet/module_list")
|
||||
async def module_list():
|
||||
module_list = get_all_preprocessor_names()
|
||||
logger.debug(module_list)
|
||||
|
||||
return {
|
||||
"module_list": module_list,
|
||||
# TODO: Add back module detail.
|
||||
# "module_detail": external_code.get_modules_detail(alias_names),
|
||||
}
|
||||
|
||||
@app.get("/controlnet/control_types")
|
||||
async def control_types():
|
||||
def format_control_type(
|
||||
filtered_preprocessor_list,
|
||||
filtered_model_list,
|
||||
default_option,
|
||||
default_model,
|
||||
):
|
||||
control_dict = {
|
||||
"module_list": filtered_preprocessor_list,
|
||||
"model_list": filtered_model_list,
|
||||
"default_option": default_option,
|
||||
"default_model": default_model,
|
||||
}
|
||||
|
||||
return control_dict
|
||||
|
||||
return {
|
||||
"control_types": {
|
||||
control_type: format_control_type(*select_control_type(control_type))
|
||||
for control_type in get_all_preprocessor_tags()
|
||||
}
|
||||
}
|
||||
|
||||
@app.post("/controlnet/detect")
|
||||
async def detect(
|
||||
controlnet_module: str = Body("none", title="Controlnet Module"),
|
||||
controlnet_input_images: List[str] = Body([], title="Controlnet Input Images"),
|
||||
controlnet_processor_res: int = Body(
|
||||
512, title="Controlnet Processor Resolution"
|
||||
),
|
||||
controlnet_threshold_a: float = Body(64, title="Controlnet Threshold a"),
|
||||
controlnet_threshold_b: float = Body(64, title="Controlnet Threshold b"),
|
||||
):
|
||||
processor_module = get_preprocessor(controlnet_module)
|
||||
if processor_module is None:
|
||||
raise HTTPException(status_code=422, detail="Module not available")
|
||||
|
||||
if len(controlnet_input_images) == 0:
|
||||
raise HTTPException(status_code=422, detail="No image selected")
|
||||
|
||||
logger.debug(
|
||||
f"Detecting {str(len(controlnet_input_images))} images with the {controlnet_module} module."
|
||||
)
|
||||
|
||||
results = []
|
||||
poses = []
|
||||
|
||||
for input_image in controlnet_input_images:
|
||||
img = np.array(api.decode_base64_to_image(input_image)).astype('uint8')
|
||||
|
||||
class JsonAcceptor:
|
||||
def __init__(self) -> None:
|
||||
self.value = None
|
||||
|
||||
def accept(self, json_dict: dict) -> None:
|
||||
self.value = json_dict
|
||||
|
||||
json_acceptor = JsonAcceptor()
|
||||
|
||||
results.append(
|
||||
processor_module(
|
||||
img,
|
||||
resolution=controlnet_processor_res,
|
||||
slider_1=controlnet_threshold_a,
|
||||
slider_2=controlnet_threshold_b,
|
||||
json_pose_callback=json_acceptor.accept,
|
||||
)
|
||||
)
|
||||
|
||||
if "openpose" in controlnet_module:
|
||||
assert json_acceptor.value is not None
|
||||
poses.append(json_acceptor.value)
|
||||
|
||||
results64 = [encode_to_base64(img) for img in results]
|
||||
res = {"images": results64, "info": "Success"}
|
||||
if poses:
|
||||
res["poses"] = poses
|
||||
|
||||
return res
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
38
extensions-builtin/sd_forge_controlnet/lib_controlnet/controlnet_ui/modal.py
Executable file
38
extensions-builtin/sd_forge_controlnet/lib_controlnet/controlnet_ui/modal.py
Executable file
@@ -0,0 +1,38 @@
|
||||
import gradio as gr
|
||||
from typing import List
|
||||
|
||||
|
||||
class ModalInterface(gr.Interface):
|
||||
modal_id_counter = 0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
html_content: str,
|
||||
open_button_text: str,
|
||||
open_button_classes: List[str] = [],
|
||||
open_button_extra_attrs: str = ''
|
||||
):
|
||||
self.html_content = html_content
|
||||
self.open_button_text = open_button_text
|
||||
self.open_button_classes = open_button_classes
|
||||
self.open_button_extra_attrs = open_button_extra_attrs
|
||||
self.modal_id = ModalInterface.modal_id_counter
|
||||
ModalInterface.modal_id_counter += 1
|
||||
|
||||
def __call__(self):
|
||||
return self.create_modal()
|
||||
|
||||
def create_modal(self, visible=True):
|
||||
html_code = f"""
|
||||
<div id="cnet-modal-{self.modal_id}" class="cnet-modal">
|
||||
<span class="cnet-modal-close">×</span>
|
||||
<div class="cnet-modal-content">
|
||||
{self.html_content}
|
||||
</div>
|
||||
</div>
|
||||
<div id="cnet-modal-open-{self.modal_id}"
|
||||
class="cnet-modal-open {' '.join(self.open_button_classes)}"
|
||||
{self.open_button_extra_attrs}
|
||||
>{self.open_button_text}</div>
|
||||
"""
|
||||
return gr.HTML(value=html_code, visible=visible)
|
||||
@@ -0,0 +1,154 @@
|
||||
import base64
|
||||
import gradio as gr
|
||||
import json
|
||||
from typing import List, Dict, Any, Tuple
|
||||
|
||||
from annotator.openpose import decode_json_as_poses, draw_poses
|
||||
from annotator.openpose.animalpose import draw_animalposes
|
||||
from lib_controlnet.controlnet_ui.modal import ModalInterface
|
||||
from modules import shared
|
||||
from lib_controlnet.logging import logger
|
||||
|
||||
|
||||
def parse_data_url(data_url: str):
|
||||
# Split the URL at the comma
|
||||
media_type, data = data_url.split(",", 1)
|
||||
|
||||
# Check if the data is base64-encoded
|
||||
assert ";base64" in media_type
|
||||
|
||||
# Decode the base64 data
|
||||
return base64.b64decode(data)
|
||||
|
||||
|
||||
def encode_data_url(json_string: str) -> str:
|
||||
base64_encoded_json = base64.b64encode(json_string.encode("utf-8")).decode("utf-8")
|
||||
return f"data:application/json;base64,{base64_encoded_json}"
|
||||
|
||||
|
||||
class OpenposeEditor(object):
|
||||
# Filename used when user click the download link.
|
||||
download_file = "pose.json"
|
||||
# URL the openpose editor is mounted on.
|
||||
editor_url = "/openpose_editor_index"
|
||||
|
||||
def __init__(self) -> None:
|
||||
self.render_button = None
|
||||
self.pose_input = None
|
||||
self.download_link = None
|
||||
self.upload_link = None
|
||||
self.modal = None
|
||||
|
||||
def render_edit(self):
|
||||
"""Renders the buttons in preview image control button group."""
|
||||
# The hidden button to trigger a re-render of generated image.
|
||||
self.render_button = gr.Button(visible=False, elem_classes=["cnet-render-pose"])
|
||||
# The hidden element that stores the pose json for backend retrieval.
|
||||
# The front-end javascript will write the edited JSON data to the element.
|
||||
self.pose_input = gr.Textbox(visible=False, elem_classes=["cnet-pose-json"])
|
||||
|
||||
self.modal = ModalInterface(
|
||||
# Use about:blank here as placeholder so that the iframe does not
|
||||
# immediately navigate. Most of controlnet units do not need
|
||||
# openpose editor active. Only navigate when the user first click
|
||||
# 'Edit'. The navigation logic is in `openpose_editor.js`.
|
||||
f'<iframe src="about:blank"></iframe>',
|
||||
open_button_text="Edit",
|
||||
open_button_classes=["cnet-edit-pose"],
|
||||
open_button_extra_attrs=f'title="Send pose to {OpenposeEditor.editor_url} for edit."',
|
||||
).create_modal(visible=False)
|
||||
self.download_link = gr.HTML(
|
||||
value=f"""<a href='' download='{OpenposeEditor.download_file}'>JSON</a>""",
|
||||
visible=False,
|
||||
elem_classes=["cnet-download-pose"],
|
||||
)
|
||||
|
||||
def render_upload(self):
|
||||
"""Renders the button in input image control button group."""
|
||||
self.upload_link = gr.HTML(
|
||||
value="""
|
||||
<label>Upload JSON</label>
|
||||
<input type="file" accept=".json"/>
|
||||
""",
|
||||
visible=False,
|
||||
elem_classes=["cnet-upload-pose"],
|
||||
)
|
||||
|
||||
def register_callbacks(
|
||||
self,
|
||||
generated_image: gr.Image,
|
||||
use_preview_as_input: gr.Checkbox,
|
||||
model: gr.Dropdown,
|
||||
):
|
||||
def render_pose(pose_url: str) -> Tuple[Dict, Dict]:
|
||||
json_string = parse_data_url(pose_url).decode("utf-8")
|
||||
poses, animals, height, width = decode_json_as_poses(
|
||||
json.loads(json_string)
|
||||
)
|
||||
logger.info("Preview as input is enabled.")
|
||||
return (
|
||||
# Generated image.
|
||||
gr.update(
|
||||
value=(
|
||||
draw_poses(
|
||||
poses,
|
||||
height,
|
||||
width,
|
||||
draw_body=True,
|
||||
draw_hand=True,
|
||||
draw_face=True,
|
||||
)
|
||||
if poses
|
||||
else draw_animalposes(animals, height, width)
|
||||
),
|
||||
visible=True,
|
||||
),
|
||||
# Use preview as input.
|
||||
gr.update(value=True),
|
||||
# Self content.
|
||||
*self.update(json_string),
|
||||
)
|
||||
|
||||
self.render_button.click(
|
||||
fn=render_pose,
|
||||
inputs=[self.pose_input],
|
||||
outputs=[generated_image.background, use_preview_as_input, *self.outputs()],
|
||||
)
|
||||
|
||||
def update_upload_link(model: str) -> Dict:
|
||||
return gr.update(visible="openpose" in model.lower())
|
||||
|
||||
model.change(fn=update_upload_link, inputs=[model], outputs=[self.upload_link])
|
||||
|
||||
def outputs(self) -> List[Any]:
|
||||
return [
|
||||
self.download_link,
|
||||
self.modal,
|
||||
]
|
||||
|
||||
def update(self, json_string: str) -> List[Dict]:
|
||||
"""
|
||||
Called when there is a new JSON pose value generated by running
|
||||
preprocessor.
|
||||
|
||||
Args:
|
||||
json_string: The new JSON string generated by preprocessor.
|
||||
|
||||
Returns:
|
||||
An gr.update event.
|
||||
"""
|
||||
hint = "Download the pose as .json file"
|
||||
html = f"""<a href='{encode_data_url(json_string)}'
|
||||
download='{OpenposeEditor.download_file}' title="{hint}">
|
||||
JSON</a>"""
|
||||
|
||||
visible = json_string != ""
|
||||
return [
|
||||
# Download link update.
|
||||
gr.update(value=html, visible=visible),
|
||||
# Modal update.
|
||||
gr.update(
|
||||
visible=visible
|
||||
and not shared.opts.data.get("controlnet_disable_openpose_edit", False)
|
||||
),
|
||||
]
|
||||
182
extensions-builtin/sd_forge_controlnet/lib_controlnet/controlnet_ui/photopea.py
Executable file
182
extensions-builtin/sd_forge_controlnet/lib_controlnet/controlnet_ui/photopea.py
Executable file
@@ -0,0 +1,182 @@
|
||||
import gradio as gr
|
||||
|
||||
from lib_controlnet.controlnet_ui.modal import ModalInterface
|
||||
|
||||
PHOTOPEA_LOGO = """
|
||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
width="100%" viewBox="0 0 256 256" enable-background="new 0 0 256 256" xml:space="preserve"
|
||||
style="width: 0.75rem; height 0.75rem; margin-left: 2px;"
|
||||
>
|
||||
<path fill="#18A497" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M1.000000,228.000000
|
||||
C1.000000,162.312439 1.000000,96.624878 1.331771,30.719650
|
||||
C2.026278,30.171114 2.594676,29.904894 2.721949,29.500008
|
||||
C6.913495,16.165672 15.629609,7.322631 28.880219,2.875538
|
||||
C29.404272,2.699659 29.633436,1.645129 30.000000,1.000000
|
||||
C95.687561,1.000000 161.375122,1.000000 227.258057,1.317018
|
||||
C227.660217,1.893988 227.815079,2.296565 228.081207,2.393433
|
||||
C241.304657,7.206383 250.980164,15.550970 255.215851,29.410040
|
||||
C255.321625,29.756128 256.383850,29.809898 257.000000,30.000000
|
||||
C257.000000,95.687561 257.000000,161.375122 256.682983,227.257858
|
||||
C256.106049,227.659790 255.699371,227.815521 255.607178,228.080658
|
||||
C250.953033,241.462830 242.292618,250.822968 228.591782,255.214935
|
||||
C228.239929,255.327698 228.190491,256.383820 228.000000,257.000000
|
||||
C175.312439,257.000000 122.624878,257.000000 69.468582,256.531342
|
||||
C68.672188,244.948196 68.218323,233.835587 68.052299,222.718674
|
||||
C67.885620,211.557587 67.886772,200.390717 68.027298,189.229050
|
||||
C68.255180,171.129044 68.084618,152.997421 69.151917,134.942368
|
||||
C70.148468,118.083969 77.974228,103.689308 89.758743,91.961365
|
||||
C104.435837,77.354736 122.313736,69.841736 143.417328,69.901505
|
||||
C168.662338,69.972984 186.981964,90.486633 187.961487,114.156334
|
||||
C189.042435,140.277435 166.783981,163.607941 140.303482,160.823074
|
||||
C137.092346,160.485382 133.490692,158.365784 131.192612,155.987366
|
||||
C126.434669,151.063141 126.975357,144.720825 129.168777,138.834930
|
||||
C131.533630,132.489014 137.260605,130.548050 143.413757,130.046677
|
||||
C150.288467,129.486496 156.424942,123.757378 157.035324,117.320816
|
||||
C157.953949,107.633820 150.959381,101.769096 145.533951,101.194389
|
||||
C132.238846,99.786079 120.699944,104.963120 111.676735,114.167313
|
||||
C102.105782,123.930222 97.469498,136.194061 99.003151,150.234955
|
||||
C100.540352,164.308228 107.108505,175.507980 118.864334,183.311539
|
||||
C128.454544,189.677597 138.866959,191.786957 150.657837,190.245651
|
||||
C166.242554,188.208420 179.874283,182.443329 191.251801,172.056793
|
||||
C209.355011,155.530380 217.848694,134.938721 216.116119,110.085892
|
||||
C214.834335,91.699440 207.721039,76.015915 195.289444,62.978828
|
||||
C175.658447,42.391735 150.833389,37.257801 123.833740,42.281937
|
||||
C98.675804,46.963364 78.315033,60.084667 62.208153,80.157814
|
||||
C46.645889,99.552216 39.305275,121.796379 39.149052,146.201981
|
||||
C38.912663,183.131317 39.666767,220.067017 40.000000,257.000000
|
||||
C36.969406,257.000000 33.938812,257.000000 30.705070,256.668213
|
||||
C30.298622,256.078369 30.144913,255.669220 29.884926,255.583878
|
||||
C16.317770,251.131058 7.127485,242.317780 2.778462,228.591797
|
||||
C2.667588,228.241821 1.613958,228.190567 1.000000,228.000000
|
||||
z"/>
|
||||
<path fill="#000000" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M40.468658,257.000000
|
||||
C39.666767,220.067017 38.912663,183.131317 39.149052,146.201981
|
||||
C39.305275,121.796379 46.645889,99.552216 62.208153,80.157814
|
||||
C78.315033,60.084667 98.675804,46.963364 123.833740,42.281937
|
||||
C150.833389,37.257801 175.658447,42.391735 195.289444,62.978828
|
||||
C207.721039,76.015915 214.834335,91.699440 216.116119,110.085892
|
||||
C217.848694,134.938721 209.355011,155.530380 191.251801,172.056793
|
||||
C179.874283,182.443329 166.242554,188.208420 150.657837,190.245651
|
||||
C138.866959,191.786957 128.454544,189.677597 118.864334,183.311539
|
||||
C107.108505,175.507980 100.540352,164.308228 99.003151,150.234955
|
||||
C97.469498,136.194061 102.105782,123.930222 111.676735,114.167313
|
||||
C120.699944,104.963120 132.238846,99.786079 145.533951,101.194389
|
||||
C150.959381,101.769096 157.953949,107.633820 157.035324,117.320816
|
||||
C156.424942,123.757378 150.288467,129.486496 143.413757,130.046677
|
||||
C137.260605,130.548050 131.533630,132.489014 129.168777,138.834930
|
||||
C126.975357,144.720825 126.434669,151.063141 131.192612,155.987366
|
||||
C133.490692,158.365784 137.092346,160.485382 140.303482,160.823074
|
||||
C166.783981,163.607941 189.042435,140.277435 187.961487,114.156334
|
||||
C186.981964,90.486633 168.662338,69.972984 143.417328,69.901505
|
||||
C122.313736,69.841736 104.435837,77.354736 89.758743,91.961365
|
||||
C77.974228,103.689308 70.148468,118.083969 69.151917,134.942368
|
||||
C68.084618,152.997421 68.255180,171.129044 68.027298,189.229050
|
||||
C67.886772,200.390717 67.885620,211.557587 68.052299,222.718674
|
||||
C68.218323,233.835587 68.672188,244.948196 68.999924,256.531342
|
||||
C59.645771,257.000000 50.291542,257.000000 40.468658,257.000000
|
||||
z"/>
|
||||
<path fill="#000000" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M257.000000,29.531342
|
||||
C256.383850,29.809898 255.321625,29.756128 255.215851,29.410040
|
||||
C250.980164,15.550970 241.304657,7.206383 228.081207,2.393433
|
||||
C227.815079,2.296565 227.660217,1.893988 227.726715,1.317018
|
||||
C237.593155,1.000000 247.186295,1.000000 257.000000,1.000000
|
||||
C257.000000,10.353075 257.000000,19.707878 257.000000,29.531342
|
||||
z"/>
|
||||
<path fill="#000000" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M228.468658,257.000000
|
||||
C228.190491,256.383820 228.239929,255.327698 228.591782,255.214935
|
||||
C242.292618,250.822968 250.953033,241.462830 255.607178,228.080658
|
||||
C255.699371,227.815521 256.106049,227.659790 256.682983,227.726517
|
||||
C257.000000,237.593155 257.000000,247.186295 257.000000,257.000000
|
||||
C247.646927,257.000000 238.292114,257.000000 228.468658,257.000000
|
||||
z"/>
|
||||
<path fill="#000000" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M1.000000,228.468658
|
||||
C1.613958,228.190567 2.667588,228.241821 2.778462,228.591797
|
||||
C7.127485,242.317780 16.317770,251.131058 29.884926,255.583878
|
||||
C30.144913,255.669220 30.298622,256.078369 30.250959,256.668213
|
||||
C20.406853,257.000000 10.813705,257.000000 1.000000,257.000000
|
||||
C1.000000,247.646927 1.000000,238.292114 1.000000,228.468658
|
||||
z"/>
|
||||
<path fill="#000000" opacity="1.000000" stroke="none"
|
||||
d="
|
||||
M29.531342,1.000000
|
||||
C29.633436,1.645129 29.404272,2.699659 28.880219,2.875538
|
||||
C15.629609,7.322631 6.913495,16.165672 2.721949,29.500008
|
||||
C2.594676,29.904894 2.026278,30.171114 1.331771,30.250992
|
||||
C1.000000,20.406855 1.000000,10.813709 1.000000,1.000000
|
||||
C10.353074,1.000000 19.707878,1.000000 29.531342,1.000000
|
||||
z"/>
|
||||
</svg>"""
|
||||
|
||||
|
||||
class Photopea(object):
|
||||
def __init__(self) -> None:
|
||||
self.modal = None
|
||||
self.triggers = []
|
||||
self.render_editor()
|
||||
|
||||
def render_editor(self):
|
||||
"""Render the editor modal."""
|
||||
with gr.Group(elem_classes=["cnet-photopea-edit"]):
|
||||
self.modal = ModalInterface(
|
||||
# Use about:blank here as placeholder so that the iframe does not
|
||||
# immediately navigate. Only navigate when the user first click
|
||||
# 'Edit'. The navigation logic is in `photopea.js`.
|
||||
f"""
|
||||
<div class="photopea-button-group">
|
||||
<button class="photopea-button photopea-fetch">Fetch from ControlNet</button>
|
||||
<button class="photopea-button photopea-send">Send to ControlNet</button>
|
||||
</div>
|
||||
<iframe class="photopea-iframe" src="about:blank"></iframe>
|
||||
""",
|
||||
open_button_text="Edit",
|
||||
open_button_classes=["cnet-photopea-main-trigger"],
|
||||
open_button_extra_attrs="hidden",
|
||||
).create_modal(visible=True)
|
||||
|
||||
def render_child_trigger(self):
|
||||
self.triggers.append(
|
||||
gr.HTML(
|
||||
f"""<div class="cnet-photopea-child-trigger">
|
||||
Edit {PHOTOPEA_LOGO}
|
||||
</div>"""
|
||||
)
|
||||
)
|
||||
|
||||
def attach_photopea_output(self, generated_image: gr.Image):
|
||||
"""Called in ControlNetUiGroup to attach preprocessor preview image Gradio element
|
||||
as the photopea output. If the front-end directly change the img HTML element's src
|
||||
to reflect the edited image result from photopea, the backend won't be notified.
|
||||
|
||||
In this method we let the front-end upload the result image an invisible gr.Image
|
||||
instance and mirrors the value to preprocessor preview gr.Image. This is because
|
||||
the generated image gr.Image instance is inferred to be an output image by Gradio
|
||||
and has no ability to accept image upload directly.
|
||||
|
||||
Arguments:
|
||||
generated_image: preprocessor result Gradio Image output element.
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
output = gr.Image(
|
||||
visible=False,
|
||||
source="upload",
|
||||
type="numpy",
|
||||
elem_classes=[f"cnet-photopea-output"],
|
||||
)
|
||||
|
||||
output.upload(
|
||||
fn=lambda img: img,
|
||||
inputs=[output],
|
||||
outputs=[generated_image],
|
||||
)
|
||||
39
extensions-builtin/sd_forge_controlnet/lib_controlnet/enums.py
Executable file
39
extensions-builtin/sd_forge_controlnet/lib_controlnet/enums.py
Executable file
@@ -0,0 +1,39 @@
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class HiResFixOption(Enum):
|
||||
BOTH = "Both"
|
||||
LOW_RES_ONLY = "Low res only"
|
||||
HIGH_RES_ONLY = "High res only"
|
||||
|
||||
@staticmethod
|
||||
def from_value(value) -> "HiResFixOption":
|
||||
if isinstance(value, str) and value.startswith("HiResFixOption."):
|
||||
_, field = value.split(".")
|
||||
return getattr(HiResFixOption, field)
|
||||
if isinstance(value, str):
|
||||
return HiResFixOption(value)
|
||||
elif isinstance(value, int):
|
||||
return [x for x in HiResFixOption][value]
|
||||
else:
|
||||
assert isinstance(value, HiResFixOption)
|
||||
return value
|
||||
|
||||
@property
|
||||
def low_res_enabled(self) -> bool:
|
||||
return self in (HiResFixOption.BOTH, HiResFixOption.LOW_RES_ONLY)
|
||||
|
||||
@property
|
||||
def high_res_enabled(self) -> bool:
|
||||
return self in (HiResFixOption.BOTH, HiResFixOption.HIGH_RES_ONLY)
|
||||
|
||||
|
||||
class InputMode(Enum):
|
||||
# Single image to a single ControlNet unit.
|
||||
SIMPLE = "simple"
|
||||
# Input is a directory. N generations. Each generation takes 1 input image
|
||||
# from the directory.
|
||||
BATCH = "batch"
|
||||
# Input is a directory. 1 generation. Each generation takes N input image
|
||||
# from the directory.
|
||||
MERGE = "merge"
|
||||
238
extensions-builtin/sd_forge_controlnet/lib_controlnet/external_code.py
Executable file
238
extensions-builtin/sd_forge_controlnet/lib_controlnet/external_code.py
Executable file
@@ -0,0 +1,238 @@
|
||||
from dataclasses import dataclass
|
||||
from enum import Enum
|
||||
from typing import List, Optional, Union, Dict, TypedDict
|
||||
import numpy as np
|
||||
from modules import shared
|
||||
from lib_controlnet.logging import logger
|
||||
from lib_controlnet.enums import InputMode, HiResFixOption
|
||||
from modules.api import api
|
||||
|
||||
|
||||
def get_api_version() -> int:
|
||||
return 2
|
||||
|
||||
|
||||
class ControlMode(Enum):
|
||||
"""
|
||||
The improved guess mode.
|
||||
"""
|
||||
|
||||
BALANCED = "Balanced"
|
||||
PROMPT = "My prompt is more important"
|
||||
CONTROL = "ControlNet is more important"
|
||||
|
||||
|
||||
class BatchOption(Enum):
|
||||
DEFAULT = "All ControlNet units for all images in a batch"
|
||||
SEPARATE = "Each ControlNet unit for each image in a batch"
|
||||
|
||||
|
||||
class ResizeMode(Enum):
|
||||
"""
|
||||
Resize modes for ControlNet input images.
|
||||
"""
|
||||
|
||||
RESIZE = "Just Resize"
|
||||
INNER_FIT = "Crop and Resize"
|
||||
OUTER_FIT = "Resize and Fill"
|
||||
|
||||
def int_value(self):
|
||||
if self == ResizeMode.RESIZE:
|
||||
return 0
|
||||
elif self == ResizeMode.INNER_FIT:
|
||||
return 1
|
||||
elif self == ResizeMode.OUTER_FIT:
|
||||
return 2
|
||||
assert False, "NOTREACHED"
|
||||
|
||||
|
||||
resize_mode_aliases = {
|
||||
'Inner Fit (Scale to Fit)': 'Crop and Resize',
|
||||
'Outer Fit (Shrink to Fit)': 'Resize and Fill',
|
||||
'Scale to Fit (Inner Fit)': 'Crop and Resize',
|
||||
'Envelope (Outer Fit)': 'Resize and Fill',
|
||||
}
|
||||
|
||||
|
||||
def resize_mode_from_value(value: Union[str, int, ResizeMode]) -> ResizeMode:
|
||||
if isinstance(value, str):
|
||||
return ResizeMode(resize_mode_aliases.get(value, value))
|
||||
elif isinstance(value, int):
|
||||
assert value >= 0
|
||||
if value == 3: # 'Just Resize (Latent upscale)'
|
||||
return ResizeMode.RESIZE
|
||||
|
||||
if value >= len(ResizeMode):
|
||||
logger.warning(f'Unrecognized ResizeMode int value {value}. Fall back to RESIZE.')
|
||||
return ResizeMode.RESIZE
|
||||
|
||||
return [e for e in ResizeMode][value]
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def control_mode_from_value(value: Union[str, int, ControlMode]) -> ControlMode:
|
||||
if isinstance(value, str):
|
||||
return ControlMode(value)
|
||||
elif isinstance(value, int):
|
||||
return [e for e in ControlMode][value]
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def visualize_inpaint_mask(img):
|
||||
if img.ndim == 3 and img.shape[2] == 4:
|
||||
result = img.copy()
|
||||
mask = result[:, :, 3]
|
||||
mask = 255 - mask // 2
|
||||
result[:, :, 3] = mask
|
||||
return np.ascontiguousarray(result.copy())
|
||||
return img
|
||||
|
||||
|
||||
def pixel_perfect_resolution(
|
||||
image: np.ndarray,
|
||||
target_H: int,
|
||||
target_W: int,
|
||||
resize_mode: ResizeMode,
|
||||
) -> int:
|
||||
"""
|
||||
Calculate the estimated resolution for resizing an image while preserving aspect ratio.
|
||||
|
||||
The function first calculates scaling factors for height and width of the image based on the target
|
||||
height and width. Then, based on the chosen resize mode, it either takes the smaller or the larger
|
||||
scaling factor to estimate the new resolution.
|
||||
|
||||
If the resize mode is OUTER_FIT, the function uses the smaller scaling factor, ensuring the whole image
|
||||
fits within the target dimensions, potentially leaving some empty space.
|
||||
|
||||
If the resize mode is not OUTER_FIT, the function uses the larger scaling factor, ensuring the target
|
||||
dimensions are fully filled, potentially cropping the image.
|
||||
|
||||
After calculating the estimated resolution, the function prints some debugging information.
|
||||
|
||||
Args:
|
||||
image (np.ndarray): A 3D numpy array representing an image. The dimensions represent [height, width, channels].
|
||||
target_H (int): The target height for the image.
|
||||
target_W (int): The target width for the image.
|
||||
resize_mode (ResizeMode): The mode for resizing.
|
||||
|
||||
Returns:
|
||||
int: The estimated resolution after resizing.
|
||||
"""
|
||||
raw_H, raw_W, _ = image.shape
|
||||
|
||||
k0 = float(target_H) / float(raw_H)
|
||||
k1 = float(target_W) / float(raw_W)
|
||||
|
||||
if resize_mode == ResizeMode.OUTER_FIT:
|
||||
estimation = min(k0, k1) * float(min(raw_H, raw_W))
|
||||
else:
|
||||
estimation = max(k0, k1) * float(min(raw_H, raw_W))
|
||||
|
||||
logger.debug(f"Pixel Perfect Computation:")
|
||||
logger.debug(f"resize_mode = {resize_mode}")
|
||||
logger.debug(f"raw_H = {raw_H}")
|
||||
logger.debug(f"raw_W = {raw_W}")
|
||||
logger.debug(f"target_H = {target_H}")
|
||||
logger.debug(f"target_W = {target_W}")
|
||||
logger.debug(f"estimation = {estimation}")
|
||||
|
||||
return int(np.round(estimation))
|
||||
|
||||
|
||||
class GradioImageMaskPair(TypedDict):
|
||||
"""Represents the dict object from Gradio's image component if `tool="sketch"`
|
||||
is specified.
|
||||
{
|
||||
"image": np.ndarray,
|
||||
"mask": np.ndarray,
|
||||
}
|
||||
"""
|
||||
image: np.ndarray
|
||||
mask: np.ndarray
|
||||
|
||||
|
||||
@dataclass
|
||||
class ControlNetUnit:
|
||||
input_mode: InputMode = InputMode.SIMPLE
|
||||
use_preview_as_input: bool = False
|
||||
batch_image_dir: str = ''
|
||||
batch_mask_dir: str = ''
|
||||
batch_input_gallery: Optional[List[str]] = None
|
||||
batch_mask_gallery: Optional[List[str]] = None
|
||||
generated_image: Optional[np.ndarray] = None
|
||||
mask_image: Optional[GradioImageMaskPair] = None
|
||||
mask_image_fg: Optional[GradioImageMaskPair] = None
|
||||
hr_option: Union[HiResFixOption, int, str] = HiResFixOption.BOTH
|
||||
enabled: bool = True
|
||||
module: str = "None"
|
||||
model: str = "None"
|
||||
weight: float = 1.0
|
||||
image: Optional[GradioImageMaskPair] = None
|
||||
image_fg: Optional[GradioImageMaskPair] = None
|
||||
resize_mode: Union[ResizeMode, int, str] = ResizeMode.INNER_FIT
|
||||
processor_res: int = -1
|
||||
threshold_a: float = -1
|
||||
threshold_b: float = -1
|
||||
guidance_start: float = 0.0
|
||||
guidance_end: float = 1.0
|
||||
pixel_perfect: bool = False
|
||||
control_mode: Union[ControlMode, int, str] = ControlMode.BALANCED
|
||||
save_detected_map: bool = True
|
||||
|
||||
@staticmethod
|
||||
def infotext_fields():
|
||||
"""Fields that should be included in infotext.
|
||||
You should define a Gradio element with exact same name in ControlNetUiGroup
|
||||
as well, so that infotext can wire the value to correct field when pasting
|
||||
infotext.
|
||||
"""
|
||||
return (
|
||||
"module",
|
||||
"model",
|
||||
"weight",
|
||||
"resize_mode",
|
||||
"processor_res",
|
||||
"threshold_a",
|
||||
"threshold_b",
|
||||
"guidance_start",
|
||||
"guidance_end",
|
||||
"pixel_perfect",
|
||||
"control_mode",
|
||||
"hr_option",
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def from_dict(d: Dict) -> "ControlNetUnit":
|
||||
"""Create ControlNetUnit from dict. This is primarily used to convert
|
||||
API json dict to ControlNetUnit."""
|
||||
unit = ControlNetUnit(
|
||||
**{k: v for k, v in d.items() if k in vars(ControlNetUnit)}
|
||||
)
|
||||
if isinstance(unit.image, str):
|
||||
unit.image = np.array(api.decode_base64_to_image(unit.image)).astype('uint8')
|
||||
if isinstance(unit.mask_image, str):
|
||||
unit.mask_image = np.array(api.decode_base64_to_image(unit.mask_image)).astype('uint8')
|
||||
return unit
|
||||
|
||||
|
||||
# Backward Compatible
|
||||
UiControlNetUnit = ControlNetUnit
|
||||
|
||||
|
||||
def to_base64_nparray(encoding: str):
|
||||
"""
|
||||
Convert a base64 image into the image type the extension uses
|
||||
"""
|
||||
|
||||
return np.array(api.decode_base64_to_image(encoding)).astype('uint8')
|
||||
|
||||
|
||||
def get_max_models_num():
|
||||
"""
|
||||
Fetch the maximum number of allowed ControlNet models.
|
||||
"""
|
||||
|
||||
max_models_num = shared.opts.data.get("control_net_unit_count", 3)
|
||||
return max_models_num
|
||||
171
extensions-builtin/sd_forge_controlnet/lib_controlnet/global_state.py
Executable file
171
extensions-builtin/sd_forge_controlnet/lib_controlnet/global_state.py
Executable file
@@ -0,0 +1,171 @@
|
||||
import os.path
|
||||
import stat
|
||||
from collections import OrderedDict
|
||||
|
||||
from modules import shared, sd_models
|
||||
from modules_forge.shared import controlnet_dir, supported_preprocessors
|
||||
|
||||
from typing import Dict, Tuple, List
|
||||
|
||||
CN_MODEL_EXTS = [".pt", ".pth", ".ckpt", ".safetensors", ".bin", ".patch"]
|
||||
|
||||
|
||||
def traverse_all_files(curr_path, model_list):
|
||||
f_list = [
|
||||
(os.path.join(curr_path, entry.name), entry.stat())
|
||||
for entry in os.scandir(curr_path)
|
||||
if os.path.isdir(curr_path)
|
||||
]
|
||||
for f_info in f_list:
|
||||
fname, fstat = f_info
|
||||
if os.path.splitext(fname)[1] in CN_MODEL_EXTS:
|
||||
model_list.append(f_info)
|
||||
elif stat.S_ISDIR(fstat.st_mode):
|
||||
model_list = traverse_all_files(fname, model_list)
|
||||
return model_list
|
||||
|
||||
|
||||
def get_all_models(sort_by, filter_by, path):
|
||||
res = OrderedDict()
|
||||
fileinfos = traverse_all_files(path, [])
|
||||
filter_by = filter_by.strip(" ")
|
||||
if len(filter_by) != 0:
|
||||
fileinfos = [x for x in fileinfos if filter_by.lower()
|
||||
in os.path.basename(x[0]).lower()]
|
||||
if sort_by == "name":
|
||||
fileinfos = sorted(fileinfos, key=lambda x: os.path.basename(x[0]))
|
||||
elif sort_by == "date":
|
||||
fileinfos = sorted(fileinfos, key=lambda x: -x[1].st_mtime)
|
||||
elif sort_by == "path name":
|
||||
fileinfos = sorted(fileinfos)
|
||||
|
||||
for finfo in fileinfos:
|
||||
filename = finfo[0]
|
||||
name = os.path.splitext(os.path.basename(filename))[0]
|
||||
# Prevent a hypothetical "None.pt" from being listed.
|
||||
if name != "None":
|
||||
res[name + f" [{sd_models.model_hash(filename)}]"] = filename
|
||||
|
||||
return res
|
||||
|
||||
|
||||
controlnet_filename_dict = {'None': 'model.safetensors'}
|
||||
controlnet_names = ['None']
|
||||
|
||||
|
||||
def get_preprocessor(name):
|
||||
return supported_preprocessors.get(name, None)
|
||||
|
||||
def get_default_preprocessor(tag):
|
||||
ps = get_filtered_preprocessor_names(tag)
|
||||
assert len(ps) > 0
|
||||
return ps[0] if len(ps) == 1 else ps[1]
|
||||
|
||||
def get_sorted_preprocessors():
|
||||
preprocessors = [p for k, p in supported_preprocessors.items() if k != 'None']
|
||||
preprocessors = sorted(preprocessors, key=lambda x: str(x.sorting_priority).zfill(8) + x.name)[::-1]
|
||||
results = OrderedDict()
|
||||
results['None'] = supported_preprocessors['None']
|
||||
for p in preprocessors:
|
||||
results[p.name] = p
|
||||
return results
|
||||
|
||||
|
||||
def get_all_controlnet_names():
|
||||
return controlnet_names
|
||||
|
||||
|
||||
def get_controlnet_filename(controlnet_name):
|
||||
return controlnet_filename_dict[controlnet_name]
|
||||
|
||||
|
||||
def get_all_preprocessor_names():
|
||||
return list(get_sorted_preprocessors().keys())
|
||||
|
||||
|
||||
def get_all_preprocessor_tags():
|
||||
tags = []
|
||||
for k, p in supported_preprocessors.items():
|
||||
tags += p.tags
|
||||
tags = list(set(tags))
|
||||
tags = sorted(tags)
|
||||
return ['All'] + tags
|
||||
|
||||
|
||||
def get_filtered_preprocessors(tag):
|
||||
if tag == 'All':
|
||||
return supported_preprocessors
|
||||
return {k: v for k, v in get_sorted_preprocessors().items() if tag in v.tags or k == 'None'}
|
||||
|
||||
|
||||
def get_filtered_preprocessor_names(tag):
|
||||
return list(get_filtered_preprocessors(tag).keys())
|
||||
|
||||
|
||||
def get_filtered_controlnet_names(tag):
|
||||
filtered_preprocessors = get_filtered_preprocessors(tag)
|
||||
model_filename_filters = []
|
||||
for p in filtered_preprocessors.values():
|
||||
model_filename_filters += p.model_filename_filters
|
||||
return [x for x in controlnet_names if x == 'None' or any(f.lower() in x.lower() for f in model_filename_filters)]
|
||||
|
||||
|
||||
def update_controlnet_filenames():
|
||||
global controlnet_filename_dict, controlnet_names
|
||||
|
||||
controlnet_filename_dict = {'None': 'model.safetensors'}
|
||||
controlnet_names = ['None']
|
||||
|
||||
ext_dirs = (shared.opts.data.get("control_net_models_path", None), getattr(shared.cmd_opts, 'controlnet_dir', None))
|
||||
extra_lora_paths = (extra_lora_path for extra_lora_path in ext_dirs
|
||||
if extra_lora_path is not None and os.path.exists(extra_lora_path))
|
||||
paths = [controlnet_dir, *extra_lora_paths]
|
||||
|
||||
for path in paths:
|
||||
sort_by = shared.opts.data.get("control_net_models_sort_models_by", "name")
|
||||
filter_by = shared.opts.data.get("control_net_models_name_filter", "")
|
||||
found = get_all_models(sort_by, filter_by, path)
|
||||
controlnet_filename_dict.update(found)
|
||||
|
||||
controlnet_names = list(controlnet_filename_dict.keys())
|
||||
return
|
||||
|
||||
|
||||
def select_control_type(
|
||||
control_type: str,
|
||||
) -> Tuple[List[str], List[str], str, str]:
|
||||
global controlnet_names
|
||||
|
||||
pattern = control_type.lower()
|
||||
all_models = list(controlnet_names)
|
||||
|
||||
if pattern == "all":
|
||||
preprocessors = get_sorted_preprocessors().values()
|
||||
return [
|
||||
[p.name for p in preprocessors],
|
||||
all_models,
|
||||
'none', # default option
|
||||
"None" # default model
|
||||
]
|
||||
|
||||
filtered_model_list = get_filtered_controlnet_names(control_type)
|
||||
|
||||
if pattern == "none":
|
||||
filtered_model_list.append("None")
|
||||
|
||||
assert len(filtered_model_list) > 0, "'None' model should always be available."
|
||||
if len(filtered_model_list) == 1:
|
||||
default_model = "None"
|
||||
else:
|
||||
default_model = filtered_model_list[1]
|
||||
for x in filtered_model_list:
|
||||
if "11" in x.split("[")[0]:
|
||||
default_model = x
|
||||
break
|
||||
|
||||
return (
|
||||
get_filtered_preprocessor_names(control_type),
|
||||
filtered_model_list,
|
||||
get_default_preprocessor(control_type),
|
||||
default_model
|
||||
)
|
||||
121
extensions-builtin/sd_forge_controlnet/lib_controlnet/infotext.py
Executable file
121
extensions-builtin/sd_forge_controlnet/lib_controlnet/infotext.py
Executable file
@@ -0,0 +1,121 @@
|
||||
from typing import List, Tuple, Union
|
||||
|
||||
import gradio as gr
|
||||
|
||||
from modules.processing import StableDiffusionProcessing
|
||||
|
||||
from lib_controlnet import external_code
|
||||
from lib_controlnet.logging import logger
|
||||
|
||||
|
||||
def field_to_displaytext(fieldname: str) -> str:
|
||||
return " ".join([word.capitalize() for word in fieldname.split("_")])
|
||||
|
||||
|
||||
def displaytext_to_field(text: str) -> str:
|
||||
return "_".join([word.lower() for word in text.split(" ")])
|
||||
|
||||
|
||||
def parse_value(value: str) -> Union[str, float, int, bool]:
|
||||
if value in ("True", "False"):
|
||||
return value == "True"
|
||||
try:
|
||||
return int(value)
|
||||
except ValueError:
|
||||
try:
|
||||
return float(value)
|
||||
except ValueError:
|
||||
return value # Plain string.
|
||||
|
||||
|
||||
def serialize_unit(unit: external_code.ControlNetUnit) -> str:
|
||||
log_value = {
|
||||
field_to_displaytext(field): getattr(unit, field)
|
||||
for field in external_code.ControlNetUnit.infotext_fields()
|
||||
if getattr(unit, field) != -1
|
||||
# Note: exclude hidden slider values.
|
||||
}
|
||||
if not all("," not in str(v) and ":" not in str(v) for v in log_value.values()):
|
||||
logger.error(f"Unexpected tokens encountered:\n{log_value}")
|
||||
return ""
|
||||
|
||||
return ", ".join(f"{field}: {value}" for field, value in log_value.items())
|
||||
|
||||
|
||||
def parse_unit(text: str) -> external_code.ControlNetUnit:
|
||||
return external_code.ControlNetUnit(
|
||||
enabled=True,
|
||||
**{
|
||||
displaytext_to_field(key): parse_value(value)
|
||||
for item in text.split(",")
|
||||
for (key, value) in (item.strip().split(": "),)
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
class Infotext(object):
|
||||
def __init__(self) -> None:
|
||||
self.infotext_fields: List[Tuple[gr.components.IOComponent, str]] = []
|
||||
self.paste_field_names: List[str] = []
|
||||
|
||||
@staticmethod
|
||||
def unit_prefix(unit_index: int) -> str:
|
||||
return f"ControlNet {unit_index}"
|
||||
|
||||
def register_unit(self, unit_index: int, uigroup) -> None:
|
||||
"""Register the unit's UI group. By regsitering the unit, A1111 will be
|
||||
able to paste values from infotext to IOComponents.
|
||||
|
||||
Args:
|
||||
unit_index: The index of the ControlNet unit
|
||||
uigroup: The ControlNetUiGroup instance that contains all gradio
|
||||
iocomponents.
|
||||
"""
|
||||
unit_prefix = Infotext.unit_prefix(unit_index)
|
||||
for field in external_code.ControlNetUnit.infotext_fields():
|
||||
# Every field in ControlNetUnit should have a corresponding
|
||||
# IOComponent in ControlNetUiGroup.
|
||||
io_component = getattr(uigroup, field)
|
||||
component_locator = f"{unit_prefix} {field}"
|
||||
self.infotext_fields.append((io_component, component_locator))
|
||||
self.paste_field_names.append(component_locator)
|
||||
|
||||
@staticmethod
|
||||
def write_infotext(
|
||||
units: List[external_code.ControlNetUnit], p: StableDiffusionProcessing
|
||||
):
|
||||
"""Write infotext to `p`."""
|
||||
p.extra_generation_params.update(
|
||||
{
|
||||
Infotext.unit_prefix(i): serialize_unit(unit)
|
||||
for i, unit in enumerate(units)
|
||||
if unit.enabled
|
||||
}
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def on_infotext_pasted(infotext: str, results: dict) -> None:
|
||||
"""Parse ControlNet infotext string and write result to `results` dict."""
|
||||
updates = {}
|
||||
for k, v in results.items():
|
||||
if not k.startswith("ControlNet"):
|
||||
continue
|
||||
|
||||
assert isinstance(v, str), f"Expect string but got {v}."
|
||||
try:
|
||||
for field, value in vars(parse_unit(v)).items():
|
||||
if field == "image":
|
||||
continue
|
||||
if value is None:
|
||||
logger.debug(f"InfoText: Skipping {field} because value is None.")
|
||||
continue
|
||||
|
||||
component_locator = f"{k} {field}"
|
||||
updates[component_locator] = value
|
||||
logger.debug(f"InfoText: Setting {component_locator} = {value}")
|
||||
except Exception as e:
|
||||
logger.warn(
|
||||
f"Failed to parse infotext, legacy format infotext is no longer supported:\n{v}\n{e}"
|
||||
)
|
||||
|
||||
results.update(updates)
|
||||
41
extensions-builtin/sd_forge_controlnet/lib_controlnet/logging.py
Executable file
41
extensions-builtin/sd_forge_controlnet/lib_controlnet/logging.py
Executable file
@@ -0,0 +1,41 @@
|
||||
import logging
|
||||
import copy
|
||||
import sys
|
||||
|
||||
from modules import shared
|
||||
|
||||
|
||||
class ColoredFormatter(logging.Formatter):
|
||||
COLORS = {
|
||||
"DEBUG": "\033[0;36m", # CYAN
|
||||
"INFO": "\033[0;32m", # GREEN
|
||||
"WARNING": "\033[0;33m", # YELLOW
|
||||
"ERROR": "\033[0;31m", # RED
|
||||
"CRITICAL": "\033[0;37;41m", # WHITE ON RED
|
||||
"RESET": "\033[0m", # RESET COLOR
|
||||
}
|
||||
|
||||
def format(self, record):
|
||||
colored_record = copy.copy(record)
|
||||
levelname = colored_record.levelname
|
||||
seq = self.COLORS.get(levelname, self.COLORS["RESET"])
|
||||
colored_record.levelname = f"{seq}{levelname}{self.COLORS['RESET']}"
|
||||
return super().format(colored_record)
|
||||
|
||||
|
||||
# Create a new logger
|
||||
logger = logging.getLogger("ControlNet")
|
||||
logger.propagate = False
|
||||
|
||||
# Add handler if we don't have one.
|
||||
if not logger.handlers:
|
||||
handler = logging.StreamHandler(sys.stdout)
|
||||
handler.setFormatter(
|
||||
ColoredFormatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
|
||||
)
|
||||
logger.addHandler(handler)
|
||||
|
||||
# Configure logger
|
||||
loglevel_string = getattr(shared.cmd_opts, "controlnet_loglevel", "INFO")
|
||||
loglevel = getattr(logging, loglevel_string.upper(), None)
|
||||
logger.setLevel(loglevel)
|
||||
88
extensions-builtin/sd_forge_controlnet/lib_controlnet/lvminthin.py
Executable file
88
extensions-builtin/sd_forge_controlnet/lib_controlnet/lvminthin.py
Executable file
@@ -0,0 +1,88 @@
|
||||
# High Quality Edge Thinning using Pure Python
|
||||
# Written by Lvmin Zhang
|
||||
# 2023 April
|
||||
# Stanford University
|
||||
# If you use this, please Cite "High Quality Edge Thinning using Pure Python", Lvmin Zhang, In Mikubill/sd-webui-controlnet.
|
||||
|
||||
|
||||
import cv2
|
||||
import numpy as np
|
||||
|
||||
|
||||
lvmin_kernels_raw = [
|
||||
np.array([
|
||||
[-1, -1, -1],
|
||||
[0, 1, 0],
|
||||
[1, 1, 1]
|
||||
], dtype=np.int32),
|
||||
np.array([
|
||||
[0, -1, -1],
|
||||
[1, 1, -1],
|
||||
[0, 1, 0]
|
||||
], dtype=np.int32)
|
||||
]
|
||||
|
||||
lvmin_kernels = []
|
||||
lvmin_kernels += [np.rot90(x, k=0, axes=(0, 1)) for x in lvmin_kernels_raw]
|
||||
lvmin_kernels += [np.rot90(x, k=1, axes=(0, 1)) for x in lvmin_kernels_raw]
|
||||
lvmin_kernels += [np.rot90(x, k=2, axes=(0, 1)) for x in lvmin_kernels_raw]
|
||||
lvmin_kernels += [np.rot90(x, k=3, axes=(0, 1)) for x in lvmin_kernels_raw]
|
||||
|
||||
lvmin_prunings_raw = [
|
||||
np.array([
|
||||
[-1, -1, -1],
|
||||
[-1, 1, -1],
|
||||
[0, 0, -1]
|
||||
], dtype=np.int32),
|
||||
np.array([
|
||||
[-1, -1, -1],
|
||||
[-1, 1, -1],
|
||||
[-1, 0, 0]
|
||||
], dtype=np.int32)
|
||||
]
|
||||
|
||||
lvmin_prunings = []
|
||||
lvmin_prunings += [np.rot90(x, k=0, axes=(0, 1)) for x in lvmin_prunings_raw]
|
||||
lvmin_prunings += [np.rot90(x, k=1, axes=(0, 1)) for x in lvmin_prunings_raw]
|
||||
lvmin_prunings += [np.rot90(x, k=2, axes=(0, 1)) for x in lvmin_prunings_raw]
|
||||
lvmin_prunings += [np.rot90(x, k=3, axes=(0, 1)) for x in lvmin_prunings_raw]
|
||||
|
||||
|
||||
def remove_pattern(x, kernel):
|
||||
objects = cv2.morphologyEx(x, cv2.MORPH_HITMISS, kernel)
|
||||
objects = np.where(objects > 127)
|
||||
x[objects] = 0
|
||||
return x, objects[0].shape[0] > 0
|
||||
|
||||
|
||||
def thin_one_time(x, kernels):
|
||||
y = x
|
||||
is_done = True
|
||||
for k in kernels:
|
||||
y, has_update = remove_pattern(y, k)
|
||||
if has_update:
|
||||
is_done = False
|
||||
return y, is_done
|
||||
|
||||
|
||||
def lvmin_thin(x, prunings=True):
|
||||
y = x
|
||||
for i in range(32):
|
||||
y, is_done = thin_one_time(y, lvmin_kernels)
|
||||
if is_done:
|
||||
break
|
||||
if prunings:
|
||||
y, _ = thin_one_time(y, lvmin_prunings)
|
||||
return y
|
||||
|
||||
|
||||
def nake_nms(x):
|
||||
f1 = np.array([[0, 0, 0], [1, 1, 1], [0, 0, 0]], dtype=np.uint8)
|
||||
f2 = np.array([[0, 1, 0], [0, 1, 0], [0, 1, 0]], dtype=np.uint8)
|
||||
f3 = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]], dtype=np.uint8)
|
||||
f4 = np.array([[0, 0, 1], [0, 1, 0], [1, 0, 0]], dtype=np.uint8)
|
||||
y = np.zeros_like(x)
|
||||
for f in [f1, f2, f3, f4]:
|
||||
np.putmask(y, cv2.dilate(x, kernel=f) == x, x)
|
||||
return y
|
||||
|
||||
362
extensions-builtin/sd_forge_controlnet/lib_controlnet/utils.py
Executable file
362
extensions-builtin/sd_forge_controlnet/lib_controlnet/utils.py
Executable file
@@ -0,0 +1,362 @@
|
||||
from typing import Optional
|
||||
from modules import processing
|
||||
|
||||
from lib_controlnet import external_code
|
||||
|
||||
from modules_forge.utils import HWC3
|
||||
|
||||
from PIL import Image, ImageFilter, ImageOps
|
||||
from lib_controlnet.lvminthin import lvmin_thin, nake_nms
|
||||
|
||||
import torch
|
||||
import os
|
||||
import functools
|
||||
import time
|
||||
import base64
|
||||
import numpy as np
|
||||
import safetensors.torch
|
||||
import cv2
|
||||
import logging
|
||||
|
||||
from typing import Any, Callable, Dict, List
|
||||
from lib_controlnet.logging import logger
|
||||
|
||||
|
||||
def load_state_dict(ckpt_path, location="cpu"):
|
||||
_, extension = os.path.splitext(ckpt_path)
|
||||
if extension.lower() == ".safetensors":
|
||||
state_dict = safetensors.torch.load_file(ckpt_path, device=location)
|
||||
else:
|
||||
state_dict = torch.load(ckpt_path, map_location=torch.device(location))
|
||||
state_dict = get_state_dict(state_dict)
|
||||
logger.info(f"Loaded state_dict from [{ckpt_path}]")
|
||||
return state_dict
|
||||
|
||||
|
||||
def get_state_dict(d):
|
||||
return d.get("state_dict", d)
|
||||
|
||||
|
||||
def ndarray_lru_cache(max_size: int = 128, typed: bool = False):
|
||||
"""
|
||||
Decorator to enable caching for functions with numpy array arguments.
|
||||
Numpy arrays are mutable, and thus not directly usable as hash keys.
|
||||
|
||||
The idea here is to wrap the incoming arguments with type `np.ndarray`
|
||||
as `HashableNpArray` so that `lru_cache` can correctly handles `np.ndarray`
|
||||
arguments.
|
||||
|
||||
`HashableNpArray` functions exactly the same way as `np.ndarray` except
|
||||
having `__hash__` and `__eq__` overriden.
|
||||
"""
|
||||
|
||||
def decorator(func: Callable):
|
||||
"""The actual decorator that accept function as input."""
|
||||
|
||||
class HashableNpArray(np.ndarray):
|
||||
def __new__(cls, input_array):
|
||||
# Input array is an instance of ndarray.
|
||||
# The view makes the input array and returned array share the same data.
|
||||
obj = np.asarray(input_array).view(cls)
|
||||
return obj
|
||||
|
||||
def __eq__(self, other) -> bool:
|
||||
return np.array_equal(self, other)
|
||||
|
||||
def __hash__(self):
|
||||
# Hash the bytes representing the data of the array.
|
||||
return hash(self.tobytes())
|
||||
|
||||
@functools.lru_cache(maxsize=max_size, typed=typed)
|
||||
def cached_func(*args, **kwargs):
|
||||
"""This function only accepts `HashableNpArray` as input params."""
|
||||
return func(*args, **kwargs)
|
||||
|
||||
# Preserves original function.__name__ and __doc__.
|
||||
@functools.wraps(func)
|
||||
def decorated_func(*args, **kwargs):
|
||||
"""The decorated function that delegates the original function."""
|
||||
|
||||
def convert_item(item: Any):
|
||||
if isinstance(item, np.ndarray):
|
||||
return HashableNpArray(item)
|
||||
if isinstance(item, tuple):
|
||||
return tuple(convert_item(i) for i in item)
|
||||
return item
|
||||
|
||||
args = [convert_item(arg) for arg in args]
|
||||
kwargs = {k: convert_item(arg) for k, arg in kwargs.items()}
|
||||
return cached_func(*args, **kwargs)
|
||||
|
||||
return decorated_func
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def timer_decorator(func):
|
||||
"""Time the decorated function and output the result to debug logger."""
|
||||
if logger.level != logging.DEBUG:
|
||||
return func
|
||||
|
||||
@functools.wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
start_time = time.time()
|
||||
result = func(*args, **kwargs)
|
||||
end_time = time.time()
|
||||
duration = end_time - start_time
|
||||
# Only report function that are significant enough.
|
||||
if duration > 1e-3:
|
||||
logger.debug(f"{func.__name__} ran in: {duration:.3f} sec")
|
||||
return result
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
class TimeMeta(type):
|
||||
""" Metaclass to record execution time on all methods of the
|
||||
child class. """
|
||||
def __new__(cls, name, bases, attrs):
|
||||
for attr_name, attr_value in attrs.items():
|
||||
if callable(attr_value):
|
||||
attrs[attr_name] = timer_decorator(attr_value)
|
||||
return super().__new__(cls, name, bases, attrs)
|
||||
|
||||
|
||||
# svgsupports
|
||||
svgsupport = False
|
||||
try:
|
||||
import io
|
||||
from svglib.svglib import svg2rlg
|
||||
from reportlab.graphics import renderPM
|
||||
|
||||
svgsupport = True
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
||||
def svg_preprocess(inputs: Dict, preprocess: Callable):
|
||||
if not inputs:
|
||||
return None
|
||||
|
||||
if inputs["image"].startswith("data:image/svg+xml;base64,") and svgsupport:
|
||||
svg_data = base64.b64decode(
|
||||
inputs["image"].replace("data:image/svg+xml;base64,", "")
|
||||
)
|
||||
drawing = svg2rlg(io.BytesIO(svg_data))
|
||||
png_data = renderPM.drawToString(drawing, fmt="PNG")
|
||||
encoded_string = base64.b64encode(png_data)
|
||||
base64_str = str(encoded_string, "utf-8")
|
||||
base64_str = "data:image/png;base64," + base64_str
|
||||
inputs["image"] = base64_str
|
||||
return preprocess(inputs)
|
||||
|
||||
|
||||
def get_unique_axis0(data):
|
||||
arr = np.asanyarray(data)
|
||||
idxs = np.lexsort(arr.T)
|
||||
arr = arr[idxs]
|
||||
unique_idxs = np.empty(len(arr), dtype=np.bool_)
|
||||
unique_idxs[:1] = True
|
||||
unique_idxs[1:] = np.any(arr[:-1, :] != arr[1:, :], axis=-1)
|
||||
return arr[unique_idxs]
|
||||
|
||||
|
||||
def read_image(img_path: str) -> str:
|
||||
"""Read image from specified path and return a base64 string."""
|
||||
img = cv2.imread(img_path)
|
||||
_, bytes = cv2.imencode(".png", img)
|
||||
encoded_image = base64.b64encode(bytes).decode("utf-8")
|
||||
return encoded_image
|
||||
|
||||
|
||||
def read_image_dir(img_dir: str, suffixes=('.png', '.jpg', '.jpeg', '.webp')) -> List[str]:
|
||||
"""Try read all images in given img_dir."""
|
||||
images = []
|
||||
for filename in os.listdir(img_dir):
|
||||
if filename.endswith(suffixes):
|
||||
img_path = os.path.join(img_dir, filename)
|
||||
try:
|
||||
images.append(read_image(img_path))
|
||||
except IOError:
|
||||
logger.error(f"Error opening {img_path}")
|
||||
return images
|
||||
|
||||
|
||||
def align_dim_latent(x: int) -> int:
|
||||
""" Align the pixel dimension (w/h) to latent dimension.
|
||||
Stable diffusion 1:8 ratio for latent/pixel, i.e.,
|
||||
1 latent unit == 8 pixel unit."""
|
||||
return (x // 8) * 8
|
||||
|
||||
|
||||
def prepare_mask(
|
||||
mask: Image.Image, p: processing.StableDiffusionProcessing
|
||||
) -> Image.Image:
|
||||
"""
|
||||
Prepare an image mask for the inpainting process.
|
||||
|
||||
This function takes as input a PIL Image object and an instance of the
|
||||
StableDiffusionProcessing class, and performs the following steps to prepare the mask:
|
||||
|
||||
1. Convert the mask to grayscale (mode "L").
|
||||
2. If the 'inpainting_mask_invert' attribute of the processing instance is True,
|
||||
invert the mask colors.
|
||||
3. If the 'mask_blur' attribute of the processing instance is greater than 0,
|
||||
apply a Gaussian blur to the mask with a radius equal to 'mask_blur'.
|
||||
|
||||
Args:
|
||||
mask (Image.Image): The input mask as a PIL Image object.
|
||||
p (processing.StableDiffusionProcessing): An instance of the StableDiffusionProcessing class
|
||||
containing the processing parameters.
|
||||
|
||||
Returns:
|
||||
mask (Image.Image): The prepared mask as a PIL Image object.
|
||||
"""
|
||||
mask = mask.convert("L")
|
||||
if getattr(p, "inpainting_mask_invert", False):
|
||||
mask = ImageOps.invert(mask)
|
||||
|
||||
if hasattr(p, 'mask_blur_x'):
|
||||
if getattr(p, "mask_blur_x", 0) > 0:
|
||||
np_mask = np.array(mask)
|
||||
kernel_size = 2 * int(2.5 * p.mask_blur_x + 0.5) + 1
|
||||
np_mask = cv2.GaussianBlur(np_mask, (kernel_size, 1), p.mask_blur_x)
|
||||
mask = Image.fromarray(np_mask)
|
||||
if getattr(p, "mask_blur_y", 0) > 0:
|
||||
np_mask = np.array(mask)
|
||||
kernel_size = 2 * int(2.5 * p.mask_blur_y + 0.5) + 1
|
||||
np_mask = cv2.GaussianBlur(np_mask, (1, kernel_size), p.mask_blur_y)
|
||||
mask = Image.fromarray(np_mask)
|
||||
else:
|
||||
if getattr(p, "mask_blur", 0) > 0:
|
||||
mask = mask.filter(ImageFilter.GaussianBlur(p.mask_blur))
|
||||
|
||||
return mask
|
||||
|
||||
|
||||
def set_numpy_seed(p: processing.StableDiffusionProcessing) -> Optional[int]:
|
||||
"""
|
||||
Set the random seed for NumPy based on the provided parameters.
|
||||
|
||||
Args:
|
||||
p (processing.StableDiffusionProcessing): The instance of the StableDiffusionProcessing class.
|
||||
|
||||
Returns:
|
||||
Optional[int]: The computed random seed if successful, or None if an exception occurs.
|
||||
|
||||
This function sets the random seed for NumPy using the seed and subseed values from the given instance of
|
||||
StableDiffusionProcessing. If either seed or subseed is -1, it uses the first value from `all_seeds`.
|
||||
Otherwise, it takes the maximum of the provided seed value and 0.
|
||||
|
||||
The final random seed is computed by adding the seed and subseed values, applying a bitwise AND operation
|
||||
with 0xFFFFFFFF to ensure it fits within a 32-bit integer.
|
||||
"""
|
||||
try:
|
||||
tmp_seed = int(p.all_seeds[0] if p.seed == -1 else max(int(p.seed), 0))
|
||||
tmp_subseed = int(p.all_seeds[0] if p.subseed == -1 else max(int(p.subseed), 0))
|
||||
seed = (tmp_seed + tmp_subseed) & 0xFFFFFFFF
|
||||
np.random.seed(seed)
|
||||
return seed
|
||||
except Exception as e:
|
||||
logger.warning(e)
|
||||
logger.warning('Warning: Failed to use consistent random seed.')
|
||||
return None
|
||||
|
||||
|
||||
def safe_numpy(x):
|
||||
# A very safe method to make sure that Apple/Mac works
|
||||
y = x
|
||||
|
||||
# below is very boring but do not change these. If you change these Apple or Mac may fail.
|
||||
y = y.copy()
|
||||
y = np.ascontiguousarray(y)
|
||||
y = y.copy()
|
||||
return y
|
||||
|
||||
|
||||
def high_quality_resize(x, size):
|
||||
# Written by lvmin
|
||||
# Super high-quality control map up-scaling, considering binary, seg, and one-pixel edges
|
||||
|
||||
if x.shape[0] != size[1] or x.shape[1] != size[0]:
|
||||
new_size_is_smaller = (size[0] * size[1]) < (x.shape[0] * x.shape[1])
|
||||
new_size_is_bigger = (size[0] * size[1]) > (x.shape[0] * x.shape[1])
|
||||
unique_color_count = len(get_unique_axis0(x.reshape(-1, x.shape[2])))
|
||||
is_one_pixel_edge = False
|
||||
is_binary = False
|
||||
if unique_color_count == 2:
|
||||
is_binary = np.min(x) < 16 and np.max(x) > 240
|
||||
if is_binary:
|
||||
xc = x
|
||||
xc = cv2.erode(xc, np.ones(shape=(3, 3), dtype=np.uint8), iterations=1)
|
||||
xc = cv2.dilate(xc, np.ones(shape=(3, 3), dtype=np.uint8), iterations=1)
|
||||
one_pixel_edge_count = np.where(xc < x)[0].shape[0]
|
||||
all_edge_count = np.where(x > 127)[0].shape[0]
|
||||
is_one_pixel_edge = one_pixel_edge_count * 2 > all_edge_count
|
||||
|
||||
if 2 < unique_color_count < 200:
|
||||
interpolation = cv2.INTER_NEAREST
|
||||
elif new_size_is_smaller:
|
||||
interpolation = cv2.INTER_AREA
|
||||
else:
|
||||
interpolation = cv2.INTER_CUBIC # Must be CUBIC because we now use nms. NEVER CHANGE THIS
|
||||
|
||||
y = cv2.resize(x, size, interpolation=interpolation)
|
||||
|
||||
if is_binary:
|
||||
y = np.mean(y.astype(np.float32), axis=2).clip(0, 255).astype(np.uint8)
|
||||
if is_one_pixel_edge:
|
||||
y = nake_nms(y)
|
||||
_, y = cv2.threshold(y, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)
|
||||
y = lvmin_thin(y, prunings=new_size_is_bigger)
|
||||
else:
|
||||
_, y = cv2.threshold(y, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)
|
||||
y = np.stack([y] * 3, axis=2)
|
||||
else:
|
||||
y = x
|
||||
|
||||
return y
|
||||
|
||||
|
||||
def crop_and_resize_image(detected_map, resize_mode, h, w, fill_border_with_255=False):
|
||||
if resize_mode == external_code.ResizeMode.RESIZE:
|
||||
detected_map = high_quality_resize(detected_map, (w, h))
|
||||
detected_map = safe_numpy(detected_map)
|
||||
return detected_map
|
||||
|
||||
old_h, old_w, _ = detected_map.shape
|
||||
old_w = float(old_w)
|
||||
old_h = float(old_h)
|
||||
k0 = float(h) / old_h
|
||||
k1 = float(w) / old_w
|
||||
|
||||
safeint = lambda x: int(np.round(x))
|
||||
|
||||
if resize_mode == external_code.ResizeMode.OUTER_FIT:
|
||||
k = min(k0, k1)
|
||||
borders = np.concatenate([detected_map[0, :, :], detected_map[-1, :, :], detected_map[:, 0, :], detected_map[:, -1, :]], axis=0)
|
||||
high_quality_border_color = np.median(borders, axis=0).astype(detected_map.dtype)
|
||||
if fill_border_with_255:
|
||||
high_quality_border_color = np.zeros_like(high_quality_border_color) + 255
|
||||
high_quality_background = np.tile(high_quality_border_color[None, None], [h, w, 1])
|
||||
detected_map = high_quality_resize(detected_map, (safeint(old_w * k), safeint(old_h * k)))
|
||||
new_h, new_w, _ = detected_map.shape
|
||||
pad_h = max(0, (h - new_h) // 2)
|
||||
pad_w = max(0, (w - new_w) // 2)
|
||||
high_quality_background[pad_h:pad_h + new_h, pad_w:pad_w + new_w] = detected_map
|
||||
detected_map = high_quality_background
|
||||
detected_map = safe_numpy(detected_map)
|
||||
return detected_map
|
||||
else:
|
||||
k = max(k0, k1)
|
||||
detected_map = high_quality_resize(detected_map, (safeint(old_w * k), safeint(old_h * k)))
|
||||
new_h, new_w, _ = detected_map.shape
|
||||
pad_h = max(0, (new_h - h) // 2)
|
||||
pad_w = max(0, (new_w - w) // 2)
|
||||
detected_map = detected_map[pad_h:pad_h+h, pad_w:pad_w+w]
|
||||
detected_map = safe_numpy(detected_map)
|
||||
return detected_map
|
||||
|
||||
|
||||
def judge_image_type(img):
|
||||
return isinstance(img, np.ndarray) and img.ndim == 3 and int(img.shape[2]) in [3, 4]
|
||||
13
extensions-builtin/sd_forge_controlnet/preload.py
Executable file
13
extensions-builtin/sd_forge_controlnet/preload.py
Executable file
@@ -0,0 +1,13 @@
|
||||
def preload(parser):
|
||||
parser.add_argument(
|
||||
"--controlnet-loglevel",
|
||||
default="INFO",
|
||||
choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
|
||||
help="Set the log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--controlnet-tracemalloc",
|
||||
action="store_true",
|
||||
help="Enable memory tracing.",
|
||||
default=None,
|
||||
)
|
||||
5
extensions-builtin/sd_forge_controlnet/requirements.txt
Executable file
5
extensions-builtin/sd_forge_controlnet/requirements.txt
Executable file
@@ -0,0 +1,5 @@
|
||||
fvcore
|
||||
mediapipe
|
||||
onnxruntime
|
||||
opencv-python>=4.8.0
|
||||
svglib
|
||||
617
extensions-builtin/sd_forge_controlnet/scripts/controlnet.py
Executable file
617
extensions-builtin/sd_forge_controlnet/scripts/controlnet.py
Executable file
@@ -0,0 +1,617 @@
|
||||
import os
|
||||
from typing import Dict, Optional, Tuple, List, Union
|
||||
|
||||
import cv2
|
||||
import torch
|
||||
|
||||
import modules.scripts as scripts
|
||||
from modules import shared, script_callbacks, masking, images
|
||||
from modules.ui_components import InputAccordion
|
||||
from modules.api.api import decode_base64_to_image
|
||||
import gradio as gr
|
||||
|
||||
from lib_controlnet import global_state, external_code
|
||||
from lib_controlnet.external_code import ControlNetUnit
|
||||
from lib_controlnet.utils import align_dim_latent, set_numpy_seed, crop_and_resize_image, \
|
||||
prepare_mask, judge_image_type
|
||||
from lib_controlnet.controlnet_ui.controlnet_ui_group import ControlNetUiGroup
|
||||
from lib_controlnet.controlnet_ui.photopea import Photopea
|
||||
from lib_controlnet.logging import logger
|
||||
from modules.processing import StableDiffusionProcessingImg2Img, StableDiffusionProcessingTxt2Img, \
|
||||
StableDiffusionProcessing
|
||||
from lib_controlnet.infotext import Infotext
|
||||
from modules_forge.utils import HWC3, numpy_to_pytorch
|
||||
from lib_controlnet.enums import HiResFixOption
|
||||
from lib_controlnet.api import controlnet_api
|
||||
|
||||
import numpy as np
|
||||
import functools
|
||||
|
||||
from PIL import Image
|
||||
from modules_forge.shared import try_load_supported_control_model
|
||||
from modules_forge.supported_controlnet import ControlModelPatcher
|
||||
|
||||
# Gradio 3.32 bug fix
|
||||
import tempfile
|
||||
|
||||
gradio_tempfile_path = os.path.join(tempfile.gettempdir(), 'gradio')
|
||||
os.makedirs(gradio_tempfile_path, exist_ok=True)
|
||||
|
||||
global_state.update_controlnet_filenames()
|
||||
|
||||
|
||||
@functools.lru_cache(maxsize=shared.opts.data.get("control_net_model_cache_size", 5))
|
||||
def cached_controlnet_loader(filename):
|
||||
return try_load_supported_control_model(filename)
|
||||
|
||||
|
||||
class ControlNetCachedParameters:
|
||||
def __init__(self):
|
||||
self.preprocessor = None
|
||||
self.model = None
|
||||
self.control_cond = None
|
||||
self.control_cond_for_hr_fix = None
|
||||
self.control_mask = None
|
||||
self.control_mask_for_hr_fix = None
|
||||
|
||||
|
||||
class ControlNetForForgeOfficial(scripts.Script):
|
||||
sorting_priority = 10
|
||||
|
||||
def title(self):
|
||||
return "ControlNet"
|
||||
|
||||
def show(self, is_img2img):
|
||||
return scripts.AlwaysVisible
|
||||
|
||||
def ui(self, is_img2img):
|
||||
infotext = Infotext()
|
||||
ui_groups = []
|
||||
controls = []
|
||||
max_models = shared.opts.data.get("control_net_unit_count", 3)
|
||||
gen_type = "img2img" if is_img2img else "txt2img"
|
||||
elem_id_tabname = gen_type + "_controlnet"
|
||||
default_unit = ControlNetUnit(enabled=False, module="None", model="None")
|
||||
with gr.Group(elem_id=elem_id_tabname):
|
||||
with gr.Accordion(f"ControlNet Integrated", open=False, elem_id="controlnet",
|
||||
elem_classes=["controlnet"]):
|
||||
photopea = (
|
||||
Photopea()
|
||||
if not shared.opts.data.get("controlnet_disable_photopea_edit", False)
|
||||
else None
|
||||
)
|
||||
with gr.Row(elem_id=elem_id_tabname + "_accordions", elem_classes="accordions"):
|
||||
for i in range(max_models):
|
||||
with InputAccordion(
|
||||
value=False,
|
||||
label=f"ControlNet Unit {i}",
|
||||
elem_classes=["cnet-unit-enabled-accordion"], # Class on accordion
|
||||
):
|
||||
group = ControlNetUiGroup(is_img2img, default_unit, photopea)
|
||||
ui_groups.append(group)
|
||||
controls.append(group.render(f"ControlNet-{i}", elem_id_tabname))
|
||||
|
||||
for i, ui_group in enumerate(ui_groups):
|
||||
infotext.register_unit(i, ui_group)
|
||||
if shared.opts.data.get("control_net_sync_field_args", True):
|
||||
self.infotext_fields = infotext.infotext_fields
|
||||
self.paste_field_names = infotext.paste_field_names
|
||||
return tuple(controls)
|
||||
|
||||
def get_enabled_units(self, units):
|
||||
# Parse dict from API calls.
|
||||
units = [
|
||||
ControlNetUnit.from_dict(unit) if isinstance(unit, dict) else unit
|
||||
for unit in units
|
||||
]
|
||||
assert all(isinstance(unit, ControlNetUnit) for unit in units)
|
||||
enabled_units = [x for x in units if x.enabled]
|
||||
return enabled_units
|
||||
|
||||
@staticmethod
|
||||
def try_crop_image_with_a1111_mask(
|
||||
p: StableDiffusionProcessing,
|
||||
unit: ControlNetUnit,
|
||||
input_image: np.ndarray,
|
||||
resize_mode: external_code.ResizeMode,
|
||||
preprocessor
|
||||
) -> np.ndarray:
|
||||
a1111_mask_image: Optional[Image.Image] = getattr(p, "image_mask", None)
|
||||
is_only_masked_inpaint = (
|
||||
issubclass(type(p), StableDiffusionProcessingImg2Img) and
|
||||
p.inpaint_full_res and
|
||||
a1111_mask_image is not None
|
||||
)
|
||||
if (
|
||||
preprocessor.corp_image_with_a1111_mask_when_in_img2img_inpaint_tab
|
||||
and is_only_masked_inpaint
|
||||
):
|
||||
logger.info("Crop input image based on A1111 mask.")
|
||||
input_image = [input_image[:, :, i] for i in range(input_image.shape[2])]
|
||||
input_image = [Image.fromarray(x) for x in input_image]
|
||||
|
||||
mask = prepare_mask(a1111_mask_image, p)
|
||||
|
||||
crop_region = masking.get_crop_region(np.array(mask), p.inpaint_full_res_padding)
|
||||
crop_region = masking.expand_crop_region(crop_region, p.width, p.height, mask.width, mask.height)
|
||||
|
||||
input_image = [
|
||||
images.resize_image(resize_mode.int_value(), i, mask.width, mask.height)
|
||||
for i in input_image
|
||||
]
|
||||
|
||||
input_image = [x.crop(crop_region) for x in input_image]
|
||||
input_image = [
|
||||
images.resize_image(external_code.ResizeMode.OUTER_FIT.int_value(), x, p.width, p.height)
|
||||
for x in input_image
|
||||
]
|
||||
|
||||
input_image = [np.asarray(x)[:, :, 0] for x in input_image]
|
||||
input_image = np.stack(input_image, axis=2)
|
||||
return input_image
|
||||
|
||||
def get_input_data(self, p, unit, preprocessor, h, w):
|
||||
logger.info(f'ControlNet Input Mode: {unit.input_mode}')
|
||||
image_list = []
|
||||
resize_mode = external_code.resize_mode_from_value(unit.resize_mode)
|
||||
|
||||
if unit.input_mode == external_code.InputMode.MERGE:
|
||||
for idx, item in enumerate(unit.batch_input_gallery):
|
||||
img_path = item[0]
|
||||
logger.info(f'Try to read image: {img_path}')
|
||||
img = np.ascontiguousarray(cv2.imread(img_path)[:, :, ::-1]).copy()
|
||||
mask = None
|
||||
if unit.batch_mask_gallery is not None and len(unit.batch_mask_gallery) > 0:
|
||||
if len(unit.batch_mask_gallery) >= len(unit.batch_input_gallery):
|
||||
mask_path = unit.batch_mask_gallery[idx]['name']
|
||||
else:
|
||||
mask_path = unit.batch_mask_gallery[0]['name']
|
||||
mask = np.ascontiguousarray(cv2.imread(mask_path)[:, :, ::-1]).copy()
|
||||
if img is not None:
|
||||
image_list.append([img, mask])
|
||||
elif unit.input_mode == external_code.InputMode.BATCH:
|
||||
image_list = []
|
||||
image_extensions = ['.jpg', '.jpeg', '.png', '.bmp']
|
||||
batch_image_files = shared.listfiles(unit.batch_image_dir)
|
||||
for batch_modifier in getattr(unit, 'batch_modifiers', []):
|
||||
batch_image_files = batch_modifier(batch_image_files, p)
|
||||
for idx, filename in enumerate(batch_image_files):
|
||||
if any(filename.lower().endswith(ext) for ext in image_extensions):
|
||||
img_path = os.path.join(unit.batch_image_dir, filename)
|
||||
logger.info(f'Try to read image: {img_path}')
|
||||
img = np.ascontiguousarray(cv2.imread(img_path)[:, :, ::-1]).copy()
|
||||
mask = None
|
||||
if unit.batch_mask_dir:
|
||||
batch_mask_files = shared.listfiles(unit.batch_mask_dir)
|
||||
if len(batch_mask_files) >= len(batch_image_files):
|
||||
mask_path = batch_mask_files[idx]
|
||||
else:
|
||||
mask_path = batch_mask_files[0]
|
||||
mask_path = os.path.join(unit.batch_mask_dir, mask_path)
|
||||
mask = np.ascontiguousarray(cv2.imread(mask_path)[:, :, ::-1]).copy()
|
||||
if img is not None:
|
||||
image_list.append([img, mask])
|
||||
else:
|
||||
a1111_i2i_image = getattr(p, "init_images", [None])[0]
|
||||
a1111_i2i_mask = getattr(p, "image_mask", None)
|
||||
|
||||
using_a1111_data = False
|
||||
|
||||
unit_image = unit.image
|
||||
unit_image_fg = unit.image_fg[:, :, 3] if unit.image_fg is not None else None
|
||||
|
||||
if unit.use_preview_as_input and unit.generated_image is not None:
|
||||
image = unit.generated_image
|
||||
elif unit.image is None:
|
||||
resize_mode = external_code.resize_mode_from_value(p.resize_mode)
|
||||
image = HWC3(np.asarray(a1111_i2i_image))
|
||||
using_a1111_data = True
|
||||
elif (unit_image < 5).all() and (unit_image_fg > 5).any():
|
||||
image = unit_image_fg
|
||||
else:
|
||||
image = unit_image
|
||||
|
||||
if not isinstance(image, np.ndarray):
|
||||
raise ValueError("controlnet is enabled but no input image is given")
|
||||
|
||||
image = HWC3(image)
|
||||
|
||||
unit_mask_image = unit.mask_image
|
||||
unit_mask_image_fg = unit.mask_image_fg[:, :, 3] if unit.mask_image_fg is not None else None
|
||||
|
||||
if using_a1111_data:
|
||||
mask = HWC3(np.asarray(a1111_i2i_mask)) if a1111_i2i_mask is not None else None
|
||||
elif unit_mask_image_fg is not None and (unit_mask_image_fg > 5).any():
|
||||
mask = unit_mask_image_fg
|
||||
elif unit_mask_image is not None and (unit_mask_image > 5).any():
|
||||
mask = unit_mask_image
|
||||
elif unit_image_fg is not None and (unit_image_fg > 5).any():
|
||||
mask = unit_image_fg
|
||||
else:
|
||||
mask = None
|
||||
|
||||
image = self.try_crop_image_with_a1111_mask(p, unit, image, resize_mode, preprocessor)
|
||||
|
||||
if mask is not None:
|
||||
mask = cv2.resize(HWC3(mask), (image.shape[1], image.shape[0]), interpolation=cv2.INTER_NEAREST)
|
||||
mask = self.try_crop_image_with_a1111_mask(p, unit, mask, resize_mode, preprocessor)
|
||||
|
||||
image_list = [[image, mask]]
|
||||
|
||||
if resize_mode == external_code.ResizeMode.OUTER_FIT and preprocessor.expand_mask_when_resize_and_fill:
|
||||
new_image_list = []
|
||||
for input_image, input_mask in image_list:
|
||||
if input_mask is None:
|
||||
input_mask = np.zeros_like(input_image)
|
||||
input_mask = crop_and_resize_image(
|
||||
input_mask,
|
||||
external_code.ResizeMode.OUTER_FIT, h, w,
|
||||
fill_border_with_255=True,
|
||||
)
|
||||
input_image = crop_and_resize_image(
|
||||
input_image,
|
||||
external_code.ResizeMode.OUTER_FIT, h, w,
|
||||
fill_border_with_255=False,
|
||||
)
|
||||
new_image_list.append((input_image, input_mask))
|
||||
image_list = new_image_list
|
||||
|
||||
return image_list, resize_mode
|
||||
|
||||
@staticmethod
|
||||
def get_target_dimensions(p: StableDiffusionProcessing) -> Tuple[int, int, int, int]:
|
||||
"""Returns (h, w, hr_h, hr_w)."""
|
||||
h = align_dim_latent(p.height)
|
||||
w = align_dim_latent(p.width)
|
||||
|
||||
high_res_fix = (
|
||||
isinstance(p, StableDiffusionProcessingTxt2Img)
|
||||
and getattr(p, 'enable_hr', False)
|
||||
)
|
||||
if high_res_fix:
|
||||
if p.hr_resize_x == 0 and p.hr_resize_y == 0:
|
||||
hr_y = int(p.height * p.hr_scale)
|
||||
hr_x = int(p.width * p.hr_scale)
|
||||
else:
|
||||
hr_y, hr_x = p.hr_resize_y, p.hr_resize_x
|
||||
hr_y = align_dim_latent(hr_y)
|
||||
hr_x = align_dim_latent(hr_x)
|
||||
else:
|
||||
hr_y = h
|
||||
hr_x = w
|
||||
|
||||
return h, w, hr_y, hr_x
|
||||
|
||||
@torch.no_grad()
|
||||
def process_unit_after_click_generate(self,
|
||||
p: StableDiffusionProcessing,
|
||||
unit: ControlNetUnit,
|
||||
params: ControlNetCachedParameters,
|
||||
*args, **kwargs):
|
||||
|
||||
h, w, hr_y, hr_x = self.get_target_dimensions(p)
|
||||
|
||||
has_high_res_fix = (
|
||||
isinstance(p, StableDiffusionProcessingTxt2Img)
|
||||
and getattr(p, 'enable_hr', False)
|
||||
)
|
||||
|
||||
if unit.use_preview_as_input:
|
||||
unit.module = 'None'
|
||||
|
||||
preprocessor = global_state.get_preprocessor(unit.module)
|
||||
|
||||
input_list, resize_mode = self.get_input_data(p, unit, preprocessor, h, w)
|
||||
preprocessor_outputs = []
|
||||
control_masks = []
|
||||
preprocessor_output_is_image = False
|
||||
preprocessor_output = None
|
||||
|
||||
def optional_tqdm(iterable, use_tqdm):
|
||||
from tqdm import tqdm
|
||||
return tqdm(iterable) if use_tqdm else iterable
|
||||
|
||||
for input_image, input_mask in optional_tqdm(input_list, len(input_list) > 1):
|
||||
if unit.pixel_perfect:
|
||||
unit.processor_res = external_code.pixel_perfect_resolution(
|
||||
input_image,
|
||||
target_H=h,
|
||||
target_W=w,
|
||||
resize_mode=resize_mode,
|
||||
)
|
||||
|
||||
seed = set_numpy_seed(p)
|
||||
logger.debug(f"Use numpy seed {seed}.")
|
||||
logger.info(f"Using preprocessor: {unit.module}")
|
||||
logger.info(f'preprocessor resolution = {unit.processor_res}')
|
||||
|
||||
preprocessor_output = preprocessor(
|
||||
input_image=input_image,
|
||||
input_mask=input_mask,
|
||||
resolution=unit.processor_res,
|
||||
slider_1=unit.threshold_a,
|
||||
slider_2=unit.threshold_b,
|
||||
)
|
||||
|
||||
preprocessor_outputs.append(preprocessor_output)
|
||||
|
||||
preprocessor_output_is_image = judge_image_type(preprocessor_output)
|
||||
|
||||
if input_mask is not None:
|
||||
control_masks.append(input_mask)
|
||||
|
||||
if len(input_list) > 1 and not preprocessor_output_is_image:
|
||||
logger.info('Batch wise input only support controlnet, control-lora, and t2i adapters!')
|
||||
break
|
||||
|
||||
if has_high_res_fix:
|
||||
hr_option = HiResFixOption.from_value(unit.hr_option)
|
||||
else:
|
||||
hr_option = HiResFixOption.BOTH
|
||||
|
||||
alignment_indices = [i % len(preprocessor_outputs) for i in range(p.batch_size)]
|
||||
def attach_extra_result_image(img: np.ndarray, is_high_res: bool = False):
|
||||
if (
|
||||
(is_high_res and hr_option.high_res_enabled) or
|
||||
(not is_high_res and hr_option.low_res_enabled)
|
||||
) and unit.save_detected_map:
|
||||
p.extra_result_images.append(img)
|
||||
|
||||
if preprocessor_output_is_image:
|
||||
params.control_cond = []
|
||||
params.control_cond_for_hr_fix = []
|
||||
|
||||
for preprocessor_output in preprocessor_outputs:
|
||||
control_cond = crop_and_resize_image(preprocessor_output, resize_mode, h, w)
|
||||
attach_extra_result_image(external_code.visualize_inpaint_mask(control_cond))
|
||||
params.control_cond.append(numpy_to_pytorch(control_cond).movedim(-1, 1))
|
||||
|
||||
params.control_cond = torch.cat(params.control_cond, dim=0)[alignment_indices].contiguous()
|
||||
|
||||
if has_high_res_fix:
|
||||
for preprocessor_output in preprocessor_outputs:
|
||||
control_cond_for_hr_fix = crop_and_resize_image(preprocessor_output, resize_mode, hr_y, hr_x)
|
||||
attach_extra_result_image(external_code.visualize_inpaint_mask(control_cond_for_hr_fix), is_high_res=True)
|
||||
params.control_cond_for_hr_fix.append(numpy_to_pytorch(control_cond_for_hr_fix).movedim(-1, 1))
|
||||
params.control_cond_for_hr_fix = torch.cat(params.control_cond_for_hr_fix, dim=0)[alignment_indices].contiguous()
|
||||
else:
|
||||
params.control_cond_for_hr_fix = params.control_cond
|
||||
else:
|
||||
params.control_cond = preprocessor_output
|
||||
params.control_cond_for_hr_fix = preprocessor_output
|
||||
attach_extra_result_image(input_image)
|
||||
|
||||
if len(control_masks) > 0:
|
||||
params.control_mask = []
|
||||
params.control_mask_for_hr_fix = []
|
||||
|
||||
for input_mask in control_masks:
|
||||
fill_border = preprocessor.fill_mask_with_one_when_resize_and_fill
|
||||
control_mask = crop_and_resize_image(input_mask, resize_mode, h, w, fill_border)
|
||||
attach_extra_result_image(control_mask)
|
||||
control_mask = numpy_to_pytorch(control_mask).movedim(-1, 1)[:, :1]
|
||||
params.control_mask.append(control_mask)
|
||||
|
||||
if has_high_res_fix:
|
||||
control_mask_for_hr_fix = crop_and_resize_image(input_mask, resize_mode, hr_y, hr_x, fill_border)
|
||||
attach_extra_result_image(control_mask_for_hr_fix, is_high_res=True)
|
||||
control_mask_for_hr_fix = numpy_to_pytorch(control_mask_for_hr_fix).movedim(-1, 1)[:, :1]
|
||||
params.control_mask_for_hr_fix.append(control_mask_for_hr_fix)
|
||||
|
||||
params.control_mask = torch.cat(params.control_mask, dim=0)[alignment_indices].contiguous()
|
||||
if has_high_res_fix:
|
||||
params.control_mask_for_hr_fix = torch.cat(params.control_mask_for_hr_fix, dim=0)[alignment_indices].contiguous()
|
||||
else:
|
||||
params.control_mask_for_hr_fix = params.control_mask
|
||||
|
||||
if preprocessor.do_not_need_model:
|
||||
model_filename = 'Not Needed'
|
||||
params.model = ControlModelPatcher()
|
||||
else:
|
||||
assert unit.model != 'None', 'You have not selected any control model!'
|
||||
model_filename = global_state.get_controlnet_filename(unit.model)
|
||||
params.model = cached_controlnet_loader(model_filename)
|
||||
assert params.model is not None, logger.error(f"Recognizing Control Model failed: {model_filename}")
|
||||
|
||||
params.preprocessor = preprocessor
|
||||
|
||||
params.preprocessor.process_after_running_preprocessors(process=p, params=params, **kwargs)
|
||||
params.model.process_after_running_preprocessors(process=p, params=params, **kwargs)
|
||||
|
||||
logger.info(f"Current ControlNet {type(params.model).__name__}: {model_filename}")
|
||||
return
|
||||
|
||||
@torch.no_grad()
|
||||
def process_unit_before_every_sampling(self,
|
||||
p: StableDiffusionProcessing,
|
||||
unit: ControlNetUnit,
|
||||
params: ControlNetCachedParameters,
|
||||
*args, **kwargs):
|
||||
|
||||
is_hr_pass = getattr(p, 'is_hr_pass', False)
|
||||
|
||||
has_high_res_fix = (
|
||||
isinstance(p, StableDiffusionProcessingTxt2Img)
|
||||
and getattr(p, 'enable_hr', False)
|
||||
)
|
||||
|
||||
if has_high_res_fix:
|
||||
hr_option = HiResFixOption.from_value(unit.hr_option)
|
||||
else:
|
||||
hr_option = HiResFixOption.BOTH
|
||||
|
||||
if has_high_res_fix and is_hr_pass and (not hr_option.high_res_enabled):
|
||||
logger.info(f"ControlNet Skipped High-res pass.")
|
||||
return
|
||||
|
||||
if has_high_res_fix and (not is_hr_pass) and (not hr_option.low_res_enabled):
|
||||
logger.info(f"ControlNet Skipped Low-res pass.")
|
||||
return
|
||||
|
||||
if is_hr_pass:
|
||||
cond = params.control_cond_for_hr_fix
|
||||
mask = params.control_mask_for_hr_fix
|
||||
else:
|
||||
cond = params.control_cond
|
||||
mask = params.control_mask
|
||||
|
||||
kwargs.update(dict(
|
||||
unit=unit,
|
||||
params=params,
|
||||
cond_original=cond.clone() if isinstance(cond, torch.Tensor) else cond,
|
||||
mask_original=mask.clone() if isinstance(mask, torch.Tensor) else mask,
|
||||
))
|
||||
|
||||
params.model.strength = float(unit.weight)
|
||||
params.model.start_percent = float(unit.guidance_start)
|
||||
params.model.end_percent = float(unit.guidance_end)
|
||||
params.model.positive_advanced_weighting = None
|
||||
params.model.negative_advanced_weighting = None
|
||||
params.model.advanced_frame_weighting = None
|
||||
params.model.advanced_sigma_weighting = None
|
||||
|
||||
soft_weighting = {
|
||||
'input': [0.09941396206337118, 0.12050177219802567, 0.14606275417942507, 0.17704576264172736,
|
||||
0.214600924414215,
|
||||
0.26012233262329093, 0.3152997971191405, 0.3821815722656249, 0.4632503906249999, 0.561515625,
|
||||
0.6806249999999999, 0.825],
|
||||
'middle': [0.561515625] if p.sd_model.is_sdxl else [1.0],
|
||||
'output': [0.09941396206337118, 0.12050177219802567, 0.14606275417942507, 0.17704576264172736,
|
||||
0.214600924414215,
|
||||
0.26012233262329093, 0.3152997971191405, 0.3821815722656249, 0.4632503906249999, 0.561515625,
|
||||
0.6806249999999999, 0.825]
|
||||
}
|
||||
|
||||
zero_weighting = {
|
||||
'input': [0.0] * 12,
|
||||
'middle': [0.0],
|
||||
'output': [0.0] * 12
|
||||
}
|
||||
|
||||
if unit.control_mode == external_code.ControlMode.CONTROL.value:
|
||||
params.model.positive_advanced_weighting = soft_weighting.copy()
|
||||
params.model.negative_advanced_weighting = zero_weighting.copy()
|
||||
|
||||
if unit.control_mode == external_code.ControlMode.PROMPT.value:
|
||||
params.model.positive_advanced_weighting = soft_weighting.copy()
|
||||
params.model.negative_advanced_weighting = soft_weighting.copy()
|
||||
|
||||
if is_hr_pass and params.preprocessor.use_soft_projection_in_hr_fix:
|
||||
params.model.positive_advanced_weighting = soft_weighting.copy()
|
||||
params.model.negative_advanced_weighting = soft_weighting.copy()
|
||||
|
||||
cond, mask = params.preprocessor.process_before_every_sampling(p, cond, mask, *args, **kwargs)
|
||||
|
||||
params.model.advanced_mask_weighting = mask
|
||||
|
||||
params.model.process_before_every_sampling(p, cond, mask, *args, **kwargs)
|
||||
|
||||
logger.info(f"ControlNet Method {params.preprocessor.name} patched.")
|
||||
return
|
||||
|
||||
@staticmethod
|
||||
def bound_check_params(unit: ControlNetUnit) -> None:
|
||||
"""
|
||||
Checks and corrects negative parameters in ControlNetUnit 'unit'.
|
||||
Parameters 'processor_res', 'threshold_a', 'threshold_b' are reset to
|
||||
their default values if negative.
|
||||
|
||||
Args:
|
||||
unit (ControlNetUnit): The ControlNetUnit instance to check.
|
||||
"""
|
||||
preprocessor = global_state.get_preprocessor(unit.module)
|
||||
|
||||
if unit.processor_res < 0:
|
||||
unit.processor_res = int(preprocessor.slider_resolution.gradio_update_kwargs.get('value', 512))
|
||||
|
||||
if unit.threshold_a < 0:
|
||||
unit.threshold_a = int(preprocessor.slider_1.gradio_update_kwargs.get('value', 1.0))
|
||||
|
||||
if unit.threshold_b < 0:
|
||||
unit.threshold_b = int(preprocessor.slider_2.gradio_update_kwargs.get('value', 1.0))
|
||||
|
||||
return
|
||||
|
||||
@torch.no_grad()
|
||||
def process_unit_after_every_sampling(self,
|
||||
p: StableDiffusionProcessing,
|
||||
unit: ControlNetUnit,
|
||||
params: ControlNetCachedParameters,
|
||||
*args, **kwargs):
|
||||
|
||||
params.preprocessor.process_after_every_sampling(p, params, *args, **kwargs)
|
||||
params.model.process_after_every_sampling(p, params, *args, **kwargs)
|
||||
return
|
||||
|
||||
@torch.no_grad()
|
||||
def process(self, p, *args, **kwargs):
|
||||
self.current_params = {}
|
||||
enabled_units = self.get_enabled_units(args)
|
||||
Infotext.write_infotext(enabled_units, p)
|
||||
for i, unit in enumerate(enabled_units):
|
||||
self.bound_check_params(unit)
|
||||
params = ControlNetCachedParameters()
|
||||
self.process_unit_after_click_generate(p, unit, params, *args, **kwargs)
|
||||
self.current_params[i] = params
|
||||
return
|
||||
|
||||
@torch.no_grad()
|
||||
def process_before_every_sampling(self, p, *args, **kwargs):
|
||||
for i, unit in enumerate(self.get_enabled_units(args)):
|
||||
self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)
|
||||
return
|
||||
|
||||
@torch.no_grad()
|
||||
def postprocess_batch_list(self, p, pp, *args, **kwargs):
|
||||
for i, unit in enumerate(self.get_enabled_units(args)):
|
||||
self.process_unit_after_every_sampling(p, unit, self.current_params[i], pp, *args, **kwargs)
|
||||
return
|
||||
|
||||
def postprocess(self, p, processed, *args):
|
||||
self.current_params = {}
|
||||
return
|
||||
|
||||
|
||||
def on_ui_settings():
|
||||
section = ('control_net', "ControlNet")
|
||||
shared.opts.add_option("control_net_detectedmap_dir", shared.OptionInfo(
|
||||
"detected_maps", "Directory for detected maps auto saving", section=section))
|
||||
shared.opts.add_option("control_net_models_path", shared.OptionInfo(
|
||||
"", "Extra path to scan for ControlNet models (e.g. training output directory)", section=section))
|
||||
shared.opts.add_option("control_net_modules_path", shared.OptionInfo(
|
||||
"",
|
||||
"Path to directory containing annotator model directories (requires restart, overrides corresponding command line flag)",
|
||||
section=section))
|
||||
shared.opts.add_option("control_net_unit_count", shared.OptionInfo(
|
||||
3, "Multi-ControlNet: ControlNet unit number (requires restart)", gr.Slider,
|
||||
{"minimum": 1, "maximum": 10, "step": 1}, section=section))
|
||||
shared.opts.add_option("control_net_model_cache_size", shared.OptionInfo(
|
||||
5, "Model cache size (requires restart)", gr.Slider, {"minimum": 1, "maximum": 10, "step": 1}, section=section))
|
||||
shared.opts.add_option("control_net_no_detectmap", shared.OptionInfo(
|
||||
False, "Do not append detectmap to output", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("control_net_detectmap_autosaving", shared.OptionInfo(
|
||||
False, "Allow detectmap auto saving", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("control_net_allow_script_control", shared.OptionInfo(
|
||||
False, "Allow other script to control this extension", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("control_net_sync_field_args", shared.OptionInfo(
|
||||
True, "Paste ControlNet parameters in infotext", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("controlnet_show_batch_images_in_ui", shared.OptionInfo(
|
||||
False, "Show batch images in gradio gallery output", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("controlnet_increment_seed_during_batch", shared.OptionInfo(
|
||||
False, "Increment seed after each controlnet batch iteration", gr.Checkbox, {"interactive": True},
|
||||
section=section))
|
||||
shared.opts.add_option("controlnet_disable_openpose_edit", shared.OptionInfo(
|
||||
False, "Disable openpose edit", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("controlnet_disable_photopea_edit", shared.OptionInfo(
|
||||
False, "Disable photopea edit", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("controlnet_photopea_warning", shared.OptionInfo(
|
||||
True, "Photopea popup warning", gr.Checkbox, {"interactive": True}, section=section))
|
||||
shared.opts.add_option("controlnet_input_thumbnail", shared.OptionInfo(
|
||||
True, "Input image thumbnail on unit header", gr.Checkbox, {"interactive": True}, section=section))
|
||||
|
||||
|
||||
script_callbacks.on_ui_settings(on_ui_settings)
|
||||
script_callbacks.on_infotext_pasted(Infotext.on_infotext_pasted)
|
||||
script_callbacks.on_after_component(ControlNetUiGroup.on_after_component)
|
||||
script_callbacks.on_before_reload(ControlNetUiGroup.reset)
|
||||
script_callbacks.on_app_started(controlnet_api)
|
||||
449
extensions-builtin/sd_forge_controlnet/scripts/xyz_grid_support.py
Executable file
449
extensions-builtin/sd_forge_controlnet/scripts/xyz_grid_support.py
Executable file
@@ -0,0 +1,449 @@
|
||||
import re
|
||||
import numpy as np
|
||||
|
||||
from modules import scripts, shared
|
||||
|
||||
try:
|
||||
from lib_controlnet.global_state import update_controlnet_filenames, cn_models_names, get_preprocessor_names
|
||||
from lib_controlnet.external_code import ResizeMode, ControlMode
|
||||
|
||||
except (ImportError, NameError):
|
||||
import_error = True
|
||||
else:
|
||||
import_error = False
|
||||
|
||||
DEBUG_MODE = False
|
||||
|
||||
|
||||
def debug_info(func):
|
||||
def debug_info_(*args, **kwargs):
|
||||
if DEBUG_MODE:
|
||||
print(f"Debug info: {func.__name__}, {args}")
|
||||
return func(*args, **kwargs)
|
||||
return debug_info_
|
||||
|
||||
|
||||
def find_dict(dict_list, keyword, search_key="name", stop=False):
|
||||
result = next((d for d in dict_list if d[search_key] == keyword), None)
|
||||
if result or not stop:
|
||||
return result
|
||||
else:
|
||||
raise ValueError(f"Dictionary with value '{keyword}' in key '{search_key}' not found.")
|
||||
|
||||
|
||||
def flatten(lst):
|
||||
result = []
|
||||
for element in lst:
|
||||
if isinstance(element, list):
|
||||
result.extend(flatten(element))
|
||||
else:
|
||||
result.append(element)
|
||||
return result
|
||||
|
||||
|
||||
def is_all_included(target_list, check_list, allow_blank=False, stop=False):
|
||||
for element in flatten(target_list):
|
||||
if allow_blank and str(element) in ["None", ""]:
|
||||
continue
|
||||
elif element not in check_list:
|
||||
if not stop:
|
||||
return False
|
||||
else:
|
||||
raise ValueError(f"'{element}' is not included in check list.")
|
||||
return True
|
||||
|
||||
|
||||
class ListParser():
|
||||
"""This class restores a broken list caused by the following process
|
||||
in the xyz_grid module.
|
||||
-> valslist = [x.strip() for x in chain.from_iterable(
|
||||
csv.reader(StringIO(vals)))]
|
||||
It also performs type conversion,
|
||||
adjusts the number of elements in the list, and other operations.
|
||||
|
||||
This class directly modifies the received list.
|
||||
"""
|
||||
numeric_pattern = {
|
||||
int: {
|
||||
"range": r"\s*([+-]?\s*\d+)\s*-\s*([+-]?\s*\d+)(?:\s*\(([+-]\d+)\s*\))?\s*",
|
||||
"count": r"\s*([+-]?\s*\d+)\s*-\s*([+-]?\s*\d+)(?:\s*\[(\d+)\s*\])?\s*"
|
||||
},
|
||||
float: {
|
||||
"range": r"\s*([+-]?\s*\d+(?:\.\d*)?)\s*-\s*([+-]?\s*\d+(?:\.\d*)?)(?:\s*\(([+-]\d+(?:\.\d*)?)\s*\))?\s*",
|
||||
"count": r"\s*([+-]?\s*\d+(?:\.\d*)?)\s*-\s*([+-]?\s*\d+(?:\.\d*)?)(?:\s*\[(\d+(?:\.\d*)?)\s*\])?\s*"
|
||||
}
|
||||
}
|
||||
|
||||
################################################
|
||||
#
|
||||
# Initialization method from here.
|
||||
#
|
||||
################################################
|
||||
|
||||
def __init__(self, my_list, converter=None, allow_blank=True, exclude_list=None, run=True):
|
||||
self.my_list = my_list
|
||||
self.converter = converter
|
||||
self.allow_blank = allow_blank
|
||||
self.exclude_list = exclude_list
|
||||
self.re_bracket_start = None
|
||||
self.re_bracket_start_precheck = None
|
||||
self.re_bracket_end = None
|
||||
self.re_bracket_end_precheck = None
|
||||
self.re_range = None
|
||||
self.re_count = None
|
||||
self.compile_regex()
|
||||
if run:
|
||||
self.auto_normalize()
|
||||
|
||||
def compile_regex(self):
|
||||
exclude_pattern = "|".join(self.exclude_list) if self.exclude_list else None
|
||||
if exclude_pattern is None:
|
||||
self.re_bracket_start = re.compile(r"^\[")
|
||||
self.re_bracket_end = re.compile(r"\]$")
|
||||
else:
|
||||
self.re_bracket_start = re.compile(fr"^\[(?!(?:{exclude_pattern})\])")
|
||||
self.re_bracket_end = re.compile(fr"(?<!\[(?:{exclude_pattern}))\]$")
|
||||
|
||||
if self.converter not in self.numeric_pattern:
|
||||
return self
|
||||
# If the converter is either int or float.
|
||||
self.re_range = re.compile(self.numeric_pattern[self.converter]["range"])
|
||||
self.re_count = re.compile(self.numeric_pattern[self.converter]["count"])
|
||||
self.re_bracket_start_precheck = None
|
||||
self.re_bracket_end_precheck = self.re_count
|
||||
return self
|
||||
|
||||
################################################
|
||||
#
|
||||
# Public method from here.
|
||||
#
|
||||
################################################
|
||||
|
||||
################################################
|
||||
# This method is executed at the time of initialization.
|
||||
#
|
||||
def auto_normalize(self):
|
||||
if not self.has_list_notation():
|
||||
self.numeric_range_parser()
|
||||
self.type_convert()
|
||||
return self
|
||||
else:
|
||||
self.fix_structure()
|
||||
self.numeric_range_parser()
|
||||
self.type_convert()
|
||||
self.fill_to_longest()
|
||||
return self
|
||||
|
||||
def has_list_notation(self):
|
||||
return any(self._search_bracket(s) for s in self.my_list)
|
||||
|
||||
def numeric_range_parser(self, my_list=None, depth=0):
|
||||
if self.converter not in self.numeric_pattern:
|
||||
return self
|
||||
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
result = []
|
||||
is_matched = False
|
||||
for s in my_list:
|
||||
if isinstance(s, list):
|
||||
result.extend(self.numeric_range_parser(s, depth+1))
|
||||
continue
|
||||
|
||||
match = self._numeric_range_to_list(s)
|
||||
if s != match:
|
||||
is_matched = True
|
||||
result.extend(match if not depth else [match])
|
||||
continue
|
||||
else:
|
||||
result.append(s)
|
||||
continue
|
||||
|
||||
if depth:
|
||||
return self._transpose(result) if is_matched else [result]
|
||||
else:
|
||||
my_list[:] = result
|
||||
return self
|
||||
|
||||
def type_convert(self, my_list=None):
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
for i, s in enumerate(my_list):
|
||||
if isinstance(s, list):
|
||||
self.type_convert(s)
|
||||
elif self.allow_blank and (str(s) in ["None", ""]):
|
||||
my_list[i] = None
|
||||
elif self.converter:
|
||||
my_list[i] = self.converter(s)
|
||||
else:
|
||||
my_list[i] = s
|
||||
return self
|
||||
|
||||
def fix_structure(self):
|
||||
def is_same_length(list1, list2):
|
||||
return len(list1) == len(list2)
|
||||
|
||||
start_indices, end_indices = [], []
|
||||
for i, s in enumerate(self.my_list):
|
||||
if is_same_length(start_indices, end_indices):
|
||||
replace_string = self._search_bracket(s, "[", replace="")
|
||||
if s != replace_string:
|
||||
s = replace_string
|
||||
start_indices.append(i)
|
||||
if not is_same_length(start_indices, end_indices):
|
||||
replace_string = self._search_bracket(s, "]", replace="")
|
||||
if s != replace_string:
|
||||
s = replace_string
|
||||
end_indices.append(i + 1)
|
||||
self.my_list[i] = s
|
||||
if not is_same_length(start_indices, end_indices):
|
||||
raise ValueError(f"Lengths of {start_indices} and {end_indices} are different.")
|
||||
# Restore the structure of a list.
|
||||
for i, j in zip(reversed(start_indices), reversed(end_indices)):
|
||||
self.my_list[i:j] = [self.my_list[i:j]]
|
||||
return self
|
||||
|
||||
def fill_to_longest(self, my_list=None, value=None, index=None):
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
if not self.sublist_exists(my_list):
|
||||
return self
|
||||
max_length = max(len(sub_list) for sub_list in my_list if isinstance(sub_list, list))
|
||||
for i, sub_list in enumerate(my_list):
|
||||
if isinstance(sub_list, list):
|
||||
fill_value = value if index is None else sub_list[index]
|
||||
my_list[i] = sub_list + [fill_value] * (max_length-len(sub_list))
|
||||
return self
|
||||
|
||||
def sublist_exists(self, my_list=None):
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
return any(isinstance(item, list) for item in my_list)
|
||||
|
||||
def all_sublists(self, my_list=None): # Unused method
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
return all(isinstance(item, list) for item in my_list)
|
||||
|
||||
def get_list(self): # Unused method
|
||||
return self.my_list
|
||||
|
||||
################################################
|
||||
#
|
||||
# Private method from here.
|
||||
#
|
||||
################################################
|
||||
|
||||
def _search_bracket(self, string, bracket="[", replace=None):
|
||||
if bracket == "[":
|
||||
pattern = self.re_bracket_start
|
||||
precheck = self.re_bracket_start_precheck # None
|
||||
elif bracket == "]":
|
||||
pattern = self.re_bracket_end
|
||||
precheck = self.re_bracket_end_precheck
|
||||
else:
|
||||
raise ValueError(f"Invalid argument provided. (bracket: {bracket})")
|
||||
|
||||
if precheck and precheck.fullmatch(string):
|
||||
return None if replace is None else string
|
||||
elif replace is None:
|
||||
return pattern.search(string)
|
||||
else:
|
||||
return pattern.sub(replace, string)
|
||||
|
||||
def _numeric_range_to_list(self, string):
|
||||
match = self.re_range.fullmatch(string)
|
||||
if match is not None:
|
||||
if self.converter == int:
|
||||
start = int(match.group(1))
|
||||
end = int(match.group(2)) + 1
|
||||
step = int(match.group(3)) if match.group(3) is not None else 1
|
||||
return list(range(start, end, step))
|
||||
else: # float
|
||||
start = float(match.group(1))
|
||||
end = float(match.group(2))
|
||||
step = float(match.group(3)) if match.group(3) is not None else 1
|
||||
return np.arange(start, end + step, step).tolist()
|
||||
|
||||
match = self.re_count.fullmatch(string)
|
||||
if match is not None:
|
||||
if self.converter == int:
|
||||
start = int(match.group(1))
|
||||
end = int(match.group(2))
|
||||
num = int(match.group(3)) if match.group(3) is not None else 1
|
||||
return [int(x) for x in np.linspace(start=start, stop=end, num=num).tolist()]
|
||||
else: # float
|
||||
start = float(match.group(1))
|
||||
end = float(match.group(2))
|
||||
num = int(match.group(3)) if match.group(3) is not None else 1
|
||||
return np.linspace(start=start, stop=end, num=num).tolist()
|
||||
return string
|
||||
|
||||
def _transpose(self, my_list=None):
|
||||
my_list = self.my_list if my_list is None else my_list
|
||||
my_list = [item if isinstance(item, list) else [item] for item in my_list]
|
||||
self.fill_to_longest(my_list, index=-1)
|
||||
return np.array(my_list, dtype=object).T.tolist()
|
||||
|
||||
################################################
|
||||
#
|
||||
# The methods of ListParser class end here.
|
||||
#
|
||||
################################################
|
||||
|
||||
################################################################
|
||||
################################################################
|
||||
#
|
||||
# Starting the main process of this module.
|
||||
#
|
||||
# functions are executed in this order:
|
||||
# find_module
|
||||
# add_axis_options
|
||||
# identity
|
||||
# enable_script_control
|
||||
# apply_field
|
||||
# confirm
|
||||
# bool_
|
||||
# choices_for
|
||||
# make_excluded_list
|
||||
# config lists for AxisOptions:
|
||||
# validation_data
|
||||
# extra_axis_options
|
||||
################################################################
|
||||
################################################################
|
||||
|
||||
|
||||
def find_module(module_names):
|
||||
if isinstance(module_names, str):
|
||||
module_names = [s.strip() for s in module_names.split(",")]
|
||||
for data in scripts.scripts_data:
|
||||
if data.script_class.__module__ in module_names and hasattr(data, "module"):
|
||||
return data.module
|
||||
return None
|
||||
|
||||
|
||||
def add_axis_options(xyz_grid):
|
||||
|
||||
################################################
|
||||
#
|
||||
# Define a function to pass to the AxisOption class from here.
|
||||
#
|
||||
################################################
|
||||
|
||||
################################################
|
||||
# Set this function as the type attribute of the AxisOption class.
|
||||
# To skip the following processing of xyz_grid module.
|
||||
# -> valslist = [opt.type(x) for x in valslist]
|
||||
# Perform type conversion using the function
|
||||
# set to the confirm attribute instead.
|
||||
#
|
||||
def identity(x):
|
||||
return x
|
||||
|
||||
def enable_script_control():
|
||||
shared.opts.data["control_net_allow_script_control"] = True
|
||||
|
||||
def apply_field(field):
|
||||
@debug_info
|
||||
def apply_field_(p, x, xs):
|
||||
enable_script_control()
|
||||
setattr(p, field, x)
|
||||
|
||||
return apply_field_
|
||||
|
||||
################################################
|
||||
# The confirm function defined in this module
|
||||
# enables list notation and performs type conversion.
|
||||
#
|
||||
# Example:
|
||||
# any = [any, any, any, ...]
|
||||
# [any] = [any, None, None, ...]
|
||||
# [None, None, any] = [None, None, any]
|
||||
# [,,any] = [None, None, any]
|
||||
# any, [,any,] = [any, any, any, ...], [None, any, None]
|
||||
#
|
||||
# Enabled Only:
|
||||
# any = [any] = [any, None, None, ...]
|
||||
# (any and [any] are considered equivalent)
|
||||
#
|
||||
def confirm(func_or_str):
|
||||
@debug_info
|
||||
def confirm_(p, xs):
|
||||
if callable(func_or_str): # func_or_str is converter
|
||||
ListParser(xs, func_or_str, allow_blank=True)
|
||||
return
|
||||
|
||||
elif isinstance(func_or_str, str): # func_or_str is keyword
|
||||
valid_data = find_dict(validation_data, func_or_str, stop=True)
|
||||
converter = valid_data["type"]
|
||||
exclude_list = valid_data["exclude"]() if valid_data["exclude"] else None
|
||||
check_list = valid_data["check"]()
|
||||
|
||||
ListParser(xs, converter, allow_blank=True, exclude_list=exclude_list)
|
||||
is_all_included(xs, check_list, allow_blank=True, stop=True)
|
||||
return
|
||||
|
||||
else:
|
||||
raise TypeError(f"Argument must be callable or str, not {type(func_or_str).__name__}.")
|
||||
|
||||
return confirm_
|
||||
|
||||
def bool_(string):
|
||||
string = str(string)
|
||||
if string in ["None", ""]:
|
||||
return None
|
||||
elif string.lower() in ["true", "1"]:
|
||||
return True
|
||||
elif string.lower() in ["false", "0"]:
|
||||
return False
|
||||
else:
|
||||
raise ValueError(f"Could not convert string to boolean: {string}")
|
||||
|
||||
def choices_bool():
|
||||
return ["False", "True"]
|
||||
|
||||
def choices_model():
|
||||
update_controlnet_filenames()
|
||||
return list(cn_models_names.values())
|
||||
|
||||
def choices_control_mode():
|
||||
return [e.value for e in ControlMode]
|
||||
|
||||
def choices_resize_mode():
|
||||
return [e.value for e in ResizeMode]
|
||||
|
||||
def choices_preprocessor():
|
||||
return list(get_preprocessor_names())
|
||||
|
||||
def make_excluded_list():
|
||||
pattern = re.compile(r"\[(\w+)\]")
|
||||
return [match.group(1) for s in choices_model()
|
||||
for match in pattern.finditer(s)]
|
||||
|
||||
validation_data = [
|
||||
{"name": "model", "type": str, "check": choices_model, "exclude": make_excluded_list},
|
||||
{"name": "control_mode", "type": str, "check": choices_control_mode, "exclude": None},
|
||||
{"name": "resize_mode", "type": str, "check": choices_resize_mode, "exclude": None},
|
||||
{"name": "preprocessor", "type": str, "check": choices_preprocessor, "exclude": None},
|
||||
]
|
||||
|
||||
extra_axis_options = [
|
||||
xyz_grid.AxisOption("[ControlNet] Enabled", identity, apply_field("control_net_enabled"), confirm=confirm(bool_), choices=choices_bool),
|
||||
xyz_grid.AxisOption("[ControlNet] Model", identity, apply_field("control_net_model"), confirm=confirm("model"), choices=choices_model, cost=0.9),
|
||||
xyz_grid.AxisOption("[ControlNet] Weight", identity, apply_field("control_net_weight"), confirm=confirm(float)),
|
||||
xyz_grid.AxisOption("[ControlNet] Guidance Start", identity, apply_field("control_net_guidance_start"), confirm=confirm(float)),
|
||||
xyz_grid.AxisOption("[ControlNet] Guidance End", identity, apply_field("control_net_guidance_end"), confirm=confirm(float)),
|
||||
xyz_grid.AxisOption("[ControlNet] Control Mode", identity, apply_field("control_net_control_mode"), confirm=confirm("control_mode"), choices=choices_control_mode),
|
||||
xyz_grid.AxisOption("[ControlNet] Resize Mode", identity, apply_field("control_net_resize_mode"), confirm=confirm("resize_mode"), choices=choices_resize_mode),
|
||||
xyz_grid.AxisOption("[ControlNet] Preprocessor", identity, apply_field("control_net_module"), confirm=confirm("preprocessor"), choices=choices_preprocessor),
|
||||
xyz_grid.AxisOption("[ControlNet] Pre Resolution", identity, apply_field("control_net_pres"), confirm=confirm(int)),
|
||||
xyz_grid.AxisOption("[ControlNet] Pre Threshold A", identity, apply_field("control_net_pthr_a"), confirm=confirm(float)),
|
||||
xyz_grid.AxisOption("[ControlNet] Pre Threshold B", identity, apply_field("control_net_pthr_b"), confirm=confirm(float)),
|
||||
]
|
||||
|
||||
xyz_grid.axis_options.extend(extra_axis_options)
|
||||
|
||||
|
||||
def run():
|
||||
xyz_grid = find_module("xyz_grid.py, xy_grid.py")
|
||||
if xyz_grid:
|
||||
add_axis_options(xyz_grid)
|
||||
|
||||
|
||||
if not import_error:
|
||||
run()
|
||||
251
extensions-builtin/sd_forge_controlnet/style.css
Executable file
251
extensions-builtin/sd_forge_controlnet/style.css
Executable file
@@ -0,0 +1,251 @@
|
||||
/* InputAccordion alignment */
|
||||
/* Flex container */
|
||||
.controlnet .svelte-vt1mxs {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
flex-direction: row;
|
||||
gap: 10px;
|
||||
/* Adjusts the space between items */
|
||||
}
|
||||
|
||||
.controlnet .input-accordion {
|
||||
flex: 1 1 calc(50% - 10px);
|
||||
/* Adjusts for the gap, default 2 columns */
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
/* Media query for screens smaller than a specific width */
|
||||
@media (max-width: 600px) {
|
||||
.controlnet .input-accordion {
|
||||
flex: 1 1 100%;
|
||||
/* Changes to 1 column when window width is ≤ 600px */
|
||||
}
|
||||
}
|
||||
/* Input image thumbnail */
|
||||
.cnet-thumbnail {
|
||||
height: 3rem !important;
|
||||
border: 1px solid var(--button-secondary-border-color);
|
||||
}
|
||||
|
||||
.controlnet .input-accordion .label-wrap>span:nth-child(1) {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
align-items: center;
|
||||
gap: 5px;
|
||||
}
|
||||
|
||||
.controlnet .input-accordion .icon {
|
||||
height: 1rem;
|
||||
width: 1rem;
|
||||
}
|
||||
|
||||
.controlnet .input-accordion .label-wrap {
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.cnet-modal {
|
||||
display: none;
|
||||
/* Hidden by default */
|
||||
position: fixed;
|
||||
/* Stay in place */
|
||||
z-index: 2147483647;
|
||||
/* Sit on top */
|
||||
left: 0;
|
||||
top: 0;
|
||||
width: 100%;
|
||||
/* Full width */
|
||||
height: 100%;
|
||||
/* Full height */
|
||||
overflow: auto;
|
||||
/* Enable scroll if needed */
|
||||
background-color: rgba(0, 0, 0, 0.4);
|
||||
/* Black with opacity */
|
||||
max-width: none !important;
|
||||
/* Fix sizing with SD.Next (vladmandic/automatic#2594) */
|
||||
}
|
||||
|
||||
.cnet-modal-content {
|
||||
position: relative;
|
||||
background-color: var(--background-fill-primary);
|
||||
margin: 5vh auto;
|
||||
/* 15% from the top and centered */
|
||||
padding: 20px;
|
||||
border: 1px solid #888;
|
||||
width: 95%;
|
||||
height: 90vh;
|
||||
/* Could be more or less, depending on screen size */
|
||||
box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);
|
||||
animation-name: animatetop;
|
||||
animation-duration: 0.4s;
|
||||
max-width: none !important;
|
||||
/* Fix sizing with SD.Next (vladmandic/automatic#2594) */
|
||||
}
|
||||
|
||||
.cnet-modal-content iframe {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
border: none;
|
||||
}
|
||||
|
||||
.cnet-modal-content.alert {
|
||||
padding: var(--size-5);
|
||||
}
|
||||
|
||||
.cnet-modal-content.alert ul {
|
||||
list-style-type: none;
|
||||
}
|
||||
|
||||
.cnet-modal-close {
|
||||
color: white !important;
|
||||
right: 0.25em;
|
||||
top: 0;
|
||||
cursor: pointer;
|
||||
position: absolute;
|
||||
font-size: 56px;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
@keyframes animatetop {
|
||||
from {
|
||||
top: -300px;
|
||||
opacity: 0
|
||||
}
|
||||
|
||||
to {
|
||||
top: 0;
|
||||
opacity: 1
|
||||
}
|
||||
}
|
||||
|
||||
.cnet-generated-image-control-group,
|
||||
.cnet-upload-pose {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: flex-end;
|
||||
|
||||
position: absolute;
|
||||
right: var(--size-2);
|
||||
bottom: var(--size-2);
|
||||
}
|
||||
|
||||
/* Gradio button style */
|
||||
.cnet-download-pose a,
|
||||
.cnet-close-preview,
|
||||
.cnet-edit-pose,
|
||||
.cnet-upload-pose,
|
||||
.cnet-photopea-child-trigger {
|
||||
font-size: x-small !important;
|
||||
font-weight: bold !important;
|
||||
padding: 2px !important;
|
||||
box-shadow: var(--shadow-drop);
|
||||
border: 1px solid var(--button-secondary-border-color);
|
||||
border-radius: var(--radius-sm);
|
||||
background: var(--background-fill-primary);
|
||||
height: var(--size-5);
|
||||
color: var(--block-label-text-color) !important;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.cnet-download-pose:hover a,
|
||||
.cnet-close-preview:hover a,
|
||||
.cnet-edit-pose:hover,
|
||||
.cnet-upload-pose:hover,
|
||||
.cnet-photopea-child-trigger:hover {
|
||||
color: var(--block-label-text-color) !important;
|
||||
}
|
||||
|
||||
.cnet-unit-active {
|
||||
color: green !important;
|
||||
font-weight: bold !important;
|
||||
}
|
||||
|
||||
.dark .cnet-unit-active {
|
||||
color: greenyellow !important;
|
||||
}
|
||||
|
||||
.cnet-badge {
|
||||
display: inline-block;
|
||||
padding: 0.25em 0.75em;
|
||||
font-size: 0.75em;
|
||||
font-weight: bold;
|
||||
color: white;
|
||||
border-radius: 0.5em;
|
||||
text-align: center;
|
||||
vertical-align: middle;
|
||||
margin-left: var(--size-2);
|
||||
}
|
||||
|
||||
.cnet-badge.primary {
|
||||
background-color: green;
|
||||
}
|
||||
|
||||
.cnet-a1111-badge {
|
||||
position: absolute;
|
||||
bottom: 0px;
|
||||
right: 0px;
|
||||
}
|
||||
|
||||
.cnet-disabled-radio {
|
||||
opacity: 50%;
|
||||
}
|
||||
|
||||
.controlnet_row {
|
||||
margin-top: 10px !important;
|
||||
}
|
||||
|
||||
/* JSON pose upload button styling */
|
||||
.cnet-upload-pose input[type=file] {
|
||||
position: absolute;
|
||||
left: 0;
|
||||
top: 0;
|
||||
opacity: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
/* Photopea integration styles */
|
||||
.photopea-button-group {
|
||||
position: absolute;
|
||||
top: -30px; /* 20px modal padding + 10px margin */
|
||||
}
|
||||
|
||||
.photopea-button {
|
||||
font-size: 3rem;
|
||||
font-weight: bold;
|
||||
padding: 2px !important;
|
||||
margin: 2px !important;
|
||||
box-shadow: var(--shadow-drop);
|
||||
border: 1px solid var(--button-secondary-border-color);
|
||||
border-radius: var(--radius-sm);
|
||||
background: var(--background-fill-primary);
|
||||
color: var(--block-label-text-color);
|
||||
}
|
||||
|
||||
.controlnet_control_type_filter_group label {
|
||||
background: unset !important;
|
||||
border: unset !important;
|
||||
margin-left: -10px !important;
|
||||
}
|
||||
|
||||
.controlnet_control_type_filter_group > span {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.controlnet_control_type_filter_group > .wrap {
|
||||
margin-top: -20px !important;
|
||||
}
|
||||
|
||||
.cnet-toolbutton {
|
||||
background: unset !important;
|
||||
border: unset !important;
|
||||
}
|
||||
|
||||
.range-slider {
|
||||
margin-top: -8px;
|
||||
}
|
||||
Reference in New Issue
Block a user