Compare commits

..

8 Commits
2.7.4 ... 2.8.0

Author SHA1 Message Date
DominikDoom
638c073f37 Merge branch 'feature-native-lora-config' into main 2023-07-26 15:05:37 +02:00
DominikDoom
d11b53083b Update README.md 2023-07-26 15:04:59 +02:00
DominikDoom
571072eea4 Update README.md 2023-07-26 15:03:52 +02:00
DominikDoom
acfdbf1ed4 Fix for loras in base folder 2023-07-26 14:53:03 +02:00
DominikDoom
2e271aea5c Support for new webui 1.5.0 lora features
Prefers trigger words over the model-keyword ones
Uses custom per-lora multiplier if set
2023-07-26 14:38:51 +02:00
DominikDoom
b28497764f Check keywords for .pt and .ckpt loras too
Especially for custom keywords, the preset list mostly uses safetensors
2023-07-23 11:27:02 +02:00
DominikDoom
0d9d5f1e44 Safety check & remove log 2023-07-23 11:08:29 +02:00
DominikDoom
de3380818e Quote lora filenames to handle commas in filenames
Fixes #206
2023-07-23 11:05:44 +02:00
9 changed files with 182 additions and 67 deletions

View File

@@ -124,23 +124,35 @@ Completion for these types is triggered by typing `<`. By default it will show t
- `<h:` or `<hypernet:` will only show Hypernetworks
### Lora / Lyco trigger word completion
This is an advanced feature that will try to add known trigger words on autocompleting a Lora/Lyco.
This feature will try to add known trigger words on autocompleting a Lora/Lyco.
It uses the list provided by the [model-keyword](https://github.com/mix1009/model-keyword/) extension, which thus needs to be installed to use this feature. The list is also regularly updated through it.
It primarily uses the list provided by the [model-keyword](https://github.com/mix1009/model-keyword/) extension, which thus needs to be installed to use this feature. The list is also regularly updated through it.
However, once installed, you can deactivate it if you want, since tag autocomplete only needs the local keyword lists it ships with, not the extension itself.
The used files are `lora-keywords.txt` and `lora-keywords-user.txt` in the model-keyword installation folder.
The used files are `lora-keyword.txt` and `lora-keyword-user.txt` in the model-keyword installation folder.
If the main file isn't found, the feature will simply deactivate itself, everything else should work normally.
To add custom mappings for unknown Loras, you can use the UI provided by model-keyword, it will automatically write it to the `lora-keywords-user.txt` for you (and create it if it doesn't exist).
The only issue is that it has no official support for the Lycoris extension and doesn't scan its folder for files, so to add them through the UI you will have to temporarily move them into the Lora model folder to be able to select them in model-keywords dropdown.
Some are already included in the default list though, so trying it out first is advisable.
<details>
<summary>Walkthrough to add custom keywords</summary>
#### Note:
As of [v1.5.0](https://github.com/AUTOMATIC1111/stable-diffusion-webui/commit/a3ddf464a2ed24c999f67ddfef7969f8291567be), the webui provides a native method to add activation keywords for Lora through the Extra networks config UI.
These trigger words will always be preferred over the model-keyword ones and can be used without needing to install the model-keyword extension. This will however, obviously, be limited to those manually added keywords. For automatic discovery of keywords, you will still need the big list provided by model-keyword.
![image](https://github.com/DominikDoom/a1111-sd-webui-tagcomplete/assets/34448969/4302c44e-c632-473d-a14a-76f164f966cb)
</details>
After having added your custom keywords, you will need to either restart the UI or use the "Refresh TAC temp files" setting button.
Custom trigger words can be added through two methods:
1. Using the extra networks UI (recommended):
- Only works with webui version v1.5.0 upwards, but much easier to use and works without the model-keyword extension
- This method requires no manual refresh
- <details>
<summary>Image example</summary>
![edit button](https://github.com/DominikDoom/a1111-sd-webui-tagcomplete/assets/34448969/22e95040-1d85-4b7e-a005-1918fafec807)
![lora_edit](https://github.com/DominikDoom/a1111-sd-webui-tagcomplete/assets/34448969/3e6c5245-d3bc-498d-8cd2-26eadf8882e7)
</details>
2. Through the model-keyword UI:
- One issue with this method is that it has no official support for the Lycoris extension and doesn't scan its folder for files, so to add them through the UI you will have to temporarily move them into the Lora model folder to be able to select them in model-keywords dropdown. Some are already included in the default list though, so trying it out first is advisable.
- After having added your custom keywords, you will need to either restart the UI or use the "Refresh TAC temp files" setting button.
- <details>
<summary>Image example</summary>
![image](https://github.com/DominikDoom/a1111-sd-webui-tagcomplete/assets/34448969/4302c44e-c632-473d-a14a-76f164f966cb)
</details>
Sometimes the inserted keywords can be wrong due to a hash collision, however model-keyword and tag autocomplete take the name of the file into account too if the collision is known.

View File

@@ -61,6 +61,26 @@ async function loadCSV(path) {
return parseCSV(text);
}
// Fetch API
async function fetchAPI(url, json = true, cache = false) {
if (!cache) {
const appendChar = url.includes("?") ? "&" : "?";
url += `${appendChar}${new Date().getTime()}`
}
let response = await fetch(url);
if (response.status != 200) {
console.error(`Error fetching API endpoint "${url}": ` + response.status, response.statusText);
return null;
}
if (json)
return await response.json();
else
return await response.text();
}
// Debounce function to prevent spamming the autocomplete function
var dbTimeOut;
const debounce = (func, wait = 300) => {

View File

@@ -29,18 +29,28 @@ class LoraParser extends BaseTagParser {
async function load() {
if (loras.length === 0) {
try {
loras = (await readFile(`${tagBasePath}/temp/lora.txt`)).split("\n")
.filter(x => x.trim().length > 0) // Remove empty lines
.map(x => x.trim().split(",")); // Remove carriage returns and padding if it exists, split into name, hash pairs
loras = (await loadCSV(`${tagBasePath}/temp/lora.txt`))
.filter(x => x[0]?.trim().length > 0) // Remove empty lines
.map(x => [x[0]?.trim(), x[1]]); // Trim filenames and return the name, hash pairs
} catch (e) {
console.error("Error loading lora.txt: " + e);
}
}
}
function sanitize(tagType, text) {
async function sanitize(tagType, text) {
if (tagType === ResultType.lora) {
return `<lora:${text}:${TAC_CFG.extraNetworksDefaultMultiplier}>`;
let multiplier = TAC_CFG.extraNetworksDefaultMultiplier;
let info = await fetchAPI(`tacapi/v1/lora-info/${text}`)
if (info && info["preferred weight"]) {
multiplier = info["preferred weight"];
}
const lastDot = text.lastIndexOf(".");
const lastSlash = text.lastIndexOf("/");
const name = text.substring(lastSlash + 1, lastDot);
return `<lora:${name}:${multiplier}>`;
}
return null;
}

View File

@@ -29,18 +29,28 @@ class LycoParser extends BaseTagParser {
async function load() {
if (lycos.length === 0) {
try {
lycos = (await readFile(`${tagBasePath}/temp/lyco.txt`)).split("\n")
.filter(x => x.trim().length > 0) // Remove empty lines
.map(x => x.trim().split(",")); // Remove carriage returns and padding if it exists, split into name, hash pairs
lycos = (await loadCSV(`${tagBasePath}/temp/lyco.txt`))
.filter(x => x[0]?.trim().length > 0) // Remove empty lines
.map(x => [x[0]?.trim(), x[1]]); // Trim filenames and return the name, hash pairs
} catch (e) {
console.error("Error loading lyco.txt: " + e);
}
}
}
function sanitize(tagType, text) {
async function sanitize(tagType, text) {
if (tagType === ResultType.lyco) {
return `<lyco:${text}:${TAC_CFG.extraNetworksDefaultMultiplier}>`;
let multiplier = TAC_CFG.extraNetworksDefaultMultiplier;
let info = await fetchAPI(`tacapi/v1/lyco-info/${text}`)
if (info && info["preferred weight"]) {
multiplier = info["preferred weight"];
}
const lastDot = text.lastIndexOf(".");
const lastSlash = text.lastIndexOf("/");
const name = text.substring(lastSlash + 1, lastDot);
return `<lyco:${name}:${multiplier}>`;
}
return null;
}

View File

@@ -5,21 +5,20 @@ async function load() {
if (modelKeywordPath.length > 0 && modelKeywordDict.size === 0) {
try {
let lines = [];
let csv_lines = [];
// Only add default keywords if wanted by the user
if (TAC_CFG.modelKeywordCompletion !== "Only user list")
lines = (await readFile(`${modelKeywordPath}/lora-keyword.txt`)).split("\n");
csv_lines = (await loadCSV(`${modelKeywordPath}/lora-keyword.txt`));
// Add custom user keywords if the file exists
if (customFileExists)
lines = lines.concat((await readFile(`${modelKeywordPath}/lora-keyword-user.txt`)).split("\n"));
csv_lines = csv_lines.concat((await loadCSV(`${modelKeywordPath}/lora-keyword-user.txt`)));
if (lines.length === 0) return;
if (csv_lines.length === 0) return;
csv_lines = csv_lines.filter(x => x[0].trim().length > 0 && x[0].trim()[0] !== "#") // Remove empty lines and comments
lines = lines.filter(x => x.trim().length > 0 && x.trim()[0] !== "#") // Remove empty lines and comments
// Add to the dict
lines.forEach(line => {
const parts = line.split(",");
csv_lines.forEach(parts => {
const hash = parts[0];
const keywords = parts[1].replaceAll("| ", ", ").replaceAll("|", ", ").trim();
const lastSepIndex = parts[2]?.lastIndexOf("/") + 1 || parts[2]?.lastIndexOf("\\") + 1 || 0;

View File

@@ -445,30 +445,45 @@ async function insertTextAtCursor(textArea, result, tagword, tabCompletedWithout
// Add lora/lyco keywords if enabled and found
let keywordsLength = 0;
if (TAC_CFG.modelKeywordCompletion !== "Never" && modelKeywordPath.length > 0 && (tagType === ResultType.lora || tagType === ResultType.lyco)) {
if (result.hash && result.hash !== "NOFILE" && result.hash.length > 0) {
let keywords = null;
if (TAC_CFG.modelKeywordCompletion !== "Never" && (tagType === ResultType.lora || tagType === ResultType.lyco)) {
let keywords = null;
// Check built-in activation words first
if (tagType === ResultType.lora || tagType === ResultType.lyco) {
let info = await fetchAPI(`tacapi/v1/lora-info/${result.text}`)
if (info && info["activation text"]) {
keywords = info["activation text"];
}
}
if (!keywords && modelKeywordPath.length > 0 && result.hash && result.hash !== "NOFILE" && result.hash.length > 0) {
let nameDict = modelKeywordDict.get(result.hash);
let name = result.text + ".safetensors";
let names = [result.text + ".safetensors", result.text + ".pt", result.text + ".ckpt"];
if (nameDict) {
if (nameDict.has(name))
keywords = nameDict.get(name);
else
let found = false;
names.forEach(name => {
if (!found && nameDict.has(name)) {
found = true;
keywords = nameDict.get(name);
}
});
if (!found)
keywords = nameDict.get("none");
}
}
if (keywords && keywords.length > 0) {
textBeforeKeywordInsertion = newPrompt;
newPrompt = `${keywords}, ${newPrompt}`; // Insert keywords
textAfterKeywordInsertion = newPrompt;
keywordInsertionUndone = false;
setTimeout(() => lastEditWasKeywordInsertion = true, 200)
keywordsLength = keywords.length + 2; // +2 for the comma and space
}
if (keywords && keywords.length > 0) {
textBeforeKeywordInsertion = newPrompt;
newPrompt = `${keywords}, ${newPrompt}`; // Insert keywords
textAfterKeywordInsertion = newPrompt;
keywordInsertionUndone = false;
setTimeout(() => lastEditWasKeywordInsertion = true, 200)
keywordsLength = keywords.length + 2; // +2 for the comma and space
}
}
@@ -566,6 +581,11 @@ function addResultsToList(textArea, results, tagword, resetList) {
if (!TAC_CFG.alias.onlyShowAlias && result.text !== bestAlias)
displayText += " ➝ " + result.text;
} else if (result.type === ResultType.lora || result.type === ResultType.lyco) {
let lastDot = result.text.lastIndexOf(".");
let lastSlash = result.text.lastIndexOf("/");
let name = result.text.substring(lastSlash + 1, lastDot);
displayText = escapeHTML(name);
} else { // No alias
displayText = escapeHTML(result.text);
}

View File

@@ -1,5 +1,6 @@
# This file provides support for the model-keyword extension to add known lora keywords on completion
import csv
import hashlib
from pathlib import Path
@@ -16,8 +17,11 @@ hash_dict = {}
def load_hash_cache():
with open(known_hashes_file, "r", encoding="utf-8") as file:
for line in file:
name, hash, mtime = line.replace("\n", "").split(",")
reader = csv.reader(
file.readlines(), delimiter=",", quotechar='"', skipinitialspace=True
)
for line in reader:
name, hash, mtime = line
hash_dict[name] = (hash, mtime)
@@ -26,7 +30,7 @@ def update_hash_cache():
if file_needs_update:
with open(known_hashes_file, "w", encoding="utf-8") as file:
for name, (hash, mtime) in hash_dict.items():
file.write(f"{name},{hash},{mtime}\n")
file.write(f'"{name}",{hash},{mtime}\n')
# Copy of the fast inaccurate hash function from the extension
@@ -71,6 +75,6 @@ def write_model_keyword_path():
return True
else:
print(
"Tag Autocomplete: Could not locate model-keyword extension, LORA/LYCO trigger word completion will be unavailable."
"Tag Autocomplete: Could not locate model-keyword extension, Lora trigger word completion will be limited to those added through the extra networks menu."
)
return False

View File

@@ -13,13 +13,13 @@ except ImportError:
# Webui root path
FILE_DIR = Path().absolute()
# The extension base path
EXT_PATH = FILE_DIR.joinpath('extensions')
EXT_PATH = FILE_DIR.joinpath("extensions")
# Tags base path
TAGS_PATH = Path(scripts.basedir()).joinpath('tags')
TAGS_PATH = Path(scripts.basedir()).joinpath("tags")
# The path to the folder containing the wildcards and embeddings
WILDCARD_PATH = FILE_DIR.joinpath('scripts/wildcards')
WILDCARD_PATH = FILE_DIR.joinpath("scripts/wildcards")
EMB_PATH = Path(shared.cmd_opts.embeddings_dir)
HYP_PATH = Path(shared.cmd_opts.hypernetwork_dir)
@@ -27,15 +27,16 @@ try:
LORA_PATH = Path(shared.cmd_opts.lora_dir)
except AttributeError:
LORA_PATH = None
try:
LYCO_PATH = Path(shared.cmd_opts.lyco_dir)
except AttributeError:
LYCO_PATH = None
def find_ext_wildcard_paths():
"""Returns the path to the extension wildcards folder"""
found = list(EXT_PATH.glob('*/wildcards/'))
found = list(EXT_PATH.glob("*/wildcards/"))
return found
@@ -43,11 +44,12 @@ def find_ext_wildcard_paths():
WILDCARD_EXT_PATHS = find_ext_wildcard_paths()
# The path to the temporary files
STATIC_TEMP_PATH = FILE_DIR.joinpath('tmp') # In the webui root, on windows it exists by default, on linux it doesn't
TEMP_PATH = TAGS_PATH.joinpath('temp') # Extension specific temp files
# In the webui root, on windows it exists by default, on linux it doesn't
STATIC_TEMP_PATH = FILE_DIR.joinpath("tmp")
TEMP_PATH = TAGS_PATH.joinpath("temp") # Extension specific temp files
# Make sure these folders exist
if not TEMP_PATH.exists():
TEMP_PATH.mkdir()
if not STATIC_TEMP_PATH.exists():
STATIC_TEMP_PATH.mkdir()
STATIC_TEMP_PATH.mkdir()

View File

@@ -2,10 +2,13 @@
# to a temporary file to expose it to the javascript side
import glob
import json
from pathlib import Path
import gradio as gr
import yaml
from fastapi import FastAPI
from fastapi.responses import FileResponse
from modules import script_callbacks, sd_hijack, shared
from scripts.model_keyword_support import (get_lora_simple_hash,
@@ -142,16 +145,15 @@ def get_lora():
valid_loras = [lf for lf in lora_paths if lf.suffix in {".safetensors", ".ckpt", ".pt"}]
hashes = {}
for l in valid_loras:
name = l.name[:l.name.rfind('.')]
name = l.relative_to(LORA_PATH).as_posix()
if model_keyword_installed:
hashes[name] = get_lora_simple_hash(l)
else:
hashes[name] = ""
# Sort
sorted_loras = dict(sorted(hashes.items()))
# Add hashes and return
return [f"{name},{hash}" for name, hash in sorted_loras.items()]
return [f"\"{name}\",{hash}" for name, hash in sorted_loras.items()]
def get_lyco():
@@ -164,13 +166,16 @@ def get_lyco():
valid_lycos = [lyf for lyf in lyco_paths if lyf.suffix in {".safetensors", ".ckpt", ".pt"}]
hashes = {}
for ly in valid_lycos:
name = ly.name[:ly.name.rfind('.')]
hashes[name] = get_lora_simple_hash(ly)
name = ly.relative_to(LYCO_PATH).as_posix()
if model_keyword_installed:
hashes[name] = get_lora_simple_hash(ly)
else:
hashes[name] = ""
# Sort
sorted_lycos = dict(sorted(hashes.items()))
# Add hashes and return
return [f"{name},{hash}" for name, hash in sorted_lycos.items()]
return [f"\"{name}\",{hash}" for name, hash in sorted_lycos.items()]
def write_tag_base_path():
@@ -328,7 +333,7 @@ def on_ui_settings():
"tac_appendComma": shared.OptionInfo(True, "Append comma on tag autocompletion"),
"tac_appendSpace": shared.OptionInfo(True, "Append space on tag autocompletion").info("will append after comma if the above is enabled"),
"tac_alwaysSpaceAtEnd": shared.OptionInfo(True, "Always append space if inserting at the end of the textbox").info("takes precedence over the regular space setting for that position"),
"tac_modelKeywordCompletion": shared.OptionInfo("Never", "Try to add known trigger words for LORA/LyCO models", gr.Dropdown, lambda: {"interactive": model_keyword_installed, "choices": ["Never","Only user list","Always"]}).info("Requires the <a href=\"https://github.com/mix1009/model-keyword\" target=\"_blank\">model-keyword</a> extension to be installed, but will work with it disabled.").needs_restart(),
"tac_modelKeywordCompletion": shared.OptionInfo("Never", "Try to add known trigger words for LORA/LyCO models", gr.Dropdown, lambda: {"choices": ["Never","Only user list","Always"]}).info("Will use & prefer the native activation keywords settable in the extra networks UI. Other functionality requires the <a href=\"https://github.com/mix1009/model-keyword\" target=\"_blank\">model-keyword</a> extension to be installed, but will work with it disabled.").needs_restart(),
"tac_wildcardCompletionMode": shared.OptionInfo("To next folder level", "How to complete nested wildcard paths", gr.Dropdown, lambda: {"choices": ["To next folder level","To first difference","Always fully"]}).info("e.g. \"hair/colours/light/...\""),
# Alias settings
"tac_alias.searchByAlias": shared.OptionInfo(True, "Search by alias"),
@@ -401,3 +406,36 @@ def on_ui_settings():
shared.opts.add_option("tac_refreshTempFiles", shared.OptionInfo("Refresh TAC temp files", "Refresh internal temp files", gr.HTML, {}, refresh=refresh_temp_files, section=TAC_SECTION))
script_callbacks.on_ui_settings(on_ui_settings)
def api_tac(_: gr.Blocks, app: FastAPI):
async def get_json_info(path: Path):
if not path:
return json.dumps({})
try:
if path is not None and path.exists() and path.parent.joinpath(path.stem + ".json").exists():
return FileResponse(path.parent.joinpath(path.stem + ".json").as_posix())
except Exception as e:
return json.dumps({"error": e})
@app.get("/tacapi/v1/lora-info/{folder}/{lora_name}")
async def get_lora_info_subfolder(folder, lora_name):
if LORA_PATH is None:
return json.dumps({})
return await get_json_info(LORA_PATH.joinpath(folder).joinpath(lora_name))
@app.get("/tacapi/v1/lyco-info/{folder}/{lyco_name}")
async def get_lyco_info_subfolder(folder, lyco_name):
if LYCO_PATH is None:
return json.dumps({})
return await get_json_info(LYCO_PATH.joinpath(folder).joinpath(lyco_name))
@app.get("/tacapi/v1/lora-info/{lora_name}")
async def get_lora_info(lora_name):
return await get_lora_info_subfolder(".", lora_name)
@app.get("/tacapi/v1/lyco-info/{lyco_name}")
async def get_lyco_info(lyco_name):
return await get_lyco_info_subfolder(".", lyco_name)
script_callbacks.on_app_started(api_tac)