mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced 2026-03-09 21:40:22 +00:00
webui update (#1003)
webui: add system message in export conversation, support upload conversation with system message Webui: show upload only when in new conversation Webui: Add model name webui: increase height of chat message window when clicking editing Webui: autoclose settings dialog dropdown and maximze screen width when zoom in webui: fix date issues and add more dates webui: change error to toast.error. server: add n_past and slot_id in props_simple webui: add cache tokens, context and prompt speed in chat webui: modernize ui webui: change welcome message webui: change speed display webui: change run python icon webui: add config to use server defaults for sampler webui: put speed on left and context on right webui: recognize AsciiDoc files as valid text files (#16850) * webui: recognize AsciiDoc files as valid text files * webui: add an updated static webui build * webui: add the updated dependency list * webui: re-add an updated static webui build Add a setting to display message generation statistics (#16901) * feat: Add setting to display message generation statistics * chore: build static webui output webui: add HTML/JS preview support to MarkdownContent with sandboxed iframe (#16757) * webui: add HTML/JS preview support to MarkdownContent with sandboxed iframe dialog Extended MarkdownContent to flag previewable code languages, add a preview button alongside copy controls, manage preview dialog state, and share styling for the new button group Introduced CodePreviewDialog.svelte, a sandboxed iframe modal for rendering HTML/JS previews with consistent dialog controls * webui: fullscreen HTML preview dialog using bits-ui * Update tools/server/webui/src/lib/components/app/misc/CodePreviewDialog.svelte Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> * Update tools/server/webui/src/lib/components/app/misc/MarkdownContent.svelte Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> * webui: pedantic style tweak for CodePreviewDialog close button * webui: remove overengineered preview language logic * chore: update webui static build --------- Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> webui: auto-refresh /props on inference start to resync model metadata (#16784) * webui: auto-refresh /props on inference start to resync model metadata - Add no-cache headers to /props and /slots - Throttle slot checks to 30s - Prevent concurrent fetches with promise guard - Trigger refresh from chat streaming for legacy and ModelSelector - Show dynamic serverWarning when using cached data * fix: restore proper legacy behavior in webui by using unified /props refresh Updated assistant message bubbles to show each message's stored model when available, falling back to the current server model only when the per-message value is missing When the model selector is disabled, now fetches /props and prioritizes that model name over chunk metadata, then persists it with the streamed message so legacy mode properly reflects the backend configuration * fix: detect first valid SSE chunk and refresh server props once * fix: removed the slots availability throttle constant and state * webui: purge ai-generated cruft * chore: update webui static build feat(webui): improve LaTeX rendering with currency detection (#16508) * webui : Revised LaTeX formula recognition * webui : Further examples containg amounts * webui : vitest for maskInlineLaTeX * webui: Moved preprocessLaTeX to lib/utils * webui: LaTeX in table-cells * chore: update webui build output (use theirs) * webui: backslash in LaTeX-preprocessing * chore: update webui build output * webui: look-behind backslash-check * chore: update webui build output * Apply suggestions from code review Code maintenance (variable names, code formatting, string handling) Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> * webui: Moved constants to lib/constants. * webui: package woff2 inside base64 data * webui: LaTeX-line-break in display formula * chore: update webui build output * webui: Bugfix (font embedding) * webui: Bugfix (font embedding) * webui: vite embeds assets * webui: don't suppress 404 (fonts) * refactor: KaTeX integration with SCSS Moves KaTeX styling to SCSS for better customization and font embedding. This change includes: - Adding `sass` as a dev dependency. - Introducing a custom SCSS file to override KaTeX variables and disable TTF/WOFF fonts, relying solely on WOFF2 for embedding. - Adjusting the Vite configuration to resolve `katex-fonts` alias and inject SCSS variables. * fix: LaTeX processing within blockquotes * webui: update webui build output --------- Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> server : add props.model_alias (#16943) * server : add props.model_alias webui: fix keyboard shortcuts for new chat & edit chat title (#17007) Better UX for handling multiple attachments in WebUI (#17246) webui: add OAI-Compat Harmony tool-call streaming visualization and persistence in chat UI (#16618) * webui: add OAI-Compat Harmony tool-call live streaming visualization and persistence in chat UI - Purely visual and diagnostic change, no effect on model context, prompt construction, or inference behavior - Captured assistant tool call payloads during streaming and non-streaming completions, and persisted them in chat state and storage for downstream use - Exposed parsed tool call labels beneath the assistant's model info line with graceful fallback when parsing fails - Added tool call badges beneath assistant responses that expose JSON tooltips and copy their payloads when clicked, matching the existing model badge styling - Added a user-facing setting to toggle tool call visibility to the Developer settings section directly under the model selector option * webui: remove scroll listener causing unnecessary layout updates (model selector) * Update tools/server/webui/src/lib/components/app/chat/ChatMessages/ChatMessageAssistant.svelte Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> * Update tools/server/webui/src/lib/components/app/chat/ChatMessages/ChatMessageAssistant.svelte Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> * chore: npm run format & update webui build output * chore: update webui build output --------- Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com> webui: Fix clickability around chat processing statistics UI (#17278) * fix: Better pointer events handling in chat processing info elements * chore: update webui build output Fix merge error webui: Add a "Continue" Action for Assistant Message (#16971) * feat: Add "Continue" action for assistant messages * feat: Continuation logic & prompt improvements * chore: update webui build output * feat: Improve logic for continuing the assistant message * chore: update webui build output * chore: Linting * chore: update webui build output * fix: Remove synthetic prompt logic, use the prefill feature by sending the conversation payload ending with assistant message * chore: update webui build output * feat: Enable "Continue" button based on config & non-reasoning model type * chore: update webui build output * chore: Update packages with `npm audit fix` * fix: Remove redundant error * chore: update webui build output * chore: Update `.gitignore` * fix: Add missing change * feat: Add auto-resizing for Edit Assistant/User Message textareas * chore: update webui build output Improved file naming & structure for UI components (#17405) * refactor: Component iles naming & structure * chore: update webui build output * refactor: Dialog titles + components namig * chore: update webui build output * refactor: Imports * chore: update webui build output webui: hide border of button webui: update webui: update webui: update add vision webui: minor settings reorganization and add disable autoscroll option (#17452) * webui: added a dedicated 'Display' settings section that groups visualization options * webui: added a Display setting to toggle automatic chat scrolling * chore: update webui build output Co-authored-by: firecoperana <firecoperana>
This commit is contained in:
@@ -4,13 +4,14 @@ import Sidebar from './components/Sidebar';
|
||||
import { AppContextProvider, useAppContext } from './utils/app.context';
|
||||
import ChatScreen from './components/ChatScreen';
|
||||
import SettingDialog from './components/SettingDialog';
|
||||
import { Toaster } from 'react-hot-toast';
|
||||
import { ModalProvider } from './components/ModalProvider';
|
||||
|
||||
function App() {
|
||||
return (
|
||||
<ModalProvider>
|
||||
<HashRouter>
|
||||
<div className="flex flex-row drawer lg:drawer-open">
|
||||
<div className="flex flex-row drawer lg:drawer-open h-screen">
|
||||
<AppContextProvider>
|
||||
<Routes>
|
||||
<Route element={<AppLayout />}>
|
||||
@@ -30,19 +31,21 @@ function AppLayout() {
|
||||
return (
|
||||
<>
|
||||
<Sidebar />
|
||||
<div
|
||||
|
||||
<main
|
||||
className="drawer-content grow flex flex-col h-screen mx-auto px-4 overflow-auto bg-base-100"
|
||||
id="main-scroll"
|
||||
>
|
||||
<Header />
|
||||
<Outlet />
|
||||
</div>
|
||||
</main>
|
||||
{
|
||||
<SettingDialog
|
||||
show={showSettings}
|
||||
onClose={() => setShowSettings(false)}
|
||||
/>
|
||||
}
|
||||
<Toaster />
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -13,8 +13,9 @@ export const CONFIG_DEFAULT = {
|
||||
// Do not use nested objects, keep it single level. Prefix the key if you need to group them.
|
||||
apiKey: '',
|
||||
systemMessage: 'You are a helpful assistant.',
|
||||
showTokensPerSecond: false,
|
||||
showThoughtInProgress: false,
|
||||
showTokensPerSecond: false,
|
||||
showThoughtInProgress: false,
|
||||
useServerDefaults: false, // don't send defaults
|
||||
excludeThoughtOnReq: true,
|
||||
pasteLongTextToFileLen: 2500,
|
||||
pdfAsImage: false,
|
||||
@@ -51,7 +52,7 @@ export const CONFIG_INFO: Record<string, string> = {
|
||||
pasteLongTextToFileLen:
|
||||
'On pasting long text, it will be converted to a file. You can control the file length by setting the value of this parameter. Value 0 means disable.',
|
||||
samplers:
|
||||
'The order at which samplers are applied, in simplified way. Default is "dkypmxt": dry->top_k->typ_p->top_p->min_p->xtc->top_sigma->temperature',
|
||||
'The order at which samplers are applied, in simplified way. Default is "dkypmxnt": dry->top_k->typ_p->top_p->min_p->xtc->top_sigma->temperature',
|
||||
temperature:
|
||||
'Controls the randomness of the generated text by affecting the probability distribution of the output tokens. Higher = more random, lower = more focused.',
|
||||
dynatemp_range:
|
||||
@@ -87,6 +88,7 @@ export const CONFIG_INFO: Record<string, string> = {
|
||||
dry_penalty_last_n:
|
||||
'DRY sampling reduces repetition in generated text even across long contexts. This parameter sets DRY penalty for the last n tokens.',
|
||||
max_tokens: 'The maximum number of token per output.',
|
||||
useServerDefaults: 'When enabled, skip sending WebUI defaults (e.g., temperature) and use the server\'s default values instead.',
|
||||
custom: '', // custom json-stringified object
|
||||
};
|
||||
// config keys having numeric value (i.e. temperature, top_k, top_p, etc)
|
||||
|
||||
@@ -5,6 +5,7 @@ import { classNames } from '../utils/misc';
|
||||
import MarkdownDisplay, { CopyButton } from './MarkdownDisplay';
|
||||
import { ChevronLeftIcon, ChevronRightIcon, ArrowPathIcon, PencilSquareIcon } from '@heroicons/react/24/outline';
|
||||
import ChatInputExtraContextItem from './ChatInputExtraContextItem';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
|
||||
interface SplitMessage {
|
||||
content: PendingMessage['content'];
|
||||
@@ -34,7 +35,8 @@ export default function ChatMessage({
|
||||
isPending?: boolean;
|
||||
}) {
|
||||
const { viewingChat, config } = useAppContext();
|
||||
const [editingContent, setEditingContent] = useState<string | null>(null);
|
||||
const [editingContent, setEditingContent] = useState<string | null>(null);
|
||||
|
||||
const timings = useMemo(
|
||||
() =>
|
||||
msg.timings
|
||||
@@ -50,7 +52,6 @@ export default function ChatMessage({
|
||||
);
|
||||
const nextSibling = siblingLeafNodeIds[siblingCurrIdx + 1];
|
||||
const prevSibling = siblingLeafNodeIds[siblingCurrIdx - 1];
|
||||
|
||||
// for reasoning model, we split the message into content and thought
|
||||
// TODO: implement this as remark/rehype plugin in the future
|
||||
const { content, thought, isThinking }: SplitMessage = useMemo(() => {
|
||||
@@ -81,7 +82,8 @@ export default function ChatMessage({
|
||||
}, [msg]);
|
||||
|
||||
if (!viewingChat) return null;
|
||||
|
||||
//const model_name = (timings?.model_name ??'')!== '' ? timings?.model_name: viewingChat.conv.model_name;
|
||||
const model_name = viewingChat.conv.model_name;
|
||||
return (
|
||||
<div className="group"
|
||||
id={id}
|
||||
@@ -108,12 +110,14 @@ export default function ChatMessage({
|
||||
{/* textarea for editing message */}
|
||||
{editingContent !== null && (
|
||||
<>
|
||||
<textarea
|
||||
<TextareaAutosize
|
||||
dir="auto"
|
||||
className="textarea textarea-bordered bg-base-100 text-base-content max-w-2xl w-[calc(90vw-8em)] h-24"
|
||||
className="textarea textarea-bordered bg-base-100 text-base-content max-w-2xl w-[calc(90vw-8em)]"
|
||||
value={editingContent}
|
||||
onChange={(e) => setEditingContent(e.target.value)}
|
||||
></textarea>
|
||||
onChange={(e: React.ChangeEvent<HTMLTextAreaElement>) => setEditingContent(e.target.value)}
|
||||
minRows={3}
|
||||
maxRows={15}
|
||||
/>
|
||||
<br />
|
||||
<button
|
||||
className="btn btn-ghost mt-2 mr-2"
|
||||
@@ -186,25 +190,48 @@ export default function ChatMessage({
|
||||
)}
|
||||
{/* render timings if enabled */}
|
||||
{timings && config.showTokensPerSecond && (
|
||||
<div className="dropdown dropdown-hover dropdown-top mt-2">
|
||||
<div className="dropdown dropdown-hover dropdown-top ax-w-[900px] w-full mt-4">
|
||||
<div
|
||||
tabIndex={0}
|
||||
role="button"
|
||||
className="cursor-pointer font-semibold text-sm opacity-60"
|
||||
>
|
||||
Speed: {timings.predicted_per_second.toFixed(1)} t/s
|
||||
<div className="font-bold text-xs">
|
||||
{timings.n_ctx>0 && (
|
||||
<div className="flex justify-between items-center">
|
||||
<span className="whitespace-nowrap">
|
||||
Token: {timings.predicted_per_second.toFixed(1)} t/s | Prompt: {timings.prompt_per_second.toFixed(1)} t/s
|
||||
</span>
|
||||
<span className="hidden lg:block pl-[200px] whitespace-nowrap">
|
||||
Ctx: {timings.predicted_n+timings.prompt_n} / {timings.n_past} / {timings.n_ctx}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{(timings.n_ctx==null || timings.n_ctx <=0) && (
|
||||
<div>
|
||||
Token: {timings.predicted_per_second.toFixed(1)} t/s | Prompt: {timings.prompt_per_second.toFixed(1)} t/s
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="dropdown-content bg-base-100 z-10 w-64 p-2 shadow mt-4">
|
||||
<p className="text-xs"><b>{model_name}</b></p>
|
||||
<p className="text-sm">
|
||||
<b>Prompt</b>
|
||||
<br />- Tokens: {timings.prompt_n}
|
||||
<br />- Time: {timings.prompt_ms} ms
|
||||
<br />- Speed: {timings.prompt_per_second.toFixed(1)} t/s
|
||||
<br />- Speed: {timings.prompt_per_second.toFixed(2)} t/s
|
||||
<br />
|
||||
<b>Generation</b>
|
||||
<br />- Tokens: {timings.predicted_n}
|
||||
<br />- Time: {timings.predicted_ms} ms
|
||||
<br />- Speed: {timings.predicted_per_second.toFixed(1)} t/s
|
||||
<br />- Speed: {timings.predicted_per_second.toFixed(2)} t/s
|
||||
<br />
|
||||
<b>Context</b>
|
||||
<br />- n_ctx: {timings.n_ctx}
|
||||
<br />- n_past: {timings.n_past}
|
||||
<br />
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
@@ -214,6 +241,13 @@ export default function ChatMessage({
|
||||
</div>
|
||||
|
||||
{/* actions for each message */}
|
||||
{msg.content !== null && !config.showTokensPerSecond && (
|
||||
msg.role === 'assistant' &&(
|
||||
<div className="badge border-none outline-none btn-mini show-on-hover mr-2">
|
||||
<p className="text-xs">Model: {model_name}</p>
|
||||
</div>
|
||||
)
|
||||
)}
|
||||
{msg.content !== null && (
|
||||
<div
|
||||
className={classNames({
|
||||
@@ -249,7 +283,7 @@ export default function ChatMessage({
|
||||
{/* user message */}
|
||||
{msg.role === 'user' && (
|
||||
<button
|
||||
className="badge btn-mini show-on-hover"
|
||||
className="badge border-none outline-none btn-mini show-on-hover"
|
||||
onClick={() => setEditingContent(msg.content)}
|
||||
disabled={msg.content === null}
|
||||
>
|
||||
@@ -261,7 +295,7 @@ export default function ChatMessage({
|
||||
<>
|
||||
{!isPending && (
|
||||
<button
|
||||
className="badge btn-mini show-on-hover mr-2"
|
||||
className="badge border-none outline-none btn-mini show-on-hover mr-2"
|
||||
onClick={() => {
|
||||
if (msg.content !== null) {
|
||||
onRegenerateMessage(msg as Message);
|
||||
@@ -274,7 +308,7 @@ export default function ChatMessage({
|
||||
)}
|
||||
{!isPending && (
|
||||
<button
|
||||
className="badge btn-mini show-on-hover"
|
||||
className="badge border-none outline-none btn-mini show-on-hover"
|
||||
onClick={() => setEditingContent(msg.content)}
|
||||
disabled={msg.content === null}
|
||||
>
|
||||
@@ -284,7 +318,7 @@ export default function ChatMessage({
|
||||
</>
|
||||
)}
|
||||
<CopyButton
|
||||
className="badge btn-mini show-on-hover mr-2"
|
||||
className="badge border-none outline-none btn-mini show-on-hover mr-2"
|
||||
content={msg.content}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
import { ClipboardEvent, useEffect, useMemo, useState } from 'react';
|
||||
import { useEffect, useMemo, useRef, useState } from 'react';
|
||||
import toast from 'react-hot-toast';
|
||||
import { CallbackGeneratedChunk, useAppContext } from '../utils/app.context';
|
||||
import ChatMessage from './ChatMessage';
|
||||
import { CanvasType, Message, PendingMessage } from '../utils/types';
|
||||
import { classNames, cleanCurrentUrl, throttle } from '../utils/misc';
|
||||
import { classNames, cleanCurrentUrl } from '../utils/misc';
|
||||
import CanvasPyInterpreter from './CanvasPyInterpreter';
|
||||
import StorageUtils from '../utils/storage';
|
||||
import { useVSCodeContext } from '../utils/llama-vscode';
|
||||
import { useChatTextarea, ChatTextareaApi } from './useChatTextarea.ts';
|
||||
import { scrollToBottom, useChatScroll } from './useChatScroll.tsx';
|
||||
import {
|
||||
ArrowUpIcon,
|
||||
StopIcon,
|
||||
@@ -82,23 +84,6 @@ function getListMessageDisplay(
|
||||
return res;
|
||||
}
|
||||
|
||||
const scrollToBottom = throttle(
|
||||
(requiresNearBottom: boolean, delay: number = 80) => {
|
||||
const mainScrollElem = document.getElementById('main-scroll');
|
||||
if (!mainScrollElem) return;
|
||||
const spaceToBottom =
|
||||
mainScrollElem.scrollHeight -
|
||||
mainScrollElem.scrollTop -
|
||||
mainScrollElem.clientHeight;
|
||||
if (!requiresNearBottom || spaceToBottom < 50) {
|
||||
setTimeout(
|
||||
() => mainScrollElem.scrollTo({ top: mainScrollElem.scrollHeight }),
|
||||
delay
|
||||
);
|
||||
}
|
||||
},
|
||||
80
|
||||
);
|
||||
|
||||
export default function ChatScreen() {
|
||||
const {
|
||||
@@ -116,9 +101,10 @@ export default function ChatScreen() {
|
||||
|
||||
const extraContext = useChatExtraContext();
|
||||
useVSCodeContext(textarea, extraContext);
|
||||
//const { extraContext, clearExtraContext } = useVSCodeContext(textarea);
|
||||
|
||||
const msgListRef = useRef<HTMLDivElement>(null);
|
||||
useChatScroll(msgListRef);
|
||||
// TODO: improve this when we have "upload file" feature
|
||||
|
||||
// keep track of leaf node for rendering
|
||||
const [currNodeId, setCurrNodeId] = useState<number>(-1);
|
||||
const messages: MessageDisplay[] = useMemo(() => {
|
||||
@@ -141,32 +127,44 @@ export default function ChatScreen() {
|
||||
if (currLeafNodeId) {
|
||||
setCurrNodeId(currLeafNodeId);
|
||||
}
|
||||
scrollToBottom(true);
|
||||
//useChatScroll will handle the auto scroll
|
||||
};
|
||||
|
||||
const sendNewMessage = async () => {
|
||||
const lastInpMsg = textarea.value();
|
||||
if (lastInpMsg.trim().length === 0 || isGenerating(currConvId ?? ''))
|
||||
|
||||
const lastInpMsg = textarea.value();
|
||||
try {
|
||||
const generate = isGenerating(currConvId ?? '');
|
||||
console.log('IsGenerating', generate);
|
||||
if (lastInpMsg.trim().length === 0 || generate)
|
||||
return;
|
||||
|
||||
textarea.setValue('');
|
||||
scrollToBottom(false);
|
||||
setCurrNodeId(-1);
|
||||
// get the last message node
|
||||
const lastMsgNodeId = messages.at(-1)?.msg.id ?? null;
|
||||
if (
|
||||
!(await sendMessage(
|
||||
const successSendMsg=await sendMessage(
|
||||
currConvId,
|
||||
lastMsgNodeId,
|
||||
lastInpMsg,
|
||||
extraContext.items,
|
||||
onChunk
|
||||
))
|
||||
) {
|
||||
);
|
||||
console.log('Send msg success:', successSendMsg);
|
||||
if (!successSendMsg)
|
||||
{
|
||||
// restore the input message if failed
|
||||
textarea.setValue(lastInpMsg);
|
||||
}
|
||||
// OK
|
||||
extraContext.clearItems();
|
||||
}
|
||||
catch (err) {
|
||||
//console.error('Error sending message:', error);
|
||||
toast.error(err instanceof Error ? err.message : String(err));
|
||||
textarea.setValue(lastInpMsg); // Restore input on error
|
||||
}
|
||||
};
|
||||
|
||||
const handleEditMessage = async (msg: Message, content: string) => {
|
||||
@@ -182,6 +180,7 @@ export default function ChatScreen() {
|
||||
);
|
||||
setCurrNodeId(-1);
|
||||
scrollToBottom(false);
|
||||
|
||||
};
|
||||
|
||||
const handleRegenerateMessage = async (msg: Message) => {
|
||||
@@ -197,9 +196,10 @@ export default function ChatScreen() {
|
||||
);
|
||||
setCurrNodeId(-1);
|
||||
scrollToBottom(false);
|
||||
|
||||
};
|
||||
|
||||
const handleContinueMessage = async (msg: Message, content: string) => {
|
||||
const handleContinueMessage = async (msg: Message, content: string) => {
|
||||
if (!viewingChat || !continueMessageAndGenerate) return;
|
||||
setCurrNodeId(msg.id);
|
||||
scrollToBottom(false);
|
||||
@@ -211,6 +211,7 @@ export default function ChatScreen() {
|
||||
);
|
||||
setCurrNodeId(-1);
|
||||
scrollToBottom(false);
|
||||
|
||||
};
|
||||
|
||||
const hasCanvas = !!canvasData;
|
||||
@@ -251,16 +252,29 @@ export default function ChatScreen() {
|
||||
>
|
||||
<div
|
||||
className={classNames({
|
||||
'flex flex-col w-full max-w-[900px] mx-auto': true,
|
||||
'flex flex-col w-[75vw] mx-auto': true,
|
||||
'hidden lg:flex': hasCanvas, // adapted for mobile
|
||||
flex: !hasCanvas,
|
||||
})}
|
||||
>
|
||||
<div className="flex items-center justify-center">
|
||||
{viewingChat?.conv.model_name}
|
||||
</div>
|
||||
{/* chat messages */}
|
||||
<div id="messages-list" className="grow">
|
||||
<div id="messages-list" className="grow" ref={msgListRef}>
|
||||
<div className="mt-auto flex justify-center">
|
||||
{/* placeholder to shift the message to the bottom */}
|
||||
{viewingChat ? '' : 'Send a message to start'}
|
||||
<div>
|
||||
{viewingChat ? '' : ''}
|
||||
</div>
|
||||
{viewingChat==null && (
|
||||
<div className="w-full max-w-2xl px-4">
|
||||
<div className="mb-8 text-center" >
|
||||
<p className="text-1xl text-muted-foreground">How can I help you today?</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
</div>
|
||||
{[...messages, ...pendingMsgDisplay].map((msgDisplay) => {
|
||||
const actualMsgObject = msgDisplay.msg;
|
||||
@@ -292,8 +306,7 @@ export default function ChatScreen() {
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
{/* chat input */}
|
||||
{/* chat input */}
|
||||
<ChatInput
|
||||
textarea={textarea}
|
||||
extraContext={extraContext}
|
||||
@@ -301,7 +314,7 @@ export default function ChatScreen() {
|
||||
onStop={() => stopGenerating(currConvId ?? '')}
|
||||
isGenerating={isGenerating(currConvId ?? '')}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="w-full sticky top-[7em] h-[calc(100vh-9em)]">
|
||||
{canvasData?.type === CanvasType.PY_INTERPRETER && (
|
||||
<CanvasPyInterpreter />
|
||||
@@ -311,38 +324,6 @@ export default function ChatScreen() {
|
||||
);
|
||||
}
|
||||
|
||||
// function ServerInfo() {
|
||||
// const { serverProps } = useAppContext();
|
||||
// const modalities = [];
|
||||
// if (serverProps?.modalities?.audio) {
|
||||
// modalities.push('audio');
|
||||
// }
|
||||
// if (serverProps?.modalities?.vision) {
|
||||
// modalities.push('vision');
|
||||
// }
|
||||
// return (
|
||||
// <div
|
||||
// className="card card-sm shadow-sm border-1 border-base-content/20 text-base-content/70 mb-6"
|
||||
// tabIndex={0}
|
||||
// aria-description="Server information"
|
||||
// >
|
||||
// <div className="card-body">
|
||||
// <b>Server Info</b>
|
||||
// <p>
|
||||
// <b>Model</b>: {serverProps?.model_path?.split(/(\\|\/)/).pop()}
|
||||
// <br />
|
||||
// {modalities.length > 0 ? (
|
||||
// <>
|
||||
// <b>Supported modalities:</b> {modalities.join(', ')}
|
||||
// </>
|
||||
// ) : (
|
||||
// ''
|
||||
// )}
|
||||
// </p>
|
||||
// </div>
|
||||
// </div>
|
||||
// );
|
||||
// }
|
||||
|
||||
function ChatInput({
|
||||
textarea,
|
||||
@@ -384,7 +365,7 @@ function ChatInput({
|
||||
className="flex flex-col rounded-xl border-1 border-base-content/30 p-3 w-full"
|
||||
// when a file is pasted to the input, we handle it here
|
||||
// if a text is pasted, and if it is long text, we will convert it to a file
|
||||
onPasteCapture={(e: ClipboardEvent<HTMLInputElement>) => {
|
||||
onPasteCapture={(e: React.ClipboardEvent<HTMLInputElement>) => {
|
||||
const text = e.clipboardData.getData('text/plain');
|
||||
if (
|
||||
text.length > 0 &&
|
||||
|
||||
@@ -62,11 +62,6 @@ export default function Header() {
|
||||
if (newName && newName.trim().length > 0) {
|
||||
StorageUtils.updateConversationName(viewingChat?.conv.id ?? '', newName);
|
||||
}
|
||||
//const importedConv = await StorageUtils.updateConversationName();
|
||||
//if (importedConv) {
|
||||
//console.log('Successfully imported:', importedConv.name);
|
||||
// Refresh UI or navigate to conversation
|
||||
//}
|
||||
};
|
||||
|
||||
// at the top of your file, alongside ConversationExport:
|
||||
@@ -75,13 +70,30 @@ export default function Header() {
|
||||
if (importedConv) {
|
||||
console.log('Successfully imported:', importedConv.name);
|
||||
// Refresh UI or navigate to conversation
|
||||
navigate(`/chat/${importedConv.id}`);
|
||||
}
|
||||
};
|
||||
|
||||
const downloadConversation = () => {
|
||||
if (isCurrConvGenerating || !viewingChat) return;
|
||||
const convId = viewingChat?.conv.id;
|
||||
const conversationJson = JSON.stringify(viewingChat, null, 2);
|
||||
|
||||
// Get the current system message from config
|
||||
const systemMessage = StorageUtils.getConfig().systemMessage;
|
||||
|
||||
// Clone the viewingChat object to avoid modifying the original
|
||||
const exportData = {
|
||||
conv: { ...viewingChat.conv },
|
||||
messages: viewingChat.messages.map(msg => ({ ...msg }))
|
||||
};
|
||||
|
||||
// Find the root message and update its content
|
||||
const rootMessage = exportData.messages.find(m => m.type === 'root');
|
||||
if (rootMessage) {
|
||||
rootMessage.content = systemMessage;
|
||||
}
|
||||
|
||||
const conversationJson = JSON.stringify(exportData, null, 2);
|
||||
const blob = new Blob([conversationJson], { type: 'application/json' });
|
||||
const url = URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
@@ -116,9 +128,12 @@ export default function Header() {
|
||||
|
||||
{/* action buttons (top right) */}
|
||||
<div className="flex items-center">
|
||||
{viewingChat && (
|
||||
{/* start */ }
|
||||
{/*viewingChat && */ /* show options for new conversation as well */
|
||||
(
|
||||
<div className="dropdown dropdown-end">
|
||||
{/* "..." button */}
|
||||
|
||||
<button
|
||||
tabIndex={0}
|
||||
role="button"
|
||||
|
||||
@@ -93,6 +93,7 @@ const CodeBlockButtons: React.ElementType<
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
export const CopyButton = ({
|
||||
content,
|
||||
className,
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useState } from 'react';
|
||||
import { useState, useRef} from 'react';
|
||||
import { useAppContext } from '../utils/app.context';
|
||||
import { CONFIG_DEFAULT, CONFIG_INFO } from '../Config';
|
||||
import { isDev } from '../Config';
|
||||
@@ -339,7 +339,7 @@ const SETTING_SECTIONS = (
|
||||
{
|
||||
type: SettingInputType.CHECKBOX,
|
||||
label:
|
||||
'Exclude thought process when sending requests to API (Recommended for DeepSeek-R1)',
|
||||
'Exclude thought process when sending requests to API (Recommended for Reasoning Models like Deepseek R1)',
|
||||
key: 'excludeThoughtOnReq',
|
||||
},
|
||||
],
|
||||
@@ -361,7 +361,7 @@ const SETTING_SECTIONS = (
|
||||
const demoConv = await res.json();
|
||||
StorageUtils.remove(demoConv.id);
|
||||
for (const msg of demoConv.messages) {
|
||||
StorageUtils.appendMsg(demoConv.id, msg);
|
||||
StorageUtils.appendMsg(demoConv.id, msg, msg.model_name);
|
||||
}
|
||||
};
|
||||
return (
|
||||
@@ -422,7 +422,7 @@ const SETTING_SECTIONS = (
|
||||
toast.success('Import complete')
|
||||
window.location.reload();
|
||||
} catch (error) {
|
||||
console.error('' + error);
|
||||
//console.error('' + error);
|
||||
toast.error('' + error);
|
||||
}
|
||||
};
|
||||
@@ -452,9 +452,14 @@ const SETTING_SECTIONS = (
|
||||
|
||||
{
|
||||
type: SettingInputType.CHECKBOX,
|
||||
label: 'Show tokens per second',
|
||||
label: 'Show generation stats (model name, context size, prompt and token per second)',
|
||||
key: 'showTokensPerSecond',
|
||||
},
|
||||
{
|
||||
type: SettingInputType.CHECKBOX,
|
||||
label: 'Use server defaults for parameters (skip sending temp, top_k, top_p, min_p, typical p from WebUI)',
|
||||
key: 'useServerDefaults',
|
||||
},
|
||||
{
|
||||
type: SettingInputType.LONG_INPUT,
|
||||
label: (
|
||||
@@ -582,7 +587,8 @@ export default function SettingDialog({
|
||||
return;
|
||||
}
|
||||
} else {
|
||||
console.error(`Unknown default type for key ${key}`);
|
||||
//console.error(`Unknown default type for key ${key}`);
|
||||
toast.error(`Unknown default type for key ${key}`);
|
||||
}
|
||||
}
|
||||
if (isDev) console.log('Saving config', newConfig);
|
||||
@@ -595,6 +601,7 @@ export default function SettingDialog({
|
||||
setLocalConfig({ ...localConfig, [key]: value });
|
||||
};
|
||||
|
||||
const detailsRef = useRef<HTMLDetailsElement>(null); // <-- Add this line
|
||||
return (
|
||||
<dialog className={classNames({ modal: true, 'modal-open': show })}>
|
||||
<div className="modal-box w-11/12 max-w-3xl">
|
||||
@@ -619,7 +626,7 @@ export default function SettingDialog({
|
||||
|
||||
{/* Left panel, showing sections - Mobile version */}
|
||||
<div className="md:hidden flex flex-row gap-2 mb-4">
|
||||
<details className="dropdown">
|
||||
<details className="dropdown" ref={detailsRef}>
|
||||
<summary className="btn bt-sm w-full m-1">
|
||||
{SETTING_SECTIONS_GENERATED[sectionIdx].title}
|
||||
</summary>
|
||||
@@ -631,7 +638,9 @@ export default function SettingDialog({
|
||||
'btn btn-ghost justify-start font-normal': true,
|
||||
'btn-active': sectionIdx === idx,
|
||||
})}
|
||||
onClick={() => setSectionIdx(idx)}
|
||||
onClick={() => {setSectionIdx(idx);
|
||||
detailsRef.current?.removeAttribute('open');
|
||||
}}
|
||||
dir="auto"
|
||||
>
|
||||
{section.title}
|
||||
|
||||
@@ -16,6 +16,7 @@ import { BtnWithTooltips } from '../utils/common';
|
||||
import { useAppContext } from '../utils/app.context';
|
||||
import toast from 'react-hot-toast';
|
||||
import { useModals } from './ModalProvider';
|
||||
import {DateTime} from 'luxon'
|
||||
|
||||
// at the top of your file, alongside ConversationExport:
|
||||
async function importConversation() {
|
||||
@@ -114,7 +115,7 @@ export default function Sidebar() {
|
||||
aria-label="New conversation"
|
||||
>
|
||||
<PencilSquareIcon className="w-5 h-5" />
|
||||
New conversation
|
||||
New Conversations
|
||||
</button>
|
||||
|
||||
{/* list of conversations */}
|
||||
@@ -251,11 +252,11 @@ function ConversationItem({
|
||||
true,
|
||||
'btn-soft': isCurrConv,
|
||||
})}
|
||||
onClick={onSelect}
|
||||
>
|
||||
<button
|
||||
key={conv.id}
|
||||
className="w-full overflow-hidden truncate text-start"
|
||||
onClick={onSelect}
|
||||
className="w-full overflow-hidden truncate text-start"
|
||||
dir="auto"
|
||||
>
|
||||
{conv.name}
|
||||
@@ -265,7 +266,7 @@ function ConversationItem({
|
||||
// on mobile, we always show the ellipsis icon
|
||||
// on desktop, we only show it when the user hovers over the conversation item
|
||||
// we use opacity instead of hidden to avoid layout shift
|
||||
className="cursor-pointer opacity-100 md:opacity-0 group-hover:opacity-100"
|
||||
className="cursor-pointer opacity-100 xl:opacity-0 group-hover:opacity-100"
|
||||
onClick={() => {}}
|
||||
tooltipsContent="More"
|
||||
>
|
||||
@@ -318,23 +319,26 @@ export interface GroupedConversations {
|
||||
|
||||
// TODO @ngxson : add test for this function
|
||||
// Group conversations by date
|
||||
// - Yesterday
|
||||
// - "Previous 7 Days"
|
||||
// - "Previous 30 Days"
|
||||
// - "Month Year" (e.g., "April 2023")
|
||||
export function groupConversationsByDate(
|
||||
conversations: Conversation[]
|
||||
): GroupedConversations[] {
|
||||
const now = new Date();
|
||||
const today = new Date(now.getFullYear(), now.getMonth(), now.getDate()); // Start of today
|
||||
|
||||
const today=DateTime.now().startOf('day');
|
||||
const yesterday = today.minus({ days: 1 });
|
||||
|
||||
const sevenDaysAgo = new Date(today);
|
||||
sevenDaysAgo.setDate(today.getDate() - 7);
|
||||
const yesterday2 = today.minus({ days: 2});
|
||||
|
||||
const thirtyDaysAgo = new Date(today);
|
||||
thirtyDaysAgo.setDate(today.getDate() - 30);
|
||||
const sevenDaysAgo = today.minus({ days: 7 });
|
||||
|
||||
const thirtyDaysAgo = today.minus({ days: 30 });
|
||||
const groups: { [key: string]: Conversation[] } = {
|
||||
Today: [],
|
||||
Yesterday: [],
|
||||
'Previous 2 Days': [],
|
||||
'Previous 7 Days': [],
|
||||
'Previous 30 Days': [],
|
||||
};
|
||||
@@ -347,17 +351,20 @@ export function groupConversationsByDate(
|
||||
);
|
||||
|
||||
for (const conv of sortedConversations) {
|
||||
const convDate = new Date(conv.lastModified);
|
||||
|
||||
const convDate=DateTime.fromMillis(conv.lastModified).setZone('America/Chicago');
|
||||
if (convDate >= today) {
|
||||
groups['Today'].push(conv);
|
||||
} else if (convDate >= yesterday) {
|
||||
groups['Yesterday'].push(conv);
|
||||
} else if (convDate >= yesterday2) {
|
||||
groups['Previous 2 Days'].push(conv);
|
||||
} else if (convDate >= sevenDaysAgo) {
|
||||
groups['Previous 7 Days'].push(conv);
|
||||
} else if (convDate >= thirtyDaysAgo) {
|
||||
groups['Previous 30 Days'].push(conv);
|
||||
} else {
|
||||
const monthName = convDate.toLocaleString('default', { month: 'long' });
|
||||
const year = convDate.getFullYear();
|
||||
const monthName = convDate.monthLong;
|
||||
const year = convDate.year;
|
||||
const monthYearKey = `${monthName} ${year}`;
|
||||
if (!monthlyGroups[monthYearKey]) {
|
||||
monthlyGroups[monthYearKey] = [];
|
||||
@@ -374,20 +381,23 @@ export function groupConversationsByDate(
|
||||
conversations: groups['Today'],
|
||||
});
|
||||
}
|
||||
const timeRanges = [
|
||||
{ key: 'Yesterday', display: 'Yesterday'},
|
||||
{ key: 'Previous 2 Days', display: 'Previous 2 Days'},
|
||||
{ key: 'Previous 7 Days', display: 'Previous 7 Days' },
|
||||
{ key: 'Previous 30 Days', display: 'Previous 30 Days' },
|
||||
|
||||
if (groups['Previous 7 Days'].length > 0) {
|
||||
// Add more ranges here if needed, e.g., 'Previous 90 Days'
|
||||
];
|
||||
|
||||
for (const range of timeRanges) {
|
||||
if (groups[range.key]?.length > 0) {
|
||||
result.push({
|
||||
title: 'Previous 7 Days',
|
||||
conversations: groups['Previous 7 Days'],
|
||||
});
|
||||
}
|
||||
|
||||
if (groups['Previous 30 Days'].length > 0) {
|
||||
result.push({
|
||||
title: 'Previous 30 Days',
|
||||
conversations: groups['Previous 30 Days'],
|
||||
title: range.display,
|
||||
conversations: groups[range.key]
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sort monthly groups by date (most recent month first)
|
||||
const sortedMonthKeys = Object.keys(monthlyGroups).sort((a, b) => {
|
||||
|
||||
@@ -231,7 +231,7 @@ async function convertPDFToImage(file: File): Promise<string[]> {
|
||||
if (!ctx) {
|
||||
throw new Error('Failed to get 2D context from canvas');
|
||||
}
|
||||
const task = page.render({ canvasContext: ctx, viewport: viewport });
|
||||
const task = page.render({ canvasContext: ctx, canvas: canvas, viewport: viewport });
|
||||
pages.push(
|
||||
task.promise.then(() => {
|
||||
return canvas.toDataURL();
|
||||
|
||||
34
examples/server/webui/src/components/useChatScroll.tsx
Normal file
34
examples/server/webui/src/components/useChatScroll.tsx
Normal file
@@ -0,0 +1,34 @@
|
||||
import React, { useEffect } from 'react';
|
||||
import { throttle } from '../utils/misc';
|
||||
|
||||
export const scrollToBottom = (requiresNearBottom: boolean, delay?: number) => {
|
||||
const mainScrollElem = document.getElementById('main-scroll');
|
||||
if (!mainScrollElem) return;
|
||||
const spaceToBottom =
|
||||
mainScrollElem.scrollHeight -
|
||||
mainScrollElem.scrollTop -
|
||||
mainScrollElem.clientHeight;
|
||||
if (!requiresNearBottom || spaceToBottom < 100) {
|
||||
setTimeout(
|
||||
() => mainScrollElem.scrollTo({ top: mainScrollElem.scrollHeight }),
|
||||
delay ?? 80
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
const scrollToBottomThrottled = throttle(scrollToBottom, 80);
|
||||
|
||||
export function useChatScroll(msgListRef: React.RefObject<HTMLDivElement>) {
|
||||
useEffect(() => {
|
||||
if (!msgListRef.current) return;
|
||||
|
||||
const resizeObserver = new ResizeObserver((_) => {
|
||||
scrollToBottomThrottled(true, 10);
|
||||
});
|
||||
|
||||
resizeObserver.observe(msgListRef.current);
|
||||
return () => {
|
||||
resizeObserver.disconnect();
|
||||
};
|
||||
}, [msgListRef]);
|
||||
}
|
||||
@@ -9,6 +9,7 @@ html {
|
||||
scrollbar-gutter: auto;
|
||||
}
|
||||
|
||||
|
||||
.markdown {
|
||||
h1,
|
||||
h2,
|
||||
@@ -31,6 +32,7 @@ html {
|
||||
/* TODO: fix markdown table */
|
||||
}
|
||||
|
||||
|
||||
.show-on-hover {
|
||||
@apply md:opacity-0 md:group-hover:opacity-100;
|
||||
}
|
||||
@@ -42,12 +44,16 @@ html {
|
||||
}
|
||||
|
||||
.chat-bubble {
|
||||
overflow-wrap: break-word;
|
||||
word-break: break-word;
|
||||
@apply break-words;
|
||||
}
|
||||
|
||||
.chat-bubble-base-300 {
|
||||
--tw-bg-opacity: 1;
|
||||
--tw-text-opacity: 1;
|
||||
overflow-wrap: break-word;
|
||||
word-break: break-word;
|
||||
@apply break-words bg-base-300 text-base-content;
|
||||
}
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@ import {
|
||||
filterThoughtFromMsgs,
|
||||
normalizeMsgsForAPI,
|
||||
getSSEStreamAsync,
|
||||
getServerProps
|
||||
getServerProps,
|
||||
} from './misc';
|
||||
import { BASE_URL, CONFIG_DEFAULT, isDev } from '../Config';
|
||||
import { matchPath, useLocation, useNavigate } from 'react-router';
|
||||
@@ -110,8 +110,7 @@ export const AppContextProvider = ({
|
||||
setServerProps(props);
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error(err);
|
||||
toast.error('Failed to fetch server props');
|
||||
console.error(err);
|
||||
});
|
||||
// eslint-disable-next-line
|
||||
}, []);
|
||||
@@ -216,6 +215,7 @@ export const AppContextProvider = ({
|
||||
content: null,
|
||||
parent: leafNodeId,
|
||||
children: [],
|
||||
model_name: '',
|
||||
};
|
||||
setPending(convId, pendingMsg as PendingMessage);
|
||||
}
|
||||
@@ -240,13 +240,8 @@ export const AppContextProvider = ({
|
||||
cache_prompt: true,
|
||||
reasoning_format: config.reasoning_format===''?'auto':config.reasoning_format,
|
||||
samplers: config.samplers,
|
||||
temperature: config.temperature,
|
||||
dynatemp_range: config.dynatemp_range,
|
||||
dynatemp_exponent: config.dynatemp_exponent,
|
||||
top_k: config.top_k,
|
||||
top_p: config.top_p,
|
||||
min_p: config.min_p,
|
||||
typical_p: config.typical_p,
|
||||
xtc_probability: config.xtc_probability,
|
||||
xtc_threshold: config.xtc_threshold,
|
||||
top_n_sigma: config.top_n_sigma,
|
||||
@@ -260,6 +255,13 @@ export const AppContextProvider = ({
|
||||
dry_penalty_last_n: config.dry_penalty_last_n,
|
||||
max_tokens: config.max_tokens,
|
||||
timings_per_token: !!config.showTokensPerSecond,
|
||||
...(config.useServerDefaults ? {} :{
|
||||
temperature: config.temperature,
|
||||
top_k: config.top_k,
|
||||
top_p: config.top_p,
|
||||
min_p: config.min_p,
|
||||
typical_p: config.typical_p,
|
||||
}),
|
||||
...(config.custom.length ? JSON.parse(config.custom) : {}),
|
||||
};
|
||||
|
||||
@@ -322,6 +324,8 @@ export const AppContextProvider = ({
|
||||
prompt_ms: timings.prompt_ms,
|
||||
predicted_n: timings.predicted_n,
|
||||
predicted_ms: timings.predicted_ms,
|
||||
n_ctx: timings.n_ctx,
|
||||
n_past: timings.n_past,
|
||||
};
|
||||
}
|
||||
setPending(convId, pendingMsg as PendingMessage);
|
||||
@@ -333,10 +337,7 @@ export const AppContextProvider = ({
|
||||
// user stopped the generation via stopGeneration() function
|
||||
// we can safely ignore this error
|
||||
} else {
|
||||
console.error(err);
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
alert((err as any)?.message ?? 'Unknown error');
|
||||
//throw err; // rethrow
|
||||
toast.error(err instanceof Error ? err.message : String(err));
|
||||
}
|
||||
}
|
||||
finally {
|
||||
@@ -344,7 +345,7 @@ export const AppContextProvider = ({
|
||||
if (isContinuation) {
|
||||
await StorageUtils.updateMessage(pendingMsg as Message);
|
||||
} else if (pendingMsg.content.trim().length > 0) {
|
||||
await StorageUtils.appendMsg(pendingMsg as Message, leafNodeId);
|
||||
await StorageUtils.appendMsg(pendingMsg as Message, leafNodeId, '');
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -375,6 +376,16 @@ export const AppContextProvider = ({
|
||||
const now = Date.now()+Timer.timercount;
|
||||
Timer.timercount=Timer.timercount + 2;
|
||||
const currMsgId = now;
|
||||
|
||||
let model_name:string='';
|
||||
await getServerProps(BASE_URL)
|
||||
.then((props) => {
|
||||
console.debug('Server props:', props);
|
||||
model_name = props.model_name;
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error(err);
|
||||
});
|
||||
StorageUtils.appendMsg(
|
||||
{
|
||||
id: currMsgId,
|
||||
@@ -383,11 +394,13 @@ export const AppContextProvider = ({
|
||||
convId,
|
||||
role: 'user',
|
||||
content,
|
||||
model_name: model_name,
|
||||
extra,
|
||||
parent: leafNodeId,
|
||||
children: [],
|
||||
},
|
||||
leafNodeId
|
||||
leafNodeId,
|
||||
model_name
|
||||
);
|
||||
onChunk(currMsgId);
|
||||
|
||||
@@ -415,9 +428,20 @@ export const AppContextProvider = ({
|
||||
) => {
|
||||
if (isGenerating(convId)) return;
|
||||
|
||||
if (content !== null) {
|
||||
if (content !== null) {
|
||||
const now = Date.now();
|
||||
const currMsgId = now;
|
||||
|
||||
let model_name:string='';
|
||||
await getServerProps(BASE_URL)
|
||||
.then((props) => {
|
||||
console.debug('Server props:', props);
|
||||
model_name = props.model_name;
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error(err);
|
||||
});
|
||||
|
||||
StorageUtils.appendMsg(
|
||||
{
|
||||
id: currMsgId,
|
||||
@@ -426,11 +450,13 @@ export const AppContextProvider = ({
|
||||
convId,
|
||||
role: 'user',
|
||||
content,
|
||||
model_name:model_name,
|
||||
extra,
|
||||
parent: parentNodeId,
|
||||
children: [],
|
||||
},
|
||||
parentNodeId
|
||||
parentNodeId,
|
||||
model_name
|
||||
);
|
||||
parentNodeId = currMsgId;
|
||||
}
|
||||
@@ -452,9 +478,9 @@ export const AppContextProvider = ({
|
||||
messageIdToContinue
|
||||
);
|
||||
if (!existingMessage || existingMessage.role !== 'assistant') {
|
||||
console.error(
|
||||
'Cannot continue non-assistant message or message not found'
|
||||
);
|
||||
// console.error(
|
||||
// 'Cannot continue non-assistant message or message not found'
|
||||
// );
|
||||
toast.error(
|
||||
'Failed to continue message: Not an assistant message or not found.'
|
||||
);
|
||||
|
||||
@@ -4,7 +4,6 @@ import { APIMessage, Message, LlamaCppServerProps, APIMessageContentPart } from
|
||||
|
||||
// ponyfill for missing ReadableStream asyncIterator on Safari
|
||||
import { asyncIterator } from '@sec-ant/readable-stream/ponyfill/asyncIterator';
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
export const isString = (x: any) => !!x.toLowerCase;
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
@@ -177,19 +176,20 @@ export const getServerProps = async (
|
||||
apiKey?: string
|
||||
): Promise<LlamaCppServerProps> => {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/props`, {
|
||||
const response = await fetch(`${baseUrl}/v1/props`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(apiKey ? { Authorization: `Bearer ${apiKey}` } : {}),
|
||||
},
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch server props');
|
||||
//throw new Error('Failed to fetch server props');
|
||||
}
|
||||
const data = await response.json();
|
||||
return data as LlamaCppServerProps;
|
||||
} catch (error) {
|
||||
console.error('Error fetching server props:', error);
|
||||
//console.error('Error fetching server props:', error);
|
||||
//toast.error('Error fetching server props:' +error);
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
@@ -1,11 +1,12 @@
|
||||
// coversations is stored in localStorage
|
||||
// format: { [convId]: { id: string, lastModified: number, messages: [...] } }
|
||||
|
||||
import { CONFIG_DEFAULT } from '../Config';
|
||||
//import { useState } from 'react';
|
||||
import {BASE_URL, CONFIG_DEFAULT } from '../Config';
|
||||
import { Conversation, Message, TimingReport, SettingsPreset } from './types';
|
||||
import Dexie, { Table } from 'dexie';
|
||||
import {getServerProps} from './misc'
|
||||
import { exportDB as exportDexieDB } from 'dexie-export-import';
|
||||
|
||||
import toast from 'react-hot-toast';
|
||||
const event = new EventTarget();
|
||||
|
||||
type CallbackConversationChanged = (convId: string) => void;
|
||||
@@ -31,6 +32,12 @@ db.version(1).stores({
|
||||
messages: '&id, convId, [convId+id], timestamp',
|
||||
});
|
||||
|
||||
db.version(2).stores({
|
||||
// Unlike SQL, you don’t need to specify all properties but only the one you wish to index.
|
||||
conversations: '&id, lastModified, model_name',
|
||||
messages: '&id, convId, [convId+id], timestamp',
|
||||
});
|
||||
|
||||
// convId is a string prefixed with 'conv-'
|
||||
const StorageUtils = {
|
||||
|
||||
@@ -118,11 +125,22 @@ const StorageUtils = {
|
||||
async createConversation(name: string): Promise<Conversation> {
|
||||
const now = Date.now();
|
||||
const msgId = now;
|
||||
let model_name:string = '';
|
||||
//window.alert(BASE_URL);
|
||||
await getServerProps(BASE_URL)
|
||||
.then((props) => {
|
||||
console.debug('Server props:', props);
|
||||
model_name = props.model_name;
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error(err);
|
||||
});
|
||||
const conv: Conversation = {
|
||||
id: `conv-${now}`,
|
||||
lastModified: now,
|
||||
currNode: msgId,
|
||||
name,
|
||||
model_name:model_name,
|
||||
};
|
||||
await db.conversations.add(conv);
|
||||
// create a root node
|
||||
@@ -133,6 +151,7 @@ const StorageUtils = {
|
||||
timestamp: now,
|
||||
role: 'system',
|
||||
content: '',
|
||||
model_name:conv.model_name,
|
||||
parent: -1,
|
||||
children: [],
|
||||
});
|
||||
@@ -143,7 +162,8 @@ const StorageUtils = {
|
||||
*/
|
||||
async appendMsg(
|
||||
msg: Exclude<Message, 'parent' | 'children'>,
|
||||
parentNodeId: Message['id']
|
||||
parentNodeId: Message['id'],
|
||||
model_name:string,
|
||||
): Promise<void> {
|
||||
if (msg.content === null) return;
|
||||
const { convId } = msg;
|
||||
@@ -161,9 +181,11 @@ const StorageUtils = {
|
||||
`Parent message ID ${parentNodeId} does not exist in conversation ${convId}`
|
||||
);
|
||||
}
|
||||
model_name = model_name!==''?model_name:conv.model_name;
|
||||
await db.conversations.update(convId, {
|
||||
lastModified: Date.now(),
|
||||
currNode: msg.id,
|
||||
model_name: model_name,
|
||||
});
|
||||
// update parent
|
||||
await db.messages.update(parentNodeId, {
|
||||
@@ -191,10 +213,10 @@ const StorageUtils = {
|
||||
|
||||
// event listeners
|
||||
onConversationChanged(callback: CallbackConversationChanged) {
|
||||
const fn = (e: Event) => callback((e as CustomEvent).detail.convId);
|
||||
onConversationChangedHandlers.push([callback, fn]);
|
||||
event.addEventListener('conversationChange', fn);
|
||||
},
|
||||
const fn = (e: Event) => callback((e as CustomEvent).detail.convId);
|
||||
onConversationChangedHandlers.push([callback, fn]);
|
||||
event.addEventListener('conversationChange', fn);
|
||||
},
|
||||
offConversationChanged(callback: CallbackConversationChanged) {
|
||||
const fn = onConversationChangedHandlers.find(([cb, _]) => cb === callback);
|
||||
if (fn) {
|
||||
@@ -295,7 +317,6 @@ async importConversation(importedData: {
|
||||
|
||||
// Refresh the page to apply changes
|
||||
window.location.reload();
|
||||
|
||||
return conversation;
|
||||
},
|
||||
/**
|
||||
@@ -329,7 +350,7 @@ async importConversation(importedData: {
|
||||
const conversation = await StorageUtils.importConversation(jsonData);
|
||||
resolve(conversation);
|
||||
} catch (error) {
|
||||
console.error('Import failed:', error);
|
||||
toast.error('Import failed:' +error);
|
||||
alert(`Import failed: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
resolve(null);
|
||||
} finally {
|
||||
@@ -367,7 +388,8 @@ async importConversation(importedData: {
|
||||
try {
|
||||
return JSON.parse(presetsJson);
|
||||
} catch (e) {
|
||||
console.error('Failed to parse presets', e);
|
||||
toast.error('Failed to parse presets: '+ e);
|
||||
|
||||
return [];
|
||||
}
|
||||
},
|
||||
@@ -444,6 +466,7 @@ async function migrationLStoIDB() {
|
||||
lastModified,
|
||||
currNode: lastMsg.id,
|
||||
name,
|
||||
model_name:'migrate_name'
|
||||
});
|
||||
const rootId = messages[0].id - 2;
|
||||
await db.messages.add({
|
||||
@@ -454,6 +477,7 @@ async function migrationLStoIDB() {
|
||||
role: 'system',
|
||||
content: '',
|
||||
parent: -1,
|
||||
model_name:'migrate_name',
|
||||
children: [firstMsg.id],
|
||||
});
|
||||
for (let i = 0; i < messages.length; i++) {
|
||||
@@ -465,6 +489,7 @@ async function migrationLStoIDB() {
|
||||
timestamp: msg.id,
|
||||
parent: i === 0 ? rootId : messages[i - 1].id,
|
||||
children: i === messages.length - 1 ? [] : [messages[i + 1].id],
|
||||
model_name:'',
|
||||
});
|
||||
}
|
||||
migratedCount++;
|
||||
|
||||
@@ -3,6 +3,8 @@ export interface TimingReport {
|
||||
prompt_ms: number;
|
||||
predicted_n: number;
|
||||
predicted_ms: number;
|
||||
n_ctx: number;
|
||||
n_past: number;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -42,6 +44,7 @@ export interface Message {
|
||||
role: 'user' | 'assistant' | 'system';
|
||||
content: string;
|
||||
timings?: TimingReport;
|
||||
model_name:string;
|
||||
extra?: MessageExtra[];
|
||||
// node based system for branching
|
||||
parent: Message['id'];
|
||||
@@ -103,6 +106,7 @@ export interface Conversation {
|
||||
lastModified: number; // timestamp from Date.now()
|
||||
currNode: Message['id']; // the current message node being viewed
|
||||
name: string;
|
||||
model_name: string;
|
||||
}
|
||||
|
||||
export interface ViewingChat {
|
||||
@@ -136,6 +140,7 @@ export interface SettingsPreset {
|
||||
// a non-complete list of props, only contains the ones we need
|
||||
export interface LlamaCppServerProps {
|
||||
model_path: string;
|
||||
model_name: string;
|
||||
n_ctx: number;
|
||||
modalities?: {
|
||||
vision: boolean;
|
||||
|
||||
Reference in New Issue
Block a user