webui update (#1003)

webui: add system message in export conversation, support upload conversation with system message
Webui: show upload only when in new conversation
Webui: Add model name
webui: increase height of chat message window when clicking editing
Webui: autoclose settings dialog dropdown and maximze screen width when zoom in
webui: fix date issues and add more dates
webui: change error to toast.error.
server: add n_past and slot_id in props_simple
webui: add cache tokens, context and prompt speed in chat
webui: modernize ui
webui: change welcome message
webui: change speed display
webui: change run python icon
webui: add config to use server defaults for sampler
webui: put speed on left and context on right

webui: recognize AsciiDoc files as valid text files (#16850)

* webui: recognize AsciiDoc files as valid text files

* webui: add an updated static webui build

* webui: add the updated dependency list

* webui: re-add an updated static webui build

Add a setting to display message generation statistics (#16901)

* feat: Add setting to display message generation statistics

* chore: build static webui output

webui: add HTML/JS preview support to MarkdownContent with sandboxed iframe (#16757)

* webui: add HTML/JS preview support to MarkdownContent with sandboxed iframe dialog

Extended MarkdownContent to flag previewable code languages,
add a preview button alongside copy controls, manage preview
dialog state, and share styling for the new button group

Introduced CodePreviewDialog.svelte, a sandboxed iframe modal
for rendering HTML/JS previews with consistent dialog controls

* webui: fullscreen HTML preview dialog using bits-ui

* Update tools/server/webui/src/lib/components/app/misc/CodePreviewDialog.svelte

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* Update tools/server/webui/src/lib/components/app/misc/MarkdownContent.svelte

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* webui: pedantic style tweak for CodePreviewDialog close button

* webui: remove overengineered preview language logic

* chore: update webui static build

---------

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

webui: auto-refresh /props on inference start to resync model metadata (#16784)

* webui: auto-refresh /props on inference start to resync model metadata

- Add no-cache headers to /props and /slots
- Throttle slot checks to 30s
- Prevent concurrent fetches with promise guard
- Trigger refresh from chat streaming for legacy and ModelSelector
- Show dynamic serverWarning when using cached data

* fix: restore proper legacy behavior in webui by using unified /props refresh

Updated assistant message bubbles to show each message's stored model when available,
falling back to the current server model only when the per-message value is missing

When the model selector is disabled, now fetches /props and prioritizes that model name
over chunk metadata, then persists it with the streamed message so legacy mode properly
reflects the backend configuration

* fix: detect first valid SSE chunk and refresh server props once

* fix: removed the slots availability throttle constant and state

* webui: purge ai-generated cruft

* chore: update webui static build

feat(webui): improve LaTeX rendering with currency detection (#16508)

* webui : Revised LaTeX formula recognition

* webui : Further examples containg amounts

* webui : vitest for maskInlineLaTeX

* webui: Moved preprocessLaTeX to lib/utils

* webui: LaTeX in table-cells

* chore: update webui build output (use theirs)

* webui: backslash in LaTeX-preprocessing

* chore: update webui build output

* webui: look-behind backslash-check

* chore: update webui build output

* Apply suggestions from code review

Code maintenance (variable names, code formatting, string handling)

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* webui: Moved constants to lib/constants.

* webui: package woff2 inside base64 data

* webui: LaTeX-line-break in display formula

* chore: update webui build output

* webui: Bugfix (font embedding)

* webui: Bugfix (font embedding)

* webui: vite embeds assets

* webui: don't suppress 404 (fonts)

* refactor: KaTeX integration with SCSS

Moves KaTeX styling to SCSS for better customization and font embedding.

This change includes:
- Adding `sass` as a dev dependency.
- Introducing a custom SCSS file to override KaTeX variables and disable TTF/WOFF fonts, relying solely on WOFF2 for embedding.
- Adjusting the Vite configuration to resolve `katex-fonts` alias and inject SCSS variables.

* fix: LaTeX processing within blockquotes

* webui: update webui build output

---------

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

server : add props.model_alias (#16943)

* server : add props.model_alias

webui: fix keyboard shortcuts for new chat & edit chat title (#17007)

Better UX for handling multiple attachments in WebUI (#17246)

webui: add OAI-Compat Harmony tool-call streaming visualization and persistence in chat UI (#16618)

* webui: add OAI-Compat Harmony tool-call live streaming visualization and persistence in chat UI

- Purely visual and diagnostic change, no effect on model context, prompt
  construction, or inference behavior

- Captured assistant tool call payloads during streaming and non-streaming
  completions, and persisted them in chat state and storage for downstream use

- Exposed parsed tool call labels beneath the assistant's model info line
  with graceful fallback when parsing fails

- Added tool call badges beneath assistant responses that expose JSON tooltips
  and copy their payloads when clicked, matching the existing model badge styling

- Added a user-facing setting to toggle tool call visibility to the Developer
  settings section directly under the model selector option

* webui: remove scroll listener causing unnecessary layout updates (model selector)

* Update tools/server/webui/src/lib/components/app/chat/ChatMessages/ChatMessageAssistant.svelte

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* Update tools/server/webui/src/lib/components/app/chat/ChatMessages/ChatMessageAssistant.svelte

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* chore: npm run format & update webui build output

* chore: update webui build output

---------

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

webui: Fix clickability around chat processing statistics UI (#17278)

* fix: Better pointer events handling in chat processing info elements

* chore: update webui build output

Fix merge error

webui: Add a "Continue" Action for Assistant Message (#16971)

* feat: Add "Continue" action for assistant messages

* feat: Continuation logic & prompt improvements

* chore: update webui build output

* feat: Improve logic for continuing the assistant message

* chore: update webui build output

* chore: Linting

* chore: update webui build output

* fix: Remove synthetic prompt logic, use the prefill feature by sending the conversation payload ending with assistant message

* chore: update webui build output

* feat: Enable "Continue" button based on config & non-reasoning model type

* chore: update webui build output

* chore: Update packages with `npm audit fix`

* fix: Remove redundant error

* chore: update webui build output

* chore: Update `.gitignore`

* fix: Add missing change

* feat: Add auto-resizing for Edit Assistant/User Message textareas

* chore: update webui build output

Improved file naming & structure for UI components (#17405)

* refactor: Component iles naming & structure

* chore: update webui build output

* refactor: Dialog titles + components namig

* chore: update webui build output

* refactor: Imports

* chore: update webui build output

webui: hide border of button

webui: update

webui: update

webui: update

add vision

webui: minor settings reorganization and add disable autoscroll option (#17452)

* webui: added a dedicated 'Display' settings section that groups visualization options

* webui: added a Display setting to toggle automatic chat scrolling

* chore: update webui build output

Co-authored-by: firecoperana <firecoperana>
This commit is contained in:
firecoperana
2025-11-24 00:03:45 -06:00
committed by GitHub
parent 920f424929
commit 07d08e15ad
91 changed files with 13517 additions and 11849 deletions

View File

@@ -5,6 +5,7 @@ import { classNames } from '../utils/misc';
import MarkdownDisplay, { CopyButton } from './MarkdownDisplay';
import { ChevronLeftIcon, ChevronRightIcon, ArrowPathIcon, PencilSquareIcon } from '@heroicons/react/24/outline';
import ChatInputExtraContextItem from './ChatInputExtraContextItem';
import TextareaAutosize from 'react-textarea-autosize';
interface SplitMessage {
content: PendingMessage['content'];
@@ -34,7 +35,8 @@ export default function ChatMessage({
isPending?: boolean;
}) {
const { viewingChat, config } = useAppContext();
const [editingContent, setEditingContent] = useState<string | null>(null);
const [editingContent, setEditingContent] = useState<string | null>(null);
const timings = useMemo(
() =>
msg.timings
@@ -50,7 +52,6 @@ export default function ChatMessage({
);
const nextSibling = siblingLeafNodeIds[siblingCurrIdx + 1];
const prevSibling = siblingLeafNodeIds[siblingCurrIdx - 1];
// for reasoning model, we split the message into content and thought
// TODO: implement this as remark/rehype plugin in the future
const { content, thought, isThinking }: SplitMessage = useMemo(() => {
@@ -81,7 +82,8 @@ export default function ChatMessage({
}, [msg]);
if (!viewingChat) return null;
//const model_name = (timings?.model_name ??'')!== '' ? timings?.model_name: viewingChat.conv.model_name;
const model_name = viewingChat.conv.model_name;
return (
<div className="group"
id={id}
@@ -108,12 +110,14 @@ export default function ChatMessage({
{/* textarea for editing message */}
{editingContent !== null && (
<>
<textarea
<TextareaAutosize
dir="auto"
className="textarea textarea-bordered bg-base-100 text-base-content max-w-2xl w-[calc(90vw-8em)] h-24"
className="textarea textarea-bordered bg-base-100 text-base-content max-w-2xl w-[calc(90vw-8em)]"
value={editingContent}
onChange={(e) => setEditingContent(e.target.value)}
></textarea>
onChange={(e: React.ChangeEvent<HTMLTextAreaElement>) => setEditingContent(e.target.value)}
minRows={3}
maxRows={15}
/>
<br />
<button
className="btn btn-ghost mt-2 mr-2"
@@ -186,25 +190,48 @@ export default function ChatMessage({
)}
{/* render timings if enabled */}
{timings && config.showTokensPerSecond && (
<div className="dropdown dropdown-hover dropdown-top mt-2">
<div className="dropdown dropdown-hover dropdown-top ax-w-[900px] w-full mt-4">
<div
tabIndex={0}
role="button"
className="cursor-pointer font-semibold text-sm opacity-60"
>
Speed: {timings.predicted_per_second.toFixed(1)} t/s
<div className="font-bold text-xs">
{timings.n_ctx>0 && (
<div className="flex justify-between items-center">
<span className="whitespace-nowrap">
Token: {timings.predicted_per_second.toFixed(1)} t/s | Prompt: {timings.prompt_per_second.toFixed(1)} t/s
</span>
<span className="hidden lg:block pl-[200px] whitespace-nowrap">
Ctx: {timings.predicted_n+timings.prompt_n} / {timings.n_past} / {timings.n_ctx}
</span>
</div>
)}
{(timings.n_ctx==null || timings.n_ctx <=0) && (
<div>
Token: {timings.predicted_per_second.toFixed(1)} t/s | Prompt: {timings.prompt_per_second.toFixed(1)} t/s
</div>
)}
</div>
</div>
<div className="dropdown-content bg-base-100 z-10 w-64 p-2 shadow mt-4">
<p className="text-xs"><b>{model_name}</b></p>
<p className="text-sm">
<b>Prompt</b>
<br />- Tokens: {timings.prompt_n}
<br />- Time: {timings.prompt_ms} ms
<br />- Speed: {timings.prompt_per_second.toFixed(1)} t/s
<br />- Speed: {timings.prompt_per_second.toFixed(2)} t/s
<br />
<b>Generation</b>
<br />- Tokens: {timings.predicted_n}
<br />- Time: {timings.predicted_ms} ms
<br />- Speed: {timings.predicted_per_second.toFixed(1)} t/s
<br />- Speed: {timings.predicted_per_second.toFixed(2)} t/s
<br />
<b>Context</b>
<br />- n_ctx: {timings.n_ctx}
<br />- n_past: {timings.n_past}
<br />
</p>
</div>
</div>
)}
@@ -214,6 +241,13 @@ export default function ChatMessage({
</div>
{/* actions for each message */}
{msg.content !== null && !config.showTokensPerSecond && (
msg.role === 'assistant' &&(
<div className="badge border-none outline-none btn-mini show-on-hover mr-2">
<p className="text-xs">Model: {model_name}</p>
</div>
)
)}
{msg.content !== null && (
<div
className={classNames({
@@ -249,7 +283,7 @@ export default function ChatMessage({
{/* user message */}
{msg.role === 'user' && (
<button
className="badge btn-mini show-on-hover"
className="badge border-none outline-none btn-mini show-on-hover"
onClick={() => setEditingContent(msg.content)}
disabled={msg.content === null}
>
@@ -261,7 +295,7 @@ export default function ChatMessage({
<>
{!isPending && (
<button
className="badge btn-mini show-on-hover mr-2"
className="badge border-none outline-none btn-mini show-on-hover mr-2"
onClick={() => {
if (msg.content !== null) {
onRegenerateMessage(msg as Message);
@@ -274,7 +308,7 @@ export default function ChatMessage({
)}
{!isPending && (
<button
className="badge btn-mini show-on-hover"
className="badge border-none outline-none btn-mini show-on-hover"
onClick={() => setEditingContent(msg.content)}
disabled={msg.content === null}
>
@@ -284,7 +318,7 @@ export default function ChatMessage({
</>
)}
<CopyButton
className="badge btn-mini show-on-hover mr-2"
className="badge border-none outline-none btn-mini show-on-hover mr-2"
content={msg.content}
/>
</div>

View File

@@ -1,12 +1,14 @@
import { ClipboardEvent, useEffect, useMemo, useState } from 'react';
import { useEffect, useMemo, useRef, useState } from 'react';
import toast from 'react-hot-toast';
import { CallbackGeneratedChunk, useAppContext } from '../utils/app.context';
import ChatMessage from './ChatMessage';
import { CanvasType, Message, PendingMessage } from '../utils/types';
import { classNames, cleanCurrentUrl, throttle } from '../utils/misc';
import { classNames, cleanCurrentUrl } from '../utils/misc';
import CanvasPyInterpreter from './CanvasPyInterpreter';
import StorageUtils from '../utils/storage';
import { useVSCodeContext } from '../utils/llama-vscode';
import { useChatTextarea, ChatTextareaApi } from './useChatTextarea.ts';
import { scrollToBottom, useChatScroll } from './useChatScroll.tsx';
import {
ArrowUpIcon,
StopIcon,
@@ -82,23 +84,6 @@ function getListMessageDisplay(
return res;
}
const scrollToBottom = throttle(
(requiresNearBottom: boolean, delay: number = 80) => {
const mainScrollElem = document.getElementById('main-scroll');
if (!mainScrollElem) return;
const spaceToBottom =
mainScrollElem.scrollHeight -
mainScrollElem.scrollTop -
mainScrollElem.clientHeight;
if (!requiresNearBottom || spaceToBottom < 50) {
setTimeout(
() => mainScrollElem.scrollTo({ top: mainScrollElem.scrollHeight }),
delay
);
}
},
80
);
export default function ChatScreen() {
const {
@@ -116,9 +101,10 @@ export default function ChatScreen() {
const extraContext = useChatExtraContext();
useVSCodeContext(textarea, extraContext);
//const { extraContext, clearExtraContext } = useVSCodeContext(textarea);
const msgListRef = useRef<HTMLDivElement>(null);
useChatScroll(msgListRef);
// TODO: improve this when we have "upload file" feature
// keep track of leaf node for rendering
const [currNodeId, setCurrNodeId] = useState<number>(-1);
const messages: MessageDisplay[] = useMemo(() => {
@@ -141,32 +127,44 @@ export default function ChatScreen() {
if (currLeafNodeId) {
setCurrNodeId(currLeafNodeId);
}
scrollToBottom(true);
//useChatScroll will handle the auto scroll
};
const sendNewMessage = async () => {
const lastInpMsg = textarea.value();
if (lastInpMsg.trim().length === 0 || isGenerating(currConvId ?? ''))
const lastInpMsg = textarea.value();
try {
const generate = isGenerating(currConvId ?? '');
console.log('IsGenerating', generate);
if (lastInpMsg.trim().length === 0 || generate)
return;
textarea.setValue('');
scrollToBottom(false);
setCurrNodeId(-1);
// get the last message node
const lastMsgNodeId = messages.at(-1)?.msg.id ?? null;
if (
!(await sendMessage(
const successSendMsg=await sendMessage(
currConvId,
lastMsgNodeId,
lastInpMsg,
extraContext.items,
onChunk
))
) {
);
console.log('Send msg success:', successSendMsg);
if (!successSendMsg)
{
// restore the input message if failed
textarea.setValue(lastInpMsg);
}
// OK
extraContext.clearItems();
}
catch (err) {
//console.error('Error sending message:', error);
toast.error(err instanceof Error ? err.message : String(err));
textarea.setValue(lastInpMsg); // Restore input on error
}
};
const handleEditMessage = async (msg: Message, content: string) => {
@@ -182,6 +180,7 @@ export default function ChatScreen() {
);
setCurrNodeId(-1);
scrollToBottom(false);
};
const handleRegenerateMessage = async (msg: Message) => {
@@ -197,9 +196,10 @@ export default function ChatScreen() {
);
setCurrNodeId(-1);
scrollToBottom(false);
};
const handleContinueMessage = async (msg: Message, content: string) => {
const handleContinueMessage = async (msg: Message, content: string) => {
if (!viewingChat || !continueMessageAndGenerate) return;
setCurrNodeId(msg.id);
scrollToBottom(false);
@@ -211,6 +211,7 @@ export default function ChatScreen() {
);
setCurrNodeId(-1);
scrollToBottom(false);
};
const hasCanvas = !!canvasData;
@@ -251,16 +252,29 @@ export default function ChatScreen() {
>
<div
className={classNames({
'flex flex-col w-full max-w-[900px] mx-auto': true,
'flex flex-col w-[75vw] mx-auto': true,
'hidden lg:flex': hasCanvas, // adapted for mobile
flex: !hasCanvas,
})}
>
<div className="flex items-center justify-center">
{viewingChat?.conv.model_name}
</div>
{/* chat messages */}
<div id="messages-list" className="grow">
<div id="messages-list" className="grow" ref={msgListRef}>
<div className="mt-auto flex justify-center">
{/* placeholder to shift the message to the bottom */}
{viewingChat ? '' : 'Send a message to start'}
<div>
{viewingChat ? '' : ''}
</div>
{viewingChat==null && (
<div className="w-full max-w-2xl px-4">
<div className="mb-8 text-center" >
<p className="text-1xl text-muted-foreground">How can I help you today?</p>
</div>
</div>
)}
</div>
{[...messages, ...pendingMsgDisplay].map((msgDisplay) => {
const actualMsgObject = msgDisplay.msg;
@@ -292,8 +306,7 @@ export default function ChatScreen() {
);
})}
</div>
{/* chat input */}
{/* chat input */}
<ChatInput
textarea={textarea}
extraContext={extraContext}
@@ -301,7 +314,7 @@ export default function ChatScreen() {
onStop={() => stopGenerating(currConvId ?? '')}
isGenerating={isGenerating(currConvId ?? '')}
/>
</div>
</div>
<div className="w-full sticky top-[7em] h-[calc(100vh-9em)]">
{canvasData?.type === CanvasType.PY_INTERPRETER && (
<CanvasPyInterpreter />
@@ -311,38 +324,6 @@ export default function ChatScreen() {
);
}
// function ServerInfo() {
// const { serverProps } = useAppContext();
// const modalities = [];
// if (serverProps?.modalities?.audio) {
// modalities.push('audio');
// }
// if (serverProps?.modalities?.vision) {
// modalities.push('vision');
// }
// return (
// <div
// className="card card-sm shadow-sm border-1 border-base-content/20 text-base-content/70 mb-6"
// tabIndex={0}
// aria-description="Server information"
// >
// <div className="card-body">
// <b>Server Info</b>
// <p>
// <b>Model</b>: {serverProps?.model_path?.split(/(\\|\/)/).pop()}
// <br />
// {modalities.length > 0 ? (
// <>
// <b>Supported modalities:</b> {modalities.join(', ')}
// </>
// ) : (
// ''
// )}
// </p>
// </div>
// </div>
// );
// }
function ChatInput({
textarea,
@@ -384,7 +365,7 @@ function ChatInput({
className="flex flex-col rounded-xl border-1 border-base-content/30 p-3 w-full"
// when a file is pasted to the input, we handle it here
// if a text is pasted, and if it is long text, we will convert it to a file
onPasteCapture={(e: ClipboardEvent<HTMLInputElement>) => {
onPasteCapture={(e: React.ClipboardEvent<HTMLInputElement>) => {
const text = e.clipboardData.getData('text/plain');
if (
text.length > 0 &&

View File

@@ -62,11 +62,6 @@ export default function Header() {
if (newName && newName.trim().length > 0) {
StorageUtils.updateConversationName(viewingChat?.conv.id ?? '', newName);
}
//const importedConv = await StorageUtils.updateConversationName();
//if (importedConv) {
//console.log('Successfully imported:', importedConv.name);
// Refresh UI or navigate to conversation
//}
};
// at the top of your file, alongside ConversationExport:
@@ -75,13 +70,30 @@ export default function Header() {
if (importedConv) {
console.log('Successfully imported:', importedConv.name);
// Refresh UI or navigate to conversation
navigate(`/chat/${importedConv.id}`);
}
};
const downloadConversation = () => {
if (isCurrConvGenerating || !viewingChat) return;
const convId = viewingChat?.conv.id;
const conversationJson = JSON.stringify(viewingChat, null, 2);
// Get the current system message from config
const systemMessage = StorageUtils.getConfig().systemMessage;
// Clone the viewingChat object to avoid modifying the original
const exportData = {
conv: { ...viewingChat.conv },
messages: viewingChat.messages.map(msg => ({ ...msg }))
};
// Find the root message and update its content
const rootMessage = exportData.messages.find(m => m.type === 'root');
if (rootMessage) {
rootMessage.content = systemMessage;
}
const conversationJson = JSON.stringify(exportData, null, 2);
const blob = new Blob([conversationJson], { type: 'application/json' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
@@ -116,9 +128,12 @@ export default function Header() {
{/* action buttons (top right) */}
<div className="flex items-center">
{viewingChat && (
{/* start */ }
{/*viewingChat && */ /* show options for new conversation as well */
(
<div className="dropdown dropdown-end">
{/* "..." button */}
<button
tabIndex={0}
role="button"

View File

@@ -93,6 +93,7 @@ const CodeBlockButtons: React.ElementType<
);
};
export const CopyButton = ({
content,
className,

View File

@@ -1,4 +1,4 @@
import { useState } from 'react';
import { useState, useRef} from 'react';
import { useAppContext } from '../utils/app.context';
import { CONFIG_DEFAULT, CONFIG_INFO } from '../Config';
import { isDev } from '../Config';
@@ -339,7 +339,7 @@ const SETTING_SECTIONS = (
{
type: SettingInputType.CHECKBOX,
label:
'Exclude thought process when sending requests to API (Recommended for DeepSeek-R1)',
'Exclude thought process when sending requests to API (Recommended for Reasoning Models like Deepseek R1)',
key: 'excludeThoughtOnReq',
},
],
@@ -361,7 +361,7 @@ const SETTING_SECTIONS = (
const demoConv = await res.json();
StorageUtils.remove(demoConv.id);
for (const msg of demoConv.messages) {
StorageUtils.appendMsg(demoConv.id, msg);
StorageUtils.appendMsg(demoConv.id, msg, msg.model_name);
}
};
return (
@@ -422,7 +422,7 @@ const SETTING_SECTIONS = (
toast.success('Import complete')
window.location.reload();
} catch (error) {
console.error('' + error);
//console.error('' + error);
toast.error('' + error);
}
};
@@ -452,9 +452,14 @@ const SETTING_SECTIONS = (
{
type: SettingInputType.CHECKBOX,
label: 'Show tokens per second',
label: 'Show generation stats (model name, context size, prompt and token per second)',
key: 'showTokensPerSecond',
},
{
type: SettingInputType.CHECKBOX,
label: 'Use server defaults for parameters (skip sending temp, top_k, top_p, min_p, typical p from WebUI)',
key: 'useServerDefaults',
},
{
type: SettingInputType.LONG_INPUT,
label: (
@@ -582,7 +587,8 @@ export default function SettingDialog({
return;
}
} else {
console.error(`Unknown default type for key ${key}`);
//console.error(`Unknown default type for key ${key}`);
toast.error(`Unknown default type for key ${key}`);
}
}
if (isDev) console.log('Saving config', newConfig);
@@ -595,6 +601,7 @@ export default function SettingDialog({
setLocalConfig({ ...localConfig, [key]: value });
};
const detailsRef = useRef<HTMLDetailsElement>(null); // <-- Add this line
return (
<dialog className={classNames({ modal: true, 'modal-open': show })}>
<div className="modal-box w-11/12 max-w-3xl">
@@ -619,7 +626,7 @@ export default function SettingDialog({
{/* Left panel, showing sections - Mobile version */}
<div className="md:hidden flex flex-row gap-2 mb-4">
<details className="dropdown">
<details className="dropdown" ref={detailsRef}>
<summary className="btn bt-sm w-full m-1">
{SETTING_SECTIONS_GENERATED[sectionIdx].title}
</summary>
@@ -631,7 +638,9 @@ export default function SettingDialog({
'btn btn-ghost justify-start font-normal': true,
'btn-active': sectionIdx === idx,
})}
onClick={() => setSectionIdx(idx)}
onClick={() => {setSectionIdx(idx);
detailsRef.current?.removeAttribute('open');
}}
dir="auto"
>
{section.title}

View File

@@ -16,6 +16,7 @@ import { BtnWithTooltips } from '../utils/common';
import { useAppContext } from '../utils/app.context';
import toast from 'react-hot-toast';
import { useModals } from './ModalProvider';
import {DateTime} from 'luxon'
// at the top of your file, alongside ConversationExport:
async function importConversation() {
@@ -114,7 +115,7 @@ export default function Sidebar() {
aria-label="New conversation"
>
<PencilSquareIcon className="w-5 h-5" />
New conversation
New Conversations
</button>
{/* list of conversations */}
@@ -251,11 +252,11 @@ function ConversationItem({
true,
'btn-soft': isCurrConv,
})}
onClick={onSelect}
>
<button
key={conv.id}
className="w-full overflow-hidden truncate text-start"
onClick={onSelect}
className="w-full overflow-hidden truncate text-start"
dir="auto"
>
{conv.name}
@@ -265,7 +266,7 @@ function ConversationItem({
// on mobile, we always show the ellipsis icon
// on desktop, we only show it when the user hovers over the conversation item
// we use opacity instead of hidden to avoid layout shift
className="cursor-pointer opacity-100 md:opacity-0 group-hover:opacity-100"
className="cursor-pointer opacity-100 xl:opacity-0 group-hover:opacity-100"
onClick={() => {}}
tooltipsContent="More"
>
@@ -318,23 +319,26 @@ export interface GroupedConversations {
// TODO @ngxson : add test for this function
// Group conversations by date
// - Yesterday
// - "Previous 7 Days"
// - "Previous 30 Days"
// - "Month Year" (e.g., "April 2023")
export function groupConversationsByDate(
conversations: Conversation[]
): GroupedConversations[] {
const now = new Date();
const today = new Date(now.getFullYear(), now.getMonth(), now.getDate()); // Start of today
const today=DateTime.now().startOf('day');
const yesterday = today.minus({ days: 1 });
const sevenDaysAgo = new Date(today);
sevenDaysAgo.setDate(today.getDate() - 7);
const yesterday2 = today.minus({ days: 2});
const thirtyDaysAgo = new Date(today);
thirtyDaysAgo.setDate(today.getDate() - 30);
const sevenDaysAgo = today.minus({ days: 7 });
const thirtyDaysAgo = today.minus({ days: 30 });
const groups: { [key: string]: Conversation[] } = {
Today: [],
Yesterday: [],
'Previous 2 Days': [],
'Previous 7 Days': [],
'Previous 30 Days': [],
};
@@ -347,17 +351,20 @@ export function groupConversationsByDate(
);
for (const conv of sortedConversations) {
const convDate = new Date(conv.lastModified);
const convDate=DateTime.fromMillis(conv.lastModified).setZone('America/Chicago');
if (convDate >= today) {
groups['Today'].push(conv);
} else if (convDate >= yesterday) {
groups['Yesterday'].push(conv);
} else if (convDate >= yesterday2) {
groups['Previous 2 Days'].push(conv);
} else if (convDate >= sevenDaysAgo) {
groups['Previous 7 Days'].push(conv);
} else if (convDate >= thirtyDaysAgo) {
groups['Previous 30 Days'].push(conv);
} else {
const monthName = convDate.toLocaleString('default', { month: 'long' });
const year = convDate.getFullYear();
const monthName = convDate.monthLong;
const year = convDate.year;
const monthYearKey = `${monthName} ${year}`;
if (!monthlyGroups[monthYearKey]) {
monthlyGroups[monthYearKey] = [];
@@ -374,20 +381,23 @@ export function groupConversationsByDate(
conversations: groups['Today'],
});
}
const timeRanges = [
{ key: 'Yesterday', display: 'Yesterday'},
{ key: 'Previous 2 Days', display: 'Previous 2 Days'},
{ key: 'Previous 7 Days', display: 'Previous 7 Days' },
{ key: 'Previous 30 Days', display: 'Previous 30 Days' },
if (groups['Previous 7 Days'].length > 0) {
// Add more ranges here if needed, e.g., 'Previous 90 Days'
];
for (const range of timeRanges) {
if (groups[range.key]?.length > 0) {
result.push({
title: 'Previous 7 Days',
conversations: groups['Previous 7 Days'],
});
}
if (groups['Previous 30 Days'].length > 0) {
result.push({
title: 'Previous 30 Days',
conversations: groups['Previous 30 Days'],
title: range.display,
conversations: groups[range.key]
});
}
}
// Sort monthly groups by date (most recent month first)
const sortedMonthKeys = Object.keys(monthlyGroups).sort((a, b) => {

View File

@@ -231,7 +231,7 @@ async function convertPDFToImage(file: File): Promise<string[]> {
if (!ctx) {
throw new Error('Failed to get 2D context from canvas');
}
const task = page.render({ canvasContext: ctx, viewport: viewport });
const task = page.render({ canvasContext: ctx, canvas: canvas, viewport: viewport });
pages.push(
task.promise.then(() => {
return canvas.toDataURL();

View File

@@ -0,0 +1,34 @@
import React, { useEffect } from 'react';
import { throttle } from '../utils/misc';
export const scrollToBottom = (requiresNearBottom: boolean, delay?: number) => {
const mainScrollElem = document.getElementById('main-scroll');
if (!mainScrollElem) return;
const spaceToBottom =
mainScrollElem.scrollHeight -
mainScrollElem.scrollTop -
mainScrollElem.clientHeight;
if (!requiresNearBottom || spaceToBottom < 100) {
setTimeout(
() => mainScrollElem.scrollTo({ top: mainScrollElem.scrollHeight }),
delay ?? 80
);
}
};
const scrollToBottomThrottled = throttle(scrollToBottom, 80);
export function useChatScroll(msgListRef: React.RefObject<HTMLDivElement>) {
useEffect(() => {
if (!msgListRef.current) return;
const resizeObserver = new ResizeObserver((_) => {
scrollToBottomThrottled(true, 10);
});
resizeObserver.observe(msgListRef.current);
return () => {
resizeObserver.disconnect();
};
}, [msgListRef]);
}