Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : (webui) migrate project to ReactJS with typescript #11688

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Feb 5, 2025

Close #11663 #10915 #9608

Along side with moving to ReactJS, this PR also added some subtle changes that would be (almost) impossible to do with VueJS:

  1. Markdown is now rendered as DOM instead (via react-markdown) of as innerHTML (via markdown-it). This improves performance on rendering long markdown input, while also enable selecting the text content while it's being generated, as described in Bug: llama-server web UI resets the text selection during inference on every token update #9608
  2. Each conversation now has its own address, for example http://localhost:5173/#/chat/conv-1738775907156
  3. The "Copy" button on code block now changes its text to "Copied" on clicking. Very small change, but now possible thanks to react-markdown
  4. User can switch between conversations while text generation is still in progress. This is mostly to match the behavior of claude / chatgpt / deekseek / etc, but also to demonstrate that the application's data flow is now decoupled.

This change will be transparent for end-users and requires no migrations.

@ngxson ngxson requested a review from ggerganov February 5, 2025 22:40
@github-actions github-actions bot added examples devops improvements to build systems and github actions server labels Feb 5, 2025
@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

I like that I can copy anytime now! But Please be very careful!

This change breaks LaTeX formatting for some math equations!!

Existing:
image

This branch:
image

It's very odd, sometimes it works:
image

Maybe only it's now broken when the math starts with a number!

@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

My only other comment is that in with the data flow no longer being coupled, the textarea is not decoupled. I cannot send two prompts in two conversations in parallel, which would be nice when I want to ask two separate long questions and get 2 answers by the time I am back from my coffee break. I still have to use two tabs with this PR even though as a user I would expect to send the second from the same tab, but different conversation. So if I start a new conversation the "Stop" remains the same.

But this is a nitpick, not as important as the other things, but for your future consideration.

Comment on lines 301 to 329
export function preprocessLaTeX(content: string): string {
// Step 1: Protect code blocks
const codeBlocks: string[] = [];
content = content.replace(/(```[\s\S]*?```|`[^`\n]+`)/g, (_, code) => {
codeBlocks.push(code);
return `<<CODE_BLOCK_${codeBlocks.length - 1}>>`;
});

// Step 2: Protect existing LaTeX expressions
const latexExpressions: string[] = [];
content = content.replace(
/(\$\$[\s\S]*?\$\$|\\\[[\s\S]*?\\\]|\\\(.*?\\\))/g,
(match) => {
latexExpressions.push(match);
return `<<LATEX_${latexExpressions.length - 1}>>`;
}
);

// Step 3: Escape dollar signs that are likely currency indicators
content = content.replace(/\$(?=\d)/g, '\\$');

// Step 4: Restore LaTeX expressions
content = content.replace(
/<<LATEX_(\d+)>>/g,
(_, index) => latexExpressions[parseInt(index)]
);

// Step 5: Restore code blocks
content = content.replace(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is like going back to the old version of the UI and almost as hacky as my pull request 😢

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree that this is not the best solution, obviously I copied this from internet, see the comment above this function.

The proper way would be to implement a custom remark/rehype plug for that, will see if I can make it quickly

@ggerganov
Copy link
Owner

while also enable selecting the text content while it's being generated,

Hm, this still does not work for me - the selection is being cleared on each new token. Tried Vivaldi and Safari on macOS.

Copy link
Owner

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar enough with the web technologies to make a proper review. Running some tests, everything seems to work, so for me it's good enough 👍

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

I still have to use two tabs with this PR even though as a user I would expect to send the second from the same tab, but different conversation. So if I start a new conversation the "Stop" remains the same.

OK I did test again on claude and it works like you described. Will see if I can do a quick hack to make it work that way.

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

@woof-dog Multiple conversations can now be generated at the same time, remember to start the server with -np ... to enable multiple slots. Can you please test it?

@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

@woof-dog Multiple conversations can now be generated at the same time, remember to start the server with -np ... to enable multiple slots. Can you please test it?

Yes, this is excellent 😄 works just as expected. This is going to make using the llama.cpp server so much better for me. Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops improvements to build systems and github actions examples server
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: move server webui from vuejs to reactjs (with typescript)
3 participants