-
Notifications
You must be signed in to change notification settings - Fork 10.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
server : (webui) migrate project to ReactJS with typescript #11688
base: master
Are you sure you want to change the base?
Conversation
My only other comment is that in with the data flow no longer being coupled, the textarea is not decoupled. I cannot send two prompts in two conversations in parallel, which would be nice when I want to ask two separate long questions and get 2 answers by the time I am back from my coffee break. I still have to use two tabs with this PR even though as a user I would expect to send the second from the same tab, but different conversation. So if I start a new conversation the "Stop" remains the same. But this is a nitpick, not as important as the other things, but for your future consideration. |
export function preprocessLaTeX(content: string): string { | ||
// Step 1: Protect code blocks | ||
const codeBlocks: string[] = []; | ||
content = content.replace(/(```[\s\S]*?```|`[^`\n]+`)/g, (_, code) => { | ||
codeBlocks.push(code); | ||
return `<<CODE_BLOCK_${codeBlocks.length - 1}>>`; | ||
}); | ||
|
||
// Step 2: Protect existing LaTeX expressions | ||
const latexExpressions: string[] = []; | ||
content = content.replace( | ||
/(\$\$[\s\S]*?\$\$|\\\[[\s\S]*?\\\]|\\\(.*?\\\))/g, | ||
(match) => { | ||
latexExpressions.push(match); | ||
return `<<LATEX_${latexExpressions.length - 1}>>`; | ||
} | ||
); | ||
|
||
// Step 3: Escape dollar signs that are likely currency indicators | ||
content = content.replace(/\$(?=\d)/g, '\\$'); | ||
|
||
// Step 4: Restore LaTeX expressions | ||
content = content.replace( | ||
/<<LATEX_(\d+)>>/g, | ||
(_, index) => latexExpressions[parseInt(index)] | ||
); | ||
|
||
// Step 5: Restore code blocks | ||
content = content.replace( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is like going back to the old version of the UI and almost as hacky as my pull request 😢
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree that this is not the best solution, obviously I copied this from internet, see the comment above this function.
The proper way would be to implement a custom remark/rehype plug for that, will see if I can make it quickly
Hm, this still does not work for me - the selection is being cleared on each new token. Tried Vivaldi and Safari on macOS. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not familiar enough with the web technologies to make a proper review. Running some tests, everything seems to work, so for me it's good enough 👍
OK I did test again on claude and it works like you described. Will see if I can do a quick hack to make it work that way. |
@woof-dog Multiple conversations can now be generated at the same time, remember to start the server with |
Yes, this is excellent 😄 works just as expected. This is going to make using the llama.cpp server so much better for me. Thank you |
Close #11663 #10915 #9608
Along side with moving to ReactJS, this PR also added some subtle changes that would be (almost) impossible to do with VueJS:
react-markdown
) of asinnerHTML
(viamarkdown-it
). This improves performance on rendering long markdown input, while also enable selecting the text content while it's being generated, as described in Bug:llama-server
web UI resets the text selection during inference on every token update #9608http://localhost:5173/#/chat/conv-1738775907156
react-markdown
This change will be transparent for end-users and requires no migrations.