Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] system prompt editing textarea UI #6

Open
MGdesigner opened this issue Sep 18, 2024 · 5 comments
Open

[Feature Request] system prompt editing textarea UI #6

MGdesigner opened this issue Sep 18, 2024 · 5 comments

Comments

@MGdesigner
Copy link

When we are writing different articles or documents, we need different writing style. When use ctrl+q , we may get not suitable outputs. This could be changed by system prompt instead of using differen't LLM.

The system prompt may be like this "You are a professional detective and you will calmly describe the details step by step",
"You are a litterateur and the words you use are very poetic."...etc

@balisujohn
Copy link
Owner

I can add a settings dialogue for a persistent system prompt, in the mean time, you can write your desired system prompt above the text you want to extend and include it in you selection before ctrl-q, then delete it afterwards.

@balisujohn
Copy link
Owner

Please see the newest release for the ability to set a system prompt: https://github.com/balisujohn/localwriter/releases/tag/v0.0.5

@MGdesigner
Copy link
Author

MGdesigner commented Sep 24, 2024

Please see the newest release for the ability to set a system prompt: https://github.com/balisujohn/localwriter/releases/tag/v0.0.5

Thanks for you updating, but after testing, I found "Extend Selection System Prompt" not working. The llm still outputs by its original system prompt.

@balisujohn
Copy link
Owner

It creates a system prompt by always appending a desired prompt in front of the selected text, with the following format:

prompt = "SYSTEM PROMPT\n" + self.get_config("extend_selection_system_prompt", "") + "\nEND SYSTEM PROMPT\n" + text_range.getString()

A good way to confirm that it’s working is to use "translate to Italian" as the system prompt for the edit selection. Then, simply say "translate" in the normal prompt. If it responds in Italian, the system prompt is functioning correctly.

For the extend selection, you can use "Write as if you are a pirate" or something similar in the extend selection system prompt to verify that it’s working.

I have confirmed that this works with both Ollama and Text Generation WebUI. However, if Ollama provides its own system prompt, this will be appended rather than overriding it.

I have been testing this with Phi 3.5 and previously with OpenChat 3.5. Phi 3.5 Mini 3.8b Q8 seems to work well in Text Generation WebUI. However, the Q4 quant provided by Ollama appears a bit too degraded to consistently follow the system prompt(though it often does).

@MGdesigner
Copy link
Author

Some customized llm itself comes with its special system prompt,so the original system prompt causes Localwriter output some text like :"I am just a assistant,not a really person. ", "Harry Potter is s famous fantasic novel series.... You know I am Shakespeare. Time's glory is to calm contending kings, To unmask falsehood, and bring truth to light." ( When rewriting any article, llm always appends the deeds of the characters he played.)

Maybe adding a new radio button to switch between overriding or appending mode better. Yes, I know that user change to aother llm without system prompt also can also solve the problem , but it waste more HD space .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants