-
-
Notifications
You must be signed in to change notification settings - Fork 355
Issues: simonw/llm
-f/--fragment option for passing in longer context fragments,...
#617
opened Nov 6, 2024 by
simonw
Open
47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Use condense_json() to make response logging more efficient
enhancement
New feature or request
#760
opened Feb 18, 2025 by
simonw
--short should show the end of the prompt too
enhancement
New feature or request
#759
opened Feb 17, 2025 by
simonw
Decouple Python API from the io.datasette.llm user directory
design
pre-1.0
#754
opened Feb 16, 2025 by
simonw
request: autocomplete paths for Further information is requested
-a
question
#743
opened Feb 11, 2025 by
einarmagnus
add --name TEXT option to 'llm prompt' command to set conversation_name
#735
opened Feb 2, 2025 by
rdslw
Having plugins installed slows even basic
llm --help
from 1.3s to 10.7s
#732
opened Feb 2, 2025 by
mcint
OpenAI o1 models require
max_completion_tokens
instead of max_tokens
#724
opened Jan 28, 2025 by
archer-eric
[feature request] Add support for Chain of Thought (CoT) from deepseek-r1
#722
opened Jan 27, 2025 by
Rianico
Design annotations abstraction for responses that are not just a stream of plain text
attachments
design
#716
opened Jan 24, 2025 by
simonw
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.