Skip to content

Latest commit

 

History

History
11 lines (7 loc) · 562 Bytes

README.md

File metadata and controls

11 lines (7 loc) · 562 Bytes

Experimenting with building a memory system for conversational ai

Requires gpt4all model in a models subfolder. Currently using a gpq4all-lora-unfiltered-quantized.bin model that has been converted.

You can obtain the unfiltered model from here:

You will need to convert it per the instructions found here: https://github.com/nomic-ai/pyllamacpp#gpt4all

Place the model in the models/ folder and give it the name gpt4all-lora-unfiltered-quantized-converted.bin