Replies: 2 comments 3 replies
-
I found a similar issue that might help you resolve your problem. The error you're encountering is due to a type mismatch when passing the Here's an example of how you can do this using an in-memory chat message history: import { RunnableWithMessageHistory } from '@langchain/core/runnables';
import { InMemoryChatMessageHistory } from '@langchain/core/chat_history';
import { RunnableSequence, RunnablePassthrough } from '@langchain/core/runnables';
import { ChatPromptTemplate } from '@langchain/core/prompts';
import { ChatOpenAI } from '@langchain/openai';
import { StringOutputParser } from '@langchain/core/output_parsers';
import { MessagesPlaceholder } from '@langchain/core/prompts';
const localModel = new ChatOpenAI({
modelName: 'gpt-4o-2024-08-06',
temperature: 0.0,
});
const REPHRASE_QUESTION_SYSTEM_TEMPLATE = `Given the following conversation and a follow up question,
rephrase the follow up question to be a standalone question.`;
const rephraseQuestionChainPrompt = ChatPromptTemplate.fromMessages([
['system', REPHRASE_QUESTION_SYSTEM_TEMPLATE],
new MessagesPlaceholder('history'),
[
'human',
'Rephrase the following question as a standalone question:\n{question}',
],
]);
const rephraseQuestionChain = RunnableSequence.from([
rephraseQuestionChainPrompt,
localModel,
new StringOutputParser(),
]);
const ANSWER_CHAIN_SYSTEM_TEMPLATE = `You are an experienced researcher,
expert at interpreting and answering questions based on provided sources.
Using the below provided context and chat history,
answer the user's question to the best of your ability using only the resources provided. Be verbose!
<context>
{context}
</context>`;
const answerGenerationChainPrompt = ChatPromptTemplate.fromMessages([
['system', ANSWER_CHAIN_SYSTEM_TEMPLATE],
new MessagesPlaceholder('history'),
[
'human',
'Now, answer this question using the previous context and chat history:\n{standalone_question}',
],
]);
const convertDocsToString = (documents: Document[]): string => {
return documents
.map((document) => {
console.log('document:', document);
return `<doc>\n${document.pageContent}\n</doc>`;
})
.join('\n');
};
const documentRetrievalChain = RunnableSequence.from([
(input: { question: string }) => input.question,
async (standaloneQ) => {
return await retriever(standaloneQ, botId, model);
},
convertDocsToString,
]);
const conversationalRetrievalChain = RunnableSequence.from([
RunnablePassthrough.assign({ standalone_question: rephraseQuestionChain }),
RunnablePassthrough.assign({ context: documentRetrievalChain }),
answerGenerationChainPrompt,
localModel,
new StringOutputParser(),
]);
const messageHistory = new InMemoryChatMessageHistory();
const finalRetrievalChain = new RunnableWithMessageHistory({
runnable: conversationalRetrievalChain,
getMessageHistory: () => messageHistory,
historyMessagesKey: 'history',
inputMessagesKey: 'question',
}); In this example, |
Beta Was this translation helpful? Give feedback.
-
@dosu, |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
So I've been trying to use a
RunnableWithMessageHistory
and am trying to pass an emptyChatMessageHistory
for now (ideally it'd load previous message from a db) but even this is not working somehow.somehow complaining about
System Info
none
Beta Was this translation helpful? Give feedback.
All reactions