So I am creating a stream of response from `createRetrievalChain` from Langchain JS. The whole code is ` const chain = await createStuffDocumentsChain({
llm: model,
prompt: prompt,
});
const conversationChain = await createRetrievalChain({
combineDocsChain: chain,
retriever: retrieverChain,
});
// check for question each time
const outputChain = RunnableSequence.from([
conversationChain,
new RunnablePick({ keys: "answer" }),
new HttpResponseOutputParser({ contentType: "text/plain" }),
]);
chat_history: chatHistory,
input: question,
});`. In the FrontEnd I am using Vercel AI SDK in Next.JS to stream the response. While everything looks good to me and is working properly, but I also want to get the textual response from this Stream response, beacuse I have to store this textual response to DB. How I can do this with the help of Langchain or Vercel AI SDK?