Replies: 4 comments 1 reply
-
Setting the chat session doesn't actually feed anything to the model, you need to call .generate in order to actually put things in its memory. |
Beta Was this translation helpful? Give feedback.
-
try out this , but its simple you can objectify it : |
Beta Was this translation helpful? Give feedback.
-
Could we possibly get an answer to this? This appears to still be an issue I'm also facing |
Beta Was this translation helpful? Give feedback.
-
This feature request is tracked at #1959. |
Beta Was this translation helpful? Give feedback.
-
I hope that I am using the right platform for asking this question. Using the Python bindings, I would like to provide a previous conversation with the large language model. However, I need clarification as I need to figure out how to do that in my code. Here is how I prompt the model in my class:
In the above code, I have
self._set_context.
That is the function that feeds data from a chat history. For example,self.chat_session
is extracted from a previous conversation and formatted like this:And my
self._set_context
function is like this:But for some reason, the language model answers my following prompt as if there was no prior conversation. Indeed, I verified that by asking things from a previous conversation. I appreciate any help to help me resolve this issue. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions