-
Assume we have a conversation like follow:
I want to use BlenderBot to continue to generate response that follows the phrase "I am working as ". What's the best way to do this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 5 replies
-
One way to do this via command line would be to have BB's "prompt" message in a from parlai.core.agents import create_agent_from_model_file
blenderbot = create_agent_from_model_file()
blenderbot.observe({'text': 'Can you tell me about your job?', 'temp_history': "\nSure, I'm happy to talk about it. I am working as", "episode_done": False})
response = blenderbot.act()['text'] The issue here is that we'll still have an EOS token at the end of the full contextual sentence. A more complex way to go about this, then, is to implement an agent that implements Incorporating something like a "prompt" into the top-level message would be useful, however. Presumably, the agent would set its prefix tokens to whatever is in the |
Beta Was this translation helpful? Give feedback.
One way to do this via command line would be to have BB's "prompt" message in a
temp_history
key in the observed message:The issue here is that we'll still have an EOS token at the end of the full contextual sentence. A more complex way to go about this, then, is to implement an agent that implements
get_prefix_tokens
; see e.g. this test agent, which forces the model's generation to begin with …