We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
将infer_slots_data 放到sequence_pool 之前,预估出core,放到sequence_pool 之后,则不出core,求问原因
for data in slots: emb = fluid.layers.embedding(input=data, size=[self._dict_dim, self._emb_dim], is_sparse=True, is_distributed=True, param_attr=fluid.ParamAttr(name="embedding")) self.infer_slots_data.append(emb) bow = fluid.layers.sequence_pool(input=emb, pool_type='sum') #self.infer_slots_data.append(emb)
core信息: #0 0x00007f1fba52422b in paddle::operators::ConcatOp::GetExpectedKernelType(paddle::framework::ExecutionContext const&) const ()
The text was updated successfully, but these errors were encountered:
infer_slots_data是什么呢,可以贴一下代码吗
Sorry, something went wrong.
是save模型的入参feeded_var_names fluid.io.save_inference_model( dirname=model_name, feeded_var_names=feeded_var_names, target_vars=target_vars, executor=executor, #main_program=program.clone(), main_program=prog, params_filename="params")
最终是修改 config.switch_ir_optim(False) ,关闭预测引擎优化 解决问题,paddle version = 1.8.5.
No branches or pull requests
请提出你的问题 Please ask your question
将infer_slots_data 放到sequence_pool 之前,预估出core,放到sequence_pool 之后,则不出core,求问原因
for data in slots:
emb = fluid.layers.embedding(input=data, size=[self._dict_dim, self._emb_dim], is_sparse=True, is_distributed=True, param_attr=fluid.ParamAttr(name="embedding"))
self.infer_slots_data.append(emb)
bow = fluid.layers.sequence_pool(input=emb, pool_type='sum')
#self.infer_slots_data.append(emb)
core信息:
#0 0x00007f1fba52422b in paddle::operators::ConcatOp::GetExpectedKernelType(paddle::framework::ExecutionContext const&) const ()
The text was updated successfully, but these errors were encountered: