Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

not enough values to unpack in server.py #87

Closed
thetuxedo opened this issue Dec 5, 2018 · 21 comments
Closed

not enough values to unpack in server.py #87

thetuxedo opened this issue Dec 5, 2018 · 21 comments
Assignees

Comments

@thetuxedo
Copy link

thetuxedo commented Dec 5, 2018

Hello, I tried to run example5.py on a single-GPU ec2 instance. I set num of worker to 1.

After a few hours running, it will popup following errors:

image

on client (same instance, another session)
image

I tried twice, always the case. Do you have any suggestion? Thanks.

Also, usually how long it takes to finish the training of example5.py on a single-gpu machine?

@hanxiao hanxiao self-assigned this Dec 5, 2018
@hanxiao hanxiao added the bug Something isn't working label Dec 5, 2018
@hanxiao
Copy link
Member

hanxiao commented Dec 5, 2018

I believe it's a bug. Will do a check.

@wanghm92
Copy link

wanghm92 commented Dec 5, 2018

I seem to run into a similar problem:

wx20181205-105558 2x

Any updates on the bug fix? Thanks!

@hanxiao
Copy link
Member

hanxiao commented Dec 7, 2018

pip install the latest version and try again. https://github.com/hanxiao/bert-as-service#install

@thetuxedo
Copy link
Author

i tried again, there is a new error.

image

@mooninrain
Copy link

latest pip version, same error again
2018-12-08 14-53-22

@hanxiao hanxiao removed the bug Something isn't working label Dec 8, 2018
@hanxiao
Copy link
Member

hanxiao commented Dec 8, 2018

sorry but i can't reproduce the error on example5.py.

some general suggestions:

  • Update to latest version 1.4.9 pip install -U bert-serving-server bert-serving-client
  • Remove .prefetch(20) if you are running the server side on CPU, not necessary also prone to the race condition
  • Set num_parallel_calls=1
  • Or please provide concrete environment and steps for reproducing the error

@hanxiao
Copy link
Member

hanxiao commented Dec 10, 2018

close due to inactivity

@hanxiao hanxiao closed this as completed Dec 10, 2018
@felixhao28
Copy link

image

We ran into a similar issue on version 1.6.4. The case seemingly appears at random and is not reliably reproducible.

@felixhao28
Copy link

Also the client hangs forever when server crashes even when timeout=5000 is set.

@hanxiao
Copy link
Member

hanxiao commented Jan 8, 2019

@felixhao28 please do pip install -U bert-serving-server bert-serving-client and use the latest version.

Better start a new issue and fill in the issue form, helping me to locate the problem faster.

Note that BertClient.encode() is not thread-safe, whereas BertClient itself is thread-safe. Therefore if you are using BertClient in a MT environment, please be careful and refer to #29 (comment)

@hanxiao
Copy link
Member

hanxiao commented Jan 8, 2019

timeout is another issue, will be addressed separately. Please always create a new issue if you are reporting something new. thanks

@wanghm92
Copy link

wanghm92 commented Jan 8, 2019

Like felixhao28 said, this issue is kind of random, which is probably why you could not reproduce it, at least not easily.

However, it does happen every time for sure, at least for me. It's just a matter of how long it holds until it crashes. I use one server hosted on one GPU and let multiple programs query it at random frequency. You may want to reopen this issue and hope for more inputs from more users.

I've not changed the default timeout, so I guess that's a separate issue.

Thanks!

@hanxiao
Copy link
Member

hanxiao commented Jan 8, 2019

@wanghm92 could you open a new issue and fill in the issue form, knowing the version and sys info can help me to understand problem much much faster.

@hanxiao
Copy link
Member

hanxiao commented Jan 8, 2019

are you all encountering this problem on example5.py?

hanxiao added a commit that referenced this issue Jan 8, 2019
@hanxiao
Copy link
Member

hanxiao commented Jan 8, 2019

fyi, @felixhao28 timeout issue is solved in #181 and is available since 1.6.7. Please do

pip install -U bert-serving-server bert-serving-client

for the update.

@felixhao28
Copy link

Thanks, we will try version 1.6.7 and report back to you later if the issue persists.

@yanzp0214
Copy link

image
my bert running server throw this error every once in a while whether i have used the server

@hanxiao
Copy link
Member

hanxiao commented Jan 14, 2019

could you please be more specific?

Prerequisites

Please fill in by replacing [ ] with [x].

System information

Some of this information can be collected via this script.

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
  • TensorFlow installed from (source or binary):
  • TensorFlow version:
  • Python version:
  • bert-as-service version:
  • GPU model and memory:
  • CPU model and memory:

Description

Please replace YOUR_SERVER_ARGS and YOUR_CLIENT_ARGS accordingly. You can also write your own description for reproducing the issue.

I'm using this command to start the server:

bert-serving-start YOUR_SERVER_ARGS

and calling the server via:

bc = BertClient(YOUR_CLIENT_ARGS)
bc.encode()

Then this issue shows up:

...

@yanzp0214
Copy link

my bert service version is 1.6.9 and the latest throwing bug is after i starting bert server several hours .i never send a request to the server

@hanxiao hanxiao reopened this Jan 14, 2019
@hanxiao
Copy link
Member

hanxiao commented Jan 14, 2019

huh? @yanzp0214 no request to the server and then it's down?

I'm reopening this issue.

@yanzp0214
Copy link

Yeah~
My test result is that no matter whether the bert server receives the request or not, the server will throw this exception after a period of time.

hanxiao pushed a commit that referenced this issue Jan 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants