Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery: Widen range for 'google-cloud-core'. #7969

Merged
merged 2 commits into from
May 15, 2019
Merged

BigQuery: Widen range for 'google-cloud-core'. #7969

merged 2 commits into from
May 15, 2019

Conversation

tseaver
Copy link
Contributor

@tseaver tseaver commented May 14, 2019

See #7968, "Prep Releases".

@tseaver tseaver added api: bigquery Issues related to the BigQuery API. packaging labels May 14, 2019
@tseaver tseaver requested review from tswast and a team May 14, 2019 18:26
@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label May 14, 2019
@tseaver
Copy link
Contributor Author

tseaver commented May 14, 2019

There is a unit test failure on the PR branch which is already fixed on master in #7849. I'd say we don't backport a test fix into the bigquery-1.11-back branch.

@busunkim96 busunkim96 added the kokoro:force-run Add this label to force Kokoro to re-run the tests. label May 14, 2019
Copy link
Contributor

@tswast tswast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Yeah, sorry that I released something with a test failure. No need to backport the test fix.

@tseaver tseaver merged commit 3fca366 into googleapis:bigquery-1.11-back May 15, 2019
@tseaver tseaver deleted the bigquery-1.11.3-release branch May 15, 2019 16:33
@tseaver
Copy link
Contributor Author

tseaver commented May 15, 2019

Release tagged.

Distributions pushed to PyPI.

@cgparkinson
Copy link

cgparkinson commented May 16, 2019

Probably the fault of my setup, but a nightly test of my code failed last night with the error
ImportError: No module named client_info

This error has no results on Google so thought I'd post here.

I fixed the error by adding google-cloud-core<1.0.0 to my dependencies.

Probably will be better to update all my dependencies rather than relying on old versions though.

In case it's useful, my dependencies now read:

pandas-gbq
google-cloud-bigquery>=1.9.0
google-cloud-core<1.0.0
google-cloud-storage
parmap
google-api-python-client
google-cloud-pubsub>=0.28.1

@tswast
Copy link
Contributor

tswast commented May 16, 2019

Probably the fault of my setup, but a nightly test of my code failed last night with the error ImportError: No module named client_info

@cgparkinson did you save more of the stack trace? We didn't intend for there to be any breaking changes in google-cloud-core 1.0.

@tseaver
Copy link
Contributor Author

tseaver commented May 16, 2019

@cgparkinson google-cloud-core 1.0.0 depends on google-api-core >= 1.11.0: did you somehow get the one updated but not the other?

tswast added a commit to regro-cf-autotick-bot/google-cloud-bigquery-feedstock that referenced this pull request May 16, 2019
@cgparkinson
Copy link

cgparkinson commented May 17, 2019

I don't have logs for what exactly got installed from my requirements.txt (this runs on dataflow) but I do have the stack trace.

Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 554, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 555, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 556, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 216, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 503, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 508, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 258, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 317, in loads
    return load(file, ignore)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 305, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 828, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/<REDACTED>", line 33, in <module>
    from google.cloud import bigquery
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigquery/__init__.py", line 35, in <module>
    from google.cloud.bigquery.client import Client
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigquery/client.py", line 44, in <module>
    from google.cloud.bigquery._http import Connection
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigquery/_http.py", line 17, in <module>
    from google.cloud import _http
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 24, in <module>
    from google.api_core.client_info import ClientInfo
ImportError: No module named client_info

@tswast
Copy link
Contributor

tswast commented May 17, 2019

Thanks.

File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 24, in
from google.api_core.client_info import ClientInfo

This does confirm you have an older version of google-api-core with the latest version of google-cloud-core.

Perhaps you caught us at a time between when the patch release for BigQuery was out, but not for Storage and Pub/Sub yet. Does this still happen when you remove the pin on google-cloud-core?

@cgparkinson
Copy link

cgparkinson commented May 18, 2019

I tried removing the pin, but I got the same error.

Now with the pin, I get the error:

Traceback (most recent call last):
  File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function
    _function_handler.invoke_user_function(event_object)
  File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function
    return call_user_function(request_or_event)
  File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function
    event_context.Context(**request_or_event.context))
  File "/user_code/<redacted>.py", line 121, in <redacted>
    <redacted>
  File "/user_code/<redacted>.py", line 401, in <redacted>
    <redacted>
  File "/user_code/<redacted>.py", line 345, in <redacted>
    bq_client = bigquery.Client()
  File "/env/local/lib/python3.7/site-packages/google/cloud/bigquery/client.py", line 161, in __init__
    self._connection = Connection(self, client_info=client_info)
  File "/env/local/lib/python3.7/site-packages/google/cloud/bigquery/_http.py", line 33, in __init__
    super(Connection, self).__init__(client, client_info)
TypeError: __init__() takes 2 positional arguments but 3 were given

Looks like I should be specifying the latest version of all of these libraries?

edit: specifying latest versions

pandas-gbq
google-cloud-bigquery>=1.12.0
google-cloud-core>=1.0.0
google-cloud-storage>=1.16.0
parmap
google-api-core>=1.11.0
google-api-python-client>=1.7.8
google-cloud-pubsub>=0.41.0

seemed to fix it.

@tswast
Copy link
Contributor

tswast commented May 20, 2019

Does pinning all to latest work? That's what all our system tests use, so it should.

If starting from a fresh virtual env, pinning to latest should have the same effect. If using an existing environment, pip varies depending on the pip version whether it upgrades dependencies or not. Probably safest to pin everything.

@cgparkinson
Copy link

Yep - sorry I edited my comment to that effect. Pinning all to latest does work 👍

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the BigQuery API. autorelease: published cla: yes This human has signed the Contributor License Agreement. kokoro:force-run Add this label to force Kokoro to re-run the tests. packaging
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants