Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't import google.protobuf.timestamp_pb2 from App Engine #1436

Closed
barakcoh opened this issue Feb 1, 2016 · 77 comments
Closed

Can't import google.protobuf.timestamp_pb2 from App Engine #1436

barakcoh opened this issue Feb 1, 2016 · 77 comments
Assignees

Comments

@barakcoh
Copy link

barakcoh commented Feb 1, 2016

I'm getting ImportError: No module named protobuf from the line from google.protobuf import timestamp_pb2 because the google module refers to the AppEngine SDK.

This is a well-known issue that's been around for years. Maybe you can bundle protobuf directly inside gcloud-python instead of an external dependency?

@dhermes
Copy link
Contributor

dhermes commented Feb 1, 2016

Thanks for reporting and for upgrading!

I am curious about a few things:

  1. How do you vendor in gcloud-python for App Engine?
  2. Do you vendor in protobuf? (AFAIK it isn't part of the GAE runtime)
  3. Which services / gcloud-python subpackages do you use? (I'm curious which imports / import paths your code has been using.)
  4. Are you modify the gcloud module object so that it acts like a namespace package? If no, this is the solution to your problem. I made an app that shows how to do this. The app is now out-of-date but the file that handles protobuf imports is still valid.

@theacodes
Copy link
Contributor

Are you modify the gcloud module object so that it acts like a namespace package? If no, this is the solution to your problem. I made an app that shows how to do this. The app is now out-of-date but the file that handles protobuf imports is still valid.

I don't think this shouldn't be necessary anymore, I'm interested to see if that's somehow changed.

@dhermes
Copy link
Contributor

dhermes commented Feb 1, 2016

Me too. Thanks for chiming in @jonparrott I was going to ping you.

@theacodes
Copy link
Contributor

I should really write an App Engine test case.

@dhermes
Copy link
Contributor

dhermes commented Feb 9, 2016

@SaMnCo also reported this import error, it may be

  • we don't understand our requirements like we think we do.
  • pip doesn't upgrade deps in setup.py as I expect
  • a PYTHONPATH issue

@dhermes
Copy link
Contributor

dhermes commented Feb 11, 2016

@SaMnCo can you do a pip show protobuf, pip show gcloud in the environment where this is failing and then make sure the same versions are imported:

$ python -c 'import google.protobuf; print(google.protobuf.__version__)'
$ python -c 'import gcloud; print(gcloud.__version__)'

@kvdb
Copy link

kvdb commented Feb 18, 2016

@dhermes I think I've got a similar problem as the reporter of the issue. Perhaps I can help by providing my answers to your questions:

How do you vendor in gcloud-python for App Engine?

In appengine_config.py:

from google.appengine.ext import vendor                                         
vendor.add('libs')                                                              
import google.protobuf; print(google.protobuf.__version__)                      
import gcloud; print(gcloud.__version__) 

Results in:

3.0.0b2
0.10.0

Do you vendor in protobuf? (AFAIK it isn't part of the GAE runtime)

It's in libs, so yes.

Which services / gcloud-python subpackages do you use? (I'm curious which imports / import paths your code has been using.)

  from gcloud import storage
File "/x/libs/gcloud/storage/__init__.py", line 43, in <module>
  from gcloud.storage.blob import Blob
File "/x/libs/gcloud/storage/blob.py", line 27, in <module>
  from gcloud._helpers import _rfc3339_to_datetime
File "/x/libs/gcloud/_helpers.py", line 26, in <module>
  from google.protobuf import timestamp_pb2
ImportError: No module named protobuf

@dhermes
Copy link
Contributor

dhermes commented Feb 18, 2016

@kvdb Thanks a lot. I'm wondering now if it may be related to Darth-Vendor and namespace packages not being the best of friends. I need to dig in and reproduce. Can you confirm that's what you used to vendor in your package?

@theacodes
Copy link
Contributor

Darth is part of the SDK (from google.appengine.ext import vendor) so that's definitely what's being used. It explicitly has support for namespaces. So I'm also unsure what's going on.

@dhermes
Copy link
Contributor

dhermes commented Feb 19, 2016

Maybe it's an issue with old versions of pip? The module certainly shows up:

$ mkdir lib
$ pip install -t lib gcloud
$ cd lib/google/protobuf/
$ ls -1 *pb2.py | egrep -v unittest
any_pb2.py
api_pb2.py
descriptor_pb2.py
duration_pb2.py
empty_pb2.py
field_mask_pb2.py
source_context_pb2.py
struct_pb2.py
timestamp_pb2.py
type_pb2.py
wrappers_pb2.py

Going to run a test app in dev_appserver to verify import works.

@dhermes
Copy link
Contributor

dhermes commented Feb 19, 2016

OK I just confirmed this works in dev_appserver as well.

@kvdb Have you tried clearing out lib and starting fresh? What version of pip are you using?

@dhermes dhermes changed the title AppEngine support broken in 0.9.0 Can't import google.protobuf.timestamp_pb2 from App Engine Feb 20, 2016
@kvdb
Copy link

kvdb commented Feb 21, 2016

pip 8.0.2
Yes, I cleared my virtualenv/libs before reporting.

@dhermes
Copy link
Contributor

dhermes commented Feb 21, 2016

Very strange! Does the timestamp_pb2.py file exist and just the import fails or the file never shows up?

@ernestoalejo
Copy link

I was having this error outside App Engine when using an old version of pip and not using the full command to install:

pip install --upgrade gcloud

I was using pip -r requirements.txt instead of pip --upgrade -r requirements.txt and the timestamp_pb2.py file was not present.

@ernestoalejo
Copy link

Outside App Engine means inside Docker, using the google/debian:wheezy image.

@dhermes
Copy link
Contributor

dhermes commented Feb 22, 2016

Thanks for the report @ernestoalejo! @kvdb any thoughts?

@dhermes
Copy link
Contributor

dhermes commented Mar 2, 2016

@kvdb Does the timestamp_pb2.py file exist and just the import fails or the file never shows up?


I'm closing for now since I can't reproduce and @ernestoalejo answer indicates --upgrade resolves the issue. @kvdb I'm happy to re-open if it still comes up.

@dhermes
Copy link
Contributor

dhermes commented Mar 18, 2016

@vsubramani can you give some info on your install:

from google import protobuf
print(protobuf.__version__)
print(protobuf.__path__)
print(protobuf.__path__[0])
import os
pb_files = sorted([fi for fi in os.listdir(protobuf.__path__[0])
                   if fi.endswith('.py')])
print('\n'.join(pb_files))

@jjangsangy
Copy link

jjangsangy commented Apr 18, 2016

So I had the same issue running from gcloud import storage inside a docker container output an error.

In [3]: from gcloud import storage
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-3-a2d587421817> in <module>()
----> 1 from gcloud import storage

/venv/lib/python2.7/site-packages/gcloud/storage/__init__.py in <module>()
     41
     42 from gcloud.storage.batch import Batch
---> 43 from gcloud.storage.blob import Blob
     44 from gcloud.storage.bucket import Bucket
     45 from gcloud.storage.client import Client

/venv/lib/python2.7/site-packages/gcloud/storage/blob.py in <module>()
     25 from six.moves.urllib.parse import quote
     26
---> 27 from gcloud._helpers import _rfc3339_to_datetime
     28 from gcloud.credentials import generate_signed_url
     29 from gcloud.exceptions import NotFound

/venv/lib/python2.7/site-packages/gcloud/_helpers.py in <module>()
     25 from threading import local as Local
     26
---> 27 from google.protobuf import timestamp_pb2
     28 import six
     29 from six.moves.http_client import HTTPConnection

ImportError: cannot import name timestamp_pb2

Running

>>> from google import protobuf
>>> print(protobuf.__version__)
>>> print(protobuf.__path__)
>>> print(protobuf.__path__[0])
>>> import os
>>> pb_files = sorted([fi for fi in os.listdir(protobuf.__path__[0])
                   if fi.endswith('.py')])
>>> print('\n'.join(pb_files))
__init__.py
descriptor.py
descriptor_database.py
descriptor_pb2.py
descriptor_pool.py
message.py
message_factory.py
proto_builder.py
reflection.py
service.py
service_reflection.py
symbol_database.py
text_encoding.py
text_format.py

This was fixed by adding this line into my Dockerfile

RUN pip install --upgrade gcloud

It looks like you have to force the gcloud installation to upgrade. This was the output while building my docker container.

Step 13 : RUN pip install --upgrade gcloud
 ---> Running in 4670a504231c
Requirement already up-to-date: gcloud in /venv/lib/python2.7/site-packages
Requirement already up-to-date: pyOpenSSL in /venv/lib/python2.7/site-packages (from gcloud)
Requirement already up-to-date: six in /venv/lib/python2.7/site-packages (from gcloud)
Requirement already up-to-date: httplib2>=0.9.1 in /venv/lib/python2.7/site-packages (from gcloud)
Collecting protobuf!=3.0.0.b2.post1,>=3.0.0b2 (from gcloud)
  Downloading protobuf-3.0.0b2.post2-py2-none-any.whl (331kB)
Collecting oauth2client>=2.0.1 (from gcloud)
  Downloading oauth2client-2.0.2.tar.gz (68kB)
Requirement already up-to-date: googleapis-common-protos in /venv/lib/python2.7/site-packages (from gcloud)
Requirement already up-to-date: cryptography>=1.3 in /venv/lib/python2.7/site-packages (from pyOpenSSL->gcloud)
Requirement already up-to-date: setuptools in /venv/lib/python2.7/site-packages (from protobuf!=3.0.0.b2.post1,>=3.0.0b2->gcloud)
Requirement already up-to-date: pyasn1>=0.1.7 in /venv/lib/python2.7/site-packages (from oauth2client>=2.0.1->gcloud)
Requirement already up-to-date: pyasn1-modules>=0.0.5 in /venv/lib/python2.7/site-packages (from oauth2client>=2.0.1->gcloud)
Collecting rsa>=3.1.4 (from oauth2client>=2.0.1->gcloud)
  Downloading rsa-3.4.2-py2.py3-none-any.whl (46kB)
Collecting enum34 (from cryptography>=1.3->pyOpenSSL->gcloud)
  Downloading enum34-1.1.3-py2.py3-none-any.whl (61kB)
Requirement already up-to-date: ipaddress in /venv/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->gcloud)
Requirement already up-to-date: idna>=2.0 in /venv/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->gcloud)
Requirement already up-to-date: cffi>=1.4.1 in /venv/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->gcloud)
Requirement already up-to-date: pycparser in /venv/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->gcloud)
Building wheels for collected packages: oauth2client
  Running setup.py bdist_wheel for oauth2client: started
  Running setup.py bdist_wheel for oauth2client: finished with status 'done'
  Stored in directory: /root/.cache/pip/wheels/3f/1b/4b/7d18513fd8ed10976cf4bb881bd0a2b549d4d22d245d0ff1a6
Successfully built oauth2client
Installing collected packages: protobuf, rsa, oauth2client, enum34
  Found existing installation: protobuf 3.0.0a3
    Uninstalling protobuf-3.0.0a3:
      Successfully uninstalled protobuf-3.0.0a3
  Found existing installation: rsa 3.3
    Uninstalling rsa-3.3:
      Successfully uninstalled rsa-3.3
  Found existing installation: oauth2client 1.5.2
    Uninstalling oauth2client-1.5.2:
      Successfully uninstalled oauth2client-1.5.2
  Found existing installation: enum34 1.1.2
    Uninstalling enum34-1.1.2:
      Successfully uninstalled enum34-1.1.2
Successfully installed enum34-1.1.3 oauth2client-2.0.2 protobuf-3.0.0b2.post2 rsa-3.4.2

@dhermes
Copy link
Contributor

dhermes commented Apr 18, 2016

Thanks for the note. Unfortunately there's not much we can do about the way Python packaging works beyond declaring the minimum versions we require.

@kvdb
Copy link

kvdb commented May 6, 2016

The problems are still present using the latest gcloud.
In setup.py, I read the protobuf dependencies are defined as:

'protobuf >= 3.0.0b2, != 3.0.0.b2.post1'

I think I had problematic version 3.0.0b2 cached, that's why it was continued to be used within gcloud.
When upgrading protobuf to protobuf-3.0.0b2.post2, it works.
Therefore, I'd recommend defining the dependency in setup.py as follows:

'protobuf >= 3.0.0b2.post2'

@dhermes
Copy link
Contributor

dhermes commented May 6, 2016

What do you mean by a "problematic version 3.0.0b2"? I just did a fresh install and timestamp_pb2 is present:

$ virtualenv venv
$ source venv/bin/activate
(venv) $ pip install 'protobuf == 3.0.0b2.post2'
(venv) $ ls venv/lib/python2.7/site-packages/google/protobuf/timestamp* -1
venv/lib/python2.7/site-packages/google/protobuf/timestamp_pb2.py
venv/lib/python2.7/site-packages/google/protobuf/timestamp_pb2.pyc

@dhermes
Copy link
Contributor

dhermes commented May 6, 2016

Ahhh hold on:

(venv) $ pip show protobuf
---
Metadata-Version: 2.0
Name: protobuf
Version: 3.0.0b2.post2
...

@dgaedcke
Copy link

ok...the pain continues....
I tried: pip install -r requirements.txt -t lib
and I get:
Exception: Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/pip/basecommand.py", line 209, in main status = self.run(options, args) File "/usr/local/lib/python2.7/site-packages/pip/commands/install.py", line 317, in run prefix=options.prefix_path, File "/usr/local/lib/python2.7/site-packages/pip/req/req_set.py", line 732, in install **kwargs File "/usr/local/lib/python2.7/site-packages/pip/req/req_install.py", line 835, in install self.move_wheel_files(self.source_dir, root=root, prefix=prefix) File "/usr/local/lib/python2.7/site-packages/pip/req/req_install.py", line 1030, in move_wheel_files isolated=self.isolated, File "/usr/local/lib/python2.7/site-packages/pip/wheel.py", line 247, in move_wheel_files prefix=prefix, File "/usr/local/lib/python2.7/site-packages/pip/locations.py", line 153, in distutils_scheme i.finalize_options() File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/install.py", line 264, in finalize_options "must supply either home or prefix/exec-prefix -- not both" DistutilsOptionError: must supply either home or prefix/exec-prefix -- not both

such a drag to go round & round with this as many times as I have.....

@dgaedcke
Copy link

Here is another guy promoting and publishing the symlink approach.......his is the newest and most consolidated instructions I've found.
Meanwhile, the "vendor" link on Google's own page seems to be broken.....

@dgaedcke
Copy link

User error (at least in part).....my "venv" folder was part of my skip_files config.....so I guess it would have been deployed to the Production VM if that line was not in there.....but it seem quite weird and inefficient to be loading the WHOLE venv directory rather than just the site-packages subfolder....
I think I'm going to just sit tight until someone points me to the valid official google instructions for local dev setup....

@theacodes
Copy link
Contributor

Okay, lots of stuff to respond to here.

my "venv" folder was part of my skip_files config

I suspected as such, glad you figured it out.

but it seem quite weird and inefficient to be loading the WHOLE venv directory rather than just the site-packages subfolder

This is one of many reasons we don't recommend using a virtualenv for vendoring. You can setup skip_files to only upload the site-packages folder.

I think I'm going to just sit tight until someone points me to the valid official google instructions for local dev setup....

The instructions are here. The vendoring link is broken but it is supposed to link to this heading in the same page.

In summary:

  1. Add vendor.add('lib') to appengine_config.py.
  2. Use pip install -t lib -r requirements.txt.

These are the steps used in all of our samples for App Engine standard that require third-party packages.

From what I understand, this is where things deviated for you:

  1. You initially used a virtualenv and a symlinked lib folder, neither of which are documented or really recommended by us, for various reasons.
  2. You had your virtualenv ignored in app.yaml.

From what I can gather, you chose this route because pip install -t failed for you. The error you pasted above seems to implicate this known issue with homebrew Python. They have a workaround in their documentation.

It seems my actions here to ameliorate this are:

  1. Fix the vendor link in the docs.
  2. Add a note about homebrew Python not working with pip install -t and link to their workaround.

I feel your pain here, I've been personally working on improving this situation. Please let me know if there's anything more I can do at this time to make this easier.

@dgaedcke
Copy link

Hey Jon..... thanks so much for your diligent responses & help!

A few clarifications here:
You link to this Homebrew/Python post as the "workaround" to the pip -t lib problem....but it's not clear to me what that post is suggesting because it's related to --user, not -t lib......I'm sure someone with more experience would see the connection instantly, but how to apply this workaround to my issue is lost on me...

Even though you say I could "setup skip_files to only upload the site-packages folder."....you are specifically recommending AGAINST that approach......correct?
But you still think it's ok to have a VENV in parallel to the -t lib/ approach to use only for automated testing purposes? Or is it better to forego that also and somehow force the tests to use the GAE Python and it's respective paths?? Essentially running all tests in the exact same environment as the local dev server?? If so, can you link to a tutorial on how to do that??

@theacodes
Copy link
Contributor

pip install -t is just another form of --user internally. Creating the ~/.distutils.cfg as described will fix the issue. You can remove it after vendoring is done.

you are specifically recommending AGAINST that approach......correct?

Yes, I do not recommend using virtualenvs for vendoring on app engine.

But you still think it's ok to have a VENV in parallel to the -t lib/ approach to use only for automated testing purposes?

I'm not sure what you mean here. I don't think I've recommended that approach elsewhere.

Or is it better to forego that also and somehow force the tests to use the GAE Python and it's respective paths??

Our documentation shows how to use a simple test runner to do this here. You can also use NoseGAE. If you're extra curious, for testing our own samples we use py.test and a custom set of py.test hooks as well as nox with a complicated noxfile. You probably don't need all that.

@dgaedcke
Copy link

ok...that makes sense....
so after creating an empty "prefix" entry in the ~/.pydistutils.cfg file, I ran this:
pip install --upgrade --requirement requirements.txt --target lib/
it put both the raw (.py files) and the matching "xxx.dist-info" package for each dependency in the lib folder.....that part is good and I should be able to delete my venv now.....and I guess I could delete those dist-info directories too to reduce my upload time??

Sadly, it also created _cffi_backend.so" in my lib directory and I know that's not allowed.....which is weird because I've not seen app engine complain when I've been pushing up there.....
Anyway, I've got to figure out which of my dependencies is pulling that one....

@theacodes
Copy link
Contributor

.and I guess I could delete those dist-info directories too to reduce my upload time??

If you really want to, but they're tiny.

Sadly, it also created _cffi_backend.so" in my lib directory and I know that's not allowed.....which is weird because I've not seen app engine complain when I've been pushing up there.....

App engine will just ignore it, so don't worry about it.

@dgaedcke
Copy link

I must really be missing something because your "don't worry" comment totally worries me ;-)
How can I be certain that dependency is not in use on my localhost, and thus causing something to break when it DOES NOT get pushed to the server???

@theacodes
Copy link
Contributor

Our Import hooks prevent loading any .so module that isn't explicitly whitelisted.

@dgaedcke
Copy link

Yes, I'm totally clear that it won't get loaded because it won't get pushed by the update......that's not my concern. My concern is that the .so IS THERE in my lib/ directory when testing on localhost, and it WON'T be there on the server......so how can I trust my local testing to be definitive....in other words, some other python package depends upon it??????

@theacodes
Copy link
Contributor

I should've specified: the import hooks I mentioned are part of dev_appserver.

@dgaedcke
Copy link

oh....all the worry evaporated ;-) Thank you!

@dguaraglia
Copy link

dguaraglia commented Jun 24, 2016

For what it's worth, if anyone else is still having this issue after reading the thread: I solved it in my case by making sure the system-wide Python path was clean of any google.* and gcloud.* related modules, and then re-installing on my virtualenv.

To clarify, my doing python -c "import sys; print sys.path" looked like this (formatted):

['/Users/dguaraglia/Projects/project/.env/lib/python2.7/site-packages',
 '',
 '/Users/dguaraglia/.homebrew/bin',
 '/Library/Python/2.7/site-packages',
 '/Users/dguaraglia/.homebrew/lib/python2.7/site-packages/gax_google_pubsub_v1-0.7.9-py2.7.egg',
 '/Users/dguaraglia/.homebrew/lib/python2.7/site-packages/google_gax-0.12.0-py2.7.egg',
 '/Users/dguaraglia/.homebrew/lib/python2.7/site-packages/_pdbpp_path_hack',
 '/Users/dguaraglia/.homebrew/lib/python2.7/site-packages/six-1.10.0-py2.7.egg',
  ...
]

After I manually removed the gax_google_pubsub and google_gax packages (simply rmd the files) and re-installed gcloud on my virtualenv, things went back to working.

@hohaichi
Copy link

hohaichi commented Jul 1, 2016

Thanks dguaraglia. I got over the issue by renaming all gax files in sys.path, after making sure that timestamp_pb2.py exists in one of the paths in sys.path:

/usr/local/lib/python2.7/dist-packages/google_gax-0.12.1-py2.7.egg
/usr/local/lib/python2.7/dist-packages/gax_google_pubsub_v1-0.7.10-py2.7.egg
/usr/local/lib/python2.7/dist-packages/gax_google_logging_v2-0.7.10-py2.7.egg

@yanik-ai
Copy link

yanik-ai commented Aug 20, 2016

I have same issue
When using
import google.protobuf; print(google.protobuf.__version__) # return following issue:
ImportError: No module named protobuf

import gcloud; print(gcloud.__version__) # return 0.18.1

UPD: just installing protobuf not helps 🎱
(pip install 'protobuf == 3.0.0b2' -t ./sitepackages/ --upgrade )

Traceback:

Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "/..../django_gcloud_pubsub/utils.py", line 4, in <module>
    from gcloud import pubsub
  File "/.... /sitepackages/gcloud/pubsub/__init__.py", line 26, in <module>
    from gcloud.pubsub.client import Client
  File "/ .... sitepackages/gcloud/pubsub/client.py", line 19, in <module>
    from gcloud.client import JSONClient
  File "/..../sitepackages/gcloud/client.py", line 20, in <module>
    from gcloud._helpers import _determine_default_project
  File "/..../sitepackages/gcloud/_helpers.py", line 28, in <module>
    from google.protobuf import timestamp_pb2
ImportError: No module named protobuf`

@tseaver
Copy link
Contributor

tseaver commented Aug 23, 2016

@yanikkoval Some version of some package has stomped on the google namespace. I'm pretty sure that if you had all the packages up-to-date, you'd be fine. Can you report the output of pip freeze here?

@dhermes
Copy link
Contributor

dhermes commented Aug 23, 2016

@tseaver It's App Engine weirdness actually. I try to stay away and let @jonparrott weigh in on such issues

@theacodes
Copy link
Contributor

@yanikkoval how are you installing packages? What's in your appengine_config.py?

@didiercabrera
Copy link

didiercabrera commented Sep 3, 2016

For anyone still having issues this is how I solved it.

protocolbuffers/protobuf#1153

@yanik-ai
Copy link

yanik-ai commented Sep 5, 2016

@tseaver, @jonparrott installing it with pip put the package in the wrong directory.
And appengine expects it in its own directory google_appengine/google.

@theacodes
Copy link
Contributor

@yanikkoval can you give us more details? What exact pip command are you running? What's in your appengine_config.py?

@ensonic
Copy link

ensonic commented Feb 21, 2017

It is also happening on local machine using Ubuntu 14.04 LTS - from pip freze:

gapic-google-cloud-datastore-v1==0.14.1
gapic-google-cloud-logging-v2==0.90.1
gapic-google-cloud-pubsub-v1==0.14.1
gcloud==0.18.3
google-cloud==0.22.0
google-cloud-bigquery==0.22.1
google-cloud-bigtable==0.22.0
google-cloud-core==0.22.1
google-cloud-datastore==0.22.1
google-cloud-dns==0.22.0
google-cloud-error-reporting==0.22.0
google-cloud-happybase==0.22.0
google-cloud-language==0.22.2
google-cloud-logging==0.22.0
google-cloud-monitoring==0.22.0
google-cloud-pubsub==0.22.0
google-cloud-resource-manager==0.22.0
google-cloud-runtimeconfig==0.22.0
google-cloud-storage==0.22.0
google-cloud-translate==0.22.0
google-cloud-vision==0.22.0
grpc-google-cloud-datastore-v1==0.14.0
grpc-google-cloud-logging-v2==0.90.0
grpc-google-cloud-pubsub-v1==0.14.0
Traceback (most recent call last):
  File "./src/main.py", line 9, in <module>
    from google.cloud import logging, pubsub
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/__init__.py", line 18, in <module>
    from google.cloud.logging.client import Client
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/client.py", line 32, in <module>
    from google.cloud.client import JSONClient
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/client.py", line 21, in <module>
    from google.cloud._helpers import _determine_default_project
  File "/usr/local/lib/python2.7/dist-packages/google/cloud/_helpers.py", line 30, in <module>
    from google.protobuf import timestamp_pb2
ImportError: cannot import name timestamp_pb2

@dhermes
Copy link
Contributor

dhermes commented Feb 21, 2017

@ensonic Run import google; print(google.__path__) so see what the problem is. The usual suspect here is that you have a "bad" package breaking the google namespace. (The fact that you have the very old gcloud co-tenant with google-cloud isn't a good sign.)

@ensonic
Copy link

ensonic commented Feb 21, 2017

@dhermes

python -c "import google; print(google.__path__)"
['/usr/local/lib/python2.7/dist-packages/google']

which ones should I get rid of - the 'grpc-google-' or 'gapic-google-'

@dhermes
Copy link
Contributor

dhermes commented Feb 21, 2017

Neither need be gotten rid of AFAIK. From there you'd want to poke around in /usr/local/lib/python2.7/dist-packages/google/__init__.py and see if it is a namespace package or otherwise (you also need to look around for (maybe bad?) .nspkg files in your site-packages

atulep pushed a commit that referenced this issue Apr 6, 2023
…loudPlatform/python-docs-samples#1436)

* enhanced model and recognition metadata

* flake, update tests

* readme

* client library version update
atulep pushed a commit that referenced this issue Apr 6, 2023
…loudPlatform/python-docs-samples#1436)

* enhanced model and recognition metadata

* flake, update tests

* readme

* client library version update
atulep pushed a commit that referenced this issue Apr 18, 2023
…loudPlatform/python-docs-samples#1436)

* enhanced model and recognition metadata

* flake, update tests

* readme

* client library version update
parthea pushed a commit that referenced this issue Oct 22, 2023
…loudPlatform/python-docs-samples#1436)

* enhanced model and recognition metadata

* flake, update tests

* readme

* client library version update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests