Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

With (I guess) update to scipy 1.4.0 test_dynamic_module and test_load_dynamic_module_in_grandchild_process break #316

Closed
mcepl opened this issue Dec 19, 2019 · 3 comments · Fixed by #325
Labels

Comments

@mcepl
Copy link

mcepl commented Dec 19, 2019

When building packages for openSUSE I observe this:

  • With Python 3.7.3, scipy 1.3.1, and cloudpickle 1.2.2 test suite passes without any problems

  • With the same Python, same cloudpickle, but scipy 1.4.0 I get CloudPickleTest::test_dynamic_module, CloudPickleTest::test_load_dynamic_module_in_grandchild_process, Protocol2CloudPickleTest::test_dynamic_module, and Protocol2CloudPickleTest::test_load_dynamic_module_in_grandchild_process failing:

[   40s] =================================== FAILURES ===================================
[   40s] _____________________ CloudPickleTest.test_dynamic_module ______________________
[   40s] 
[   40s] self = <tests.cloudpickle_test.CloudPickleTest testMethod=test_dynamic_module>
[   40s] 
[   40s]     def test_dynamic_module(self):
[   40s]         mod = types.ModuleType('mod')
[   40s]         code = '''
[   40s]         x = 1
[   40s]         def f(y):
[   40s]             return x + y
[   40s]     
[   40s]         class Foo:
[   40s]             def method(self, x):
[   40s]                 return f(x)
[   40s]         '''
[   40s]         exec(textwrap.dedent(code), mod.__dict__)
[   40s] >       mod2 = pickle_depickle(mod, protocol=self.protocol)
[   40s] 
[   40s] tests/cloudpickle_test.py:474: 
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] tests/cloudpickle_test.py:67: in pickle_depickle
[   40s]     return pickle.loads(cloudpickle.dumps(obj, protocol=protocol))
[   40s] cloudpickle/cloudpickle.py:1125: in dumps
[   40s]     cp.dump(obj)
[   40s] cloudpickle/cloudpickle.py:482: in dump
[   40s]     return Pickler.dump(self, obj)
[   40s] /usr/lib64/python3.7/pickle.py:437: in dump
[   40s]     self.save(obj)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] cloudpickle/cloudpickle.py:507: in save_module
[   40s]     obj=obj)
[   40s] /usr/lib64/python3.7/pickle.py:638: in save_reduce
[   40s]     save(args)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:771: in save_tuple
[   40s]     save(element)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] 
[   40s] self = <cloudpickle.cloudpickle.CloudPickler object at 0x7f697bee6a20>
[   40s] obj = <capsule object NULL at 0x7f697cf9d660>, save_persistent_id = True
[   40s] 
[   40s]     def save(self, obj, save_persistent_id=True):
[   40s]         self.framer.commit_frame()
[   40s]     
[   40s]         # Check for persistent id (defined by a subclass)
[   40s]         pid = self.persistent_id(obj)
[   40s]         if pid is not None and save_persistent_id:
[   40s]             self.save_pers(pid)
[   40s]             return
[   40s]     
[   40s]         # Check the memo
[   40s]         x = self.memo.get(id(obj))
[   40s]         if x is not None:
[   40s]             self.write(self.get(x[0]))
[   40s]             return
[   40s]     
[   40s]         # Check the type dispatch table
[   40s]         t = type(obj)
[   40s]         f = self.dispatch.get(t)
[   40s]         if f is not None:
[   40s]             f(self, obj) # Call unbound method with explicit self
[   40s]             return
[   40s]     
[   40s]         # Check private dispatch table if any, or else copyreg.dispatch_table
[   40s]         reduce = getattr(self, 'dispatch_table', dispatch_table).get(t)
[   40s]         if reduce is not None:
[   40s]             rv = reduce(obj)
[   40s]         else:
[   40s]             # Check for a class with a custom metaclass; treat as regular class
[   40s]             try:
[   40s]                 issc = issubclass(t, type)
[   40s]             except TypeError: # t is not a class (old Boost; see SF #502085)
[   40s]                 issc = False
[   40s]             if issc:
[   40s]                 self.save_global(obj)
[   40s]                 return
[   40s]     
[   40s]             # Check for a __reduce_ex__ method, fall back to __reduce__
[   40s]             reduce = getattr(obj, "__reduce_ex__", None)
[   40s]             if reduce is not None:
[   40s] >               rv = reduce(self.proto)
[   40s] E               TypeError: can't pickle PyCapsule objects
[   40s] 
[   40s] /usr/lib64/python3.7/pickle.py:524: TypeError
[   40s] ________ CloudPickleTest.test_load_dynamic_module_in_grandchild_process ________
[   40s] 
[   40s] self = <tests.cloudpickle_test.CloudPickleTest testMethod=test_load_dynamic_module_in_grandchild_process>
[   40s] 
[   40s]     def test_load_dynamic_module_in_grandchild_process(self):
[   40s]         # Make sure that when loaded, a dynamic module preserves its dynamic
[   40s]         # property. Otherwise, this will lead to an ImportError if pickled in
[   40s]         # the child process and reloaded in another one.
[   40s]     
[   40s]         # We create a new dynamic module
[   40s]         mod = types.ModuleType('mod')
[   40s]         code = '''
[   40s]         x = 1
[   40s]         '''
[   40s]         exec(textwrap.dedent(code), mod.__dict__)
[   40s]     
[   40s]         # This script will be ran in a separate child process. It will import
[   40s]         # the pickled dynamic module, and then re-pickle it under a new name.
[   40s]         # Finally, it will create a child process that will load the re-pickled
[   40s]         # dynamic module.
[   40s]         parent_process_module_file = os.path.join(
[   40s]             self.tmpdir, 'dynamic_module_from_parent_process.pkl')
[   40s]         child_process_module_file = os.path.join(
[   40s]             self.tmpdir, 'dynamic_module_from_child_process.pkl')
[   40s]         child_process_script = '''
[   40s]             import pickle
[   40s]             import textwrap
[   40s]     
[   40s]             import cloudpickle
[   40s]             from testutils import assert_run_python_script
[   40s]     
[   40s]     
[   40s]             child_of_child_process_script = {child_of_child_process_script}
[   40s]     
[   40s]             with open('{parent_process_module_file}', 'rb') as f:
[   40s]                 mod = pickle.load(f)
[   40s]     
[   40s]             with open('{child_process_module_file}', 'wb') as f:
[   40s]                 cloudpickle.dump(mod, f, protocol={protocol})
[   40s]     
[   40s]             assert_run_python_script(textwrap.dedent(child_of_child_process_script))
[   40s]             '''
[   40s]     
[   40s]         # The script ran by the process created by the child process
[   40s]         child_of_child_process_script = """ '''
[   40s]                 import pickle
[   40s]                 with open('{child_process_module_file}','rb') as fid:
[   40s]                     mod = pickle.load(fid)
[   40s]                 ''' """
[   40s]     
[   40s]         # Filling the two scripts with the pickled modules filepaths and,
[   40s]         # for the first child process, the script to be executed by its
[   40s]         # own child process.
[   40s]         child_of_child_process_script = child_of_child_process_script.format(
[   40s]                 child_process_module_file=child_process_module_file)
[   40s]     
[   40s]         child_process_script = child_process_script.format(
[   40s]             parent_process_module_file=_escape(parent_process_module_file),
[   40s]             child_process_module_file=_escape(child_process_module_file),
[   40s]             child_of_child_process_script=_escape(child_of_child_process_script),
[   40s]             protocol=self.protocol)
[   40s]     
[   40s]         try:
[   40s]             with open(parent_process_module_file, 'wb') as fid:
[   40s] >               cloudpickle.dump(mod, fid, protocol=self.protocol)
[   40s] 
[   40s] tests/cloudpickle_test.py:591: 
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] cloudpickle/cloudpickle.py:1109: in dump
[   40s]     CloudPickler(file, protocol=protocol).dump(obj)
[   40s] cloudpickle/cloudpickle.py:482: in dump
[   40s]     return Pickler.dump(self, obj)
[   40s] /usr/lib64/python3.7/pickle.py:437: in dump
[   40s]     self.save(obj)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] cloudpickle/cloudpickle.py:507: in save_module
[   40s]     obj=obj)
[   40s] /usr/lib64/python3.7/pickle.py:638: in save_reduce
[   40s]     save(args)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:771: in save_tuple
[   40s]     save(element)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] 
[   40s] self = <cloudpickle.cloudpickle.CloudPickler object at 0x7f697ba60550>
[   40s] obj = <capsule object NULL at 0x7f697cf9d660>, save_persistent_id = True
[   40s] 
[   40s]     def save(self, obj, save_persistent_id=True):
[   40s]         self.framer.commit_frame()
[   40s]     
[   40s]         # Check for persistent id (defined by a subclass)
[   40s]         pid = self.persistent_id(obj)
[   40s]         if pid is not None and save_persistent_id:
[   40s]             self.save_pers(pid)
[   40s]             return
[   40s]     
[   40s]         # Check the memo
[   40s]         x = self.memo.get(id(obj))
[   40s]         if x is not None:
[   40s]             self.write(self.get(x[0]))
[   40s]             return
[   40s]     
[   40s]         # Check the type dispatch table
[   40s]         t = type(obj)
[   40s]         f = self.dispatch.get(t)
[   40s]         if f is not None:
[   40s]             f(self, obj) # Call unbound method with explicit self
[   40s]             return
[   40s]     
[   40s]         # Check private dispatch table if any, or else copyreg.dispatch_table
[   40s]         reduce = getattr(self, 'dispatch_table', dispatch_table).get(t)
[   40s]         if reduce is not None:
[   40s]             rv = reduce(obj)
[   40s]         else:
[   40s]             # Check for a class with a custom metaclass; treat as regular class
[   40s]             try:
[   40s]                 issc = issubclass(t, type)
[   40s]             except TypeError: # t is not a class (old Boost; see SF #502085)
[   40s]                 issc = False
[   40s]             if issc:
[   40s]                 self.save_global(obj)
[   40s]                 return
[   40s]     
[   40s]             # Check for a __reduce_ex__ method, fall back to __reduce__
[   40s]             reduce = getattr(obj, "__reduce_ex__", None)
[   40s]             if reduce is not None:
[   40s] >               rv = reduce(self.proto)
[   40s] E               TypeError: can't pickle PyCapsule objects
[   40s] 
[   40s] /usr/lib64/python3.7/pickle.py:524: TypeError
[   40s] _________________ Protocol2CloudPickleTest.test_dynamic_module _________________
[   40s] 
[   40s] self = <tests.cloudpickle_test.Protocol2CloudPickleTest testMethod=test_dynamic_module>
[   40s] 
[   40s]     def test_dynamic_module(self):
[   40s]         mod = types.ModuleType('mod')
[   40s]         code = '''
[   40s]         x = 1
[   40s]         def f(y):
[   40s]             return x + y
[   40s]     
[   40s]         class Foo:
[   40s]             def method(self, x):
[   40s]                 return f(x)
[   40s]         '''
[   40s]         exec(textwrap.dedent(code), mod.__dict__)
[   40s] >       mod2 = pickle_depickle(mod, protocol=self.protocol)
[   40s] 
[   40s] tests/cloudpickle_test.py:474: 
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] tests/cloudpickle_test.py:67: in pickle_depickle
[   40s]     return pickle.loads(cloudpickle.dumps(obj, protocol=protocol))
[   40s] cloudpickle/cloudpickle.py:1125: in dumps
[   40s]     cp.dump(obj)
[   40s] cloudpickle/cloudpickle.py:482: in dump
[   40s]     return Pickler.dump(self, obj)
[   40s] /usr/lib64/python3.7/pickle.py:437: in dump
[   40s]     self.save(obj)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] cloudpickle/cloudpickle.py:507: in save_module
[   40s]     obj=obj)
[   40s] /usr/lib64/python3.7/pickle.py:638: in save_reduce
[   40s]     save(args)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:771: in save_tuple
[   40s]     save(element)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] 
[   40s] self = <cloudpickle.cloudpickle.CloudPickler object at 0x7f697be7ef98>
[   40s] obj = <capsule object NULL at 0x7f697cf9d660>, save_persistent_id = True
[   40s] 
[   40s]     def save(self, obj, save_persistent_id=True):
[   40s]         self.framer.commit_frame()
[   40s]     
[   40s]         # Check for persistent id (defined by a subclass)
[   40s]         pid = self.persistent_id(obj)
[   40s]         if pid is not None and save_persistent_id:
[   40s]             self.save_pers(pid)
[   40s]             return
[   40s]     
[   40s]         # Check the memo
[   40s]         x = self.memo.get(id(obj))
[   40s]         if x is not None:
[   40s]             self.write(self.get(x[0]))
[   40s]             return
[   40s]     
[   40s]         # Check the type dispatch table
[   40s]         t = type(obj)
[   40s]         f = self.dispatch.get(t)
[   40s]         if f is not None:
[   40s]             f(self, obj) # Call unbound method with explicit self
[   40s]             return
[   40s]     
[   40s]         # Check private dispatch table if any, or else copyreg.dispatch_table
[   40s]         reduce = getattr(self, 'dispatch_table', dispatch_table).get(t)
[   40s]         if reduce is not None:
[   40s]             rv = reduce(obj)
[   40s]         else:
[   40s]             # Check for a class with a custom metaclass; treat as regular class
[   40s]             try:
[   40s]                 issc = issubclass(t, type)
[   40s]             except TypeError: # t is not a class (old Boost; see SF #502085)
[   40s]                 issc = False
[   40s]             if issc:
[   40s]                 self.save_global(obj)
[   40s]                 return
[   40s]     
[   40s]             # Check for a __reduce_ex__ method, fall back to __reduce__
[   40s]             reduce = getattr(obj, "__reduce_ex__", None)
[   40s]             if reduce is not None:
[   40s] >               rv = reduce(self.proto)
[   40s] E               TypeError: can't pickle PyCapsule objects
[   40s] 
[   40s] /usr/lib64/python3.7/pickle.py:524: TypeError
[   40s] ___ Protocol2CloudPickleTest.test_load_dynamic_module_in_grandchild_process ____
[   40s] 
[   40s] self = <tests.cloudpickle_test.Protocol2CloudPickleTest testMethod=test_load_dynamic_module_in_grandchild_process>
[   40s] 
[   40s]     def test_load_dynamic_module_in_grandchild_process(self):
[   40s]         # Make sure that when loaded, a dynamic module preserves its dynamic
[   40s]         # property. Otherwise, this will lead to an ImportError if pickled in
[   40s]         # the child process and reloaded in another one.
[   40s]     
[   40s]         # We create a new dynamic module
[   40s]         mod = types.ModuleType('mod')
[   40s]         code = '''
[   40s]         x = 1
[   40s]         '''
[   40s]         exec(textwrap.dedent(code), mod.__dict__)
[   40s]     
[   40s]         # This script will be ran in a separate child process. It will import
[   40s]         # the pickled dynamic module, and then re-pickle it under a new name.
[   40s]         # Finally, it will create a child process that will load the re-pickled
[   40s]         # dynamic module.
[   40s]         parent_process_module_file = os.path.join(
[   40s]             self.tmpdir, 'dynamic_module_from_parent_process.pkl')
[   40s]         child_process_module_file = os.path.join(
[   40s]             self.tmpdir, 'dynamic_module_from_child_process.pkl')
[   40s]         child_process_script = '''
[   40s]             import pickle
[   40s]             import textwrap
[   40s]     
[   40s]             import cloudpickle
[   40s]             from testutils import assert_run_python_script
[   40s]     
[   40s]     
[   40s]             child_of_child_process_script = {child_of_child_process_script}
[   40s]     
[   40s]             with open('{parent_process_module_file}', 'rb') as f:
[   40s]                 mod = pickle.load(f)
[   40s]     
[   40s]             with open('{child_process_module_file}', 'wb') as f:
[   40s]                 cloudpickle.dump(mod, f, protocol={protocol})
[   40s]     
[   40s]             assert_run_python_script(textwrap.dedent(child_of_child_process_script))
[   40s]             '''
[   40s]     
[   40s]         # The script ran by the process created by the child process
[   40s]         child_of_child_process_script = """ '''
[   40s]                 import pickle
[   40s]                 with open('{child_process_module_file}','rb') as fid:
[   40s]                     mod = pickle.load(fid)
[   40s]                 ''' """
[   40s]     
[   40s]         # Filling the two scripts with the pickled modules filepaths and,
[   40s]         # for the first child process, the script to be executed by its
[   40s]         # own child process.
[   40s]         child_of_child_process_script = child_of_child_process_script.format(
[   40s]                 child_process_module_file=child_process_module_file)
[   40s]     
[   40s]         child_process_script = child_process_script.format(
[   40s]             parent_process_module_file=_escape(parent_process_module_file),
[   40s]             child_process_module_file=_escape(child_process_module_file),
[   40s]             child_of_child_process_script=_escape(child_of_child_process_script),
[   40s]             protocol=self.protocol)
[   40s]     
[   40s]         try:
[   40s]             with open(parent_process_module_file, 'wb') as fid:
[   40s] >               cloudpickle.dump(mod, fid, protocol=self.protocol)
[   40s] 
[   40s] tests/cloudpickle_test.py:591: 
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] cloudpickle/cloudpickle.py:1109: in dump
[   40s]     CloudPickler(file, protocol=protocol).dump(obj)
[   40s] cloudpickle/cloudpickle.py:482: in dump
[   40s]     return Pickler.dump(self, obj)
[   40s] /usr/lib64/python3.7/pickle.py:437: in dump
[   40s]     self.save(obj)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] cloudpickle/cloudpickle.py:507: in save_module
[   40s]     obj=obj)
[   40s] /usr/lib64/python3.7/pickle.py:638: in save_reduce
[   40s]     save(args)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:771: in save_tuple
[   40s]     save(element)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] /usr/lib64/python3.7/pickle.py:504: in save
[   40s]     f(self, obj) # Call unbound method with explicit self
[   40s] /usr/lib64/python3.7/pickle.py:856: in save_dict
[   40s]     self._batch_setitems(obj.items())
[   40s] /usr/lib64/python3.7/pickle.py:882: in _batch_setitems
[   40s]     save(v)
[   40s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   40s] 
[   40s] self = <cloudpickle.cloudpickle.CloudPickler object at 0x7f697ba60978>
[   40s] obj = <capsule object NULL at 0x7f697cf9d660>, save_persistent_id = True
[   40s] 
[   40s]     def save(self, obj, save_persistent_id=True):
[   40s]         self.framer.commit_frame()
[   40s]     
[   40s]         # Check for persistent id (defined by a subclass)
[   40s]         pid = self.persistent_id(obj)
[   40s]         if pid is not None and save_persistent_id:
[   40s]             self.save_pers(pid)
[   40s]             return
[   40s]     
[   40s]         # Check the memo
[   40s]         x = self.memo.get(id(obj))
[   40s]         if x is not None:
[   40s]             self.write(self.get(x[0]))
[   40s]             return
[   40s]     
[   40s]         # Check the type dispatch table
[   40s]         t = type(obj)
[   40s]         f = self.dispatch.get(t)
[   40s]         if f is not None:
[   40s]             f(self, obj) # Call unbound method with explicit self
[   40s]             return
[   40s]     
[   40s]         # Check private dispatch table if any, or else copyreg.dispatch_table
[   40s]         reduce = getattr(self, 'dispatch_table', dispatch_table).get(t)
[   40s]         if reduce is not None:
[   40s]             rv = reduce(obj)
[   40s]         else:
[   40s]             # Check for a class with a custom metaclass; treat as regular class
[   40s]             try:
[   40s]                 issc = issubclass(t, type)
[   40s]             except TypeError: # t is not a class (old Boost; see SF #502085)
[   40s]                 issc = False
[   40s]             if issc:
[   40s]                 self.save_global(obj)
[   40s]                 return
[   40s]     
[   40s]             # Check for a __reduce_ex__ method, fall back to __reduce__
[   40s]             reduce = getattr(obj, "__reduce_ex__", None)
[   40s]             if reduce is not None:
[   40s] >               rv = reduce(self.proto)
[   40s] E               TypeError: can't pickle PyCapsule objects
[   40s] 
[   40s] /usr/lib64/python3.7/pickle.py:524: TypeError
[   40s] =================== 4 failed, 174 passed, 7 skipped in 8.99s ===================

Full build log is attached as well.

@mcepl mcepl changed the title With (I guess) update to scipy 1.4.0 CloudPickleTest.test_dynamic_module breaks With (I guess) update to scipy 1.4.0 test_dynamic_module and test_load_dynamic_module_in_grandchild_process break Dec 19, 2019
@ogrisel
Copy link
Contributor

ogrisel commented Dec 20, 2019

Indeed I can also reproduce with scipy 1.4.1 while it works with scipy 1.3.1. Thanks for the report.

@ogrisel
Copy link
Contributor

ogrisel commented Jan 24, 2020

For the record, the PyCapsule instance that causes the problem is named __pybind11_internals_v3_gcc_libstdcpp_cxxabi1002__ in the __builtins__ dict.

Not sure about why it's there and what in scipy is adding it to the __builtins__ dict.

@ogrisel
Copy link
Contributor

ogrisel commented Jan 24, 2020

The scipy components that rely on pybind11 seem to have been introduced in the scipy/scipy#10238 PR (new scipy.fft module based on pocketfft).

wip-sync pushed a commit to NetBSD/pkgsrc-wip that referenced this issue Apr 4, 2022
2.0.0
=====

- Python 3.5 is no longer supported.

- Support for registering modules to be serialised by value. This allows code
  defined in local modules to be serialised and executed remotely without those
  local modules installed on the remote machine.
  ([PR #417](cloudpipe/cloudpickle#417))

- Fix a side effect altering dynamic modules at pickling time.
  ([PR #426](cloudpipe/cloudpickle#426))

- Support for pickling type annotations on Python 3.10 as per [PEP 563](
  https://www.python.org/dev/peps/pep-0563/)
  ([PR #400](cloudpipe/cloudpickle#400))

- Stricter parametrized type detection heuristics in
  _is_parametrized_type_hint to limit false positives.
  ([PR #409](cloudpipe/cloudpickle#409))

- Support pickling / depickling of OrderedDict KeysView, ValuesView, and
  ItemsView, following similar strategy for vanilla Python dictionaries.
  ([PR #423](cloudpipe/cloudpickle#423))

- Suppressed a source of non-determinism when pickling dynamically defined
  functions and handles the deprecation of co_lnotab in Python 3.10+.
  ([PR #428](cloudpipe/cloudpickle#428))

1.6.0
=====

- `cloudpickle`'s pickle.Pickler subclass (currently defined as
  `cloudpickle.cloudpickle_fast.CloudPickler`) can and should now be accessed
  as `cloudpickle.Pickler`. This is the only officially supported way of
  accessing it.
  ([issue #366](cloudpipe/cloudpickle#366))

- `cloudpickle` now supports pickling `dict_keys`, `dict_items` and
  `dict_values`.
  ([PR #384](cloudpipe/cloudpickle#384))

1.5.0
=====

- Fix a bug causing cloudpickle to crash when pickling dynamically created,
  importable modules.
  ([issue #360](cloudpipe/cloudpickle#354))

- Add optional dependency on `pickle5` to get improved performance on
  Python 3.6 and 3.7.
  ([PR #370](cloudpipe/cloudpickle#370))

- Internal refactoring to ease the use of `pickle5` in cloudpickle
  for Python 3.6 and 3.7.
  ([PR #368](cloudpipe/cloudpickle#368))

1.4.1
=====

- Fix incompatibilities between cloudpickle 1.4.0 and Python 3.5.0/1/2
  introduced by the new support of cloudpickle for pickling typing constructs.
  ([issue #360](cloudpipe/cloudpickle#360))

- Restore compat with loading dynamic classes pickled with cloudpickle
  version 1.2.1 that would reference the `types.ClassType` attribute.
  ([PR #359](cloudpipe/cloudpickle#359))

1.4.0
=====

**This version requires Python 3.5 or later**

- cloudpickle can now all pickle all constructs from the ``typing`` module
  and the ``typing_extensions`` library in Python 3.5+
  ([PR #318](cloudpipe/cloudpickle#318))

- Stop pickling the annotations of a dynamic class for Python < 3.6
  (follow up on #276)
  ([issue #347](cloudpipe/cloudpickle#347))

- Fix a bug affecting the pickling of dynamic `TypeVar` instances on Python 3.7+,
  and expand the support for pickling `TypeVar` instances (dynamic or non-dynamic)
  to Python 3.5-3.6 ([PR #350](cloudpipe/cloudpickle#350))

- Add support for pickling dynamic classes subclassing `typing.Generic`
  instances on Python 3.7+
  ([PR #351](cloudpipe/cloudpickle#351))

1.3.0
=====

- Fix a bug affecting dynamic modules occuring with modified builtins
  ([issue #316](cloudpipe/cloudpickle#316))

- Fix a bug affecting cloudpickle when non-modules objects are added into
  sys.modules
  ([PR #326](cloudpipe/cloudpickle#326)).

- Fix a regression in cloudpickle and python3.8 causing an error when trying to
  pickle property objects.
  ([PR #329](cloudpipe/cloudpickle#329)).

- Fix a bug when a thread imports a module while cloudpickle iterates
  over the module list
  ([PR #322](cloudpipe/cloudpickle#322)).

- Add support for out-of-band pickling (Python 3.8 and later).
  https://docs.python.org/3/library/pickle.html#example
  ([issue #308](cloudpipe/cloudpickle#308))

- Fix a side effect that would redefine `types.ClassTypes` as `type`
  when importing cloudpickle.
  ([issue #337](cloudpipe/cloudpickle#337))

- Fix a bug affecting subclasses of slotted classes.
  ([issue #311](cloudpipe/cloudpickle#311))

- Dont pickle the abc cache of dynamically defined classes for Python 3.6-
  (This was already the case for python3.7+)
  ([issue #302](cloudpipe/cloudpickle#302))

1.2.2
=====

- Revert the change introduced in
  ([issue #276](cloudpipe/cloudpickle#276))
  attempting to pickle functions annotations for Python 3.4 to 3.6. It is not
  possible to pickle complex typing constructs for those versions (see
  [issue #193]( cloudpipe/cloudpickle#193))

- Fix a bug affecting bound classmethod saving on Python 2.
  ([issue #288](cloudpipe/cloudpickle#288))

- Add support for pickling "getset" descriptors
  ([issue #290](cloudpipe/cloudpickle#290))

1.2.1
=====

- Restore (partial) support for Python 3.4 for downstream projects that have
  LTS versions that would benefit from cloudpickle bug fixes.

1.2.0
=====

- Leverage the C-accelerated Pickler new subclassing API (available in Python
  3.8) in cloudpickle. This allows cloudpickle to pickle Python objects up to
  30 times faster.
  ([issue #253](cloudpipe/cloudpickle#253))

- Support pickling of classmethod and staticmethod objects in python2.
  arguments. ([issue #262](cloudpipe/cloudpickle#262))

- Add support to pickle type annotations for Python 3.5 and 3.6 (pickling type
  annotations was already supported for Python 3.7, Python 3.4 might also work
  but is no longer officially supported by cloudpickle)
  ([issue #276](cloudpipe/cloudpickle#276))

- Internal refactoring to proactively detect dynamic functions and classes when
  pickling them.  This refactoring also yields small performance improvements
  when pickling dynamic classes (~10%)
  ([issue #273](cloudpipe/cloudpickle#273))

1.1.1
=====

- Minor release to fix a packaging issue (Markdown formatting of the long
  description rendered on pypi.org). The code itself is the same as 1.1.0.

1.1.0
=====

- Support the pickling of interactively-defined functions with positional-only
  arguments. ([issue #266](cloudpipe/cloudpickle#266))

- Track the provenance of dynamic classes and enums so as to preseve the
  usual `isinstance` relationship between pickled objects and their
  original class defintions.
  ([issue #246](cloudpipe/cloudpickle#246))

1.0.0
=====

- Fix a bug making functions with keyword-only arguments forget the default
  values of these arguments after being pickled.
  ([issue #264](cloudpipe/cloudpickle#264))

0.8.1
=====

- Fix a bug (already present before 0.5.3 and re-introduced in 0.8.0)
  affecting relative import instructions inside depickled functions
  ([issue #254](cloudpipe/cloudpickle#254))

0.8.0
=====

- Add support for pickling interactively defined dataclasses.
  ([issue #245](cloudpipe/cloudpickle#245))

- Global variables referenced by functions pickled by cloudpickle are now
  unpickled in a new and isolated namespace scoped by the CloudPickler
  instance. This restores the (previously untested) behavior of cloudpickle
  prior to changes done in 0.5.4 for functions defined in the `__main__`
  module, and 0.6.0/1 for other dynamic functions.

0.7.0
=====

- Correctly serialize dynamically defined classes that have a `__slots__`
  attribute.
  ([issue #225](cloudpipe/cloudpickle#225))

0.6.1
=====

- Fix regression in 0.6.0 which breaks the pickling of local function defined
  in a module, making it impossible to access builtins.
  ([issue #211](cloudpipe/cloudpickle#211))

0.6.0
=====

- Ensure that unpickling a function defined in a dynamic module several times
  sequentially does not reset the values of global variables.
  ([issue #187](cloudpipe/cloudpickle#205))

- Restrict the ability to pickle annotations to python3.7+ ([issue #193](
  cloudpipe/cloudpickle#193) and [issue #196](
  cloudpipe/cloudpickle#196))

- Stop using the deprecated `imp` module under Python 3.
  ([issue #207](cloudpipe/cloudpickle#207))

- Fixed pickling issue with singleton types `NoneType`, `type(...)` and
  `type(NotImplemented)` ([issue #209](cloudpipe/cloudpickle#209))

0.5.6
=====

- Ensure that unpickling a locally defined function that accesses the global
  variables of a module does not reset the values of the global variables if
  they are already initialized.
  ([issue #187](cloudpipe/cloudpickle#187))

0.5.5
=====

- Fixed inconsistent version in `cloudpickle.__version__`.

0.5.4
=====

- Fixed a pickling issue for ABC in python3.7+ ([issue #180](
  cloudpipe/cloudpickle#180)).

- Fixed a bug when pickling functions in `__main__` that access global
  variables ([issue #187](
  cloudpipe/cloudpickle#187)).

0.5.3
=====
- Fixed a crash in Python 2 when serializing non-hashable instancemethods of built-in
  types ([issue #144](cloudpipe/cloudpickle#144)).

- itertools objects can also pickled
  ([PR #156](cloudpipe/cloudpickle#156)).

- `logging.RootLogger` can be also pickled
  ([PR #160](cloudpipe/cloudpickle#160)).

0.5.2
=====

- Fixed a regression: `AttributeError` when loading pickles that hold a
  reference to a dynamically defined class from the `__main__` module.
  ([issue #131]( cloudpipe/cloudpickle#131)).

- Make it possible to pickle classes and functions defined in faulty
  modules that raise an exception when trying to look-up their attributes
  by name.

0.5.1
=====

- Fixed `cloudpickle.__version__`.

0.5.0
=====

- Use `pickle.HIGHEST_PROTOCOL` by default.

0.4.4
=====

- `logging.RootLogger` can be also pickled
  ([PR #160](cloudpipe/cloudpickle#160)).

0.4.3
=====

- Fixed a regression: `AttributeError` when loading pickles that hold a
  reference to a dynamically defined class from the `__main__` module.
  ([issue #131]( cloudpipe/cloudpickle#131)).

- Fixed a crash in Python 2 when serializing non-hashable instancemethods of built-in
  types. ([issue #144](cloudpipe/cloudpickle#144))

0.4.2
=====

- Restored compatibility with pickles from 0.4.0.
- Handle the `func.__qualname__` attribute.

0.4.1
=====

- Fixed a crash when pickling dynamic classes whose `__dict__` attribute was
  defined as a [`property`](https://docs.python.org/3/library/functions.html#property).
  Most notably, this affected dynamic [namedtuples](https://docs.python.org/2/library/collections.html#namedtuple-factory-function-for-tuples-with-named-fields)
  in Python 2. (cloudpipe/cloudpickle#113)
- Cloudpickle now preserves the `__module__` attribute of functions (cloudpipe/cloudpickle#118).
- Fixed a crash when pickling modules that don't have a `__package__` attribute (cloudpipe/cloudpickle#116).

0.4.0
=====

* Fix functions with empty cells
* Allow pickling Logger objects
* Fix crash when pickling dynamic class cycles
* Ignore "None" mdoules added to sys.modules
* Support WeakSets and ABCMeta instances
* Remove non-standard `__transient__` support
* Catch exception from `pickle.whichmodule()`

0.3.1
=====

* Fix version information and ship a changelog

 0.3.0
=====

* Import submodules accessed by pickled functions
* Support recursive functions inside closures
* Fix `ResourceWarnings` and `DeprecationWarnings`
* Assume modules with `__file__` attribute are not dynamic

0.2.2
=====

* Support Python 3.6
* Support Tornado Coroutines
* Support builtin methods
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants