Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support ipykernel >= 6 #737

Merged
merged 21 commits into from
May 2, 2022
Merged

Support ipykernel >= 6 #737

merged 21 commits into from
May 2, 2022

Conversation

devstein
Copy link
Collaborator

@devstein devstein commented Sep 16, 2021

Closes #727

Description

  • Support ipykernel>=6 async def do_execute
  • Update ipykernel 4 and 5 tests
  • Add test for ipykernel 6
  • Removes ipykernel<6 constraints
  • Ran black formatter on the changed files

TODOs

  • Figure out what other SparkKernelBase methods need to be async

Checklist

  • Wrote a description of my changes above
  • Added a bullet point for my changes to the top of the CHANGELOG.md file
  • Added or modified unit tests to reflect my changes
  • Manually tested with a notebook

@juliusvonkohout
Copy link
Contributor

Thanks for your efforts @devstein

What about the reviewers ?
@Carreau
@aggFTW
@itamarst
@juhoautio
@PedroRossi

@sergiimk
Copy link

Thank you for this PR @devstein . I had to put in a lot of hacks to run my environment with older Jupyter base image just to work around this incompatibility, so can't wait for this to be merged.

I tried installing sparkmagic from your branch to see if it works with jupyter/base-notebook:2021-10-18 base docker image but I got following error when running the kernel:

Error
[I 20:21:18.989 NotebookApp] Kernel started: 79352a82-730a-4f5a-b328-be8994935c26, name: pysparkkernel
Traceback (most recent call last):
  File "/opt/conda/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/opt/conda/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/opt/conda/lib/python3.9/site-packages/sparkmagic/kernels/pysparkkernel/pysparkkernel.py", line 34, in 
    IPKernelApp.launch_instance(kernel_class=PySparkKernel)
  File "/opt/conda/lib/python3.9/site-packages/traitlets/config/application.py", line 845, in launch_instance
    app.initialize(argv)
  File "/opt/conda/lib/python3.9/site-packages/traitlets/config/application.py", line 88, in inner
    return method(app, *args, **kwargs)
  File "/opt/conda/lib/python3.9/site-packages/ipykernel/kernelapp.py", line 647, in initialize
    self.init_kernel()
  File "/opt/conda/lib/python3.9/site-packages/ipykernel/kernelapp.py", line 499, in init_kernel
    kernel = kernel_factory(parent=self, session=self.session,
  File "/opt/conda/lib/python3.9/site-packages/traitlets/config/configurable.py", line 540, in instance
    inst = cls(*args, **kwargs)
  File "/opt/conda/lib/python3.9/site-packages/sparkmagic/kernels/pysparkkernel/pysparkkernel.py", line 26, in __init__
    super(PySparkKernel,
  File "/opt/conda/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 53, in __init__
    self._load_magics_extension()
  File "/opt/conda/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 88, in _load_magics_extension
    self._execute_cell(
  File "/opt/conda/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 149, in _execute_cell
    if shutdown_if_error and reply_content[u"status"] == u"error":
TypeError: 'coroutine' object is not subscriptable
pip freeze
alembic @ file:///home/conda/feedstock_root/build_artifacts/alembic_1633571415941/work
altair==4.1.0
anyio @ file:///home/conda/feedstock_root/build_artifacts/anyio_1633972074008/work/dist
argon2-cffi @ file:///home/conda/feedstock_root/build_artifacts/argon2-cffi_1633990448879/work
async-generator==1.10
attrs @ file:///home/conda/feedstock_root/build_artifacts/attrs_1620387926260/work
autovizwidget==0.19.1
Babel @ file:///home/conda/feedstock_root/build_artifacts/babel_1619719576210/work
backcall @ file:///home/conda/feedstock_root/build_artifacts/backcall_1592338393461/work
backports.functools-lru-cache @ file:///home/conda/feedstock_root/build_artifacts/backports.functools_lru_cache_1618230623929/work
bleach @ file:///home/conda/feedstock_root/build_artifacts/bleach_1629908509068/work
blinker==1.4
bokeh==2.4.1
branca==0.4.2
brotlipy==0.7.0
certifi==2021.10.8
certipy==0.1.3
cffi @ file:///home/conda/feedstock_root/build_artifacts/cffi_1631636250774/work
chardet @ file:///home/conda/feedstock_root/build_artifacts/chardet_1610093492116/work
charset-normalizer @ file:///home/conda/feedstock_root/build_artifacts/charset-normalizer_1626371162869/work
chroma-py==0.1.0.dev1
colorama @ file:///home/conda/feedstock_root/build_artifacts/colorama_1602866480661/work
colour==0.1.5
conda==4.10.3
conda-package-handling @ file:///home/conda/feedstock_root/build_artifacts/conda-package-handling_1618231390031/work
cryptography @ file:///home/conda/feedstock_root/build_artifacts/cryptography_1634230275410/work
cycler==0.10.0
debugpy @ file:///home/conda/feedstock_root/build_artifacts/debugpy_1627074853231/work
decorator @ file:///home/conda/feedstock_root/build_artifacts/decorator_1631346842025/work
defusedxml @ file:///home/conda/feedstock_root/build_artifacts/defusedxml_1615232257335/work
entrypoints @ file:///home/conda/feedstock_root/build_artifacts/entrypoints_1602701735325/work/dist/entrypoints-0.3-py2.py3-none-any.whl
folium==0.12.1
geojson==2.5.0
greenlet @ file:///home/conda/feedstock_root/build_artifacts/greenlet_1632928145576/work
hdijupyterutils==0.19.1
idna @ file:///home/conda/feedstock_root/build_artifacts/idna_1609836280497/work
importlib-metadata @ file:///home/conda/feedstock_root/build_artifacts/importlib-metadata_1630267473458/work
importlib-resources @ file:///home/conda/feedstock_root/build_artifacts/importlib_resources_1634509907544/work
ipykernel @ file:///home/conda/feedstock_root/build_artifacts/ipykernel_1631291098355/work/dist/ipykernel-6.4.1-py3-none-any.whl
ipython @ file:///home/conda/feedstock_root/build_artifacts/ipython_1632763773116/work
ipython-genutils==0.2.0
ipywidgets==7.6.5
jedi @ file:///home/conda/feedstock_root/build_artifacts/jedi_1610146787869/work
Jinja2 @ file:///home/conda/feedstock_root/build_artifacts/jinja2_1633656206378/work
json5 @ file:///home/conda/feedstock_root/build_artifacts/json5_1600692310011/work
jsonschema @ file:///home/conda/feedstock_root/build_artifacts/jsonschema_1633875207482/work
jupyter==1.0.0
jupyter-client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1633454794268/work
jupyter-console==6.4.0
jupyter-core @ file:///home/conda/feedstock_root/build_artifacts/jupyter_core_1631852705892/work
jupyter-server @ file:///home/conda/feedstock_root/build_artifacts/jupyter_server_1633398189934/work
jupyter-telemetry @ file:///home/conda/feedstock_root/build_artifacts/jupyter_telemetry_1605173804246/work
jupyterhub @ file:///home/conda/feedstock_root/build_artifacts/jupyterhub-feedstock_1626468614048/work
jupyterlab @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_1634218716496/work
jupyterlab-pygments @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_pygments_1601375948261/work
jupyterlab-server @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_server_1632590716858/work
jupyterlab-widgets==1.0.2
kiwisolver==1.3.2
Mako @ file:///home/conda/feedstock_root/build_artifacts/mako_1629523042001/work
mamba @ file:///home/conda/feedstock_root/build_artifacts/mamba_1634161550232/work
mapboxgl==0.10.2
MarkupSafe @ file:///home/conda/feedstock_root/build_artifacts/markupsafe_1621455668600/work
matplotlib==3.4.2
matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1631080358261/work
mistune @ file:///home/conda/feedstock_root/build_artifacts/mistune_1624941317779/work
mock==4.0.3
nbclassic @ file:///home/conda/feedstock_root/build_artifacts/nbclassic_1631880505492/work
nbclient @ file:///home/conda/feedstock_root/build_artifacts/nbclient_1629120697898/work
nbconvert @ file:///home/conda/feedstock_root/build_artifacts/nbconvert_1632535927841/work
nbformat @ file:///home/conda/feedstock_root/build_artifacts/nbformat_1617383142101/work
nest-asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1617163391303/work
nose==1.3.7
notebook @ file:///home/conda/feedstock_root/build_artifacts/notebook_1631733685426/work
numpy==1.21.2
oauthlib @ file:///home/conda/feedstock_root/build_artifacts/oauthlib_1622563202229/work
packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1625323647219/work
pamela==1.0.0
pandas==1.3.4
pandas-bokeh==0.5.5
pandocfilters @ file:///home/conda/feedstock_root/build_artifacts/pandocfilters_1631603243851/work
parso @ file:///home/conda/feedstock_root/build_artifacts/parso_1617148930513/work
pexpect @ file:///home/conda/feedstock_root/build_artifacts/pexpect_1602535608087/work
pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602535621604/work
Pillow==8.4.0
plotly==5.3.1
prometheus-client @ file:///home/conda/feedstock_root/build_artifacts/prometheus_client_1622586138406/work
prompt-toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1629903925368/work
ptyprocess @ file:///home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl
pycosat @ file:///home/conda/feedstock_root/build_artifacts/pycosat_1610094799048/work
pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1593275161868/work
pycurl==7.44.1
Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1629119114968/work
PyJWT @ file:///home/conda/feedstock_root/build_artifacts/pyjwt_1634405536383/work
pykerberos @ file:///home/conda/feedstock_root/build_artifacts/pykerberos_1621070881309/work
pyOpenSSL @ file:///home/conda/feedstock_root/build_artifacts/pyopenssl_1633192417276/work
pyparsing==2.4.7
pyrsistent @ file:///home/conda/feedstock_root/build_artifacts/pyrsistent_1610146801554/work
PySocks @ file:///home/conda/feedstock_root/build_artifacts/pysocks_1610291451001/work
python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1626286286081/work
python-json-logger @ file:///home/conda/feedstock_root/build_artifacts/python-json-logger_1602545356084/work
pytz @ file:///home/conda/feedstock_root/build_artifacts/pytz_1633452062248/work
PyYAML==6.0
pyzmq @ file:///home/conda/feedstock_root/build_artifacts/pyzmq_1631793304627/work
qtconsole==5.1.1
QtPy==1.11.2
requests @ file:///home/conda/feedstock_root/build_artifacts/requests_1626393743643/work
requests-kerberos @ file:///home/conda/feedstock_root/build_artifacts/requests-kerberos_1612972370917/work
requests-unixsocket==0.2.0
ruamel-yaml-conda @ file:///home/conda/feedstock_root/build_artifacts/ruamel_yaml_1611943432947/work
ruamel.yaml @ file:///home/conda/feedstock_root/build_artifacts/ruamel.yaml_1630314087433/work
ruamel.yaml.clib @ file:///home/conda/feedstock_root/build_artifacts/ruamel.yaml.clib_1610146844134/work
Send2Trash @ file:///home/conda/feedstock_root/build_artifacts/send2trash_1628511208346/work
Shapely==1.7.1
six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work
sniffio @ file:///home/conda/feedstock_root/build_artifacts/sniffio_1610318319523/work
sparkmagic @ git+https://github.com/jupyter-incubator/sparkmagic.git@bf89a4127b6572355d6c3df77e381e1e533d286f#subdirectory=sparkmagic
SQLAlchemy @ file:///home/conda/feedstock_root/build_artifacts/sqlalchemy_1632383405367/work
tenacity==8.0.1
terminado @ file:///home/conda/feedstock_root/build_artifacts/terminado_1631128166466/work
testpath @ file:///home/conda/feedstock_root/build_artifacts/testpath_1621261527237/work
toolz==0.11.1
tornado @ file:///home/conda/feedstock_root/build_artifacts/tornado_1610094708661/work
tqdm @ file:///home/conda/feedstock_root/build_artifacts/tqdm_1632160078689/work
traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1630423529112/work
typing-extensions==3.10.0.2
urllib3 @ file:///home/conda/feedstock_root/build_artifacts/urllib3_1632350318291/work
wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1600965781394/work
webencodings==0.5.1
websocket-client @ file:///home/conda/feedstock_root/build_artifacts/websocket-client_1610127648202/work
widgetsnbextension==3.5.1
zipp @ file:///home/conda/feedstock_root/build_artifacts/zipp_1633302054558/work

I might be doing something wrong. Here's what I put in my requirements.txt:

git+https://github.com/jupyter-incubator/sparkmagic.git@devstein/ipykernel-6#egg=sparkmagic&subdirectory=sparkmagic

The commit in pip freeze matches your branch.

@devstein devstein changed the title [WIP] Support ipykernel >= 6 Support ipykernel >= 6 Oct 25, 2021
@devstein
Copy link
Collaborator Author

@sergiimk Thanks for sharing. The PR was still a WIP, but now has all the tests updated and passing. Would you mind trying again and let me know if you run into the same issue?

@sergiimk
Copy link

thank you @devstein! I'll give it a try this weekend and will report back

@Qiuzhuang
Copy link

Hopefully this PR released ASAP since vscode 1.6+ needs to use ipykernel >= 6 to support debugging.

@sergiimk
Copy link

sergiimk commented Nov 1, 2021

Hi @devstein I finally had time to give this another go (same setup as I described before).

Initially it looked like kernel has started successfully, but I saw the following error in the log:

Exception in callback <TaskWakeupMethWrapper object at 0x7f0212502910>(<Future finis...511"\r\n\r\n'>)
handle: <Handle <TaskWakeupMethWrapper object at 0x7f0212502910>(<Future finis...511"\r\n\r\n'>)>
Traceback (most recent call last):
  File "/opt/conda/lib/python3.9/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
RuntimeError: Cannot enter into task <Task pending name='Task-1' coro=<HTTP1ServerConnection._server_request_loop() running at /opt/conda/lib/python3.9/site-packages/tornado/http1connection.py:823> wait_for=<Future finished result=b'GET /api/co...7511"\r\n\r\n'> cb=[IOLoop.add_future.<locals>.<lambda>() at /opt/conda/lib/python3.9/site-packages/tornado/ioloop.py:688]> while another task <Task pending name='Task-2' coro=<KernelManager._async_start_kernel() running at /opt/conda/lib/python3.9/site-packages/jupyter_client/manager.py:337>> is being executed.

A pretty obscure stack trace...

Attempting to use the notebook then results in errors like:

UsageError: Cell magic `%%spark` not found.

Which indicates that the above error prevented sparkmagic to fully initialize.

I'll take a look at the PR and share ideas if I'll have any.

@sergiimk
Copy link

sergiimk commented Nov 1, 2021

You can build this docker image to reproduce the problem yourself:

FROM jupyter/base-notebook:2021-10-18

USER root
RUN conda install requests-kerberos -y
RUN apt-get update && \
    apt-get -y install git && \
    rm -rf /var/lib/apt/lists/*

USER $NB_USER
RUN pip install --upgrade pip && \
    pip install --upgrade --ignore-installed setuptools && \
    pip install "git+https://github.com/jupyter-incubator/sparkmagic.git@devstein/ipykernel-6#egg=sparkmagic&subdirectory=sparkmagic"

RUN jupyter nbextension enable --py --sys-prefix widgetsnbextension
RUN jupyter-kernelspec install --user $(pip show sparkmagic | grep Location | cut -d" " -f2)/sparkmagic/kernels/pysparkkernel
RUN jupyter serverextension enable --py sparkmagic

@sergiimk
Copy link

sergiimk commented Nov 1, 2021

Sorry, I just realized that the above is an upstream Jupyter issue: jupyter/notebook#6164
The error shows up even with vanilla jupyter/base-notebook:2021-10-28 image in a kernel without sparkmagic.

This means the UsageError: Cell magic %%spark not found. error is unrelated.

@juliusvonkohout
Copy link
Contributor

@devstein what is keeping this from being merged?

@devstein
Copy link
Collaborator Author

devstein commented Jan 18, 2022

Hi @juliusvonkohout the main blocker is I have not been able to get this PR to work with https://github.com/nteract/papermill (cc @MSeal for help 🙏 ). I have not had time to debug to understand the root cause, so I worry about the implications for other common notebook libraries that I'm unaware of. Any help would be appreciated!

@juliusvonkohout
Copy link
Contributor

Hi @juliusvonkohout the main blocker is I have not been able to get this PR to work with https://github.com/nteract/papermill (cc @MSeal for help 🙏 ). I have not had time to debug to understand the root cause, so I worry about the implications for other common notebook libraries that I'm unaware of. Any help would be appreciated!

Sadly i have zero experience with writing kernels or papermill and I do not use the old and deprecated "Notebook" interface anyway, just jupyterlab 3.2. I just fixed some small things in sparkmagic. If if works properly in a recent jupyterlab i would merge it and not care about old deprecated stuff. I am completely occupied until May by my employer and private stuff, so I can only build it myself and test whether it works in Jupyterlab until then.

@itamarst
Copy link
Contributor

@Carreau might you have a few minutes to glance at this and see if it's using IPython APIs appropriately?

@juliusvonkohout
Copy link
Contributor

juliusvonkohout commented Mar 15, 2022

@Carreau might you have a few minutes to glance at this and see if it's using IPython APIs appropriately?

You will also encounter this bug #749. I am currently building from this branch and will report back.

@juliusvonkohout
Copy link
Contributor

I have tested it and it does not work with jupyterlab 3.3.2 ipython 7.32 and ipykernel 6.9.2 if i install this branch.

@Carreau
Copy link
Member

Carreau commented Mar 16, 2022

I had a quick looks but didn't see anything obviously wrong. I'll try to to dig deeper but these days I barely find the time to follow all that is happening on ipykernel and have barely enough bandwidth for IPython itself.

@devstein
Copy link
Collaborator Author

@sergiimk @Qiuzhuang @Carreau @itamarst @juliusvonkohout Thanks for the support and patience. I was finally able to take a day to tackle this.

The latest implementation takes a page from jupyter/notebook and uses nest_asyncio and a custom run_sync function wrapper for loop.run_until_complete to avoid changing every kernel method to async.

The tests are passing and I have seen no issues when doing integration testing manual locally. Please let me know if any of you have concerns with this approach; otherwise, I'll merge this in by end of week (May 1) and release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support IPyKernel >= 6.0.0
6 participants