Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Snowflake] dbt compile throws memory exception for ffi.callback() on M1 Macbook Air #19

Closed
1 of 5 tasks
jplynch77 opened this issue Aug 11, 2021 · 7 comments
Closed
1 of 5 tasks
Labels
bug Something isn't working dependencies Stale

Comments

@jplynch77
Copy link

Describe the bug

This is basically dbt-labs/dbt-core#3162, but I'm still running into the problem with send_anonymous_usage_stats: False as well as trying the 0.21.0-b1 prerelease version where this anonymous_usage issue is patched. dbt compile and dbt run both throw MemoryError: Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks.

Steps To Reproduce

On a 2020 Macbook Air running Big Sur 11.5. Installed dbt as follows

brew install python
python3 -m venv dbt-env  
source dbt-env/bin/activate  
pip install dbt

dbt installs fine with either version, and dbt -version works fine as well. However when I do dbt compile or dbt run I run into the ffi.callback() error. I got this error with both 0.20.0 and 0.21.0-b1.

I know there was an attempted fix as described in this comment but I think that only addresses the case where this is affecting anonymous tracking and the problem here seems to be with the primary snowflake-connector-python connection to Snowflake (see trace below).

There is also an open issue on snowflake-connector-python repo but no progress towards fixing that yet.

Expected behavior

dbt compile and dbt run run without error.

Screenshots and log output

Full trace.

    show terse objects in user_scratch.jlynch_dbt
2021-08-11 02:38:15.323454 (ThreadPoolExecutor-0_0): Rolling back transaction.
2021-08-11 02:38:15.323576 (ThreadPoolExecutor-0_1): Rolling back transaction.
2021-08-11 02:38:15.323678 (ThreadPoolExecutor-0_0): Opening a new connection, currently in state init
2021-08-11 02:38:15.323817 (ThreadPoolExecutor-0_1): Opening a new connection, currently in state init
2021-08-11 02:38:15.423470 (ThreadPoolExecutor-0_1): Error running SQL: macro list_relations_without_caching
2021-08-11 02:38:15.423695 (ThreadPoolExecutor-0_1): Rolling back transaction.
2021-08-11 02:38:15.424342 (ThreadPoolExecutor-0_1): Opening a new connection, currently in state init
2021-08-11 02:38:15.424882 (ThreadPoolExecutor-0_0): Error running SQL: macro list_relations_without_caching
2021-08-11 02:38:15.426007 (ThreadPoolExecutor-0_0): Rolling back transaction.
2021-08-11 02:38:15.426822 (ThreadPoolExecutor-0_0): Opening a new connection, currently in state init
2021-08-11 02:38:15.535091 (MainThread): Connection 'master' was properly closed.
2021-08-11 02:38:15.535311 (MainThread): Connection 'list_analytics_snapshot' was properly closed.
2021-08-11 02:38:15.535435 (MainThread): Connection 'list_user_scratch_jlynch_dbt' was properly closed.
2021-08-11 02:38:15.535643 (MainThread): Flushing usage events
2021-08-11 02:38:15.535784 (MainThread): Encountered an error:
2021-08-11 02:38:15.535869 (MainThread): Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks
2021-08-11 02:38:15.538470 (MainThread): Traceback (most recent call last):
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/dbt/adapters/snowflake/connections.py", line 179, in exception_handler
    yield
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/dbt/adapters/sql/connections.py", line 79, in add_query
    cursor = connection.handle.cursor()
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/dbt/contracts/connection.py", line 81, in handle
    self._handle.resolve(self)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/dbt/contracts/connection.py", line 107, in resolve
    return self.opener(connection)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/dbt/adapters/snowflake/connections.py", line 220, in open
    handle = snowflake.connector.connect(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/__init__.py", line 50, in Connect
    return SnowflakeConnection(**kwargs)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/connection.py", line 273, in __init__
    self.connect(**kwargs)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/connection.py", line 484, in connect
    self.__open_connection()
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/connection.py", line 723, in __open_connection
    self._authenticate(auth_instance)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/connection.py", line 984, in _authenticate
    self.__authenticate(self.__preprocess_auth_instance(auth_instance))
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/connection.py", line 1003, in __authenticate
    auth.authenticate(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/auth.py", line 241, in authenticate
    ret = self._rest._post_request(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 629, in _post_request
    ret = self.fetch(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 719, in fetch
    ret = self._request_exec_wrapper(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 841, in _request_exec_wrapper
    raise e
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 762, in _request_exec_wrapper
    return_object = self._request_exec(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 1049, in _request_exec
    raise err
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/network.py", line 926, in _request_exec
    raw_ret = session.request(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/requests/sessions.py", line 542, in request
    resp = self.send(prep, **send_kwargs)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/requests/sessions.py", line 655, in send
    r = adapter.send(request, **kwargs)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/requests/adapters.py", line 439, in send
    resp = conn.urlopen(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 699, in urlopen
    httplib_response = self._make_request(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 382, in _make_request
    self._validate_conn(conn)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 1010, in _validate_conn
    conn.connect()
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/connection.py", line 392, in connect
    self.ssl_context = create_urllib3_context(
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/util/ssl_.py", line 339, in create_urllib3_context
    context.verify_mode = cert_reqs
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/snowflake/connector/vendored/urllib3/contrib/pyopenssl.py", line 444, in verify_mode
    self._ctx.set_verify(_stdlib_to_openssl_verify[value], _verify_callback)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/OpenSSL/SSL.py", line 1028, in set_verify
    self._verify_helper = _VerifyHelper(callback)
  File "/Users/john.lynch/dbt-env/lib/python3.9/site-packages/OpenSSL/SSL.py", line 331, in __init__
    self.callback = _ffi.callback(
MemoryError: Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks

System information

Which database are you using dbt with?

  • postgres
  • redshift
  • bigquery
  • snowflake
  • other (specify: ____________)

The output of dbt --version:

installed version: 0.20.0
   latest version: 0.20.0

Up to date!

Plugins:
  - snowflake: 0.20.0

The operating system you're using:
MacOS Big Sur 11.5

The output of python --version:
Python 3.9.6

@jtcohen6
Copy link
Contributor

@jplynch77 Thanks for re-opening. This is frustrating, and I'm not sure if there's anything we can do about it until snowflakedb/snowflake-connector-python#799 is patched. Am I thinking about that right? Over in dbt-labs/dbt-core#3162, we talked about potentially figuring out a way to undo the monkey-patch of pyOpenSSL into urllib3, but it was more than we could muster after a few hours of looking into it.

At the very least, we have relaxed our dependency on snowflake-connector-python, so as soon as they put out a fix for this in a patch release, we should be golden. In the meantime, installing dbt-snowflake on M1s via Rosetta feels like the way to go.

@jtcohen6 jtcohen6 changed the title dbt compile throws memory exception for ffi.callback() on M1 Macbook Air [Snowflake] dbt compile throws memory exception for ffi.callback() on M1 Macbook Air Aug 11, 2021
@poudrouxj
Copy link

poudrouxj commented Aug 12, 2021

Subscribed - we've also had a user reporting on this with dbt compile as it attempts to authenticate against our dwh (snowflake). Instructed the user to test the rosetta installation method, I'll reply here once we get it tested (not on a M1 arch so cannot test it myself)

Edit: SUCCESS:
Running brew as such 👇 worked for our end-user

# Install Homebrew for x86_64 architecture
# https://soffes.blog/homebrew-on-apple-silicon
arch -x86_64 /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"


# Install dbt using that version of Homebrew
arch -x86_64 /usr/local/bin/brew tap dbt-labs/dbt
arch -x86_64 /usr/local/bin/brew install dbt@0.20.1

@jplynch77
Copy link
Author

@jtcohen6 Yeah as far as I can tell patching the underlying problem in connector is the only real path forward that's not a hack. I just don't know if anyone's even trying to do that though. Based on this comment it seems like it's might be quite difficult to fix the underlying issue and it doesn't seem like there's been any progress.

So if there is some patch you all can figure out, or if the Snowflake people can figure something else out for the connector, I think that's probably going to the way to get this working in the short/medium term.

I did get it installed via Rosetta as @poudrouxj points out, it just makes me sad that I have to resort to that. 😢

@jtcohen6
Copy link
Contributor

@jplynch77 Just left a comment over there: snowflakedb/snowflake-connector-python#799 (comment). If you could give that a read, and let me know if it makes sense to you, I'd really appreciate it.

I'd love to figure out a way for us to just call extract_from_urllib3(), and undo the monkey-patch, but I don't think we can while the Snowflake connector is using a vendored version of urllib3... right?

@thomasaarholt
Copy link

I struggled getting the VSCode DBT Power User extension working along the brew-installed DBT by @poudrouxj's method above. That is, getting hyperlink table names wasn't working.

Instead, I installed the rosetta version of one of miniconda/miniforge/[mambaforge]https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-MacOSX-x86_64.sh (I've linked direct links. If downloading on your own, just specify the x86_64 or Intel version) and install as normal, placing it at e.g. ~/miniconda_rosetta to avoid conflicts with your normal python installation.

I then created an alias in ~/.zprofile: alias rosetta='eval "$(/Users/thomas/miniconda_rosetta/bin/conda shell.zsh hook)"'. Now, when I call rosetta, I get a base conda environment that is running in rosetta.

In that base environment I create a dbt environment and install it as usual.

# install x86 conda at ~/miniconda_rosetta 
# add `alias rosetta='eval "$(/Users/thomas/miniconda_rosetta/bin/conda shell.zsh hook)"'` to profile, restart terminal
rosetta
conda create -n dbt python=3.9
conda activate dbt
# Check that `conda info` includes `platform : osx-64` and not `platform : osx-arm64`, or you installed normal conda, not x86
pip install dbt # executable will be installed in ~/miniconda_rosetta/envs/dbt/bin/dbt

Now, in VSCode, I can specify the dbt conda environment, and poweruser works as expected.

@jtcohen6 jtcohen6 transferred this issue from dbt-labs/dbt-core Oct 12, 2021
@jtcohen6 jtcohen6 added bug Something isn't working dependencies labels Oct 12, 2021
@bpmcd
Copy link

bpmcd commented Dec 20, 2021

Instead of the workaround above (installing Homebrew via Rosetta), I customized a docker image based on this image https://hub.docker.com/r/xemuliam/dbt. Docker Desktop now natively supports M1 architectures, and this image can build arm64 Linux containers that do not exhibit the FFI issue.

@github-actions
Copy link
Contributor

This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working dependencies Stale
Projects
None yet
Development

No branches or pull requests

5 participants