Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resolve can be slow for specific packages #2134

Closed
2 tasks done
fecet opened this issue Sep 25, 2024 · 5 comments
Closed
2 tasks done

Resolve can be slow for specific packages #2134

fecet opened this issue Sep 25, 2024 · 5 comments
Labels
conda Issue related to Conda dependencies ⏩ performance An issue related to performance

Comments

@fecet
Copy link
Contributor

fecet commented Sep 25, 2024

Checks

  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of pixi, using pixi --version.

Reproducible example

name: default
channels:
- conda-forge
- nodefaults
dependencies:
- cmake *
- compilers *
- gcc ==11.4
- pyspark *

save this as environment.yml, then solve it with

pixi init --import environment.yml
pixi project export conda-explicit-spec . -vvvv

Issue description

Solve takes a long time (90s on my machine), as a comparsion, conda-lock takes 55s

Expected behavior

The solve should not be that slow, or at least as fast as conda-lock

@fecet
Copy link
Contributor Author

fecet commented Sep 25, 2024

@synapticarbors from #1873 to here, maybe related to mamba-org/resolvo#49.

@ruben-arts
Copy link
Contributor

Thanks for the information! This was merged some time ago which has resulted in better results in some cases but we're now seeing some edge cases where it got much slower.

FYI @baszalmstra

Tip

While we're working on improving the speed. You can always add more constrained requirements to help the solver avoid unwanted paths in the traversal of the solution space. A good one to always add is the minimal version of python and in this specific case you can probably also limit cmake.

@ruben-arts ruben-arts added ⏩ performance An issue related to performance conda Issue related to Conda dependencies labels Sep 25, 2024
@lambdaxdotx
Copy link

I can confirm that upgrading to 0.30.0 makes the resolving conda stage of a pixi update way, way slower than before. It really takes ages.

Here is my pyproject.toml just edited a little bit to remove private info:

$ cat pyproject.toml 
[project]
authors = [...]
requires-python = ">= 3.10"

[build-system]
build-backend = "hatchling.build"
requires = ["hatchling"]

[tool.pixi.project]
channels = ["conda-forge", "nvidia", "pytorch", "pyg"]
platforms = ["linux-64"]

[tool.pixi.pypi-dependencies]

[tool.pixi.tasks]

[tool.pixi.dependencies]
python = ">=3.10,<3.11"
black = ">=24.8.0,<25"
tqdm = ">=4.64.1,<5"
gitpython = ">=3.1.27,<4"
matplotlib = ">=3.9.2,<4"
seaborn = ">=0.13.2,<0.14"
scikit-learn = ">=1.5.1,<2"
imbalanced-learn = ">=0.12.3,<0.13"
termcolor = ">=2.4.0,<3"
numpy = ">=1.23.2,<2"
pandas = ">=1.4.4,<2"
xgboost = ">=1.6.2,<1.7.0"
pre-commit = ">=3.8.0,<4"
tensorflow = ">=2.9.1,<3"
keras = ">=2.9.0,<3"
clang-11 = ">=11.1.0,<12"
pytorch = { version = ">=2.2.1,<2.3", channel = "pytorch" }
pyg = { version = ">=2.5.1,<2.6", channel = "pyg" }
pytorch-sparse = { version = ">=0.6.18,<0.7", channel = "pyg" }
pytorch-scatter = { version = ">=2.1.2,<2.2", channel = "pyg" }
pytorch-cluster = { version = ">=1.6.3,<1.7", channel = "pyg" }
huggingface_hub = ">=0.21.4,<0.22"
transformers = ">=4.40.1,<4.41"
wandb = ">=0.17.3,<0.18"

HTH

@0xbe7a
Copy link
Contributor

0xbe7a commented Sep 25, 2024

This takes 40minutes to solve with 0.30.0

[project]
channels = ["conda-forge"]
name = "slow-pixi-solve"
platforms = ["osx-arm64"]

[dependencies]
ipython = "*"
awscli = ">=2"

@ruben-arts
Copy link
Contributor

This was fixed in #2162, Thanks for providing examples that helped a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
conda Issue related to Conda dependencies ⏩ performance An issue related to performance
Projects
None yet
Development

No branches or pull requests

4 participants