diff --git a/latest b/latest index 88b72b96e..76b2d23e8 120000 --- a/latest +++ b/latest @@ -1 +1 @@ -v0.39.2-prerelease.2 \ No newline at end of file +v0.39.2 \ No newline at end of file diff --git a/v0.39.2/404.html b/v0.39.2/404.html new file mode 100644 index 000000000..34fb6ab60 --- /dev/null +++ b/v0.39.2/404.html @@ -0,0 +1,1535 @@ + + + + + + + + + + + + + + + + + + + Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ +

404 - Not found

+ +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/CHANGELOG/index.html b/v0.39.2/CHANGELOG/index.html new file mode 100644 index 000000000..f8c72aa4b --- /dev/null +++ b/v0.39.2/CHANGELOG/index.html @@ -0,0 +1,4996 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Changelog - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Changelog#

+

All notable changes to this project will be documented in this file.

+

The format is based on Keep a Changelog, +and this project adheres to Semantic Versioning.

+

[0.39.2] - 2024-12-11#

+

Patch release to fix the binary generation in CI.

+

[0.39.1] - 2024-12-09#

+

Added#

+
    +
  • Add proper unit testing for PyPI installation and fix re-installation issues by @tdejager in #2617
  • +
  • Add detailed json output for task list by @jjjermiah in #2608
  • +
  • Add pixi project name CLI by @LiamConnors in #2649
  • +
+

Changed#

+
    +
  • Use fs-err in more places by @Hofer-Julian in #2636
  • +
+

Documentation#

+
    +
  • Remove tclf from community.md📑 by @KarelZe in #2619
  • +
  • Update contributing guide by @LiamConnors in #2650
  • +
  • Update clean cache CLI doc by @LiamConnors in #2657
  • +
+

Fixed#

+
    +
  • Color formatting detection on stdout by @blmaier in #2613
  • +
  • Use correct dependency location for pixi upgrade by @Hofer-Julian in #2472
  • +
  • Regression detached-environments not used by @ruben-arts in #2627
  • +
  • Allow configuring pypi insecure host by @zen-xu in #2521#2622
  • +
+

Refactor#

+
    +
  • Rework CI and use cargo-dist for releases by @baszalmstra in #2566
  • +
+

pixi build Preview work#

+
    +
  • Refactor to [build-system.build-backend] by @baszalmstra in #2601
  • +
  • Remove ipc override from options and give it manually to test by @wolfv in #2629
  • +
  • Pixi build trigger rebuild by @Hofer-Julian in #2641
  • +
  • Add variant config to [workspace.build-variants] by @wolfv in #2634
  • +
  • Add request coalescing for isolated tools by @nichmor in #2589
  • +
  • Add example using rich and pixi-build-python and remove flask by @Hofer-Julian in #2638
  • +
  • (simple) build tool override by @wolfv in #2620
  • +
  • Add caching of build tool installation by @nichmor in #2637
  • +
+

New Contributors#

+
    +
  • @blmaier made their first contribution in #2613
  • +
+

[0.39.0] - 2024-12-02#

+

✨ Highlights#

+
    +
  • We now have a new concurrency configuration in the pixi.toml file. +This allows you to set the number of concurrent solves or downloads that can be run at the same time.
  • +
  • We changed the way pixi searches for a pixi manifest. Where it was previously first considering the activated pixi shell, it will now search first in the current directory and its parent directories. more info
  • +
  • The lockfile format is changed to make it slightly smaller and support source dependencies.
  • +
+

Added#

+
    +
  • Add concurrency configuration by @ruben-arts in #2569
  • +
+

Changed#

+
    +
  • Add XDG_CONFIG_HOME/.config to search of pixi global manifest path by @hoxbro in #2547
  • +
  • Let pixi global sync collect errors rather than returning early by @Hofer-Julian in #2586
  • +
  • Allow configuring pypi insecure host by @zen-xu in #2521
  • +
  • Reorder manifest discovery logic by @Hofer-Julian in #2564
  • +
+

Documentation#

+
    +
  • Improve pixi manifest by @Hofer-Julian in #2596
  • +
+

Fixed#

+
    +
  • pixi global list failing for empty environments by @Hofer-Julian in #2571
  • +
  • Macos activation cargo vars by @ruben-arts in #2578
  • +
  • Trampoline without corresponding json breaking by @Hofer-Julian in #2576
  • +
  • Ensure pinning strategy is not affected by non-semver packages by @seowalex in #2580
  • +
  • Pypi installs happening every time by @tdejager in #2587
  • +
  • pixi global report formatting by @Hofer-Julian in #2595
  • +
  • Improve test speed and support win-arm64 by @baszalmstra in #2597
  • +
  • Update Task::Alias to return command description by @jjjermiah in #2607
  • +
+

Refactor#

+
    +
  • Split install pypi into module and files by @tdejager in #2590
  • +
  • PyPI installation traits + deduplication by @tdejager in #2599
  • +
+

Pixi build#

+

We've merged in the main pixi build feature branch. This is a big change but shouldn't have affected any of the current functionality. +If you notice any issues, please let us know.

+

It can be turned on by preview = "pixi-build" in your pixi.toml file. It's under heavy development so expect breaking changes in that feature for now.

+
    +
  • Preview of pixi build and workspaces by @tdejager in #2250
  • +
  • Build recipe yaml directly by @wolfv in #2568
  • +
+

New Contributors#

+
    +
  • @seowalex made their first contribution in #2580
  • +
+

[0.38.0] - 2024-11-26#

+

✨ Highlights#

+
    +
  • Specify pypi-index per pypi-dependency +
    [pypi-dependencies]
    +pytorch ={ version = "*", index = "https://download.pytorch.org/whl/cu118" }
    +
  • +
  • [dependency-groups] (PEP735) support in pyproject.toml +
    [dependency-groups]
    +test = ["pytest"]
    +docs = ["sphinx"]
    +dev = [{include-group = "test"}, {include-group = "docs"}]
    +
    +[tool.pixi.environments]
    +dev = ["dev"]
    +
  • +
  • Much improved pixi search output!
  • +
+

Added#

+
    +
  • Add pypi index by @nichmor in #2416
  • +
  • Implement PEP735 support by @olivier-lacroix in #2448
  • +
  • Extends manifest to allow for preview features by @tdejager in #2489
  • +
  • Add versions/build list to pixi search output by @delsner in #2440
  • +
  • Expose nested executables in pixi global by @bahugo in #2362
  • +
+

Fixed#

+
    +
  • Always print a warning when config is invalid by @Hofer-Julian in #2508
  • +
  • Incorrectly saving absolute base as path component by @tdejager in #2501
  • +
  • Keep the case when getting the executable in pixi global by @wolfv in #2528
  • +
  • Install script on win-arm64 by @baszalmstra in #2538
  • +
  • Trampoline installation on pixi global update by @nichmor in #2530
  • +
  • Update the PATH env var with dynamic elements on pixi global by @wolfv in #2541
  • +
  • Correct ppc64le arch by @wolfv in #2540
  • +
+

Performance#

+
    +
  • Experimental environment activation cache by @ruben-arts in #2367
  • +
+

Documentation#

+
    +
  • Update project structure in Python tutorial by @LiamConnors in #2506
  • +
  • Fix typo in pixi project export conda-environment by @nmarticorena in #2533
  • +
  • Fix wrong use of underscores in pixi project export by @traversaro in #2539
  • +
  • Adapt completion instructions by @Hofer-Julian in #2561
  • +
+

New Contributors#

+
    +
  • @nmarticorena made their first contribution in #2533
  • +
  • @delsner made their first contribution in #2440
  • +
+

[0.37.0] - 2024-11-18#

+

✨ Highlights#

+

We now allow the use of prefix.dev channels with sharded repodata:

+

Running pixi search rubin-env using hyperfine on the default versus our channels gives these results:

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Cache StatusChannelMean [ms]Relative
With cachehttps://prefix.dev/conda-forge69.31.00
Withouthttps://prefix.dev/conda-forge389.55.62
With cachehttps://conda.anaconda.org/conda-forge1043.315.06
Withouthttps://conda.anaconda.org/conda-forge2420.334.94
+

Breaking#

+
    +
  • Make sure that [activation.env] are not completely overridden by [target. tables, by @hameerabbasi in #2396
  • +
+

Changed#

+
    +
  • Allow using sharded repodata by @baszalmstra in #2467
  • +
+

Documentation#

+
    +
  • Update ros2.md turtlesim section by @nbbrooks in #2442
  • +
  • Update pycharm.md to show optional installation by @plainerman in #2487
  • +
  • Fix typo in documentation by @saraedum in #2496
  • +
  • Update pixi install output by @LiamConnors in #2495
  • +
+

Fixed#

+
    +
  • Incorrect python version was used in some parts of the solve by @tdejager in #2481
  • +
  • Wrong description on pixi upgrade by @notPlancha in #2483
  • +
  • Extra test for mismatch in python versions by @tdejager in #2485
  • +
  • Keep build in pixi upgrade by @ruben-arts in #2476
  • +
+

New Contributors#

+
    +
  • @saraedum made their first contribution in #2496
  • +
  • @plainerman made their first contribution in #2487
  • +
  • @hameerabbasi made their first contribution in #2396
  • +
  • @nbbrooks made their first contribution in #2442
  • +
+

[0.36.0] - 2024-11-07#

+

✨ Highlights#

+
    +
  • You can now pixi upgrade your project dependencies.
  • +
  • We've done a performance improvement on the prefix validation check, thus faster pixi run startup times.
  • +
+

Added#

+
    +
  • Add powerpc64le target to trampoline by @ruben-arts in #2419
  • +
  • Add trampoline tests again by @Hofer-Julian in #2420
  • +
  • Add pixi upgrade by @Hofer-Julian in #2368
  • +
  • Add platform fallback win-64 for win-arm64 by @chawyehsu in #2427
  • +
  • Add --prepend option for pixi project channel add by @mrswastik-robot in #2447
  • +
+

Documentation#

+
    +
  • Fix cli basic usage example by @lucascolley in #2432
  • +
  • Update python tutorial by @LiamConnors in #2452
  • +
  • Improve pixi global docs by @Hofer-Julian in #2437
  • +
+

Fixed#

+
    +
  • Use --silent instead of --no-progress-meter for old curl by @jaimergp in #2428
  • +
  • Search should return latest package across all platforms by @nichmor in #2424
  • +
  • Trampoline unwraps by @ruben-arts in #2422
  • +
  • PyPI Index usage (regression in v0.35.0) by @tdejager in #2465
  • +
  • PyPI git dependencies (regression in v0.35.0) by @wolfv in #2438
  • +
  • Tolerate pixi file errors (regression in v0.35.0) by @jvenant in #2457
  • +
  • Make sure tasks are fetched for best platform by @jjjermiah in #2446
  • +
+

Performance#

+
    +
  • Quick prefix validation check by @ruben-arts in #2400
  • +
+

New Contributors#

+
    +
  • @jvenant made their first contribution in #2457
  • +
  • @mrswastik-robot made their first contribution in #2447
  • +
  • @LiamConnors made their first contribution in #2452
  • +
+

[0.35.0] - 2024-11-05#

+

✨ Highlights#

+

pixi global now exposed binaries are not scripts anymore but actual executables. +Resulting in significant speedup and better compatibility with other tools.

+

Added#

+
    +
  • Add language packages with minor pinning by default by @ruben-arts in #2310
  • +
  • Add grouping for exposing and removing by @nichmor in #2387
  • +
  • Add trampoline for pixi global by @Hofer-Julian and @nichmor in #2381
  • +
  • Adding SCM option for init command by @alvgaona in #2342
  • +
  • Create .pixi/.gitignore containing * by @maresb in #2361
  • +
+

Changed#

+
    +
  • Use the same package cache folder by @nichmor in #2335zx
  • +
  • Disable progress in non tty by @ruben-arts in #2308
  • +
  • Improve global install reporting by @Hofer-Julian in #2395
  • +
  • Suggest fix in platform error message by @maurosilber in #2404
  • +
  • Upgrading uv to 0.4.30 by @tdejager in #2372
  • +
+

Documentation#

+
    +
  • Add pybind11 example by @alvgaona in #2324
  • +
  • Replace build with uv in pybind11 example by @alvgaona in #2341
  • +
  • Fix incorrect statement about env location by @opcode81 in #2370
  • +
+

Fixed#

+
    +
  • Global update reporting by @Hofer-Julian in #2352
  • +
  • Correctly display unrequested environments on task list by @jjjermiah in #2402
  • +
+

Refactor#

+
    +
  • Use built in string methods by @KGrewal1 in #2348
  • +
  • Reorganize integration tests by @Hofer-Julian in #2408
  • +
  • Reimplement barrier cell on OnceLock by @KGrewal1 in #2347
  • +
+

New Contributors#

+
    +
  • @maurosilber made their first contribution in #2404
  • +
  • @opcode81 made their first contribution in #2370
  • +
  • @alvgaona made their first contribution in #2342
  • +
+

[0.34.0] - 2024-10-21#

+

✨ Highlights#

+
    +
  • pixi global install now takes a flag --with, inspired by uv tool install. If you only want to add dependencies without exposing them, you can now run pixi global install ipython --with numpy --with matplotlib
  • +
  • Improved the output of pixi global subcommands
  • +
  • Many bug fixes
  • +
+

Added#

+
    +
  • Add timeouts by @Hofer-Julian in #2311
  • +
+

Changed#

+
    +
  • Global update should add new executables by @nichmor in #2298
  • +
+
    +
  • Add pixi global install --with by @Hofer-Julian in #2332
  • +
+

Documentation#

+
    +
  • Document where pixi-global.toml can be found by @Hofer-Julian in #2304
  • +
+
    +
  • Add ros noetic example by @ruben-arts in #2271
  • +
+
    +
  • Add nichita and julian to CITATION.cff by @Hofer-Julian in #2327
  • +
+
    +
  • Improve keyring documentation to use pixi global by @olivier-lacroix in #2318
  • +
+

Fixed#

+
    +
  • pixi global upgrade-all error message by @Hofer-Julian in #2296
  • +
+
    +
  • Select correct run environment by @ruben-arts in #2301
  • +
+
    +
  • Adapt channels to work with newest rattler-build version by @Hofer-Julian in #2306
  • +
+
    +
  • Hide obsolete commands in help page of pixi global by @chawyehsu in #2320
  • +
+
    +
  • Typecheck all tests by @Hofer-Julian in #2328
  • +
+

Refactor#

+
    +
  • Improve upload errors by @ruben-arts in #2303
  • +
+

New Contributors#

+
    +
  • @gerlero made their first contribution in #2300
  • +
+

[0.33.0] - 2024-10-16#

+

✨ Highlights#

+

This is the first release with the new pixi global implementation. It's a full reimplementation of pixi global where it now uses a manifest file just like pixi projects. This way you can declare your environments and save them to a VCS.

+

It also brings features like, adding dependencies to a global environment, and exposing multiple binaries from the same environment that are not part of the main installed packages.

+

Test it out with: +

# Normal feature
+pixi global install ipython
+
+# New features
+pixi global install \
+    --environment science \           # Defined the environment name
+    --expose scipython=ipython \      # Expose binaries under custom names
+    ipython scipy                     # Define multiple dependencies for one environment
+

+

This should result in a manifest in $HOME/.pixi/manifests/pixi-global.toml: +

version = 1
+
+[envs.ipython]
+channels = ["conda-forge"]
+dependencies = { ipython = "*" }
+exposed = { ipython = "ipython", ipython3 = "ipython3" }
+
+[envs.science]
+channels = ["conda-forge"]
+dependencies = { ipython = "*", scipy = "*" }
+exposed = { scipython = "ipython" }
+

+

📖 Documentation#

+

Checkout the updated documentation on this new feature: +- Main documentation on this tag: https://pixi.sh/v0.33.0/ +- Global CLI documentation: https://pixi.sh/v0.33.0/reference/cli/#global +- The implementation documentation: https://pixi.sh/v0.33.0/features/global_tools/ +- The initial design proposal: https://pixi.sh/v0.33.0/design_proposals/pixi_global_manifest/

+

[0.32.2] - 2024-10-16#

+

✨ Highlights#

+
    +
  • pixi self-update will only work on the binaries from the GitHub releases, avoiding accidentally breaking the installation.
  • +
  • We now support gcs:// conda registries.
  • +
  • No more broken PowerShell after using pixi shell.
  • +
+

Changed#

+
    +
  • Add support for gcs:// conda registries by @clement-chaneching in #2263
  • +
+

Documentation#

+
    +
  • Small fixes in tutorials/python.md by @carschandler in #2252
  • +
  • Update pixi list docs by @Hofer-Julian in #2269
  • +
+

Fixed#

+
    +
  • Bind ctrl c listener so that it doesn't interfere on powershell by @wolfv in #2260
  • +
  • Explicitly run default environment by @ruben-arts in #2273
  • +
  • Parse env name on adding by @ruben-arts in #2279
  • +
+

Refactor#

+
    +
  • Make self-update a compile time feature by @freundTech in #2213
  • +
+

New Contributors#

+
    +
  • @clement-chaneching made their first contribution in #2263
  • +
  • @freundTech made their first contribution in #2213
  • +
+

[0.32.1] - 2024-10-08#

+

Fixes#

+
    +
  • Bump Rust version to 1.81 by @wolfv in #2227
  • +
+

Documentation#

+
    +
  • Pixi-pack, docker, devcontainer by @pavelzw in #2220
  • +
+

[0.32.0] - 2024-10-08#

+

✨ Highlights#

+

The biggest fix in this PR is the move to the latest rattler as it came with some major bug fixes for macOS and Rust 1.81 compatibility.

+

Changed#

+
    +
  • Correctly implement total ordering for dependency provider by @tdejager in rattler/#892
  • +
+

Fixed#

+
    +
  • Fixed self-clobber issue when up/down grading packages by @wolfv in rattler/#893
  • +
  • Check environment name before returning not found print by @ruben-arts in #2198
  • +
  • Turn off symlink follow for task cache by @ruben-arts in #2209
  • +
+

[0.31.0] - 2024-10-03#

+

✨ Highlights#

+

Thanks to our maintainer @baszamstra! +He sped up the resolver for all cases we could think of in #2162 +Check the result of times it takes to solve the environments in our test set: +image

+

Added#

+
    +
  • Add nodefaults to imported conda envs by @ruben-arts in #2097
  • +
  • Add newline to .gitignore by @ruben-arts in #2095
  • +
  • Add --no-activation option to prevent env activation during global install/upgrade by @183amir in #1980
  • +
  • Add --priority arg to project channel add by @minrk in #2086
  • +
+

Changed#

+
    +
  • Use pixi spec for conda environment yml by @ruben-arts in #2096
  • +
  • Update rattler by @nichmor in #2120
  • +
  • Update README.md by @ruben-arts in #2129
  • +
  • Follow symlinks while walking files by @0xbe7a in #2141
  • +
+

Documentation#

+
    +
  • Adapt wording in pixi global proposal by @Hofer-Julian in #2098
  • +
  • Community: add array-api-extra by @lucascolley in #2107
  • +
  • pixi global mention no-activation by @Hofer-Julian in #2109
  • +
  • Add minimal constructor example by @bollwyvl in #2102
  • +
  • Update global manifest install by @Hofer-Julian in #2128
  • +
  • Add description for pixi update --json by @scottamain in #2160
  • +
  • Fixes backticks for doc strings by @rachfop in #2174
  • +
+

Fixed#

+
    +
  • Sort exported conda explicit spec topologically by @synapticarbors in #2101
  • +
  • --import env_file breaks channel priority by @fecet in #2113
  • +
  • Allow exact yanked pypi packages by @nichmor in #2116
  • +
  • Check if files are same in self-update by @apoorvkh in #2132
  • +
  • get_or_insert_nested_table by @Hofer-Julian in #2167
  • +
  • Improve install.sh PATH handling and general robustness by @Arcitec in #2189
  • +
  • Output tasks on pixi run without input by @ruben-arts in #2193
  • +
+

Performance#

+
    +
  • Significantly speed up conda resolution by @baszalmstra in #2162
  • +
+

New Contributors#

+
    +
  • @Arcitec made their first contribution in #2189
  • +
  • @rachfop made their first contribution in #2174
  • +
  • @scottamain made their first contribution in #2160
  • +
  • @apoorvkh made their first contribution in #2132
  • +
  • @0xbe7a made their first contribution in #2141
  • +
  • @fecet made their first contribution in #2113
  • +
  • @minrk made their first contribution in #2086
  • +
  • @183amir made their first contribution in #1980
  • +
  • @lucascolley made their first contribution in #2107
  • +
+

[0.30.0] - 2024-09-19#

+

✨ Highlights#

+

I want to thank @synapticarbors and @abkfenris for starting the work on pixi project export. +Pixi now supports the export of a conda environment.yml file and a conda explicit specification file. +This is a great addition to the project and will help users to share their projects with other non pixi users.

+

Added#

+
    +
  • Export conda explicit specification file from project by @synapticarbors in #1873
  • +
  • Add flag to pixi search by @Hofer-Julian in #2018
  • +
  • Adds the ability to set the index strategy by @tdejager in #1986
  • +
  • Export conda environment.yml by @abkfenris in #2003
  • +
+

Changed#

+
    +
  • Improve examples/docker by @jennydaman in #1965
  • +
  • Minimal pre-commit tasks by @Hofer-Julian in #1984
  • +
  • Improve error and feedback when target does not exist by @tdejager in #1961
  • +
  • Move the rectangle using a mouse in SDL by @certik in #2069
  • +
+

Documentation#

+
    +
  • Update cli.md by @xela-95 in #2047
  • +
  • Update system-requirements information by @ruben-arts in #2079
  • +
  • Append to file syntax in task docs by @nicornk in #2013
  • +
  • Change documentation of pixi upload to refer to correct API endpoint by @traversaro in #2074
  • +
+

Testing#

+
    +
  • Add downstream nerfstudio test by @tdejager in #1996
  • +
  • Run pytests in parallel by @tdejager in #2027
  • +
  • Testing common wheels by @tdejager in #2031
  • +
+

Fixed#

+
    +
  • Lock file is always outdated for pypi path dependencies by @nichmor in #2039
  • +
  • Fix error message for export conda explicit spec by @synapticarbors in #2048
  • +
  • Use conda-pypi-map for feature channels by @nichmor in #2038
  • +
  • Constrain feature platforms in schema by @bollwyvl in #2055
  • +
  • Split tag creation functions by @tdejager in #2062
  • +
  • Tree print to pipe by @ruben-arts in #2064
  • +
  • subdirectory in pypi url by @ruben-arts in #2065
  • +
  • Create a GUI application on Windows, not Console by @certik in #2067
  • +
  • Make dashes underscores in python package names by @ruben-arts in #2073
  • +
  • Give better errors on broken pyproject.toml by @ruben-arts in #2075
  • +
+

Refactor#

+
    +
  • Stop duplicating strip_channel_alias from rattler by @Hofer-Julian in #2017
  • +
  • Follow-up wheels tests by @Hofer-Julian in #2063
  • +
  • Integration test suite by @Hofer-Julian in #2081
  • +
  • Remove psutils by @Hofer-Julian in #2083
  • +
  • Add back older caching method by @tdejager in #2046
  • +
  • Release script by @Hofer-Julian in #1978
  • +
  • Activation script by @Hofer-Julian in #2014
  • +
  • Pins python version in add_pypi_functionality by @tdejager in #2040
  • +
  • Improve the lock_file_usage flags and behavior. by @ruben-arts in #2078
  • +
  • Move matrix to workflow that it is used in by @tdejager in #1987
  • +
  • Refactor manifest into more generic approach by @nichmor in #2015
  • +
+

New Contributors#

+
    +
  • @certik made their first contribution in #2069
  • +
  • @xela-95 made their first contribution in #2047
  • +
  • @nicornk made their first contribution in #2013
  • +
  • @jennydaman made their first contribution in #1965
  • +
+

[0.29.0] - 2024-09-04#

+

✨ Highlights#

+
    +
  • Add build-isolation options, for more details check out our docs
  • +
  • Allow to use virtual package overrides from environment variables (PR)
  • +
  • Many bug fixes
  • +
+

Added#

+
    +
  • Add build-isolation options by @tdejager in #1909
  • +
+
    +
  • Add release script by @Hofer-Julian in #1971
  • +
+

Changed#

+
    +
  • Use rustls-tls instead of native-tls per default by @Hofer-Julian in #1929
  • +
+
    +
  • Upgrade to uv 0.3.4 by @tdejager in #1936
  • +
+
    +
  • Upgrade to uv 0.4.0 by @tdejager in #1944
  • +
+
    +
  • Better error for when the target or platform are missing by @tdejager in #1959
  • +
+
    +
  • Improve integration tests by @Hofer-Julian in #1958
  • +
+
    +
  • Improve release script by @Hofer-Julian in #1974
  • +
+

Fixed#

+
    +
  • Update env variables in installation docs by @lev112 in #1937
  • +
+
    +
  • Always overwrite when pixi adding the dependency by @ruben-arts in #1935
  • +
+
    +
  • Typo in schema.json by @SobhanMP in #1948
  • +
+
    +
  • Using file url as mapping by @nichmor in #1930
  • +
+
    +
  • Offline mapping should not request by @nichmor in #1968
  • +
+
    +
  • pixi init for pyproject.toml by @Hofer-Julian in #1947
  • +
+
    +
  • Use two in memory indexes, for resolve and builds by @tdejager in #1969
  • +
+
    +
  • Minor issues and todos by @KGrewal1 in #1963
  • +
+

Refactor#

+
    +
  • Improve integration tests by @Hofer-Julian in #1942
  • +
+

New Contributors#

+
    +
  • @SobhanMP made their first contribution in #1948
  • +
  • @lev112 made their first contribution in #1937
  • +
+

[0.28.2] - 2024-08-28#

+

Changed#

+
    +
  • Use mold on linux by @Hofer-Julian in #1914
  • +
+

Documentation#

+
    +
  • Fix global manifest by @Hofer-Julian in #1912
  • +
  • Document azure keyring usage by @tdejager in #1913
  • +
+

Fixed#

+
    +
  • Let init add dependencies independent of target and don't install by @ruben-arts in #1916
  • +
  • Enable use of manylinux wheeltags once again by @tdejager in #1925
  • +
  • The bigger runner by @ruben-arts in #1902
  • +
+

[0.28.1] - 2024-08-26#

+

Changed#

+
    +
  • Uv upgrade to 0.3.2 by @tdejager in #1900
  • +
+

Documentation#

+
    +
  • Add keyrings.artifacts to the list of project built with pixi by @jslorrma in #1908
  • +
+

Fixed#

+
    +
  • Use default indexes if non where given by the lockfile by @ruben-arts in #1910
  • +
+

New Contributors#

+
    +
  • @jslorrma made their first contribution in #1908
  • +
+

[0.28.0] - 2024-08-22#

+

✨ Highlights#

+
    +
  • Bug Fixes: Major fixes in general but especially for PyPI installation issues and better error messaging.
  • +
  • Compatibility: Default Linux version downgraded to 4.18 for broader support.
  • +
  • New Features: Added INIT_CWD in pixi run, improved logging, and more cache options.
  • +
+

Added#

+
    +
  • Add INIT_CWD to activated env pixi run by @ruben-arts in #1798
  • +
  • Add context to error when parsing conda-meta files by @baszalmstra in #1854
  • +
  • Add some logging for when packages are actually overridden by conda by @tdejager in #1874
  • +
  • Add package when extra is added by @ruben-arts in #1856
  • +
+

Changed#

+
    +
  • Use new gateway to get the repodata for global install by @nichmor in #1767
  • +
  • Pixi global proposal by @Hofer-Julian in #1757
  • +
  • Upgrade to new uv 0.2.37 by @tdejager in #1829
  • +
  • Use new gateway for pixi search by @nichmor in #1819
  • +
  • Extend pixi clean cache with more cache options by @ruben-arts in #1872
  • +
  • Downgrade __linux default to 4.18 by @ruben-arts in #1887
  • +
+

Documentation#

+
    +
  • Fix instructions for update github actions by @Hofer-Julian in #1774
  • +
  • Fix fish completion script by @dennis-wey in #1789
  • +
  • Expands the environment variable examples in the reference section by @travishathaway in #1779
  • +
  • Community feedback pixi global by @Hofer-Julian in #1800
  • +
  • Additions to the pixi global proposal by @Hofer-Julian in #1803
  • +
  • Stop using invalid environment name in pixi global proposal by @Hofer-Julian in #1826
  • +
  • Extend pixi global proposal by @Hofer-Julian in #1861
  • +
  • Make channels required in pixi global manifest by @Hofer-Julian in #1868
  • +
  • Fix linux minimum version in project_configuration docs by @traversaro in #1888
  • +
+

Fixed#

+
    +
  • Try to increase rlimit by @baszalmstra in #1766
  • +
  • Add test for invalid environment names by @Hofer-Julian in #1825
  • +
  • Show global config in info command by @ruben-arts in #1807
  • +
  • Correct documentation of PIXI_ENVIRONMENT_PLATFORMS by @traversaro in #1842
  • +
  • Format in docs/features/environment.md by @cdeil in #1846
  • +
  • Make proper use of NamedChannelOrUrl by @ruben-arts in #1820
  • +
  • Trait impl override by @baszalmstra in #1848
  • +
  • Tame pixi search by @baszalmstra in #1849
  • +
  • Fix pixi tree -i duplicate output by @baszalmstra in #1847
  • +
  • Improve spec parsing error messages by @baszalmstra in #1786
  • +
  • Parse matchspec from CLI Lenient by @baszalmstra in #1852
  • +
  • Improve parsing of pypi-dependencies by @baszalmstra in #1851
  • +
  • Don't enforce system requirements for task tests by @baszalmstra in #1855
  • +
  • Satisfy when there are no pypi packages in the lockfile by @ruben-arts in #1862
  • +
  • Ssh url should not contain colon by @baszalmstra in #1865
  • +
  • find-links with manifest-path by @baszalmstra in #1864
  • +
  • Increase stack size in debug mode on windows by @baszalmstra in #1867
  • +
  • Solve-group-envs should reside in .pixi folder by @baszalmstra in #1866
  • +
  • Move package-override logging by @tdejager in #1883
  • +
  • Pinning logic for minor and major by @baszalmstra in #1885
  • +
  • Docs manifest tests by @ruben-arts in #1879
  • +
+

Refactor#

+
    +
  • Encapsulate channel resolution logic for CLI by @olivier-lacroix in #1781
  • +
  • Move to pub(crate) fn in order to detect and remove unused functions by @Hofer-Julian in #1805
  • +
  • Only compile TaskNode::full_command for tests by @Hofer-Julian in #1809
  • +
  • Derive Default for more structs by @Hofer-Julian in #1824
  • +
  • Rename get_up_to_date_prefix to update_prefix by @Hofer-Julian in #1837
  • +
  • Make HasSpecs implementation more functional by @Hofer-Julian in #1863
  • +
+

New Contributors#

+
    +
  • @cdeil made their first contribution in #1846
  • +
+

[0.27.1] - 2024-08-09#

+

Documentation#

+
    +
  • Fix mlx feature in "multiple machines" example by @rgommers in #1762
  • +
  • Update some of the cli and add osx rosetta mention by @ruben-arts in #1760
  • +
  • Fix typo by @pavelzw in #1771
  • +
+

Fixed#

+
    +
  • User agent string was wrong by @wolfv in #1759
  • +
  • Dont accidentally wipe pyproject.toml on init by @ruben-arts in #1775
  • +
+

Refactor#

+
    +
  • Add pixi_spec crate by @baszalmstra in #1741
  • +
+

New Contributors#

+
    +
  • @rgommers made their first contribution in #1762
  • +
+

[0.27.0] - 2024-08-07#

+

✨ Highlights#

+

This release contains a lot of refactoring and improvements to the codebase, in preparation for future features and improvements. +Including with that we've fixed a ton of bugs. To make sure we're not breaking anything we've added a lot of tests and CI checks. +But let us know if you find any issues!

+

As a reminder, you can update pixi using pixi self-update and move to a specific version, including backwards, with pixi self-update --version 0.27.0.

+

Added#

+
    +
  • Add pixi run completion for fish shell by @dennis-wey in #1680
  • +
+

Changed#

+
    +
  • Move examples from setuptools to hatchling by @Hofer-Julian in #1692
  • +
  • Let pixi init create hatchling pyproject.toml by @Hofer-Julian in #1693
  • +
  • Make [project] table optional for pyproject.toml manifests by @olivier-lacroix in #1732
  • +
+

Documentation#

+
    +
  • Improve the fish completions location by @tdejager in #1647
  • +
  • Explain why we use hatchling by @Hofer-Julian
  • +
  • Update install CLI doc now that the update command exist by @olivier-lacroix in #1690
  • +
  • Mention pixi exec in GHA docs by @pavelzw in #1724
  • +
  • Update to correct spelling by @ahnsn in #1730
  • +
  • Ensure hatchling is used everywhere in documentation by @olivier-lacroix in #1733
  • +
  • Add readme to WASM example by @wolfv in #1703
  • +
  • Fix typo by @pavelzw in #1660
  • +
  • Fix typo by @DimitriPapadopoulos in #1743
  • +
  • Fix typo by @SeaOtocinclus in #1651
  • +
+

Testing#

+
    +
  • Added script and tasks for testing examples by @tdejager in #1671
  • +
  • Add simple integration tests by @ruben-arts in #1719
  • +
+

Fixed#

+
    +
  • Prepend pixi to path instead of appending by @vigneshmanick in #1644
  • +
  • Add manifest tests and run them in ci by @ruben-arts in #1667
  • +
  • Use hashed pypi mapping by @baszalmstra in #1663
  • +
  • Depend on pep440_rs from crates.io and use replace by @baszalmstra in #1698
  • +
  • pixi add with more than just package name and version by @ruben-arts in #1704
  • +
  • Ignore pypi logic on non pypi projects by @ruben-arts in #1705
  • +
  • Fix and refactor --no-lockfile-update by @ruben-arts in #1683
  • +
  • Changed example to use hatchling by @tdejager in #1729
  • +
  • Todo clean up by @KGrewal1 in #1735
  • +
  • Allow for init to pixi.toml when pyproject.toml is available. by @ruben-arts in #1640
  • +
  • Test on macos-13 by @ruben-arts in #1739
  • +
  • Make sure pixi vars are available before activation.env vars are by @ruben-arts in #1740
  • +
  • Authenticate exec package download by @olivier-lacroix in #1751
  • +
+

Refactor#

+
    +
  • Extract pixi_manifest by @baszalmstra in #1656
  • +
  • Delay channel config url evaluation by @baszalmstra in #1662
  • +
  • Split out pty functionality by @tdejager in #1678
  • +
  • Make project manifest loading DRY and consistent by @olivier-lacroix in #1688
  • +
  • Refactor channel add and remove CLI commands by @olivier-lacroix in #1689
  • +
  • Refactor pixi::consts and pixi::config into separate crates by @tdejager in #1684
  • +
  • Move dependencies to pixi_manifest by @tdejager in #1700
  • +
  • Moved pypi environment modifiers by @tdejager in #1699
  • +
  • Split HasFeatures by @tdejager in #1712
  • +
  • Move, splits and renames the HasFeatures trait by @tdejager in #1717
  • +
  • Merge utils by @tdejager in #1718
  • +
  • Move fancy to its own crate by @tdejager in #1722
  • +
  • Move config to repodata functions by @tdejager in #1723
  • +
  • Move pypi-mapping to its own crate by @tdejager in #1725
  • +
  • Split utils into 2 crates by @tdejager in #1736
  • +
  • Add progress bar as a crate by @nichmor in #1727
  • +
  • Split up pixi_manifest lib by @tdejager in #1661
  • +
+

New Contributors#

+
    +
  • @DimitriPapadopoulos made their first contribution in #1743
  • +
  • @KGrewal1 made their first contribution in #1735
  • +
  • @ahnsn made their first contribution in #1730
  • +
  • @dennis-wey made their first contribution in #1680
  • +
+

[0.26.1] - 2024-07-22#

+

Fixed#

+
    +
  • Make sure we also build the msi installer by @ruben-arts in #1645
  • +
+

[0.26.0] - 2024-07-19#

+

✨ Highlights#

+
    +
  • Specify how pixi pins your dependencies with the pinning-strategy in the config. e.g. semver -> >=1.2.3,<2 and no-pin -> *) #1516
  • +
  • Specify how pixi solves multiple channels with channel-priority in the manifest. #1631
  • +
+

Added#

+
    +
  • Add short options to config location flags by @ruben-arts in #1586
  • +
  • Add a file guard to indicate if an environment is being installed by @baszalmstra in #1593
  • +
  • Add pinning-strategy to the configuration by @ruben-arts in #1516
  • +
  • Add channel-priority to the manifest and solve by @ruben-arts in #1631
  • +
  • Add nushell completion by @Hofer-Julian in #1599
  • +
  • Add nushell completions for pixi run by @Hofer-Julian in #1627
  • +
  • Add completion for pixi run --environment for nushell by @Hofer-Julian in #1636
  • +
+

Changed#

+
    +
  • Upgrade uv 0.2.18 by @tdejager in #1540
  • +
  • Refactor pyproject.toml parser by @nichmor in #1592
  • +
  • Interactive warning for packages in pixi global install by @ruben-arts in #1626
  • +
+

Documentation#

+
    +
  • Add WASM example with JupyterLite by @wolfv in #1623
  • +
  • Added LLM example by @ytjhai in #1545
  • +
  • Add note to mark directory as excluded in pixi-pycharm by @pavelzw in #1579
  • +
  • Add changelog to docs by @vigneshmanick in #1574
  • +
  • Updated the values of the system requirements by @tdejager in #1575
  • +
  • Tell cargo install which bin to install by @ruben-arts in #1584
  • +
  • Update conflict docs for cargo add by @Hofer-Julian in #1600
  • +
  • Revert "Update conflict docs for cargo add " by @Hofer-Julian in #1605
  • +
  • Add reference documentation for the exec command by @baszalmstra in #1587
  • +
  • Add transitioning docs for poetry and conda by @ruben-arts in #1624
  • +
  • Add pixi-pack by @pavelzw in #1629
  • +
  • Use '-' instead of '_' for package name by @olivier-lacroix in #1628
  • +
+

Fixed#

+
    +
  • Flaky task test by @tdejager in #1581
  • +
  • Pass command line arguments verbatim by @baszalmstra in #1582
  • +
  • Run clippy on all targets by @Hofer-Julian in #1588
  • +
  • Pre-commit install pixi task by @Hofer-Julian in #1590
  • +
  • Add clap_complete_nushell to dependencies by @Hofer-Julian in #1625
  • +
  • Write to stdout for machine readable output by @Hofer-Julian in #1639
  • +
+

Refactor#

+
    +
  • Migrate to workspace by @baszalmstra in #1597
  • +
+

Removed#

+
    +
  • Remove double manifest warning by @tdejager in #1580
  • +
+

New Contributors#

+
    +
  • @ytjhai made their first contribution in #1545
  • +
+

[0.25.0] - 2024-07-05#

+

✨ Highlights#

+
    +
  • pixi exec command, execute commands in temporary environments, useful for testing in short-lived sessions.
  • +
  • We've bumped the default system-requirements to higher defaults: glibc (2.17 -> 2.28), osx64 (10.15 -> 13.0), osx-arm64 (11.0 -> 13.0). Let us know if this causes any issues. To keep the previous values please use a system-requirements table, this is explained here
  • +
+

Changed#

+
    +
  • Bump system requirements by @wolfv in #1553
  • +
  • Better error when exec is missing a cmd by @tdejager in #1565
  • +
  • Make exec use authenticated client by @tdejager in #1568
  • +
+

Documentation#

+
    +
  • Automatic updating using github actions by @pavelzw in #1456
  • +
  • Describe the --change-ps1 option for pixi shell by @Yura52 in #1536
  • +
  • Add some other quantco repos by @pavelzw in #1542
  • +
  • Add example using geos-rs by @Hofer-Julian in #1563
  • +
+

Fixed#

+
    +
  • Tiny error in basic_usage.md by @Sjouks in #1513
  • +
  • Lazy initialize client by @baszalmstra in #1511
  • +
  • URL typos in rtd examples by @kklein in #1538
  • +
  • Fix satisfiability for short sha hashes by @tdejager in #1530
  • +
  • Wrong path passed to dynamic check by @tdejager in #1552
  • +
  • Don't error if no tasks is available on platform by @hoxbro in #1550
  • +
+

Refactor#

+
    +
  • Add to use update code by @baszalmstra in #1508
  • +
+

New Contributors#

+
    +
  • @kklein made their first contribution in #1538
  • +
  • @Yura52 made their first contribution in #1536
  • +
  • @Sjouks made their first contribution in #1513
  • +
+

[0.24.2] - 2024-06-14#

+

Documentation#

+
    +
  • Add readthedocs examples by @bollwyvl in #1423
  • +
  • Fix typo in project_configuration.md by @RaulPL in #1502
  • +
+

Fixed#

+
    +
  • Too much shell variables in activation of pixi shell by @ruben-arts in #1507
  • +
+

[0.24.1] - 2024-06-12#

+

Fixed#

+
    +
  • Replace http code %2b with + by @ruben-arts in #1500
  • +
+

[0.24.0] - 2024-06-12#

+

✨ Highlights#

+
    +
  • You can now run in a more isolated environment on unix machines, using pixi run --clean-env TASK_NAME.
  • +
  • You can new easily clean your environment with pixi clean or the cache with pixi clean cache
  • +
+

Added#

+
    +
  • Add pixi clean command by @ruben-arts in #1325
  • +
  • Add --clean-env flag to tasks and run command by @ruben-arts in #1395
  • +
  • Add description field to task by @jjjermiah in #1479
  • +
  • Add pixi file to the environment to add pixi specific details by @ruben-arts in #1495
  • +
+

Changed#

+
    +
  • Project environment cli by @baszalmstra in #1433
  • +
  • Update task list console output by @vigneshmanick in #1443
  • +
  • Upgrade uv by @tdejager in #1436
  • +
  • Sort packages in list_global_packages by @dhirschfeld in #1458
  • +
  • Added test for special chars wheel filename by @tdejager in #1454
  • +
+

Documentation#

+
    +
  • Improve multi env tasks documentation by @ruben-arts in #1494
  • +
+

Fixed#

+
    +
  • Use the activated environment when running a task by @tdejager in #1461
  • +
  • Fix authentication pypi-deps for download from lockfile by @tdejager in #1460
  • +
  • Display channels correctly in pixi info by @ruben-arts in #1459
  • +
  • Render help for --frozen by @ruben-arts in #1468
  • +
  • Don't record purl for non conda-forge channels by @nichmor in #1451
  • +
  • Use best_platform to verify the run platform by @ruben-arts in #1472
  • +
  • Creation of parent dir of symlink by @ruben-arts in #1483
  • +
  • pixi install --all output missing newline by @vigneshmanick in #1487
  • +
  • Don't error on already existing dependency by @ruben-arts in #1449
  • +
  • Remove debug true in release by @ruben-arts in #1477
  • +
+

New Contributors#

+
    +
  • @dhirschfeld made their first contribution in #1458
  • +
+

Full commit history

+

[0.23.0] - 2024-05-27#

+

✨ Highlights#

+
    +
  • This release adds two new commands pixi config and pixi update
      +
    • pixi config allows you to edit, set, unset, append, prepend and list your local/global or system configuration.
    • +
    • pixi update re-solves the full lockfile or use pixi update PACKAGE to only update PACKAGE, making sure your project is using the latest versions that the manifest allows for.
    • +
    +
  • +
+

Added#

+
    +
  • Add pixi config command by @chawyehsu in #1339
  • +
  • Add pixi list --explicit flag command by @jjjermiah in #1403
  • +
  • Add [activation.env] table for environment variables by @ruben-arts in #1156
  • +
  • Allow installing multiple envs, including --all at once by @tdejager in #1413
  • +
  • Add pixi update command to re-solve the lockfile by @baszalmstra in #1431 (fixes 20 :thumbsup:)
  • +
  • Add detached-environments to the config, move environments outside the project folder by @ruben-arts in #1381 (fixes 11 :thumbsup:)
  • +
+

Changed#

+
    +
  • Use the gateway to fetch repodata by @baszalmstra in #1307
  • +
  • Switch to compressed mapping by @nichmor in #1335
  • +
  • Warn on pypi conda clobbering by @nichmor in #1353
  • +
  • Align remove arguments with add by @olivier-lacroix in #1406
  • +
  • Add backward compat logic for older lock files by @nichmor in #1425
  • +
+

Documentation#

+
    +
  • Fix small screen by removing getting started section. by @ruben-arts in #1393
  • +
  • Improve caching docs by @ruben-arts in #1422
  • +
  • Add example, python library using gcp upload by @tdejager in #1380
  • +
  • Correct typos with --no-lockfile-update. by @tobiasraabe in #1396
  • +
+

Fixed#

+
    +
  • Trim channel url when filter packages_for_prefix_mapping by @zen-xu in #1391
  • +
  • Use the right channels when upgrading global packages by @olivier-lacroix in #1326
  • +
  • Fish prompt display looks wrong in tide by @tfriedel in #1424
  • +
  • Use local mapping instead of remote by @nichmor in #1430
  • +
+

Refactor#

+
    +
  • Remove unused fetch_sparse_repodata by @olivier-lacroix in #1411
  • +
  • Remove project level method that are per environment by @olivier-lacroix in #1412
  • +
  • Update lockfile functionality for reusability by @baszalmstra in #1426
  • +
+

New Contributors#

+
    +
  • @tfriedel made their first contribution in #1424
  • +
  • @jjjermiah made their first contribution in #1403
  • +
  • @tobiasraabe made their first contribution in #1396
  • +
+

Full commit history

+

[0.22.0] - 2024-05-13#

+

✨ Highlights#

+
    +
  • Support for source pypi dependencies through the cli:
      +
    • pixi add --pypi 'package @ package.whl', perfect for adding just build wheels to your environment in CI.
    • +
    • pixi add --pypi 'package_from_git @ git+https://github.com/org/package.git', to add a package from a git repository.
    • +
    • pixi add --pypi 'package_from_path @ file:///path/to/package' --editable, to add a package from a local path.
    • +
    +
  • +
+

Added#

+
    +
  • Implement more functions for pixi add --pypi by @wolfv in #1244
  • +
+

Documentation#

+
    +
  • Update install cli doc by @vigneshmanick in #1336
  • +
  • Replace empty default example with no-default-feature by @beenje in #1352
  • +
  • Document the add & remove cli behaviour with pyproject.toml manifest by @olivier-lacroix in #1338
  • +
  • Add environment activation to GitHub actions docs by @pavelzw in #1371
  • +
  • Clarify in CLI that run can also take commands by @twrightsman in #1368
  • +
+

Fixed#

+
    +
  • Automated update of install script in pixi.sh by @ruben-arts in #1351
  • +
  • Wrong description on pixi project help by @notPlancha in #1358
  • +
  • Don't need a python interpreter when not having pypi dependencies. by @ruben-arts in #1366
  • +
  • Don't error on not editable not path by @ruben-arts in #1365
  • +
  • Align shell-hook cli with shell by @ruben-arts in #1364
  • +
  • Only write prefix file if needed by @ruben-arts in #1363
  • +
+

Refactor#

+
    +
  • Lock-file resolve functionality in separated modules by @tdejager in #1337
  • +
  • Use generic for RepoDataRecordsByName and PypiRecordsByName by @olivier-lacroix in #1341
  • +
+

New Contributors#

+
    +
  • @twrightsman made their first contribution in #1368
  • +
  • @notPlancha made their first contribution in #1358
  • +
  • @vigneshmanick made their first contribution in #1336
  • +
+

Full commit history

+

[0.21.1] - 2024-05-07#

+

Fixed#

+
    +
  • Use read timeout, not global timeout by @wolfv in #1329
  • +
  • Channel priority logic by @ruben-arts in #1332
  • +
+

Full commit history

+

[0.21.0] - 2024-05-06#

+

✨ Highlights#

+
    +
  • This release adds support for configuring PyPI settings globally, to use alternative PyPI indexes and load credentials with keyring.
  • +
  • We now support cross-platform running, for osx-64 on osx-arm64 and wasm environments.
  • +
  • There is now a no-default-feature option to simplify usage of environments.
  • +
+

Added#

+
    +
  • Add pypi config for global local config file + keyring support by @wolfv in #1279
  • +
  • Allow for cross-platform running, for osx-64 on osx-arm64 and wasm environments by @wolfv in #1020
  • +
+

Changed#

+
    +
  • Add no-default-feature option to environments by @olivier-lacroix in #1092
  • +
  • Add /etc/pixi/config.toml to global configuration search paths by @pavelzw in #1304
  • +
  • Change global config fields to kebab-case by @tdejager in #1308
  • +
  • Show all available task with task list by @Hoxbro in #1286
  • +
  • Allow to emit activation environment variables as JSON by @borchero in #1317
  • +
  • Use locked pypi packages as preferences in the pypi solve to get minimally updating lock files by @ruben-arts in #1320
  • +
  • Allow to upgrade several global packages at once by @olivier-lacroix in #1324
  • +
+

Documentation#

+
    +
  • Typo in tutorials python by @carschandler in #1297
  • +
  • Python Tutorial: Dependencies, PyPI, Order, Grammar by @JesperDramsch in #1313
  • +
+

Fixed#

+
    +
  • Schema version and add it to tbump by @ruben-arts in #1284
  • +
  • Make integration test fail in ci and fix ssh issue by @ruben-arts in #1301
  • +
  • Automate adding install scripts to the docs by @ruben-arts in #1302
  • +
  • Do not always request for prefix mapping by @nichmor in #1300
  • +
  • Align CLI aliases and add missing by @ruben-arts in #1316
  • +
  • Alias depends_on to depends-on by @ruben-arts in #1310
  • +
  • Add error if channel or platform doesn't exist on remove by @ruben-arts in #1315
  • +
  • Allow spec in pixi q instead of only name by @ruben-arts in #1314
  • +
  • Remove dependency on sysroot for linux by @ruben-arts in #1319
  • +
  • Fix linking symlink issue, by updating to the latest rattler by @baszalmstra in #1327
  • +
+

Refactor#

+
    +
  • Use IndexSet instead of Vec for collections of unique elements by @olivier-lacroix in #1289
  • +
  • Use generics over PyPiDependencies and CondaDependencies by @olivier-lacroix in #1303
  • +
+

New Contributors#

+
    +
  • @borchero made their first contribution in #1317
  • +
  • @JesperDramsch made their first contribution in #1313
  • +
  • @Hoxbro made their first contribution in #1286
  • +
  • @carschandler made their first contribution in #1297
  • +
+

Full commit history

+

[0.20.1] - 2024-04-26#

+

✨ Highlights#

+
    +
  • Big improvements on the pypi-editable installs.
  • +
+

Fixed#

+
    +
  • Editable non-satisfiable by @baszalmstra in #1251
  • +
  • Satisfiability with pypi extras by @baszalmstra in #1253
  • +
  • Change global install activation script permission from 0o744 -> 0o755 by @zen-xu in #1250
  • +
  • Avoid creating Empty TOML tables by @olivier-lacroix in #1270
  • +
  • Uses the special-case uv path handling for both built and source by @tdejager in #1263
  • +
  • Modify test before attempting to write to .bash_profile in install.sh by @bruchim-cisco in #1267
  • +
  • Parse properly 'default' as environment Cli argument by @olivier-lacroix in #1247
  • +
  • Apply schema.json normalization, add to docs by @bollwyvl in #1265
  • +
  • Improve absolute path satisfiability by @tdejager in #1252
  • +
  • Improve parse deno error and make task a required field in the cli by @ruben-arts in #1260
  • +
+

New Contributors#

+
    +
  • @bollwyvl made their first contribution in #1265
  • +
  • @bruchim-cisco made their first contribution in #1267
  • +
  • @zen-xu made their first contribution in #1250
  • +
+

Full commit history

+

[0.20.0] - 2024-04-19#

+

✨ Highlights#

+
    +
  • We now support env variables in the task definition, these can also be used as default values for parameters in your task which you can overwrite with your shell's env variables. +e.g. task = { cmd = "task to run", env = { VAR="value1", PATH="my/path:$PATH" } }
  • +
  • We made a big effort on fixing issues and improving documentation!
  • +
+

Added#

+
    +
  • Add env to the tasks to specify tasks specific environment variables by @wolfv in https://github.com/prefix-dev/pixi/pull/972
  • +
+

Changed#

+
    +
  • Add --pyproject option to pixi init with a pyproject.toml by @olivier-lacroix in #1188
  • +
  • Upgrade to new uv version 0.1.32 by @tdejager in #1208
  • +
+

Documentation#

+
    +
  • Document pixi.lock by @ruben-arts in #1209
  • +
  • Document channel priority definition by @ruben-arts in #1234
  • +
  • Add rust tutorial including openssl example by @ruben-arts in #1155
  • +
  • Add python tutorial to documentation by @tdejager in #1179
  • +
  • Add JupyterLab integration docs by @renan-r-santos in #1147
  • +
+
    +
  • Add Windows support for PyCharm integration by @pavelzw in #1192
  • +
  • Setup_pixi for local pixi installation by @ytausch in #1181
  • +
  • Update pypi docs by @Hofer-Julian in #1215
  • +
  • Fix order of --no-deps when pip installing in editable mode by @glemaitre in #1220
  • +
  • Fix frozen documentation by @ruben-arts in #1167
  • +
+

Fixed#

+
    +
  • Small typo in list cli by @tdejager in #1169
  • +
  • Issue with invalid solve group by @baszalmstra in #1190
  • +
  • Improve error on parsing lockfile by @ruben-arts in #1180
  • +
  • Replace _ with - when creating environments from features by @wolfv in #1203
  • +
  • Prevent duplicate direct dependencies in tree by @abkfenris in #1184
  • +
  • Use project root directory instead of task.working_directory for base dir when hashing by @wolfv in #1202
  • +
  • Do not leak env vars from bat scripts in cmd.exe by @wolfv in #1205
  • +
  • Make file globbing behave more as expected by @wolfv in #1204
  • +
  • Fix for using file::// in pyproject.toml dependencies by @tdejager in #1196
  • +
  • Improve pypi version conversion in pyproject.toml dependencies by @wolfv in #1201
  • +
  • Update to the latest rattler by @wolfv in #1235
  • +
+

BREAKING#

+
    +
  • task = { cmd = "task to run", cwd = "folder", inputs = "input.txt", output = "output.txt"} Where input.txt and output.txt where previously in folder they are now relative the project root. This changed in: #1202
  • +
  • task = { cmd = "task to run", inputs = "input.txt"} previously searched for all input.txt files now only for the ones in the project root. This changed in: #1204
  • +
+

New Contributors#

+
    +
  • @glemaitre made their first contribution in #1220
  • +
+

Full commit history

+

[0.19.1] - 2024-04-11#

+

✨ Highlights#

+

This fixes the issue where pixi would generate broken environments/lockfiles when a mapping for a brand-new version of a package is missing.

+

Changed#

+
    +
  • Add fallback mechanism for missing mapping by @nichmor in #1166
  • +
+

Full commit history

+

[0.19.0] - 2024-04-10#

+

✨ Highlights#

+
    +
  • This release adds a new pixi tree command to show the dependency tree of the project.
  • +
  • Pixi now persists the manifest and environment when activating a shell, so you can use pixi as if you are in that folder while in the shell.
  • +
+

Added#

+
    +
  • pixi tree command to show dependency tree by @abkfenris in #1069
  • +
  • Persistent shell manifests by @abkfenris in #1080
  • +
  • Add to pypi in feature (pixi add --feature test --pypi package) by @ruben-arts in #1135
  • +
  • Use new mapping by @nichmor in #888
  • +
  • --no-progress to disable all progress bars by @baszalmstra in #1105
  • +
  • Create a table if channel is specified (pixi add conda-forge::rattler-build) by @baszalmstra in #1079
  • +
+

Changed#

+
    +
  • Add the project itself as an editable dependency by @olivier-lacroix in #1084
  • +
  • Get tool.pixi.project.name from project.name by @olivier-lacroix in #1112
  • +
  • Create features and environments from extras by @olivier-lacroix in #1077
  • +
  • Pypi supports come out of Beta by @olivier-lacroix in #1120
  • +
  • Enable to force PIXI_ARCH for pixi installation by @beenje in #1129
  • +
  • Improve tool.pixi.project detection logic by @olivier-lacroix in #1127
  • +
  • Add purls for packages if adding pypi dependencies by @nichmor in #1148
  • +
  • Add env name if not default to tree and list commands by @ruben-arts in #1145
  • +
+

Documentation#

+
    +
  • Add MODFLOW 6 to community docs by @Hofer-Julian in #1125
  • +
  • Addition of ros2 tutorial by @ruben-arts in #1116
  • +
  • Improve install script docs by @ruben-arts in #1136
  • +
  • More structured table of content by @tdejager in #1142
  • +
+

Fixed#

+
    +
  • Amend syntax in conda-meta/history to prevent conda.history.History.parse() error by @jaimergp in #1117
  • +
  • Fix docker example and include pyproject.toml by @tdejager in #1121
  • +
+

New Contributors#

+
    +
  • @abkfenris made their first contribution in #1069
  • +
  • @beenje made their first contribution in #1129
  • +
  • @jaimergp made their first contribution in #1117
  • +
+

Full commit history

+

[0.18.0] - 2024-04-02#

+

✨ Highlights#

+
    +
  • This release adds support for pyproject.toml, now pixi reads from the [tool.pixi] table.
  • +
  • We now support editable PyPI dependencies, and PyPI source dependencies, including git, path, and url dependencies.
  • +
+
+

[!TIP] +These new features are part of the ongoing effort to make pixi more flexible, powerful, and comfortable for the python users. +They are still in progress so expect more improvements on these features soon, so please report any issues you encounter and follow our next releases!

+
+

Added#

+
    +
  • Support for pyproject.toml by @olivier-lacroix in #999
  • +
  • Support for PyPI source dependencies by @tdejager in #985
  • +
  • Support for editable PyPI dependencies by @tdejager in #1044
  • +
+

Changed#

+
    +
  • XDG_CONFIG_HOME and XDG_CACHE_HOME compliance by @chawyehsu in #1050
  • +
  • Build pixi for windows arm by @baszalmstra in #1053
  • +
  • Platform literals by @baszalmstra in #1054
  • +
  • Cli docs: --user is actually --username
  • +
  • Fixed error in auth example (CLI docs) by @ytausch in #1076
  • +
+

Documentation#

+
    +
  • Add lockfile update description in preparation for pixi update by @ruben-arts in #1073
  • +
  • zsh may be used for installation on macOS by @pya in #1091
  • +
  • Fix typo in pixi auth documentation by @ytausch in #1076
  • +
  • Add rstudio to the IDE integration docs by @wolfv in #1144
  • +
+

Fixed#

+
    +
  • Test failure on riscv64 by @hack3ric in #1045
  • +
  • Validation test was testing on a wrong pixi.toml by @ruben-arts in #1056
  • +
  • Pixi list shows path and editable by @baszalmstra in #1100
  • +
  • Docs ci by @ruben-arts in #1074
  • +
  • Add error for unsupported pypi dependencies by @baszalmstra in #1052
  • +
  • Interactively delete environment when it was relocated by @baszalmstra in #1102
  • +
  • Allow solving for different platforms by @baszalmstra in #1101
  • +
  • Don't allow extra keys in pypi requirements by @baszalmstra in #1104
  • +
  • Solve when moving dependency from conda to pypi by @baszalmstra in #1099
  • +
+

New Contributors#

+
    +
  • @pya made their first contribution in #1091
  • +
  • @ytausch made their first contribution in #1076
  • +
  • @hack3ric made their first contribution in #1045
  • +
  • @olivier-lacroix made their first contribution in #999
  • +
  • @henryiii made their first contribution in #1063
  • +
+

Full commit history

+

[0.17.1] - 2024-03-21#

+

✨ Highlights#

+

A quick bug-fix release for pixi list.

+

Documentation#

+
    +
  • Fix typo by @pavelzw in #1028
  • +
+

Fixed#

+
    +
  • Remove the need for a python interpreter in pixi list by @baszalmstra in #1033
  • +
+

[0.17.0] - 2024-03-19#

+

✨ Highlights#

+
    +
  • This release greatly improves pixi global commands, thanks to @chawyehsu!
  • +
  • We now support global (or local) configuration for pixi's own behavior, including mirrors, and OCI registries.
  • +
  • We support channel mirrors for corporate environments!
  • +
  • Faster task execution thanks to caching 🚀 Tasks that already executed successfully can be skipped based on the hash of the inputs and outputs.
  • +
  • PyCharm and GitHub Actions integration thanks to @pavelzw – read more about it in the docs!
  • +
+

Added#

+
    +
  • Add citation file by @ruben-arts in #908
  • +
  • Add a pixi badge by @ruben-arts in #961
  • +
  • Add deserialization of pypi source dependencies from toml by @ruben-arts and @wolf in #895 #984
  • +
  • Implement mirror and OCI settings by @wolfv in #988
  • +
  • Implement inputs and outputs hash based task skipping by @wolfv in #933
  • +
+

Changed#

+
    +
  • Refined global upgrade commands by @chawyehsu in #948
  • +
  • Global upgrade supports matchspec by @chawyehsu in #962
  • +
  • Improve pixi search with platform selection and making limit optional by @wolfv in #979
  • +
  • Implement global config options by @wolfv in #960 #1015 #1019
  • +
  • Update auth to use rattler cli by @kassoulait by @ruben-arts in #986
  • +
+

Documentation#

+
    +
  • Remove cache: true from setup-pixi by @pavelzw in #950
  • +
  • Add GitHub Actions documentation by @pavelzw in #955
  • +
  • Add PyCharm documentation by @pavelzw in #974
  • +
  • Mention watch_file in direnv usage by @pavelzw in #983
  • +
  • Add tip to help users when no PROFILE file exists by @ruben-arts in #991
  • +
  • Move yaml comments into mkdocs annotations by @pavelzw in #1003
  • +
  • Fix --env and extend actions examples by @ruben-arts in #1005
  • +
  • Add Wflow to projects built with pixi by @Hofer-Julian in #1006
  • +
  • Removed linenums to avoid buggy visualization by @ruben-arts in #1002
  • +
  • Fix typos by @pavelzw in #1016
  • +
+

Fixed#

+
    +
  • Pypi dependencies not being removed by @tdejager in #952
  • +
  • Permissions for lint pr by @ruben-arts in #852
  • +
  • Install Windows executable with install.sh in Git Bash by @jdblischak in #966
  • +
  • Proper scanning of the conda-meta folder for json entries by @wolfv in #971
  • +
  • Global shim scripts for Windows by @wolfv in #975
  • +
  • Correct fish prompt by @wolfv in #981
  • +
  • Prefix_file rename by @ruben-arts in #959
  • +
  • Conda transitive dependencies of pypi packages are properly extracted by @baszalmstra in #967
  • +
  • Make tests more deterministic and use single * for glob expansion by @wolfv in #987
  • +
  • Create conda-meta/history file by @pavelzw in #995
  • +
  • Pypi dependency parsing was too lenient by @wolfv in #984
  • +
  • Add reactivation of the environment in pixi shell by @wolfv in #982
  • +
  • Add tool to strict json schema by @ruben-arts in #969
  • +
+

New Contributors#

+
    +
  • @jdblischak made their first contribution in #966
  • +
  • @kassoulait made their first contribution in #986
  • +
+

Full commit history

+

[0.16.1] - 2024-03-11#

+

Fixed#

+
    +
  • Parse lockfile matchspecs lenient, fixing bug introduced in 0.16.0 by @ruben-arts in #951
  • +
+

Full commit history

+

[0.16.0] - 2024-03-09#

+

✨ Highlights#

+
    +
  • This release removes rip and add uv as the PyPI resolver and installer.
  • +
+

Added#

+
    +
  • Add tcsh install support by @obust in #898
  • +
  • Add user agent to pixi http client by @baszalmstra in #892
  • +
  • Add a schema for the pixi.toml by @ruben-arts in #936
  • +
+

Changed#

+
    +
  • Switch from rip to uv by @tdejager in #863
  • +
  • Move uv options into context by @tdejager in #911
  • +
  • Add Deltares projects to Community.md by @Hofer-Julian in #920
  • +
  • Upgrade to uv 0.1.16, updated for changes in the API by @tdejager in #935
  • +
+

Fixed#

+
    +
  • Made the uv re-install logic a bit more clear by @tdejager in #894
  • +
  • Avoid duplicate pip dependency while importing environment.yaml by @sumanth-manchala in #890
  • +
  • Handle custom channels when importing from env yaml by @sumanth-manchala in #901
  • +
  • Pip editable installs getting uninstalled by @renan-r-santos in #902
  • +
  • Highlight pypi deps in pixi list by @sumanth-manchala in #907
  • +
  • Default to the default environment if possible by @ruben-arts in #921
  • +
  • Switching channels by @baszalmstra in #923
  • +
  • Use correct name of the channel on adding by @ruben-arts in #928
  • +
  • Turn back on jlap for faster repodata fetching by @ruben-arts in #937
  • +
  • Remove dists site-packages's when python interpreter changes by @tdejager in #896
  • +
+

New Contributors#

+
    +
  • @obust made their first contribution in #898
  • +
  • @renan-r-santos made their first contribution in #902
  • +
+

Full Commit history

+

[0.15.2] - 2024-02-29#

+

Changed#

+
    +
  • Add more info to a failure of activation by @ruben-arts in #873
  • +
+

Fixed#

+
    +
  • Improve global list UX when there is no global env dir created by @sumanth-manchala in #865
  • +
  • Update rattler to v0.19.0 by @AliPiccioniQC in #885
  • +
  • Error on pixi run if platform is not supported by @ruben-arts in #878
  • +
+

New Contributors#

+
    +
  • @sumanth-manchala made their first contribution in #865
  • +
  • @AliPiccioniQC made their first contribution in #885
  • +
+

Full commit history

+

[0.15.1] - 2024-02-26#

+

Added#

+
    +
  • Add prefix to project info json output by @baszalmstra in #859
  • +
+

Changed#

+
    +
  • New pixi global list display format by @chawyehsu in #723
  • +
  • Add direnv usage by @pavelzw in #845
  • +
  • Add docker example by @pavelzw in #846
  • +
  • Install/remove multiple packages globally by @chawyehsu in #854
  • +
+

Fixed#

+
    +
  • Prefix file in init --import by @ruben-arts in #855
  • +
  • Environment and feature names in pixi info --json by @baszalmstra in #857
  • +
+

Full commit history

+

[0.15.0] - 2024-02-23#

+

✨ Highlights#

+
    +
  • [pypi-dependencies] now get build in the created environment so it uses the conda installed build tools.
  • +
  • pixi init --import env.yml to import an existing conda environment file.
  • +
  • [target.unix.dependencies] to specify dependencies for unix systems instead of per platform.
  • +
+
+

[!WARNING] +This versions build failed, use v0.15.1

+
+

Added#

+
    +
  • pass environment variables during pypi resolution and install (#818)
  • +
  • skip micromamba style selector lines and warn about them (#830)
  • +
  • add import yml flag (#792)
  • +
  • check duplicate dependencies (#717)
  • +
  • (ci) check conventional PR title (#820)
  • +
  • add --feature to pixi add (#803)
  • +
  • add windows, macos, linux and unix to targets (#832)
  • +
+

Fixed#

+
    +
  • cache and retry pypi name mapping (#839)
  • +
  • check duplicates while adding dependencies (#829)
  • +
  • logic PIXI_NO_PATH_UPDATE variable (#822)
  • +
+

Other#

+
    +
  • add mike to the documentation and update looks (#809)
  • +
  • add instructions for installing on Alpine Linux (#828)
  • +
  • more error reporting in self-update (#823)
  • +
  • disabled jlap for now (#836)
  • +
+

Full commit history

+

[0.14.0] - 2024-02-15#

+

✨ Highlights#

+

Now, solve-groups can be used in [environments] to ensure dependency alignment across different environments without simultaneous installation. +This feature is particularly beneficial for managing identical dependencies in test and production environments. +Example configuration:

+

[environments]
+test = { features = ["prod", "test"], solve-groups = ["group1"] }
+prod = { features = ["prod"], solve-groups = ["group1"] }
+
+This setup simplifies managing dependencies that must be consistent across test and production.

+

Added#

+
    +
  • Add index field to pypi requirements by @vlad-ivanov-name in #784
  • +
  • Add -f/--feature to the pixi project platform command by @ruben-arts in #785
  • +
  • Warn user when unused features are defined by @ruben-arts in #762
  • +
  • Disambiguate tasks interactive by @baszalmstra in #766
  • +
  • Solve groups for conda by @baszalmstra in #783
  • +
  • Pypi solve groups by @baszalmstra in #802
  • +
  • Enable reflinks by @baszalmstra in #729
  • +
+

Changed#

+
    +
  • Add environment name to the progress by @ruben-arts in #788
  • +
  • Set color scheme by @ruben-arts in #773
  • +
  • Update lock on pixi list by @ruben-arts in #775
  • +
  • Use default env if task available in it. by @ruben-arts in #772
  • +
  • Color environment name in install step by @ruben-arts in #795
  • +
+

Fixed#

+
    +
  • Running cuda env and using those tasks. by @ruben-arts in #764
  • +
  • Make svg a gif by @ruben-arts in #782
  • +
  • Fmt by @ruben-arts
  • +
  • Check for correct platform in task env creation by @ruben-arts in #759
  • +
  • Remove using source name by @ruben-arts in #765
  • +
  • Auto-guessing of the shell in the shell-hook by @ruben-arts in https://github.com/prefix-dev/pixi/pull/811
  • +
  • sdist with direct references by @nichmor in https://github.com/prefix-dev/pixi/pull/813
  • +
+

Miscellaneous#

+
    +
  • Add slim-trees to community projects by @pavelzw in #760
  • +
  • Add test to default env in polarify example
  • +
  • Add multiple machine example by @ruben-arts in #757
  • +
  • Add more documentation on environments by @ruben-arts in #790
  • +
  • Update rip and rattler by @wolfv in #798
  • +
  • Rattler 0.18.0 by @baszalmstra in #805
  • +
  • Rip 0.8.0 by @nichmor in #806
  • +
  • Fix authentication path by @pavelzw in #796
  • +
  • Initial addition of integration test by @ruben-arts in https://github.com/prefix-dev/pixi/pull/804
  • +
+

New Contributors#

+
    +
  • @vlad-ivanov-name made their first contribution in #784
  • +
  • @nichmor made their first contribution in #806
  • +
+

Full commit history

+

[0.13.0] - 2024-02-01#

+

✨ Highlights#

+

This release is pretty crazy in amount of features! The major ones are: +- We added support for multiple environments. :tada: Checkout the documentation +- We added support for sdist installation, which greatly improves the amount of packages that can be installed from PyPI. :rocket:

+
+

[!IMPORTANT]

+

Renaming of PIXI_PACKAGE_* variables: +

PIXI_PACKAGE_ROOT -> PIXI_PROJECT_ROOT
+PIXI_PACKAGE_NAME ->  PIXI_PROJECT_NAME
+PIXI_PACKAGE_MANIFEST -> PIXI_PROJECT_MANIFEST
+PIXI_PACKAGE_VERSION -> PIXI_PROJECT_VERSION
+PIXI_PACKAGE_PLATFORMS -> PIXI_ENVIRONMENT_PLATFORMS
+
+Check documentation here: https://pixi.sh/environment/

+

[!IMPORTANT]

+

The .pixi/env/ folder has been moved to accommodate multiple environments. +If you only have one environment it is now named .pixi/envs/default.

+
+

Added#

+
    +
  • Add support for multiple environment:
      +
    • Update to rattler lock v4 by @baszalmstra in #698
    • +
    • Multi-env installation and usage by @baszalmstra in #721
    • +
    • Update all environments in the lock-file when requesting an environment by @baszalmstra in #711
    • +
    • Run tasks in the env they are defined by @baszalmstra in #731
    • +
    • polarify use-case as an example by @ruben-arts in #735
    • +
    • Make environment name parsing strict by @ruben-arts in #673
    • +
    • Use named environments (only "default" for now) by @ruben-arts in #674
    • +
    • Use task graph instead of traversal by @baszalmstra in #725
    • +
    • Multi env documentation by @ruben-arts in #703
    • +
    • pixi info -e/--environment option by @ruben-arts in #676
    • +
    • pixi channel add -f/--feature option by @ruben-arts in #700
    • +
    • pixi channel remove -f/--feature option by @ruben-arts in #706
    • +
    • pixi remove -f/--feature option by @ruben-arts in #680
    • +
    • pixi task list -e/--environment option by @ruben-arts in #694
    • +
    • pixi task remove -f/--feature option by @ruben-arts in #694
    • +
    • pixi install -e/--environment option by @ruben-arts in #722
    • +
    +
  • +
+
    +
  • Support for sdists in pypi-dependencies by @tdejager in #664
  • +
  • Add pre-release support to pypi-dependencies by @tdejager in #716
  • +
+
    +
  • Support adding dependencies for project's unsupported platforms by @orhun in #668
  • +
  • Add pixi list command by @hadim in #665
  • +
  • Add pixi shell-hook command by @orhun in #672#679 #684
  • +
  • Use env variable to configure locked, frozen and color by @hadim in #726
  • +
  • pixi self-update by @hadim in #675
  • +
  • Add PIXI_NO_PATH_UPDATE for PATH update suppression by @chawyehsu in #692
  • +
  • Set the cache directory by @ruben-arts in #683
  • +
+

Changed#

+
    +
  • Use consistent naming for tests module by @orhun in #678
  • +
  • Install pixi and add to the path in docker example by @ruben-arts in #743
  • +
  • Simplify the deserializer of PyPiRequirement by @orhun in #744
  • +
  • Use tabwriter instead of comfy_table by @baszalmstra in #745
  • +
  • Document environment variables by @ruben-arts in #746
  • +
+

Fixed#

+
    +
  • Quote part of the task that has brackets ([ or ]) by @JafarAbdi in #677
  • +
  • Package clobber and __pycache__ removal issues by @wolfv in #573
  • +
  • Non-global reqwest client by @tdejager in #693
  • +
  • Fix broken pipe error during search by @orhun in #699
  • +
  • Make pixi search result correct by @chawyehsu in #713
  • +
  • Allow the tasks for all platforms to be shown in pixi info by @ruben-arts in #728
  • +
  • Flaky tests while installing pypi dependencies by @baszalmstra in #732
  • +
  • Linux install script by @mariusvniekerk in #737
  • +
  • Download wheels in parallel to avoid deadlock by @baszalmstra in #752
  • +
+

New Contributors#

+
    +
  • @JafarAbdi made their first contribution in #677
  • +
  • @mariusvniekerk made their first contribution in #737
  • +
+

Full commit history

+

[0.12.0] - 2024-01-15#

+

✨ Highlights#

+
    +
  • Some great community contributions, pixi global upgrade, pixi project version commands, a PIXI_HOME variable.
  • +
  • A ton of refactor work to prepare for the multi-environment feature.
      +
    • Note that there are no extra environments created yet, but you can just specify them in the pixi.toml file already.
    • +
    • Next we'll build the actual environments.
    • +
    +
  • +
+

Added#

+
    +
  • Add global upgrade command to pixi by @trueleo in #614
  • +
  • Add configurable PIXI_HOME by @chawyehsu in #627
  • +
  • Add --pypi option to pixi remove by @marcelotrevisani in https://github.com/prefix-dev/pixi/pull/602
  • +
  • PrioritizedChannels to specify channel priority by @ruben-arts in https://github.com/prefix-dev/pixi/pull/658
  • +
  • Add project version {major,minor,patch} CLIs by @hadim in https://github.com/prefix-dev/pixi/pull/633
  • +
+

Changed#

+
    +
  • Refactored project model using targets, features and environments by @baszalmstra in https://github.com/prefix-dev/pixi/pull/616
  • +
  • Move code from Project to Environment by @baszalmstra in #630
  • +
  • Refactored system-requirements from Environment by @baszalmstra in #632
  • +
  • Extract activation.scripts into Environment by @baszalmstra in #659
  • +
  • Extract pypi-dependencies from Environment by @baszalmstra in https://github.com/prefix-dev/pixi/pull/656
  • +
  • De-serialization of features and environments by @ruben-arts in https://github.com/prefix-dev/pixi/pull/636
  • +
+

Fixed#

+
    +
  • Make install.sh also work with wget if curl is not available by @wolfv in #644
  • +
  • Use source build for rattler by @ruben-arts
  • +
  • Check for pypi-dependencies before amending the pypi purls by @ruben-arts in #661
  • +
  • Don't allow the use of reflinks by @ruben-arts in #662
  • +
+

Removed#

+
    +
  • Remove windows and unix system requirements by @baszalmstra in #635
  • +
+

Documentation#

+
    +
  • Document the channel logic by @ruben-arts in https://github.com/prefix-dev/pixi/pull/610
  • +
  • Update the instructions for installing on Arch Linux by @orhun in https://github.com/prefix-dev/pixi/pull/653
  • +
  • Update Community.md by @KarelZe in https://github.com/prefix-dev/pixi/pull/654
  • +
  • Replace contributions.md with contributing.md and make it more standardized by @ruben-arts in https://github.com/prefix-dev/pixi/pull/649
  • +
  • Remove windows and unix system requirements by @baszalmstra in https://github.com/prefix-dev/pixi/pull/635
  • +
  • Add CODE_OF_CONDUCT.md by @ruben-arts in https://github.com/prefix-dev/pixi/pull/648
  • +
  • Removed remaining .ps1 references by @bahugo in https://github.com/prefix-dev/pixi/pull/643
  • +
+

New Contributors#

+
    +
  • @marcelotrevisani made their first contribution in https://github.com/prefix-dev/pixi/pull/602
  • +
  • @trueleo made their first contribution in https://github.com/prefix-dev/pixi/pull/614
  • +
  • @bahugo made their first contribution in https://github.com/prefix-dev/pixi/pull/643
  • +
  • @KarelZe made their first contribution in https://github.com/prefix-dev/pixi/pull/654
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.11.0...v0.12.0

+

[0.11.1] - 2024-01-06#

+

Fixed#

+
    +
  • Upgrading rattler to fix pixi auth in #642
  • +
+

[0.11.0] - 2024-01-05#

+

✨ Highlights#

+
    +
  • Lots of important and preparations for the pypi sdist and multi environment feature
  • +
  • Lots of new contributors that help pixi improve!
  • +
+

Added#

+
    +
  • Add new commands for pixi project {version|channel|platform|description} by @hadim in #579
  • +
  • Add dependabot.yml by @pavelzw in #606
  • +
+

Changed#

+
    +
  • winget-releaser gets correct identifier by @ruben-arts in #561
  • +
  • Task run code by @baszalmstra in #556
  • +
  • No ps1 in activation scripts by @ruben-arts in #563
  • +
  • Changed some names for clarity by @tdejager in #568
  • +
  • Change font and make it dark mode by @ruben-arts in #576
  • +
  • Moved pypi installation into its own module by @tdejager in #589
  • +
  • Move alpha to beta feature and toggle it off with env var by @ruben-arts in #604
  • +
  • Improve UX activation scripts by @ruben-arts in #560
  • +
  • Add sanity check by @tdejager in #569
  • +
  • Refactor manifest by @ruben-arts in #572
  • +
  • Improve search by @Johnwillliam in #578
  • +
  • Split pypi and conda solve steps by @tdejager in #601
  • +
+

Fixed#

+
    +
  • Save file after lockfile is correctly updated by @ruben-arts in #555
  • +
  • Limit the number of concurrent solves by @baszalmstra in #571
  • +
  • Use project virtual packages in add command by @msegado in #609
  • +
  • Improved mapped dependency by @ruben-arts in #574
  • +
+

Documentation#

+
    +
  • Change font and make it dark mode by @ruben-arts in #576
  • +
  • typo: no ps1 in activation scripts by @ruben-arts in #563
  • +
  • Document adding CUDA to system-requirements by @ruben-arts in #595
  • +
  • Multi env proposal documentation by @ruben-arts in #584
  • +
  • Fix multiple typos in configuration.md by @SeaOtocinclus in #608
  • +
  • Add multiple machines from one project example by @pavelzw in #605
  • +
+

New Contributors#

+
    +
  • @hadim made their first contribution in #579
  • +
  • @msegado made their first contribution in #609
  • +
  • @Johnwillliam made their first contribution in #578
  • +
  • @SeaOtocinclus made their first contribution in #608
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.10.0...v0.11.0

+

[0.10.0] - 2023-12-8#

+

Highlights#

+
    +
  • Better pypi-dependencies support, now install even more of the pypi packages.
  • +
  • pixi add --pypi command to add a pypi package to your project.
  • +
+

Added#

+
    +
  • Use range (>=1.2.3, <1.3) when adding requirement, instead of 1.2.3.* by @baszalmstra in https://github.com/prefix-dev/pixi/pull/536
  • +
  • Update rip to fix by @tdejager in https://github.com/prefix-dev/pixi/pull/543
      +
    • Better Bytecode compilation (.pyc) support by @baszalmstra
    • +
    • Recognize .data directory headers by @baszalmstra
    • +
    +
  • +
  • Also print arguments given to a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/545
  • +
  • Add pixi add --pypi command by @ruben-arts in https://github.com/prefix-dev/pixi/pull/539
  • +
+

Fixed#

+
    +
  • space in global install path by @ruben-arts in https://github.com/prefix-dev/pixi/pull/513
  • +
  • Glibc version/family parsing by @baszalmstra in https://github.com/prefix-dev/pixi/pull/535
  • +
  • Use build and host specs while getting the best version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/538
  • +
+

Miscellaneous#

+
    +
  • docs: add update manual by @ruben-arts in https://github.com/prefix-dev/pixi/pull/521
  • +
  • add lightgbm demo by @partrita in https://github.com/prefix-dev/pixi/pull/492
  • +
  • Update documentation link by @williamjamir in https://github.com/prefix-dev/pixi/pull/525
  • +
  • Update Community.md by @jiaxiyang in https://github.com/prefix-dev/pixi/pull/527
  • +
  • Add winget releaser by @ruben-arts in https://github.com/prefix-dev/pixi/pull/547
  • +
  • Custom rerun-sdk example, force driven graph of pixi.lock by @ruben-arts in https://github.com/prefix-dev/pixi/pull/548
  • +
  • Better document pypi part by @ruben-arts in https://github.com/prefix-dev/pixi/pull/546
  • +
+

New Contributors#

+
    +
  • @partrita made their first contribution in https://github.com/prefix-dev/pixi/pull/492
  • +
  • @williamjamir made their first contribution in https://github.com/prefix-dev/pixi/pull/525
  • +
  • @jiaxiyang made their first contribution in https://github.com/prefix-dev/pixi/pull/527
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.1...v0.10.0

+

[0.9.1] - 2023-11-29#

+

Highlights#

+
    +
  • PyPI's scripts are now fixed. For example: https://github.com/prefix-dev/pixi/issues/516
  • +
+

Fixed#

+
    +
  • Remove attr (unused) and update all dependencies by @wolfv in https://github.com/prefix-dev/pixi/pull/510
  • +
  • Remove empty folders on python uninstall by @baszalmstra in https://github.com/prefix-dev/pixi/pull/512
  • +
  • Bump rip to add scripts by @baszalmstra in https://github.com/prefix-dev/pixi/pull/517
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.0...v0.9.1

+

[0.9.0] - 2023-11-28#

+

Highlights#

+
    +
  • You can now run pixi remove, pixi rm to remove a package from the environment
  • +
  • Fix pip install -e issue that was created by release v0.8.0 : https://github.com/prefix-dev/pixi/issues/507
  • +
+

Added#

+
    +
  • pixi remove command by @Wackyator in https://github.com/prefix-dev/pixi/pull/483
  • +
+

Fixed#

+
    +
  • Install entrypoints for [pypi-dependencies] @baszalmstra in https://github.com/prefix-dev/pixi/pull/508
  • +
  • Only uninstall pixi installed packages by @baszalmstra in https://github.com/prefix-dev/pixi/pull/509
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.8.0...v0.9.0

+

[0.8.0] - 2023-11-27#

+

Highlights#

+
    +
  • 🎉🐍[pypi-dependencies] ALPHA RELEASE🐍🎉, you can now add PyPI dependencies to your pixi project.
  • +
  • UX of pixi run has been improved with better errors and showing what task is run.
  • +
+
+

[!NOTE] +[pypi-dependencies] support is still incomplete, missing functionality is listed here: https://github.com/orgs/prefix-dev/projects/6. +Our intent is not to have 100% feature parity with pip, our goal is that you only need pixi for both conda and pypi packages alike.

+
+

Added#

+
    +
  • Bump rattler @ruben-arts in https://github.com/prefix-dev/pixi/pull/496
  • +
  • Implement lock-file satisfiability with pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/494
  • +
  • List pixi tasks when command not found is returned by @ruben-arts in https://github.com/prefix-dev/pixi/pull/488
  • +
  • Show which command is run as a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/491 && https://github.com/prefix-dev/pixi/pull/493
  • +
  • Add progress info to conda install by @baszalmstra in https://github.com/prefix-dev/pixi/pull/470
  • +
  • Install pypi dependencies (alpha) by @baszalmstra in https://github.com/prefix-dev/pixi/pull/452
  • +
+

Fixed#

+
    +
  • Add install scripts to pixi.sh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/458 && https://github.com/prefix-dev/pixi/pull/459 && https://github.com/prefix-dev/pixi/pull/460
  • +
  • Fix RECORD not found issue by @baszalmstra in https://github.com/prefix-dev/pixi/pull/495
  • +
  • Actually add to the .gitignore and give better errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/490
  • +
  • Support macOS for pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/478
  • +
  • Custom pypi-dependencies type by @ruben-arts in https://github.com/prefix-dev/pixi/pull/471
  • +
  • pypi-dependencies parsing errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/479
  • +
  • Progress issues by @baszalmstra in https://github.com/prefix-dev/pixi/pull/4
  • +
+

Miscellaneous#

+
    +
  • Example: ctypes by @liquidcarbon in https://github.com/prefix-dev/pixi/pull/441
  • +
  • Mention the AUR package by @orhun in https://github.com/prefix-dev/pixi/pull/464
  • +
  • Update rerun example by @ruben-arts in https://github.com/prefix-dev/pixi/pull/489
  • +
  • Document pypi-dependencies by @ruben-arts in https://github.com/prefix-dev/pixi/pull/481
  • +
  • Ignore docs paths on rust workflow by @ruben-arts in https://github.com/prefix-dev/pixi/pull/482
  • +
  • Fix flaky tests, run serially by @baszalmstra in https://github.com/prefix-dev/pixi/pull/477
  • +
+

New Contributors#

+
    +
  • @liquidcarbon made their first contribution in https://github.com/prefix-dev/pixi/pull/441
  • +
  • @orhun made their first contribution in https://github.com/prefix-dev/pixi/pull/464
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.7.0...v0.8.0

+

[0.7.0] - 2023-11-14#

+

Highlights#

+
    +
  • Channel priority: channels = ["conda-forge", "pytorch"] All packages found in conda-forge will not be taken from pytorch.
  • +
  • Channel specific dependencies: pytorch = { version="*", channel="pytorch"}
  • +
  • Autocompletion on pixi run <TABTAB>
  • +
  • Moved all pixi documentation into this repo, try it with pixi run docs!
  • +
  • Lots of new contributors!
  • +
+

Added#

+
    +
  • Bump rattler to its newest version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/395 + * Some notable changes: + * Add channel priority (If a package is found in the first listed channel it will not be looked for in the other channels). + * Fix JLAP using wrong hash. + * Lockfile forward compatibility error.
  • +
  • Add nushell support by @wolfv in https://github.com/prefix-dev/pixi/pull/360
  • +
  • Autocomplete tasks on pixi run for bash and zsh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/390
  • +
  • Add prefix location file to avoid copy error by @ruben-arts in https://github.com/prefix-dev/pixi/pull/422
  • +
  • Channel specific dependencies python = { version = "*" channel="conda-forge" } by @ruben-arts in https://github.com/prefix-dev/pixi/pull/439
  • +
+

Changed#

+
    +
  • project.version as optional field in the pixi.toml by @ruben-arts in https://github.com/prefix-dev/pixi/pull/400
  • +
+

Fixed#

+
    +
  • Deny unknown fields in pixi.toml to help users find errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/396
  • +
  • install.sh to create dot file if not present by @humphd in https://github.com/prefix-dev/pixi/pull/408
  • +
  • Ensure order of repodata fetches by @baszalmstra in https://github.com/prefix-dev/pixi/pull/405
  • +
  • Strip Linux binaries by @baszalmstra in https://github.com/prefix-dev/pixi/pull/414
  • +
  • Sort task list by @ruben-arts in https://github.com/prefix-dev/pixi/pull/431
  • +
  • Fix global install path on windows by @ruben-arts in https://github.com/prefix-dev/pixi/pull/449
  • +
  • Let PIXI_BIN_PATH use backslashes by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/442
  • +
  • Print more informative error if created file is empty by @traversaro in https://github.com/prefix-dev/pixi/pull/447
  • +
+

Docs#

+
    +
  • Move to mkdocs with all documentation by @ruben-arts in https://github.com/prefix-dev/pixi/pull/435
  • +
  • Fix typing errors by @FarukhS52 in https://github.com/prefix-dev/pixi/pull/426
  • +
  • Add social cards to the pages by @ruben-arts in https://github.com/prefix-dev/pixi/pull/445
  • +
  • Enhance README.md: Added Table of Contents, Grammar Improvements by @adarsh-jha-dev in https://github.com/prefix-dev/pixi/pull/421
  • +
  • Adding conda-auth to community examples by @travishathaway in https://github.com/prefix-dev/pixi/pull/433
  • +
  • Minor grammar correction by @tylere in https://github.com/prefix-dev/pixi/pull/406
  • +
  • Make capitalization of tab titles consistent by @tylere in https://github.com/prefix-dev/pixi/pull/407
  • +
+

New Contributors#

+
    +
  • @tylere made their first contribution in https://github.com/prefix-dev/pixi/pull/406
  • +
  • @humphd made their first contribution in https://github.com/prefix-dev/pixi/pull/408
  • +
  • @adarsh-jha-dev made their first contribution in https://github.com/prefix-dev/pixi/pull/421
  • +
  • @FarukhS52 made their first contribution in https://github.com/prefix-dev/pixi/pull/426
  • +
  • @travishathaway made their first contribution in https://github.com/prefix-dev/pixi/pull/433
  • +
  • @traversaro made their first contribution in https://github.com/prefix-dev/pixi/pull/447
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.6.0...v0.7.0

+

[0.6.0] - 2023-10-17#

+

Highlights#

+

This release fixes some bugs and adds the --cwd option to the tasks.

+

Fixed#

+
    +
  • Improve shell prompts by @ruben-arts in https://github.com/prefix-dev/pixi/pull/385 https://github.com/prefix-dev/pixi/pull/388
  • +
  • Change --frozen logic to error when there is no lockfile by @ruben-arts in https://github.com/prefix-dev/pixi/pull/373
  • +
  • Don't remove the '.11' from 'python3.11' binary file name by @ruben-arts in https://github.com/prefix-dev/pixi/pull/366
  • +
+

Changed#

+
    +
  • Update rerun example to v0.9.1 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/389
  • +
+

Added#

+
    +
  • Add the current working directory (--cwd) in pixi tasks by @ruben-arts in https://github.com/prefix-dev/pixi/pull/380
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.5.0...v0.6.0

+

[0.5.0] - 2023-10-03#

+

Highlights#

+

We rebuilt pixi shell, fixing the fact that your rc file would overrule the environment activation.

+

Fixed#

+
    +
  • Change how shell works and make activation more robust by @wolfv in https://github.com/prefix-dev/pixi/pull/316
  • +
  • Documentation: use quotes in cli by @pavelzw in https://github.com/prefix-dev/pixi/pull/367
  • +
+

Added#

+
    +
  • Create or append to the .gitignore and .gitattributes files by @ruben-arts in https://github.com/prefix-dev/pixi/pull/359
  • +
  • Add --locked and --frozen to getting an up-to-date prefix by @ruben-arts in https://github.com/prefix-dev/pixi/pull/363
  • +
  • Documentation: improvement/update by @ruben-arts in https://github.com/prefix-dev/pixi/pull/355
  • +
  • Example: how to build a docker image using pixi by @ruben-arts in https://github.com/prefix-dev/pixi/pull/353 & https://github.com/prefix-dev/pixi/pull/365
  • +
  • Update to the newest rattler by @baszalmstra in https://github.com/prefix-dev/pixi/pull/361
  • +
  • Periodic cargo upgrade --all --incompatible by @wolfv in https://github.com/prefix-dev/pixi/pull/358
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.4.0...v0.5.0

+

[0.4.0] - 2023-09-22#

+

Highlights#

+

This release adds the start of a new cli command pixi project which will allow users to interact with the project configuration from the command line.

+

Fixed#

+
    +
  • Align with latest rattler version 0.9.0 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/350
  • +
+

Added#

+
    +
  • Add codespell (config, workflow) to catch typos + catch and fix some of those by @yarikoptic in https://github.com/prefix-dev/pixi/pull/329
  • +
  • remove atty and use stdlib by @wolfv in https://github.com/prefix-dev/pixi/pull/337
  • +
  • xtsci-dist to Community.md by @HaoZeke in https://github.com/prefix-dev/pixi/pull/339
  • +
  • ribasim to Community.md by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/340
  • +
  • LFortran to Community.md by @wolfv in https://github.com/prefix-dev/pixi/pull/341
  • +
  • Give tip to resolve virtual package issue by @ruben-arts in https://github.com/prefix-dev/pixi/pull/348
  • +
  • pixi project channel add subcommand by @baszalmstra and @ruben-arts in https://github.com/prefix-dev/pixi/pull/347
  • +
+

New Contributors#

+
    +
  • @yarikoptic made their first contribution in https://github.com/prefix-dev/pixi/pull/329
  • +
  • @HaoZeke made their first contribution in https://github.com/prefix-dev/pixi/pull/339
  • +
+

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.3.0...v0.4.0

+

[0.3.0] - 2023-09-11#

+

Highlights#

+

This releases fixes a lot of issues encountered by the community as well as some awesome community contributions like the addition of pixi global list and pixi global remove.

+

Fixed#

+
    +
  • Properly detect Cuda on linux using our build binaries, by @baszalmstra (#290)
  • +
  • Package names are now case-insensitive, by @baszalmstra (#285)
  • +
  • Issue with starts-with and compatibility operator, by @tdejager (#296)
  • +
  • Lock files are now consistently sorted, by @baszalmstra (#295 & #307)
  • +
  • Improved xonsh detection and powershell env-var escaping, by @wolfv (#307)
  • +
  • system-requirements are properly filtered by platform, by @ruben-arts (#299)
  • +
  • Powershell completion install script, by @chawyehsu (#325)
  • +
  • Simplified and improved shell quoting, by @baszalmstra (#313)
  • +
  • Issue where platform specific subdirs were required, by @baszalmstra (#333)
  • +
  • thread 'tokio-runtime-worker' has overflowed its stack issue, by @baszalmstra (#28)
  • +
+

Added#

+
    +
  • Certificates from the OS certificate store are now used, by @baszalmstra (#310)
  • +
  • pixi global list and pixi global remove commands, by @cjfuller (#318)
  • +
+

Changed#

+
    +
  • --manifest-path must point to a pixi.toml file, by @baszalmstra (#324)
  • +
+

[0.2.0] - 2023-08-22#

+

Highlights#

+
    +
  • Added pixi search command to search for packages, by @Wackyator. (#244)
  • +
  • Added target specific tasks, eg. [target.win-64.tasks], by @ruben-arts. (#269)
  • +
  • Flaky install caused by the download of packages, by @baszalmstra. (#281)
  • +
+

Fixed#

+
    +
  • Install instructions, by @baszalmstra. (#258)
  • +
  • Typo in getting started, by @RaulPL. (#266)
  • +
  • Don't execute alias tasks, by @baszalmstra. (#274)
  • +
+

Added#

+
    +
  • Rerun example, by @ruben-arts. (#236)
  • +
  • Reduction of pixi's binary size, by @baszalmstra (#256)
  • +
  • Updated pixi banner, including webp file for faster loading, by @baszalmstra. (#257)
  • +
  • Set linguist attributes for pixi.lock automatically, by @spenserblack. (#265)
  • +
  • Contribution manual for pixi, by @ruben-arts. (#268)
  • +
  • GitHub issue templates, by @ruben-arts. (#271)
  • +
  • Links to prefix.dev in readme, by @tdejager. (#279)
  • +
+

[0.1.0] - 2023-08-11#

+

As this is our first Semantic Versioning release, we'll change from the prototype to the developing phase, as semver describes. +A 0.x release could be anything from a new major feature to a breaking change where the 0.0.x releases will be bugfixes or small improvements.

+

Highlights#

+
    +
  • Update to the latest rattler version, by @baszalmstra. (#249)
  • +
+

Fixed#

+
    +
  • Only add shebang to activation scripts on unix platforms, by @baszalmstra. (#250)
  • +
  • Use official crates.io releases for all dependencies, by @baszalmstra. (#252)
  • +
+

[0.0.8] - 2023-08-01#

+

Highlights#

+
    +
  • Much better error printing using miette, by @baszalmstra. (#211)
  • +
  • You can now use pixi on aarch64-linux, by @pavelzw. (#233)
  • +
  • Use the Rust port of libsolv as the default solver, by @ruben-arts. (#209)
  • +
+

Added#

+
    +
  • Add mention to condax in the docs, by @maresb. (#207)
  • +
  • Add brew installation instructions, by @wolfv. (#208)
  • +
  • Add activation.scripts to the pixi.toml to configure environment activation, by @ruben-arts. (#217)
  • +
  • Add pixi upload command to upload packages to prefix.dev, by @wolfv. (#127)
  • +
  • Add more metadata fields to the pixi.toml, by @wolfv. (#218)
  • +
  • Add pixi task list to show all tasks in the project, by @tdejager. (#228)
  • +
  • Add --color to configure the colors in the output, by @baszalmstra. (#243)
  • +
  • Examples, ROS2 Nav2, JupyterLab and QGIS, by @ruben-arts.
  • +
+

Fixed#

+
    +
  • Add trailing newline to pixi.toml and .gitignore, by @pavelzw. (#216)
  • +
  • Deny unknown fields and rename license-file in pixi.toml, by @wolfv. (#220)
  • +
  • Overwrite PS1 variable when going into a pixi shell, by @ruben-arts. (#201)
  • +
+

Changed#

+
    +
  • Install environment when adding a dependency using pixi add, by @baszalmstra. (#213)
  • +
  • Improve and speedup CI, by @baszalmstra. (#241)
  • +
+

[0.0.7] - 2023-07-11#

+

Highlights#

+
    +
  • Transitioned the run subcommand to use the deno_task_shell for improved cross-platform functionality. More details in the Deno Task Runner documentation.
  • +
  • Added an info subcommand to retrieve system-specific information understood by pixi.
  • +
+

BREAKING CHANGES#

+
    +
  • [commands] in the pixi.toml is now called [tasks]. (#177)
  • +
+

Added#

+
    +
  • The pixi info command to get more system information by @wolfv in (#158)
  • +
  • Documentation on how to use the cli by @ruben-arts in (#160)
  • +
  • Use the deno_task_shell to execute commands in pixi run by @baszalmstra in (#173)
  • +
  • Use new solver backend from rattler by @baszalmstra in (#178)
  • +
  • The pixi command command to the cli by @tdejager in (#177)
  • +
  • Documentation on how to use the pixi auth command by @wolfv in (#183)
  • +
  • Use the newest rattler 0.6.0 by @baszalmstra in (#185)
  • +
  • Build with pixi section to the documentation by @tdejager in (#196)
  • +
+

Fixed#

+
    +
  • Running tasks sequentially when using depends_on by @tdejager in (#161)
  • +
  • Don't add PATH variable where it is already set by @baszalmstra in (#169)
  • +
  • Fix README by @Hofer-Julian in (#182)
  • +
  • Fix Ctrl+C signal in pixi run by @tdejager in (#190)
  • +
  • Add the correct license information to the lockfiles by @wolfv in (#191)
  • +
+

[0.0.6] - 2023-06-30#

+

Highlights#

+

Improving the reliability is important to us, so we added an integration testing framework, we can now test as close as possible to the CLI level using cargo.

+

Added#

+
    +
  • An integration test harness, to test as close as possible to the user experience but in rust. (#138, #140, #156)
  • +
  • Add different levels of dependencies in preparation for pixi build, allowing host- and build- dependencies (#149)
  • +
+

Fixed#

+
    +
  • Use correct folder name on pixi init (#144)
  • +
  • Fix windows cli installer (#152)
  • +
  • Fix global install path variable (#147)
  • +
  • Fix macOS binary notarization (#153)
  • +
+

[0.0.5] - 2023-06-26#

+

Fixing Windows installer build in CI. (#145)

+

[0.0.4] - 2023-06-26#

+

Highlights#

+

A new command, auth which can be used to authenticate the host of the package channels. +A new command, shell which can be used to start a shell in the pixi environment of a project. +A refactor of the install command which is changed to global install and the install command now installs a pixi project if you run it in the directory. +Platform specific dependencies using [target.linux-64.dependencies] instead of [dependencies] in the pixi.toml

+

Lots and lots of fixes and improvements to make it easier for this user, where bumping to the new version of rattler +helped a lot.

+

Added#

+
    +
  • Platform specific dependencies and helpful error reporting on pixi.toml issues(#111)
  • +
  • Windows installer, which is very useful for users that want to start using pixi on windows. (#114)
  • +
  • shell command to use the pixi environment without pixi run. (#116)
  • +
  • Verbosity options using -v, -vv, -vvv (#118)
  • +
  • auth command to be able to login or logout of a host like repo.prefix.dev if you're using private channels. (#120)
  • +
  • New examples: CPP sdl: #121, Opencv camera calibration #125
  • +
  • Apple binary signing and notarization. (#137)
  • +
+

Changed#

+
    +
  • pixi install moved to pixi global install and pixi install became the installation of a project using the pixi.toml (#124)
  • +
+

Fixed#

+
    +
  • pixi run uses default shell (#119)
  • +
  • pixi add command is fixed. (#132)
  • +
+ + + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/Community/index.html b/v0.39.2/Community/index.html new file mode 100644 index 000000000..613dded84 --- /dev/null +++ b/v0.39.2/Community/index.html @@ -0,0 +1,1789 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Community - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Community#

+

When you want to show your users and contributors that they can use pixi in your repo, you can use the following badge:

+

Pixi Badge

+
[![Pixi Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/prefix-dev/pixi/main/assets/badge/v0.json)](https://pixi.sh)
+
+
+

Customize your badge

+

To further customize the look and feel of your badge, +you can add &style=<custom-style> at the end of the URL. +See the documentation on shields.io for more info.

+
+

Built using Pixi#

+
    +
  • Deltares:
      +
    • Ribasim: Water resources model
    • +
    • Ribasim-NL: Ribasim water resources modeling in the Netherlands
    • +
    • iMOD Python: Make massive MODFLOW models
    • +
    • iMOD Coupler: Application for coupling hydrological kernels
    • +
    • iMOD Documentation: Documentation of the iMOD suite.
    • +
    • Xugrid: Xarray and unstructured grids
    • +
    • Numba celltree: Celltree data structure for searching for points, lines, boxes, and cells (convex polygons) in a two dimensional unstructured mesh.
    • +
    • QGIS-Tim: QGIS plugin and utilities for TimML multi-layer analytic element model
    • +
    • Pandamesh: From geodataframe to mesh
    • +
    • Wflow: Hydrological modeling framework
    • +
    • HydroMT: Automated and reproducible model building and analysis
    • +
    • HydroMT SFINCS: SFINCS plugin for HydroMT
    • +
    • PyFlwDir: Fast methods to work with hydro- and topography data in pure Python.
    • +
    +
  • +
  • USGS:
      +
    • MODFLOW 6: USGS modular hydrological model
    • +
    +
  • +
  • QuantCo:
      +
    • glum: High performance Python GLMs with all the features!
    • +
    • tabmat: Efficient matrix representations for working with tabular data
    • +
    • pixi-pack: A tool to pack and unpack conda environments created with pixi
    • +
    • polarify: Simplifying conditional Polars Expressions with Python 🐍 🐻‍❄️
    • +
    • copier-template-python-open-source: Copier template for python projects using pixi
    • +
    • datajudge: Assessing whether data from database complies with reference information
    • +
    • ndonnx: ONNX-backed array library that is compliant with the Array API standard
    • +
    • multiregex: Quickly match many regexes against a string
    • +
    • slim-trees: Pickle your ML models more efficiently for deployment 🚀
    • +
    • sqlcompyre: Compare SQL tables and databases
    • +
    • metalearners: MetaLearners for CATE estimation
    • +
    • ndonnx: ONNX-backed array library that is compliant with the Array API standard
    • +
    • tabulardelta: Simplify table comparisons
    • +
    • pydiverse.pipedag: A library for data pipeline orchestration optimizing high development iteration speed
    • +
    • pydiverse.transform: Pipe based dataframe manipulation library that can also transform data on SQL databases
    • +
    +
  • +
+
    +
  • pixi-pycharm: Conda shim for PyCharm that proxies pixi
  • +
  • pixi-diff-to-markdown: Generate markdown summaries from pixi update
  • +
  • jiaxiyang/cpp_project_guideline: Guide the way beginners make their c++ projects.
  • +
  • hex-inc/vegafusion: Serverside scaling of Vega and Altair visualizations in Rust, Python, WASM, and Java
  • +
  • pablovela5620/arxiv-researcher: Summarize PDF's and Arixv papers with Langchain and Nougat 🦉
  • +
  • HaoZeke/xtsci-dist: Incremental scipy port using xtensor
  • +
  • jslorrma/keyrings.artifacts: Keyring backend that provides authentication for publishing or consuming Python packages to or from Azure Artifacts feeds within Azure DevOps
  • +
  • LFortran: A modern cross-platform Fortran compiler
  • +
  • Rerun: Rerun is an SDK for building time aware visualizations of multimodal data.
  • +
  • conda-auth: a conda plugin providing more secure authentication support to conda.
  • +
  • py-rattler: Build your own conda environment manager using the python wrapper of our Rattler backend.
  • +
  • array-api-extra: Extra array functions built on top of the Python array API standard.
  • +
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/FAQ/index.html b/v0.39.2/FAQ/index.html new file mode 100644 index 000000000..a6ada1bce --- /dev/null +++ b/v0.39.2/FAQ/index.html @@ -0,0 +1,1816 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Frequently asked questions - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

FAQ

+ +

What is the difference with conda, mamba, poetry, pip#

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
ToolInstalls pythonBuilds packagesRuns predefined tasksHas lock files builtinFastUse without python
Conda
Mamba
Pip
Pixi🚧
Poetry
+

Why the name pixi#

+

Starting with the name prefix we iterated until we had a name that was easy to pronounce, spell and remember. +There also wasn't a cli tool yet using that name. +Unlike px, pex, pax, etc. +We think it sparks curiosity and fun, if you don't agree, I'm sorry, but you can always alias it to whatever you like.

+
+
+
+
alias not_pixi="pixi"
+
+
+
+

PowerShell: +

New-Alias -Name not_pixi -Value pixi
+

+
+
+
+

Where is pixi build#

+

TL;DR: It's coming we promise!

+

pixi build is going to be the subcommand that can generate a conda package out of a pixi project. +This requires a solid build tool which we're creating with rattler-build which will be used as a library in pixi.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/__pycache__/docs_hooks.cpython-312.pyc b/v0.39.2/__pycache__/docs_hooks.cpython-312.pyc new file mode 100644 index 000000000..4d05ffd8b Binary files /dev/null and b/v0.39.2/__pycache__/docs_hooks.cpython-312.pyc differ diff --git a/v0.39.2/advanced/advanced_tasks/index.html b/v0.39.2/advanced/advanced_tasks/index.html new file mode 100644 index 000000000..25ad5c867 --- /dev/null +++ b/v0.39.2/advanced/advanced_tasks/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/v0.39.2/advanced/authentication/index.html b/v0.39.2/advanced/authentication/index.html new file mode 100644 index 000000000..fffa3e97e --- /dev/null +++ b/v0.39.2/advanced/authentication/index.html @@ -0,0 +1,1985 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Authenticate pixi with a server - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Authentication

+ +

You can authenticate pixi with a server like prefix.dev, a private quetz instance or anaconda.org. +Different servers use different authentication methods. +In this documentation page, we detail how you can authenticate against the different servers and where the authentication information is stored.

+
Usage: pixi auth login [OPTIONS] <HOST>
+
+Arguments:
+  <HOST>  The host to authenticate with (e.g. repo.prefix.dev)
+
+Options:
+      --token <TOKEN>              The token to use (for authentication with prefix.dev)
+      --username <USERNAME>        The username to use (for basic HTTP authentication)
+      --password <PASSWORD>        The password to use (for basic HTTP authentication)
+      --conda-token <CONDA_TOKEN>  The token to use on anaconda.org / quetz authentication
+  -v, --verbose...                 More output per occurrence
+  -q, --quiet...                   Less output per occurrence
+  -h, --help                       Print help
+
+

The different options are "token", "conda-token" and "username + password".

+

The token variant implements a standard "Bearer Token" authentication as is used on the prefix.dev platform. +A Bearer Token is sent with every request as an additional header of the form Authentication: Bearer <TOKEN>.

+

The conda-token option is used on anaconda.org and can be used with a quetz server. With this option, the token is sent as part of the URL following this scheme: conda.anaconda.org/t/<TOKEN>/conda-forge/linux-64/....

+

The last option, username & password, are used for "Basic HTTP Authentication". This is the equivalent of adding http://user:password@myserver.com/.... This authentication method can be configured quite easily with a reverse NGinx or Apache server and is thus commonly used in self-hosted systems.

+

Examples#

+

Login to prefix.dev:

+
pixi auth login prefix.dev --token pfx_jj8WDzvnuTHEGdAhwRZMC1Ag8gSto8
+
+

Login to anaconda.org:

+
pixi auth login anaconda.org --conda-token xy-72b914cc-c105-4ec7-a969-ab21d23480ed
+
+

Login to a basic HTTP secured server:

+
pixi auth login myserver.com --username user --password password
+
+

Where does pixi store the authentication information?#

+

The storage location for the authentication information is system-dependent. By default, pixi tries to use the keychain to store this sensitive information securely on your machine.

+

On Windows, the credentials are stored in the "credentials manager". Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

+

On macOS, the passwords are stored in the keychain. To access the password, you can use the Keychain Access program that comes pre-installed on macOS. Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

+

On Linux, one can use GNOME Keyring (or just Keyring) to access credentials that are securely stored by libsecret. Searching for rattler should list all the credentials stored by pixi and other rattler-based programs.

+

Fallback storage#

+

If you run on a server with none of the aforementioned keychains available, then pixi falls back to store the credentials in an insecure JSON file. +This JSON file is located at ~/.rattler/credentials.json and contains the credentials.

+

Override the authentication storage#

+

You can use the RATTLER_AUTH_FILE environment variable to override the default location of the credentials file. +When this environment variable is set, it provides the only source of authentication data that is used by pixi.

+

E.g.

+
export RATTLER_AUTH_FILE=$HOME/credentials.json
+# You can also specify the file in the command line
+pixi global install --auth-file $HOME/credentials.json ...
+
+

The JSON should follow the following format:

+
{
+    "*.prefix.dev": {
+        "BearerToken": "your_token"
+    },
+    "otherhost.com": {
+        "BasicHTTP": {
+            "username": "your_username",
+            "password": "your_password"
+        }
+    },
+    "conda.anaconda.org": {
+        "CondaToken": "your_token"
+    }
+}
+
+

Note: if you use a wildcard in the host, any subdomain will match (e.g. *.prefix.dev also matches repo.prefix.dev).

+

Lastly you can set the authentication override file in the global configuration file.

+

PyPI authentication#

+

Currently, we support the following methods for authenticating against PyPI:

+
    +
  1. keyring authentication.
  2. +
  3. .netrc file authentication.
  4. +
+

We want to add more methods in the future, so if you have a specific method you would like to see, please let us know.

+

Keyring authentication#

+

Currently, pixi supports the uv method of authentication through the python keyring library.

+

Installing keyring#

+

To install keyring you can use pixi global install:

+
+
+
+
pixi global install keyring
+
+
+
+
pixi global install keyring --with keyrings.google-artifactregistry-auth
+
+
+
+
pixi global install keyring --with keyring.artifacts
+
+
+
+
+

For other registries, you will need to adapt these instructions to add the right keyring backend.

+

Configuring your project to use keyring#

+
+
+
+

Use keyring to store your credentials e.g:

+
keyring set https://my-index/simple your_username
+# prompt will appear for your password
+
+

Add the following configuration to your pixi manifest, making sure to include your_username@ in the URL of the registry:

+
[pypi-options]
+index-url = "https://your_username@custom-registry.com/simple"
+
+
+
+

After making sure you are logged in, for instance by running gcloud auth login, add the following configuration to your pixi manifest:

+
[pypi-options]
+extra-index-urls = ["https://oauth2accesstoken@<location>-python.pkg.dev/<project>/<repository>/simple"]
+
+
+

Note

+

To find this URL more easily, you can use the gcloud command:

+
gcloud artifacts print-settings python --project=<project> --repository=<repository> --location=<location>
+
+
+
+
+

After following the keyring.artifacts instructions and making sure that keyring works correctly, add the following configuration to your pixi manifest:

+
[pypi-options]
+extra-index-urls = ["https://VssSessionToken@pkgs.dev.azure.com/{organization}/{project}/_packaging/{feed}/pypi/simple/"]
+
+
+
+
+

Installing your environment#

+

Either configure your Global Config, or use the flag --pypi-keyring-provider which can either be set to subprocess (activated) or disabled:

+
# From an existing pixi project
+pixi install --pypi-keyring-provider subprocess
+
+

.netrc file#

+

pixi allows you to access private registries securely by authenticating with credentials stored in a .netrc file.

+
    +
  • The .netrc file can be stored in your home directory ($HOME/.netrc for Unix-like systems)
  • +
  • or in the user profile directory on Windows (%HOME%\_netrc).
  • +
  • You can also set up a different location for it using the NETRC variable (export NETRC=/my/custom/location/.netrc). + e.g export NETRC=/my/custom/location/.netrc pixi install
  • +
+

In the .netrc file, you store authentication details like this:

+

machine registry-name
+login admin
+password admin
+
+For more details, you can access the .netrc docs.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/channel_priority/index.html b/v0.39.2/advanced/channel_priority/index.html new file mode 100644 index 000000000..a9f409a62 --- /dev/null +++ b/v0.39.2/advanced/channel_priority/index.html @@ -0,0 +1,1817 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Channel Logic - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

All logic regarding the decision which dependencies can be installed from which channel is done by the instruction we give the solver.

+

The actual code regarding this is in the rattler_solve crate. +This might however be hard to read. +Therefore, this document will continue with simplified flow charts.

+

Channel specific dependencies#

+

When a user defines a channel per dependency, the solver needs to know the other channels are unusable for this dependency. +

[project]
+channels = ["conda-forge", "my-channel"]
+
+[dependencies]
+packgex = { version = "*", channel = "my-channel" }
+
+In the packagex example, the solver will understand that the package is only available in my-channel and will not look for it in conda-forge.

+

The flowchart of the logic that excludes all other channels:

+
flowchart TD
+    A[Start] --> B[Given a Dependency]
+    B --> C{Channel Specific Dependency?}
+    C -->|Yes| D[Exclude All Other Channels for This Package]
+    C -->|No| E{Any Other Dependencies?}
+    E -->|Yes| B
+    E -->|No| F[End]
+    D --> E
+

Channel priority#

+

Channel priority is dictated by the order in the project.channels array, where the first channel is the highest priority. +For instance: +

[project]
+channels = ["conda-forge", "my-channel", "your-channel"]
+
+If the package is found in conda-forge the solver will not look for it in my-channel and your-channel, because it tells the solver they are excluded. +If the package is not found in conda-forge the solver will look for it in my-channel and if it is found there it will tell the solver to exclude your-channel for this package. +This diagram explains the logic: +
flowchart TD
+    A[Start] --> B[Given a Dependency]
+    B --> C{Loop Over Channels}
+    C --> D{Package in This Channel?}
+    D -->|No| C
+    D -->|Yes| E{"This the first channel
+     for this package?"}
+    E -->|Yes| F[Include Package in Candidates]
+    E -->|No| G[Exclude Package from Candidates]
+    F --> H{Any Other Channels?}
+    G --> H
+    H -->|Yes| C
+    H -->|No| I{Any Other Dependencies?}
+    I -->|No| J[End]
+    I -->|Yes| B

+

This method ensures the solver only adds a package to the candidates if it's found in the highest priority channel available. +If you have 10 channels and the package is found in the 5th channel it will exclude the next 5 channels from the candidates if they also contain the package.

+

Use case: pytorch and nvidia with conda-forge#

+

A common use case is to use pytorch with nvidia drivers, while also needing the conda-forge channel for the main dependencies. +

[project]
+channels = ["nvidia/label/cuda-11.8.0", "nvidia", "conda-forge", "pytorch"]
+platforms = ["linux-64"]
+
+[dependencies]
+cuda = {version = "*", channel="nvidia/label/cuda-11.8.0"}
+pytorch = {version = "2.0.1.*", channel="pytorch"}
+torchvision = {version = "0.15.2.*", channel="pytorch"}
+pytorch-cuda = {version = "11.8.*", channel="pytorch"}
+python = "3.10.*"
+
+What this will do is get as much as possible from the nvidia/label/cuda-11.8.0 channel, which is actually only the cuda package.

+

Then it will get all packages from the nvidia channel, which is a little more and some packages overlap the nvidia and conda-forge channel. +Like the cuda-cudart package, which will now only be retrieved from the nvidia channel because of the priority logic.

+

Then it will get the packages from the conda-forge channel, which is the main channel for the dependencies.

+

But the user only wants the pytorch packages from the pytorch channel, which is why pytorch is added last and the dependencies are added as channel specific dependencies.

+

We don't define the pytorch channel before conda-forge because we want to get as much as possible from the conda-forge as the pytorch channel is not always shipping the best versions of all packages.

+

For example, it also ships the ffmpeg package, but only an old version which doesn't work with the newer pytorch versions. +Thus breaking the installation if we would skip the conda-forge channel for ffmpeg with the priority logic.

+

Force a specific channel priority#

+

If you want to force a specific priority for a channel, you can use the priority (int) key in the channel definition. +The higher the number, the higher the priority. +Non specified priorities are set to 0 but the index in the array still counts as a priority, where the first in the list has the highest priority.

+

This priority definition is mostly important for multiple environments with different channel priorities, as by default feature channels are prepended to the project channels.

+

[project]
+name = "test_channel_priority"
+platforms = ["linux-64", "osx-64", "win-64", "osx-arm64"]
+channels = ["conda-forge"]
+
+[feature.a]
+channels = ["nvidia"]
+
+[feature.b]
+channels = [ "pytorch", {channel = "nvidia", priority = 1}]
+
+[feature.c]
+channels = [ "pytorch", {channel = "nvidia", priority = -1}]
+
+[environments]
+a = ["a"]
+b = ["b"]
+c = ["c"]
+
+This example creates 4 environments, a, b, c, and the default environment. +Which will have the following channel order:

+ + + + + + + + + + + + + + + + + + + + + + + + + +
EnvironmentResulting Channels order
defaultconda-forge
anvidia, conda-forge
bnvidia, pytorch, conda-forge
cpytorch, conda-forge, nvidia
+
+Check priority result with pixi info +

Using pixi info you can check the priority of the channels in the environment. +

pixi info
+Environments
+------------
+       Environment: default
+          Features: default
+          Channels: conda-forge
+Dependency count: 0
+Target platforms: linux-64
+
+       Environment: a
+          Features: a, default
+          Channels: nvidia, conda-forge
+Dependency count: 0
+Target platforms: linux-64
+
+       Environment: b
+          Features: b, default
+          Channels: nvidia, pytorch, conda-forge
+Dependency count: 0
+Target platforms: linux-64
+
+       Environment: c
+          Features: c, default
+          Channels: pytorch, conda-forge, nvidia
+Dependency count: 0
+Target platforms: linux-64
+

+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/explain_info_command/index.html b/v0.39.2/advanced/explain_info_command/index.html new file mode 100644 index 000000000..c01f28adb --- /dev/null +++ b/v0.39.2/advanced/explain_info_command/index.html @@ -0,0 +1,2075 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Info command - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Info Command

+ +

pixi info prints out useful information to debug a situation or to get an overview of your machine/project. +This information can also be retrieved in json format using the --json flag, which can be useful for programmatically reading it.

+
Running pixi info in the pixi repo
➜ pixi info
+      Pixi version: 0.13.0
+          Platform: linux-64
+  Virtual packages: __unix=0=0
+                  : __linux=6.5.12=0
+                  : __glibc=2.36=0
+                  : __cuda=12.3=0
+                  : __archspec=1=x86_64
+         Cache dir: /home/user/.cache/rattler/cache
+      Auth storage: /home/user/.rattler/credentials.json
+
+Project
+------------
+           Version: 0.13.0
+     Manifest file: /home/user/development/pixi/pixi.toml
+      Last updated: 25-01-2024 10:29:08
+
+Environments
+------------
+default
+          Features: default
+          Channels: conda-forge
+  Dependency count: 10
+      Dependencies: pre-commit, rust, openssl, pkg-config, git, mkdocs, mkdocs-material, pillow, cairosvg, compilers
+  Target platforms: linux-64, osx-arm64, win-64, osx-64
+             Tasks: docs, test-all, test, build, lint, install, build-docs
+
+

Global info#

+

The first part of the info output is information that is always available and tells you what pixi can read on your machine.

+

Platform#

+

This defines the platform you're currently on according to pixi. +If this is incorrect, please file an issue on the pixi repo.

+

Virtual packages#

+

The virtual packages that pixi can find on your machine.

+

In the Conda ecosystem, you can depend on virtual packages. +These packages aren't real dependencies that are going to be installed, but rather are being used in the solve step to find if a package can be installed on the machine. +A simple example: When a package depends on Cuda drivers being present on the host machine it can do that by depending on the __cuda virtual package. +In that case, if pixi cannot find the __cuda virtual package on your machine the installation will fail.

+

Cache dir#

+

The directory where pixi stores its cache. +Checkout the cache documentation for more information.

+

Auth storage#

+

Check the authentication documentation

+

Cache size#

+

[requires --extended]

+

The size of the previously mentioned "Cache dir" in Mebibytes.

+

Project info#

+

Everything below Project is info about the project you're currently in. +This info is only available if your path has a manifest file.

+

Manifest file#

+

The path to the manifest file that describes the project.

+

Last updated#

+

The last time the lock file was updated, either manually or by pixi itself.

+

Environment info#

+

The environment info defined per environment. If you don't have any environments defined, this will only show the default environment.

+

Features#

+

This lists which features are enabled in the environment. +For the default this is only default

+

Channels#

+

The list of channels used in this environment.

+

Dependency count#

+

The amount of dependencies defined that are defined for this environment (not the amount of installed dependencies).

+

Dependencies#

+

The list of dependencies defined for this environment.

+

Target platforms#

+

The platforms the project has defined.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/github_actions/index.html b/v0.39.2/advanced/github_actions/index.html new file mode 100644 index 000000000..7536111fd --- /dev/null +++ b/v0.39.2/advanced/github_actions/index.html @@ -0,0 +1,2224 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + GitHub Action - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

GitHub Actions

+ + + +

We created prefix-dev/setup-pixi to facilitate using pixi in CI.

+

Usage#

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    pixi-version: v0.39.2
+    cache: true
+    auth-host: prefix.dev
+    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}
+- run: pixi run test
+
+
+

Pin your action versions

+

Since pixi is not yet stable, the API of this action may change between minor versions. +Please pin the versions of this action to a specific version (i.e., prefix-dev/setup-pixi@v0.8.0) to avoid breaking changes. +You can automatically update the version of this action by using Dependabot.

+

Put the following in your .github/dependabot.yml file to enable Dependabot for your GitHub Actions:

+
.github/dependabot.yml
version: 2
+updates:
+  - package-ecosystem: github-actions
+    directory: /
+    schedule:
+      interval: monthly # (1)!
+    groups:
+      dependencies:
+        patterns:
+          - "*"
+
+
    +
  1. or daily, weekly
  2. +
+
+

Features#

+

To see all available input arguments, see the action.yml file in setup-pixi. +The most important features are described below.

+

Caching#

+

The action supports caching of the pixi environment. +By default, caching is enabled if a pixi.lock file is present. +It will then use the pixi.lock file to generate a hash of the environment and cache it. +If the cache is hit, the action will skip the installation and use the cached environment. +You can specify the behavior by setting the cache input argument.

+
+

Customize your cache key

+

If you need to customize your cache-key, you can use the cache-key input argument. +This will be the prefix of the cache key. The full cache key will be <cache-key><conda-arch>-<hash>.

+
+
+

Only save caches on main

+

In order to not exceed the 10 GB cache size limit as fast, you might want to restrict when the cache is saved. +This can be done by setting the cache-write argument.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    cache: true
+    cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}
+
+
+

Multiple environments#

+

With pixi, you can create multiple environments for different requirements. +You can also specify which environment(s) you want to install by setting the environments input argument. +This will install all environments that are specified and cache them.

+
[project]
+name = "my-package"
+channels = ["conda-forge"]
+platforms = ["linux-64"]
+
+[dependencies]
+python = ">=3.11"
+pip = "*"
+polars = ">=0.14.24,<0.21"
+
+[feature.py311.dependencies]
+python = "3.11.*"
+[feature.py312.dependencies]
+python = "3.12.*"
+
+[environments]
+py311 = ["py311"]
+py312 = ["py312"]
+
+

Multiple environments using a matrix#

+

The following example will install the py311 and py312 environments in different jobs.

+
test:
+  runs-on: ubuntu-latest
+  strategy:
+    matrix:
+      environment: [py311, py312]
+  steps:
+  - uses: actions/checkout@v4
+  - uses: prefix-dev/setup-pixi@v0.8.0
+    with:
+      environments: ${{ matrix.environment }}
+
+

Install multiple environments in one job#

+

The following example will install both the py311 and the py312 environment on the runner.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    environments: >- # (1)!
+      py311
+      py312
+- run: |
+  pixi run -e py311 test
+  pixi run -e py312 test
+
+
    +
  1. +

    separated by spaces, equivalent to

    +
    environments: py311 py312
    +
    +
  2. +
+
+

Caching behavior if you don't specify environments

+

If you don't specify any environment, the default environment will be installed and cached, even if you use other environments.

+
+

Authentication#

+

There are currently three ways to authenticate with pixi:

+
    +
  • using a token
  • +
  • using a username and password
  • +
  • using a conda-token
  • +
+

For more information, see Authentication.

+
+

Handle secrets with care

+

Please only store sensitive information using GitHub secrets. Do not store them in your repository. +When your sensitive information is stored in a GitHub secret, you can access it using the ${{ secrets.SECRET_NAME }} syntax. +These secrets will always be masked in the logs.

+
+

Token#

+

Specify the token using the auth-token input argument. +This form of authentication (bearer token in the request headers) is mainly used at prefix.dev.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    auth-host: prefix.dev
+    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}
+
+

Username and password#

+

Specify the username and password using the auth-username and auth-password input arguments. +This form of authentication (HTTP Basic Auth) is used in some enterprise environments with artifactory for example.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    auth-host: custom-artifactory.com
+    auth-username: ${{ secrets.PIXI_USERNAME }}
+    auth-password: ${{ secrets.PIXI_PASSWORD }}
+
+

Conda-token#

+

Specify the conda-token using the conda-token input argument. +This form of authentication (token is encoded in URL: https://my-quetz-instance.com/t/<token>/get/custom-channel) is used at anaconda.org or with quetz instances.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    auth-host: anaconda.org # (1)!
+    conda-token: ${{ secrets.CONDA_TOKEN }}
+
+
    +
  1. or my-quetz-instance.com
  2. +
+

Custom shell wrapper#

+

setup-pixi allows you to run command inside of the pixi environment by specifying a custom shell wrapper with shell: pixi run bash -e {0}. +This can be useful if you want to run commands inside of the pixi environment, but don't want to use the pixi run command for each command.

+
- run: | # (1)!
+    python --version
+    pip install --no-deps -e .
+  shell: pixi run bash -e {0}
+
+
    +
  1. everything here will be run inside of the pixi environment
  2. +
+

You can even run Python scripts like this:

+
- run: | # (1)!
+    import my_package
+    print("Hello world!")
+  shell: pixi run python {0}
+
+
    +
  1. everything here will be run inside of the pixi environment
  2. +
+

If you want to use PowerShell, you need to specify -Command as well.

+
- run: | # (1)!
+    python --version | Select-String "3.11"
+  shell: pixi run pwsh -Command {0} # pwsh works on all platforms
+
+
    +
  1. everything here will be run inside of the pixi environment
  2. +
+
+

How does it work under the hood?

+

Under the hood, the shell: xyz {0} option is implemented by creating a temporary script file and calling xyz with that script file as an argument. +This file does not have the executable bit set, so you cannot use shell: pixi run {0} directly but instead have to use shell: pixi run bash {0}. +There are some custom shells provided by GitHub that have slightly different behavior, see jobs.<job_id>.steps[*].shell in the documentation. +See the official documentation and ADR 0277 for more information about how the shell: input works in GitHub Actions.

+
+

One-off shell wrapper using pixi exec#

+

With pixi exec, you can also run a one-off command inside a temporary pixi environment.

+
- run: | # (1)!
+    zstd --version
+  shell: pixi exec --spec zstd -- bash -e {0}
+
+
    +
  1. everything here will be run inside of the temporary pixi environment
  2. +
+
- run: | # (1)!
+    import ruamel.yaml
+    # ...
+  shell: pixi exec --spec python=3.11.* --spec ruamel.yaml -- python {0}
+
+
    +
  1. everything here will be run inside of the temporary pixi environment
  2. +
+

See here for more information about pixi exec.

+

Environment activation#

+

Instead of using a custom shell wrapper, you can also make all pixi-installed binaries available to subsequent steps by "activating" the installed environment in the currently running job. +To this end, setup-pixi adds all environment variables set when executing pixi run to $GITHUB_ENV and, similarly, adds all path modifications to $GITHUB_PATH. +As a result, all installed binaries can be accessed without having to call pixi run.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    activate-environment: true
+
+

If you are installing multiple environments, you will need to specify the name of the environment that you want to be activated.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    environments: >-
+      py311
+      py312
+    activate-environment: py311
+
+

Activating an environment may be more useful than using a custom shell wrapper as it allows non-shell based steps to access binaries on the path. +However, be aware that this option augments the environment of your job.

+

--frozen and --locked#

+

You can specify whether setup-pixi should run pixi install --frozen or pixi install --locked depending on the frozen or the locked input argument. +See the official documentation for more information about the --frozen and --locked flags.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    locked: true
+    # or
+    frozen: true
+
+

If you don't specify anything, the default behavior is to run pixi install --locked if a pixi.lock file is present and pixi install otherwise.

+

Debugging#

+

There are two types of debug logging that you can enable.

+

Debug logging of the action#

+

The first one is the debug logging of the action itself. +This can be enabled by for the action by re-running the action in debug mode:

+

Re-run in debug mode +Re-run in debug mode

+
+

Debug logging documentation

+

For more information about debug logging in GitHub Actions, see the official documentation.

+
+

Debug logging of pixi#

+

The second type is the debug logging of the pixi executable. +This can be specified by setting the log-level input.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    log-level: vvv # (1)!
+
+
    +
  1. One of q, default, v, vv, or vvv.
  2. +
+

If nothing is specified, log-level will default to default or vv depending on if debug logging is enabled for the action.

+

Self-hosted runners#

+

On self-hosted runners, it may happen that some files are persisted between jobs. +This can lead to problems or secrets getting leaked between job runs. +To avoid this, you can use the post-cleanup input to specify the post cleanup behavior of the action (i.e., what happens after all your commands have been executed).

+

If you set post-cleanup to true, the action will delete the following files:

+
    +
  • .pixi environment
  • +
  • the pixi binary
  • +
  • the rattler cache
  • +
  • other rattler files in ~/.rattler
  • +
+

If nothing is specified, post-cleanup will default to true.

+

On self-hosted runners, you also might want to alter the default pixi install location to a temporary location. You can use pixi-bin-path: ${{ runner.temp }}/bin/pixi to do this.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    post-cleanup: true
+    pixi-bin-path: ${{ runner.temp }}/bin/pixi # (1)!
+
+
    +
  1. ${{ runner.temp }}\Scripts\pixi.exe on Windows
  2. +
+

You can also use a preinstalled local version of pixi on the runner by not setting any of the pixi-version, +pixi-url or pixi-bin-path inputs. This action will then try to find a local version of pixi in the runner's PATH.

+

Using the pyproject.toml as a manifest file for pixi.#

+

setup-pixi will automatically pick up the pyproject.toml if it contains a [tool.pixi.project] section and no pixi.toml. +This can be overwritten by setting the manifest-path input argument.

+
- uses: prefix-dev/setup-pixi@v0.8.0
+  with:
+    manifest-path: pyproject.toml
+
+

More examples#

+

If you want to see more examples, you can take a look at the GitHub Workflows of the setup-pixi repository.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/global_configuration/index.html b/v0.39.2/advanced/global_configuration/index.html new file mode 100644 index 000000000..d6f7b6013 --- /dev/null +++ b/v0.39.2/advanced/global_configuration/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/v0.39.2/advanced/multi_platform_configuration/index.html b/v0.39.2/advanced/multi_platform_configuration/index.html new file mode 100644 index 000000000..00caef49d --- /dev/null +++ b/v0.39.2/advanced/multi_platform_configuration/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/v0.39.2/advanced/production_deployment/index.html b/v0.39.2/advanced/production_deployment/index.html new file mode 100644 index 000000000..bebf01cf7 --- /dev/null +++ b/v0.39.2/advanced/production_deployment/index.html @@ -0,0 +1,1969 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Production Deployment - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Bringing pixi to production#

+

You can bring pixi projects into production by either containerizing it using tools like Docker or by using quantco/pixi-pack.

+
+

@pavelzw from QuantCo wrote a blog post about bringing pixi to production. You can read it here.

+
+

Docker#

+ + +

We provide a simple docker image at pixi-docker that contains the pixi executable on top of different base images.

+

The images are available on ghcr.io/prefix-dev/pixi.

+

There are different tags for different base images available:

+
    +
  • latest - based on ubuntu:jammy
  • +
  • focal - based on ubuntu:focal
  • +
  • bullseye - based on debian:bullseye
  • +
  • jammy-cuda-12.2.2 - based on nvidia/cuda:12.2.2-jammy
  • +
  • ... and more
  • +
+
+

All tags

+

For all tags, take a look at the build script.

+
+

Example usage#

+

The following example uses the pixi docker image as a base image for a multi-stage build. +It also makes use of pixi shell-hook to not rely on pixi being installed in the production container.

+
+

More examples

+

For more examples, take a look at pavelzw/pixi-docker-example.

+
+
FROM ghcr.io/prefix-dev/pixi:0.39.2 AS build
+
+# copy source code, pixi.toml and pixi.lock to the container
+WORKDIR /app
+COPY . .
+# install dependencies to `/app/.pixi/envs/prod`
+# use `--locked` to ensure the lockfile is up to date with pixi.toml
+RUN pixi install --locked -e prod
+# create the shell-hook bash script to activate the environment
+RUN pixi shell-hook -e prod -s bash > /shell-hook
+RUN echo "#!/bin/bash" > /app/entrypoint.sh
+RUN cat /shell-hook >> /app/entrypoint.sh
+# extend the shell-hook script to run the command passed to the container
+RUN echo 'exec "$@"' >> /app/entrypoint.sh
+
+FROM ubuntu:24.04 AS production
+WORKDIR /app
+# only copy the production environment into prod container
+# please note that the "prefix" (path) needs to stay the same as in the build container
+COPY --from=build /app/.pixi/envs/prod /app/.pixi/envs/prod
+COPY --from=build --chmod=0755 /app/entrypoint.sh /app/entrypoint.sh
+# copy your project code into the container as well
+COPY ./my_project /app/my_project
+
+EXPOSE 8000
+ENTRYPOINT [ "/app/entrypoint.sh" ]
+# run your app inside the pixi environment
+CMD [ "uvicorn", "my_project:app", "--host", "0.0.0.0" ]
+
+

pixi-pack#

+ + +

pixi-pack is a simple tool that takes a pixi environment and packs it into a compressed archive that can be shipped to the target machine.

+

It can be installed via

+
pixi global install pixi-pack
+
+

Or by downloading our pre-built binaries from the releases page.

+

Instead of installing pixi-pack globally, you can also use pixi exec to run pixi-pack in a temporary environment:

+
pixi exec pixi-pack pack
+pixi exec pixi-pack unpack environment.tar
+
+

pixi-pack demo +pixi-pack demo

+

You can pack an environment with

+
pixi-pack pack --manifest-file pixi.toml --environment prod --platform linux-64
+
+

This will create a environment.tar file that contains all conda packages required to create the environment.

+
# environment.tar
+| pixi-pack.json
+| environment.yml
+| channel
+|    ├── noarch
+|    |    ├── tzdata-2024a-h0c530f3_0.conda
+|    |    ├── ...
+|    |    └── repodata.json
+|    └── linux-64
+|         ├── ca-certificates-2024.2.2-hbcca054_0.conda
+|         ├── ...
+|         └── repodata.json
+
+

Unpacking an environment#

+

With pixi-pack unpack environment.tar, you can unpack the environment on your target system. This will create a new conda environment in ./env that contains all packages specified in your pixi.toml. It also creates an activate.sh (or activate.bat on Windows) file that lets you activate the environment without needing to have conda or micromamba installed.

+

Cross-platform packs#

+

Since pixi-pack just downloads the .conda and .tar.bz2 files from the conda repositories, you can trivially create packs for different platforms.

+
pixi-pack pack --platform win-64
+
+
+

You can only unpack a pack on a system that has the same platform as the pack was created for.

+
+

Inject additional packages#

+

You can inject additional packages into the environment that are not specified in pixi.lock by using the --inject flag:

+
pixi-pack pack --inject local-package-1.0.0-hbefa133_0.conda --manifest-pack pixi.toml
+
+

This can be particularly useful if you build the project itself and want to include the built package in the environment but still want to use pixi.lock from the project.

+

Unpacking without pixi-pack#

+

If you don't have pixi-pack available on your target system, you can still install the environment if you have conda or micromamba available. +Just unarchive the environment.tar, then you have a local channel on your system where all necessary packages are available. +Next to this local channel, you will find an environment.yml file that contains the environment specification. +You can then install the environment using conda or micromamba:

+
tar -xvf environment.tar
+micromamba create -p ./env --file environment.yml
+# or
+conda env create -p ./env --file environment.yml
+
+
+

The environment.yml and repodata.json files are only for this use case, pixi-pack unpack does not use them.

+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/pyproject_toml/index.html b/v0.39.2/advanced/pyproject_toml/index.html new file mode 100644 index 000000000..6558159fa --- /dev/null +++ b/v0.39.2/advanced/pyproject_toml/index.html @@ -0,0 +1,2043 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Pyproject.toml - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

pyproject.toml in pixi#

+

We support the use of the pyproject.toml as our manifest file in pixi. +This allows the user to keep one file with all configuration. +The pyproject.toml file is a standard for Python projects. +We don't advise to use the pyproject.toml file for anything else than python projects, the pixi.toml is better suited for other types of projects.

+

Initial setup of the pyproject.toml file#

+

When you already have a pyproject.toml file in your project, you can run pixi init in a that folder. Pixi will automatically

+
    +
  • Add a [tool.pixi.project] section to the file, with the platform and channel information required by pixi;
  • +
  • Add the current project as an editable pypi dependency;
  • +
  • Add some defaults to the .gitignore and .gitattributes files.
  • +
+

If you do not have an existing pyproject.toml file , you can run pixi init --format pyproject in your project folder. In that case, pixi will create a pyproject.toml manifest from scratch with some sane defaults.

+

Python dependency#

+

The pyproject.toml file supports the requires_python field. +Pixi understands that field and automatically adds the version to the dependencies.

+

This is an example of a pyproject.toml file with the requires_python field, which will be used as the python dependency:

+
pyproject.toml
[project]
+name = "my_project"
+requires-python = ">=3.9"
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+

Which is equivalent to:

+
equivalent pixi.toml
[project]
+name = "my_project"
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+[dependencies]
+python = ">=3.9"
+
+

Dependency section#

+

The pyproject.toml file supports the dependencies field. +Pixi understands that field and automatically adds the dependencies to the project as [pypi-dependencies].

+

This is an example of a pyproject.toml file with the dependencies field:

+
pyproject.toml
[project]
+name = "my_project"
+requires-python = ">=3.9"
+dependencies = [
+    "numpy",
+    "pandas",
+    "matplotlib",
+]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+

Which is equivalent to:

+
equivalent pixi.toml
[project]
+name = "my_project"
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+[pypi-dependencies]
+numpy = "*"
+pandas = "*"
+matplotlib = "*"
+
+[dependencies]
+python = ">=3.9"
+
+

You can overwrite these with conda dependencies by adding them to the dependencies field:

+
pyproject.toml
[project]
+name = "my_project"
+requires-python = ">=3.9"
+dependencies = [
+    "numpy",
+    "pandas",
+    "matplotlib",
+]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+[tool.pixi.dependencies]
+numpy = "*"
+pandas = "*"
+matplotlib = "*"
+
+

This would result in the conda dependencies being installed and the pypi dependencies being ignored. +As pixi takes the conda dependencies over the pypi dependencies.

+

Optional dependencies#

+

If your python project includes groups of optional dependencies, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

+

You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per feature. Self-references to other groups of optional dependencies are also handled.

+

For instance, imagine you have a project folder with a pyproject.toml file similar to:

+
[project]
+name = "my_project"
+dependencies = ["package1"]
+
+[project.optional-dependencies]
+test = ["pytest"]
+all = ["package2","my_project[test]"]
+
+

Running pixi init in that project folder will transform the pyproject.toml file into:

+
[project]
+name = "my_project"
+dependencies = ["package1"]
+
+[project.optional-dependencies]
+test = ["pytest"]
+all = ["package2","my_project[test]"]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64"] # if executed on linux
+
+[tool.pixi.environments]
+default = {features = [], solve-group = "default"}
+test = {features = ["test"], solve-group = "default"}
+all = {features = ["all", "test"], solve-group = "default"}
+
+

In this example, three environments will be created by pixi:

+
    +
  • default with 'package1' as pypi dependency
  • +
  • test with 'package1' and 'pytest' as pypi dependencies
  • +
  • all with 'package1', 'package2' and 'pytest' as pypi dependencies
  • +
+

All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

+

Dependency groups#

+

If your python project includes dependency groups, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

+

You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per dependency group.

+

For instance, imagine you have a project folder with a pyproject.toml file similar to:

+
[project]
+name = "my_project"
+dependencies = ["package1"]
+
+[dependency-groups]
+test = ["pytest"]
+docs = ["sphinx"]
+dev = [{include-group = "test"}, {include-group = "docs"}]
+
+

Running pixi init in that project folder will transform the pyproject.toml file into:

+
[project]
+name = "my_project"
+dependencies = ["package1"]
+
+[dependency-groups]
+test = ["pytest"]
+docs = ["sphinx"]
+dev = [{include-group = "test"}, {include-group = "docs"}]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64"] # if executed on linux
+
+[tool.pixi.environments]
+default = {features = [], solve-group = "default"}
+test = {features = ["test"], solve-group = "default"}
+docs = {features = ["docs"], solve-group = "default"}
+dev = {features = ["dev"], solve-group = "default"}
+
+

In this example, four environments will be created by pixi:

+
    +
  • default with 'package1' as pypi dependency
  • +
  • test with 'package1' and 'pytest' as pypi dependencies
  • +
  • docs with 'package1', 'sphinx' as pypi dependencies
  • +
  • dev with 'package1', 'sphinx' and 'pytest' as pypi dependencies
  • +
+

All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

+

Example#

+

As the pyproject.toml file supports the full pixi spec with [tool.pixi] prepended an example would look like this:

+
pyproject.toml
[project]
+name = "my_project"
+requires-python = ">=3.9"
+dependencies = [
+    "numpy",
+    "pandas",
+    "matplotlib",
+    "ruff",
+]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+[tool.pixi.dependencies]
+compilers = "*"
+cmake = "*"
+
+[tool.pixi.tasks]
+start = "python my_project/main.py"
+lint = "ruff lint"
+
+[tool.pixi.system-requirements]
+cuda = "11.0"
+
+[tool.pixi.feature.test.dependencies]
+pytest = "*"
+
+[tool.pixi.feature.test.tasks]
+test = "pytest"
+
+[tool.pixi.environments]
+test = ["test"]
+
+

Build-system section#

+

The pyproject.toml file normally contains a [build-system] section. Pixi will use this section to build and install the project if it is added as a pypi path dependency.

+

If the pyproject.toml file does not contain any [build-system] section, pixi will fall back to uv's default, which is equivalent to the below:

+
pyproject.toml
[build-system]
+requires = ["setuptools >= 40.8.0"]
+build-backend = "setuptools.build_meta:__legacy__"
+
+

Including a [build-system] section is highly recommended. If you are not sure of the build-backend you want to use, including the [build-system] section below in your pyproject.toml is a good starting point. +pixi init --format pyproject defaults to hatchling. +The advantages of hatchling over setuptools are outlined on its website.

+
pyproject.toml
[build-system]
+build-backend = "hatchling.build"
+requires = ["hatchling"]
+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/advanced/updates_github_actions/index.html b/v0.39.2/advanced/updates_github_actions/index.html new file mode 100644 index 000000000..660eb51fa --- /dev/null +++ b/v0.39.2/advanced/updates_github_actions/index.html @@ -0,0 +1,1840 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Update lockfiles with GitHub Actions - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + Skip to content + + +
+
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Updates using GitHub Actions

+ +

You can leverage GitHub Actions in combination with pavelzw/pixi-diff-to-markdown +to automatically update your lockfiles similar to dependabot or renovate in other ecosystems.

+

Update lockfiles +Update lockfiles

+
+

Dependabot/Renovate support for pixi

+

You can track native Dependabot support for pixi in dependabot/dependabot-core #2227 +and for Renovate in renovatebot/renovate #2213.

+
+

How to use#

+

To get started, create a new GitHub Actions workflow file in your repository.

+
.github/workflows/update-lockfiles.yml
name: Update lockfiles
+
+permissions: # (1)!
+  contents: write
+  pull-requests: write
+
+on:
+  workflow_dispatch:
+  schedule:
+    - cron: 0 5 1 * * # (2)!
+
+jobs:
+  pixi-update:
+    runs-on: ubuntu-latest
+    steps:
+      - uses: actions/checkout@v4
+      - name: Set up pixi
+        uses: prefix-dev/setup-pixi@v0.8.1
+        with:
+          run-install: false
+      - name: Update lockfiles
+        run: |
+          set -o pipefail
+          pixi update --json | pixi exec pixi-diff-to-markdown >> diff.md
+      - name: Create pull request
+        uses: peter-evans/create-pull-request@v7
+        with:
+          token: ${{ secrets.GITHUB_TOKEN }}
+          commit-message: Update pixi lockfile
+          title: Update pixi lockfile
+          body-path: diff.md
+          branch: update-pixi
+          base: main
+          labels: pixi
+          delete-branch: true
+          add-paths: pixi.lock
+
+
    +
  1. Needed for peter-evans/create-pull-request
  2. +
  3. Runs at 05:00, on day 1 of the month
  4. +
+

In order for this workflow to work, you need to set "Allow GitHub Actions to create and approve pull requests" to true in your repository settings (in "Actions" -> "General").

+
+

Tip

+

If you don't have any pypi-dependencies, you can use pixi update --json --no-install to speed up diff generation.

+
+

Allow GitHub Actions PRs +Allow GitHub Actions PRs

+

Triggering CI in automated PRs#

+

In order to prevent accidental recursive GitHub Workflow runs, GitHub decided to not trigger any workflows on automated PRs when using the default GITHUB_TOKEN. +There are a couple of ways how to work around this limitation. You can find excellent documentation for this in peter-evans/create-pull-request, see here.

+

Customizing the summary#

+

You can customize the summary by either using command-line-arguments of pixi-diff-to-markdown or by specifying the configuration in pixi.toml under [tool.pixi-diff-to-markdown]. See the pixi-diff-to-markdown documentation or run pixi-diff-to-markdown --help for more information.

+

Using reusable workflows#

+

If you want to use the same workflow in multiple repositories in your GitHub organization, you can create a reusable workflow. +You can find more information in the GitHub documentation.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/assets/allow-github-actions-prs-dark.png b/v0.39.2/assets/allow-github-actions-prs-dark.png new file mode 100644 index 000000000..41a3a5143 Binary files /dev/null and b/v0.39.2/assets/allow-github-actions-prs-dark.png differ diff --git a/v0.39.2/assets/allow-github-actions-prs-light.png b/v0.39.2/assets/allow-github-actions-prs-light.png new file mode 100644 index 000000000..4b9a3664e Binary files /dev/null and b/v0.39.2/assets/allow-github-actions-prs-light.png differ diff --git a/v0.39.2/assets/images/favicon.png b/v0.39.2/assets/images/favicon.png new file mode 100644 index 000000000..1cf13b9f9 Binary files /dev/null and b/v0.39.2/assets/images/favicon.png differ diff --git a/v0.39.2/assets/images/social/CHANGELOG.png b/v0.39.2/assets/images/social/CHANGELOG.png new file mode 100644 index 000000000..bd4a2cd5f Binary files /dev/null and b/v0.39.2/assets/images/social/CHANGELOG.png differ diff --git a/v0.39.2/assets/images/social/Community.png b/v0.39.2/assets/images/social/Community.png new file mode 100644 index 000000000..bcb769a65 Binary files /dev/null and b/v0.39.2/assets/images/social/Community.png differ diff --git a/v0.39.2/assets/images/social/FAQ.png b/v0.39.2/assets/images/social/FAQ.png new file mode 100644 index 000000000..1b7872dd3 Binary files /dev/null and b/v0.39.2/assets/images/social/FAQ.png differ diff --git a/v0.39.2/assets/images/social/advanced/authentication.png b/v0.39.2/assets/images/social/advanced/authentication.png new file mode 100644 index 000000000..f17319c80 Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/authentication.png differ diff --git a/v0.39.2/assets/images/social/advanced/channel_priority.png b/v0.39.2/assets/images/social/advanced/channel_priority.png new file mode 100644 index 000000000..fc157166d Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/channel_priority.png differ diff --git a/v0.39.2/assets/images/social/advanced/explain_info_command.png b/v0.39.2/assets/images/social/advanced/explain_info_command.png new file mode 100644 index 000000000..2609f960d Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/explain_info_command.png differ diff --git a/v0.39.2/assets/images/social/advanced/github_actions.png b/v0.39.2/assets/images/social/advanced/github_actions.png new file mode 100644 index 000000000..52b1c4237 Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/github_actions.png differ diff --git a/v0.39.2/assets/images/social/advanced/production_deployment.png b/v0.39.2/assets/images/social/advanced/production_deployment.png new file mode 100644 index 000000000..35303a4a7 Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/production_deployment.png differ diff --git a/v0.39.2/assets/images/social/advanced/pyproject_toml.png b/v0.39.2/assets/images/social/advanced/pyproject_toml.png new file mode 100644 index 000000000..1f38d2b38 Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/pyproject_toml.png differ diff --git a/v0.39.2/assets/images/social/advanced/updates_github_actions.png b/v0.39.2/assets/images/social/advanced/updates_github_actions.png new file mode 100644 index 000000000..27615bb1d Binary files /dev/null and b/v0.39.2/assets/images/social/advanced/updates_github_actions.png differ diff --git a/v0.39.2/assets/images/social/basic_usage.png b/v0.39.2/assets/images/social/basic_usage.png new file mode 100644 index 000000000..82f808649 Binary files /dev/null and b/v0.39.2/assets/images/social/basic_usage.png differ diff --git a/v0.39.2/assets/images/social/examples/cpp-sdl.png b/v0.39.2/assets/images/social/examples/cpp-sdl.png new file mode 100644 index 000000000..b28db29c3 Binary files /dev/null and b/v0.39.2/assets/images/social/examples/cpp-sdl.png differ diff --git a/v0.39.2/assets/images/social/examples/opencv.png b/v0.39.2/assets/images/social/examples/opencv.png new file mode 100644 index 000000000..a5906c3e0 Binary files /dev/null and b/v0.39.2/assets/images/social/examples/opencv.png differ diff --git a/v0.39.2/assets/images/social/examples/ros2-nav2.png b/v0.39.2/assets/images/social/examples/ros2-nav2.png new file mode 100644 index 000000000..bfd17c0b6 Binary files /dev/null and b/v0.39.2/assets/images/social/examples/ros2-nav2.png differ diff --git a/v0.39.2/assets/images/social/features/advanced_tasks.png b/v0.39.2/assets/images/social/features/advanced_tasks.png new file mode 100644 index 000000000..3c1d8488a Binary files /dev/null and b/v0.39.2/assets/images/social/features/advanced_tasks.png differ diff --git a/v0.39.2/assets/images/social/features/environment.png b/v0.39.2/assets/images/social/features/environment.png new file mode 100644 index 000000000..474a1de61 Binary files /dev/null and b/v0.39.2/assets/images/social/features/environment.png differ diff --git a/v0.39.2/assets/images/social/features/global_tools.png b/v0.39.2/assets/images/social/features/global_tools.png new file mode 100644 index 000000000..f3eaad172 Binary files /dev/null and b/v0.39.2/assets/images/social/features/global_tools.png differ diff --git a/v0.39.2/assets/images/social/features/lockfile.png b/v0.39.2/assets/images/social/features/lockfile.png new file mode 100644 index 000000000..b193c900f Binary files /dev/null and b/v0.39.2/assets/images/social/features/lockfile.png differ diff --git a/v0.39.2/assets/images/social/features/multi_environment.png b/v0.39.2/assets/images/social/features/multi_environment.png new file mode 100644 index 000000000..94985636f Binary files /dev/null and b/v0.39.2/assets/images/social/features/multi_environment.png differ diff --git a/v0.39.2/assets/images/social/features/multi_platform_configuration.png b/v0.39.2/assets/images/social/features/multi_platform_configuration.png new file mode 100644 index 000000000..8b1945b8a Binary files /dev/null and b/v0.39.2/assets/images/social/features/multi_platform_configuration.png differ diff --git a/v0.39.2/assets/images/social/features/system_requirements.png b/v0.39.2/assets/images/social/features/system_requirements.png new file mode 100644 index 000000000..22ce04eeb Binary files /dev/null and b/v0.39.2/assets/images/social/features/system_requirements.png differ diff --git a/v0.39.2/assets/images/social/ide_integration/devcontainer.png b/v0.39.2/assets/images/social/ide_integration/devcontainer.png new file mode 100644 index 000000000..8571bb41d Binary files /dev/null and b/v0.39.2/assets/images/social/ide_integration/devcontainer.png differ diff --git a/v0.39.2/assets/images/social/ide_integration/jupyterlab.png b/v0.39.2/assets/images/social/ide_integration/jupyterlab.png new file mode 100644 index 000000000..6a388a76a Binary files /dev/null and b/v0.39.2/assets/images/social/ide_integration/jupyterlab.png differ diff --git a/v0.39.2/assets/images/social/ide_integration/pycharm.png b/v0.39.2/assets/images/social/ide_integration/pycharm.png new file mode 100644 index 000000000..3b0611c61 Binary files /dev/null and b/v0.39.2/assets/images/social/ide_integration/pycharm.png differ diff --git a/v0.39.2/assets/images/social/ide_integration/r_studio.png b/v0.39.2/assets/images/social/ide_integration/r_studio.png new file mode 100644 index 000000000..5c9cdd3a4 Binary files /dev/null and b/v0.39.2/assets/images/social/ide_integration/r_studio.png differ diff --git a/v0.39.2/assets/images/social/index.png b/v0.39.2/assets/images/social/index.png new file mode 100644 index 000000000..8403dfd92 Binary files /dev/null and b/v0.39.2/assets/images/social/index.png differ diff --git a/v0.39.2/assets/images/social/packaging.png b/v0.39.2/assets/images/social/packaging.png new file mode 100644 index 000000000..3443651d8 Binary files /dev/null and b/v0.39.2/assets/images/social/packaging.png differ diff --git a/v0.39.2/assets/images/social/reference/cli.png b/v0.39.2/assets/images/social/reference/cli.png new file mode 100644 index 000000000..6bfb43d3e Binary files /dev/null and b/v0.39.2/assets/images/social/reference/cli.png differ diff --git a/v0.39.2/assets/images/social/reference/pixi_configuration.png b/v0.39.2/assets/images/social/reference/pixi_configuration.png new file mode 100644 index 000000000..eba4b4801 Binary files /dev/null and b/v0.39.2/assets/images/social/reference/pixi_configuration.png differ diff --git a/v0.39.2/assets/images/social/reference/pixi_manifest.png b/v0.39.2/assets/images/social/reference/pixi_manifest.png new file mode 100644 index 000000000..3706f3cdd Binary files /dev/null and b/v0.39.2/assets/images/social/reference/pixi_manifest.png differ diff --git a/v0.39.2/assets/images/social/switching_from/conda.png b/v0.39.2/assets/images/social/switching_from/conda.png new file mode 100644 index 000000000..ef37902ad Binary files /dev/null and b/v0.39.2/assets/images/social/switching_from/conda.png differ diff --git a/v0.39.2/assets/images/social/switching_from/poetry.png b/v0.39.2/assets/images/social/switching_from/poetry.png new file mode 100644 index 000000000..f22c0b7d2 Binary files /dev/null and b/v0.39.2/assets/images/social/switching_from/poetry.png differ diff --git a/v0.39.2/assets/images/social/tutorials/python.png b/v0.39.2/assets/images/social/tutorials/python.png new file mode 100644 index 000000000..38adf5bce Binary files /dev/null and b/v0.39.2/assets/images/social/tutorials/python.png differ diff --git a/v0.39.2/assets/images/social/tutorials/ros2.png b/v0.39.2/assets/images/social/tutorials/ros2.png new file mode 100644 index 000000000..b73768930 Binary files /dev/null and b/v0.39.2/assets/images/social/tutorials/ros2.png differ diff --git a/v0.39.2/assets/images/social/tutorials/rust.png b/v0.39.2/assets/images/social/tutorials/rust.png new file mode 100644 index 000000000..64ecb4dc5 Binary files /dev/null and b/v0.39.2/assets/images/social/tutorials/rust.png differ diff --git a/v0.39.2/assets/images/social/vision.png b/v0.39.2/assets/images/social/vision.png new file mode 100644 index 000000000..832ca054c Binary files /dev/null and b/v0.39.2/assets/images/social/vision.png differ diff --git a/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js b/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js new file mode 100644 index 000000000..e22d189fd --- /dev/null +++ b/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js @@ -0,0 +1,29 @@ +"use strict";(()=>{var Fi=Object.create;var gr=Object.defineProperty;var ji=Object.getOwnPropertyDescriptor;var Wi=Object.getOwnPropertyNames,Dt=Object.getOwnPropertySymbols,Ui=Object.getPrototypeOf,xr=Object.prototype.hasOwnProperty,no=Object.prototype.propertyIsEnumerable;var oo=(e,t,r)=>t in e?gr(e,t,{enumerable:!0,configurable:!0,writable:!0,value:r}):e[t]=r,R=(e,t)=>{for(var r in t||(t={}))xr.call(t,r)&&oo(e,r,t[r]);if(Dt)for(var r of Dt(t))no.call(t,r)&&oo(e,r,t[r]);return e};var io=(e,t)=>{var r={};for(var o in e)xr.call(e,o)&&t.indexOf(o)<0&&(r[o]=e[o]);if(e!=null&&Dt)for(var o of Dt(e))t.indexOf(o)<0&&no.call(e,o)&&(r[o]=e[o]);return r};var yr=(e,t)=>()=>(t||e((t={exports:{}}).exports,t),t.exports);var Di=(e,t,r,o)=>{if(t&&typeof t=="object"||typeof t=="function")for(let n of Wi(t))!xr.call(e,n)&&n!==r&&gr(e,n,{get:()=>t[n],enumerable:!(o=ji(t,n))||o.enumerable});return e};var Vt=(e,t,r)=>(r=e!=null?Fi(Ui(e)):{},Di(t||!e||!e.__esModule?gr(r,"default",{value:e,enumerable:!0}):r,e));var ao=(e,t,r)=>new Promise((o,n)=>{var i=p=>{try{s(r.next(p))}catch(c){n(c)}},a=p=>{try{s(r.throw(p))}catch(c){n(c)}},s=p=>p.done?o(p.value):Promise.resolve(p.value).then(i,a);s((r=r.apply(e,t)).next())});var co=yr((Er,so)=>{(function(e,t){typeof Er=="object"&&typeof so!="undefined"?t():typeof define=="function"&&define.amd?define(t):t()})(Er,function(){"use strict";function e(r){var o=!0,n=!1,i=null,a={text:!0,search:!0,url:!0,tel:!0,email:!0,password:!0,number:!0,date:!0,month:!0,week:!0,time:!0,datetime:!0,"datetime-local":!0};function s(H){return!!(H&&H!==document&&H.nodeName!=="HTML"&&H.nodeName!=="BODY"&&"classList"in H&&"contains"in H.classList)}function p(H){var mt=H.type,ze=H.tagName;return!!(ze==="INPUT"&&a[mt]&&!H.readOnly||ze==="TEXTAREA"&&!H.readOnly||H.isContentEditable)}function c(H){H.classList.contains("focus-visible")||(H.classList.add("focus-visible"),H.setAttribute("data-focus-visible-added",""))}function l(H){H.hasAttribute("data-focus-visible-added")&&(H.classList.remove("focus-visible"),H.removeAttribute("data-focus-visible-added"))}function f(H){H.metaKey||H.altKey||H.ctrlKey||(s(r.activeElement)&&c(r.activeElement),o=!0)}function u(H){o=!1}function h(H){s(H.target)&&(o||p(H.target))&&c(H.target)}function w(H){s(H.target)&&(H.target.classList.contains("focus-visible")||H.target.hasAttribute("data-focus-visible-added"))&&(n=!0,window.clearTimeout(i),i=window.setTimeout(function(){n=!1},100),l(H.target))}function A(H){document.visibilityState==="hidden"&&(n&&(o=!0),te())}function te(){document.addEventListener("mousemove",J),document.addEventListener("mousedown",J),document.addEventListener("mouseup",J),document.addEventListener("pointermove",J),document.addEventListener("pointerdown",J),document.addEventListener("pointerup",J),document.addEventListener("touchmove",J),document.addEventListener("touchstart",J),document.addEventListener("touchend",J)}function ie(){document.removeEventListener("mousemove",J),document.removeEventListener("mousedown",J),document.removeEventListener("mouseup",J),document.removeEventListener("pointermove",J),document.removeEventListener("pointerdown",J),document.removeEventListener("pointerup",J),document.removeEventListener("touchmove",J),document.removeEventListener("touchstart",J),document.removeEventListener("touchend",J)}function J(H){H.target.nodeName&&H.target.nodeName.toLowerCase()==="html"||(o=!1,ie())}document.addEventListener("keydown",f,!0),document.addEventListener("mousedown",u,!0),document.addEventListener("pointerdown",u,!0),document.addEventListener("touchstart",u,!0),document.addEventListener("visibilitychange",A,!0),te(),r.addEventListener("focus",h,!0),r.addEventListener("blur",w,!0),r.nodeType===Node.DOCUMENT_FRAGMENT_NODE&&r.host?r.host.setAttribute("data-js-focus-visible",""):r.nodeType===Node.DOCUMENT_NODE&&(document.documentElement.classList.add("js-focus-visible"),document.documentElement.setAttribute("data-js-focus-visible",""))}if(typeof window!="undefined"&&typeof document!="undefined"){window.applyFocusVisiblePolyfill=e;var t;try{t=new CustomEvent("focus-visible-polyfill-ready")}catch(r){t=document.createEvent("CustomEvent"),t.initCustomEvent("focus-visible-polyfill-ready",!1,!1,{})}window.dispatchEvent(t)}typeof document!="undefined"&&e(document)})});var Yr=yr((Rt,Kr)=>{/*! + * clipboard.js v2.0.11 + * https://clipboardjs.com/ + * + * Licensed MIT © Zeno Rocha + */(function(t,r){typeof Rt=="object"&&typeof Kr=="object"?Kr.exports=r():typeof define=="function"&&define.amd?define([],r):typeof Rt=="object"?Rt.ClipboardJS=r():t.ClipboardJS=r()})(Rt,function(){return function(){var e={686:function(o,n,i){"use strict";i.d(n,{default:function(){return Ii}});var a=i(279),s=i.n(a),p=i(370),c=i.n(p),l=i(817),f=i.n(l);function u(V){try{return document.execCommand(V)}catch(_){return!1}}var h=function(_){var O=f()(_);return u("cut"),O},w=h;function A(V){var _=document.documentElement.getAttribute("dir")==="rtl",O=document.createElement("textarea");O.style.fontSize="12pt",O.style.border="0",O.style.padding="0",O.style.margin="0",O.style.position="absolute",O.style[_?"right":"left"]="-9999px";var j=window.pageYOffset||document.documentElement.scrollTop;return O.style.top="".concat(j,"px"),O.setAttribute("readonly",""),O.value=V,O}var te=function(_,O){var j=A(_);O.container.appendChild(j);var D=f()(j);return u("copy"),j.remove(),D},ie=function(_){var O=arguments.length>1&&arguments[1]!==void 0?arguments[1]:{container:document.body},j="";return typeof _=="string"?j=te(_,O):_ instanceof HTMLInputElement&&!["text","search","url","tel","password"].includes(_==null?void 0:_.type)?j=te(_.value,O):(j=f()(_),u("copy")),j},J=ie;function H(V){"@babel/helpers - typeof";return typeof Symbol=="function"&&typeof Symbol.iterator=="symbol"?H=function(O){return typeof O}:H=function(O){return O&&typeof Symbol=="function"&&O.constructor===Symbol&&O!==Symbol.prototype?"symbol":typeof O},H(V)}var mt=function(){var _=arguments.length>0&&arguments[0]!==void 0?arguments[0]:{},O=_.action,j=O===void 0?"copy":O,D=_.container,Y=_.target,ke=_.text;if(j!=="copy"&&j!=="cut")throw new Error('Invalid "action" value, use either "copy" or "cut"');if(Y!==void 0)if(Y&&H(Y)==="object"&&Y.nodeType===1){if(j==="copy"&&Y.hasAttribute("disabled"))throw new Error('Invalid "target" attribute. Please use "readonly" instead of "disabled" attribute');if(j==="cut"&&(Y.hasAttribute("readonly")||Y.hasAttribute("disabled")))throw new Error(`Invalid "target" attribute. You can't cut text from elements with "readonly" or "disabled" attributes`)}else throw new Error('Invalid "target" value, use a valid Element');if(ke)return J(ke,{container:D});if(Y)return j==="cut"?w(Y):J(Y,{container:D})},ze=mt;function Ie(V){"@babel/helpers - typeof";return typeof Symbol=="function"&&typeof Symbol.iterator=="symbol"?Ie=function(O){return typeof O}:Ie=function(O){return O&&typeof Symbol=="function"&&O.constructor===Symbol&&O!==Symbol.prototype?"symbol":typeof O},Ie(V)}function _i(V,_){if(!(V instanceof _))throw new TypeError("Cannot call a class as a function")}function ro(V,_){for(var O=0;O<_.length;O++){var j=_[O];j.enumerable=j.enumerable||!1,j.configurable=!0,"value"in j&&(j.writable=!0),Object.defineProperty(V,j.key,j)}}function Ai(V,_,O){return _&&ro(V.prototype,_),O&&ro(V,O),V}function Ci(V,_){if(typeof _!="function"&&_!==null)throw new TypeError("Super expression must either be null or a function");V.prototype=Object.create(_&&_.prototype,{constructor:{value:V,writable:!0,configurable:!0}}),_&&br(V,_)}function br(V,_){return br=Object.setPrototypeOf||function(j,D){return j.__proto__=D,j},br(V,_)}function Hi(V){var _=Pi();return function(){var j=Wt(V),D;if(_){var Y=Wt(this).constructor;D=Reflect.construct(j,arguments,Y)}else D=j.apply(this,arguments);return ki(this,D)}}function ki(V,_){return _&&(Ie(_)==="object"||typeof _=="function")?_:$i(V)}function $i(V){if(V===void 0)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return V}function Pi(){if(typeof Reflect=="undefined"||!Reflect.construct||Reflect.construct.sham)return!1;if(typeof Proxy=="function")return!0;try{return Date.prototype.toString.call(Reflect.construct(Date,[],function(){})),!0}catch(V){return!1}}function Wt(V){return Wt=Object.setPrototypeOf?Object.getPrototypeOf:function(O){return O.__proto__||Object.getPrototypeOf(O)},Wt(V)}function vr(V,_){var O="data-clipboard-".concat(V);if(_.hasAttribute(O))return _.getAttribute(O)}var Ri=function(V){Ci(O,V);var _=Hi(O);function O(j,D){var Y;return _i(this,O),Y=_.call(this),Y.resolveOptions(D),Y.listenClick(j),Y}return Ai(O,[{key:"resolveOptions",value:function(){var D=arguments.length>0&&arguments[0]!==void 0?arguments[0]:{};this.action=typeof D.action=="function"?D.action:this.defaultAction,this.target=typeof D.target=="function"?D.target:this.defaultTarget,this.text=typeof D.text=="function"?D.text:this.defaultText,this.container=Ie(D.container)==="object"?D.container:document.body}},{key:"listenClick",value:function(D){var Y=this;this.listener=c()(D,"click",function(ke){return Y.onClick(ke)})}},{key:"onClick",value:function(D){var Y=D.delegateTarget||D.currentTarget,ke=this.action(Y)||"copy",Ut=ze({action:ke,container:this.container,target:this.target(Y),text:this.text(Y)});this.emit(Ut?"success":"error",{action:ke,text:Ut,trigger:Y,clearSelection:function(){Y&&Y.focus(),window.getSelection().removeAllRanges()}})}},{key:"defaultAction",value:function(D){return vr("action",D)}},{key:"defaultTarget",value:function(D){var Y=vr("target",D);if(Y)return document.querySelector(Y)}},{key:"defaultText",value:function(D){return vr("text",D)}},{key:"destroy",value:function(){this.listener.destroy()}}],[{key:"copy",value:function(D){var Y=arguments.length>1&&arguments[1]!==void 0?arguments[1]:{container:document.body};return J(D,Y)}},{key:"cut",value:function(D){return w(D)}},{key:"isSupported",value:function(){var D=arguments.length>0&&arguments[0]!==void 0?arguments[0]:["copy","cut"],Y=typeof D=="string"?[D]:D,ke=!!document.queryCommandSupported;return Y.forEach(function(Ut){ke=ke&&!!document.queryCommandSupported(Ut)}),ke}}]),O}(s()),Ii=Ri},828:function(o){var n=9;if(typeof Element!="undefined"&&!Element.prototype.matches){var i=Element.prototype;i.matches=i.matchesSelector||i.mozMatchesSelector||i.msMatchesSelector||i.oMatchesSelector||i.webkitMatchesSelector}function a(s,p){for(;s&&s.nodeType!==n;){if(typeof s.matches=="function"&&s.matches(p))return s;s=s.parentNode}}o.exports=a},438:function(o,n,i){var a=i(828);function s(l,f,u,h,w){var A=c.apply(this,arguments);return l.addEventListener(u,A,w),{destroy:function(){l.removeEventListener(u,A,w)}}}function p(l,f,u,h,w){return typeof l.addEventListener=="function"?s.apply(null,arguments):typeof u=="function"?s.bind(null,document).apply(null,arguments):(typeof l=="string"&&(l=document.querySelectorAll(l)),Array.prototype.map.call(l,function(A){return s(A,f,u,h,w)}))}function c(l,f,u,h){return function(w){w.delegateTarget=a(w.target,f),w.delegateTarget&&h.call(l,w)}}o.exports=p},879:function(o,n){n.node=function(i){return i!==void 0&&i instanceof HTMLElement&&i.nodeType===1},n.nodeList=function(i){var a=Object.prototype.toString.call(i);return i!==void 0&&(a==="[object NodeList]"||a==="[object HTMLCollection]")&&"length"in i&&(i.length===0||n.node(i[0]))},n.string=function(i){return typeof i=="string"||i instanceof String},n.fn=function(i){var a=Object.prototype.toString.call(i);return a==="[object Function]"}},370:function(o,n,i){var a=i(879),s=i(438);function p(u,h,w){if(!u&&!h&&!w)throw new Error("Missing required arguments");if(!a.string(h))throw new TypeError("Second argument must be a String");if(!a.fn(w))throw new TypeError("Third argument must be a Function");if(a.node(u))return c(u,h,w);if(a.nodeList(u))return l(u,h,w);if(a.string(u))return f(u,h,w);throw new TypeError("First argument must be a String, HTMLElement, HTMLCollection, or NodeList")}function c(u,h,w){return u.addEventListener(h,w),{destroy:function(){u.removeEventListener(h,w)}}}function l(u,h,w){return Array.prototype.forEach.call(u,function(A){A.addEventListener(h,w)}),{destroy:function(){Array.prototype.forEach.call(u,function(A){A.removeEventListener(h,w)})}}}function f(u,h,w){return s(document.body,u,h,w)}o.exports=p},817:function(o){function n(i){var a;if(i.nodeName==="SELECT")i.focus(),a=i.value;else if(i.nodeName==="INPUT"||i.nodeName==="TEXTAREA"){var s=i.hasAttribute("readonly");s||i.setAttribute("readonly",""),i.select(),i.setSelectionRange(0,i.value.length),s||i.removeAttribute("readonly"),a=i.value}else{i.hasAttribute("contenteditable")&&i.focus();var p=window.getSelection(),c=document.createRange();c.selectNodeContents(i),p.removeAllRanges(),p.addRange(c),a=p.toString()}return a}o.exports=n},279:function(o){function n(){}n.prototype={on:function(i,a,s){var p=this.e||(this.e={});return(p[i]||(p[i]=[])).push({fn:a,ctx:s}),this},once:function(i,a,s){var p=this;function c(){p.off(i,c),a.apply(s,arguments)}return c._=a,this.on(i,c,s)},emit:function(i){var a=[].slice.call(arguments,1),s=((this.e||(this.e={}))[i]||[]).slice(),p=0,c=s.length;for(p;p{"use strict";/*! + * escape-html + * Copyright(c) 2012-2013 TJ Holowaychuk + * Copyright(c) 2015 Andreas Lubbe + * Copyright(c) 2015 Tiancheng "Timothy" Gu + * MIT Licensed + */var ts=/["'&<>]/;ei.exports=rs;function rs(e){var t=""+e,r=ts.exec(t);if(!r)return t;var o,n="",i=0,a=0;for(i=r.index;i0&&i[i.length-1])&&(c[0]===6||c[0]===2)){r=0;continue}if(c[0]===3&&(!i||c[1]>i[0]&&c[1]=e.length&&(e=void 0),{value:e&&e[o++],done:!e}}};throw new TypeError(t?"Object is not iterable.":"Symbol.iterator is not defined.")}function N(e,t){var r=typeof Symbol=="function"&&e[Symbol.iterator];if(!r)return e;var o=r.call(e),n,i=[],a;try{for(;(t===void 0||t-- >0)&&!(n=o.next()).done;)i.push(n.value)}catch(s){a={error:s}}finally{try{n&&!n.done&&(r=o.return)&&r.call(o)}finally{if(a)throw a.error}}return i}function q(e,t,r){if(r||arguments.length===2)for(var o=0,n=t.length,i;o1||s(u,h)})})}function s(u,h){try{p(o[u](h))}catch(w){f(i[0][3],w)}}function p(u){u.value instanceof nt?Promise.resolve(u.value.v).then(c,l):f(i[0][2],u)}function c(u){s("next",u)}function l(u){s("throw",u)}function f(u,h){u(h),i.shift(),i.length&&s(i[0][0],i[0][1])}}function mo(e){if(!Symbol.asyncIterator)throw new TypeError("Symbol.asyncIterator is not defined.");var t=e[Symbol.asyncIterator],r;return t?t.call(e):(e=typeof de=="function"?de(e):e[Symbol.iterator](),r={},o("next"),o("throw"),o("return"),r[Symbol.asyncIterator]=function(){return this},r);function o(i){r[i]=e[i]&&function(a){return new Promise(function(s,p){a=e[i](a),n(s,p,a.done,a.value)})}}function n(i,a,s,p){Promise.resolve(p).then(function(c){i({value:c,done:s})},a)}}function k(e){return typeof e=="function"}function ft(e){var t=function(o){Error.call(o),o.stack=new Error().stack},r=e(t);return r.prototype=Object.create(Error.prototype),r.prototype.constructor=r,r}var zt=ft(function(e){return function(r){e(this),this.message=r?r.length+` errors occurred during unsubscription: +`+r.map(function(o,n){return n+1+") "+o.toString()}).join(` + `):"",this.name="UnsubscriptionError",this.errors=r}});function qe(e,t){if(e){var r=e.indexOf(t);0<=r&&e.splice(r,1)}}var Fe=function(){function e(t){this.initialTeardown=t,this.closed=!1,this._parentage=null,this._finalizers=null}return e.prototype.unsubscribe=function(){var t,r,o,n,i;if(!this.closed){this.closed=!0;var a=this._parentage;if(a)if(this._parentage=null,Array.isArray(a))try{for(var s=de(a),p=s.next();!p.done;p=s.next()){var c=p.value;c.remove(this)}}catch(A){t={error:A}}finally{try{p&&!p.done&&(r=s.return)&&r.call(s)}finally{if(t)throw t.error}}else a.remove(this);var l=this.initialTeardown;if(k(l))try{l()}catch(A){i=A instanceof zt?A.errors:[A]}var f=this._finalizers;if(f){this._finalizers=null;try{for(var u=de(f),h=u.next();!h.done;h=u.next()){var w=h.value;try{fo(w)}catch(A){i=i!=null?i:[],A instanceof zt?i=q(q([],N(i)),N(A.errors)):i.push(A)}}}catch(A){o={error:A}}finally{try{h&&!h.done&&(n=u.return)&&n.call(u)}finally{if(o)throw o.error}}}if(i)throw new zt(i)}},e.prototype.add=function(t){var r;if(t&&t!==this)if(this.closed)fo(t);else{if(t instanceof e){if(t.closed||t._hasParent(this))return;t._addParent(this)}(this._finalizers=(r=this._finalizers)!==null&&r!==void 0?r:[]).push(t)}},e.prototype._hasParent=function(t){var r=this._parentage;return r===t||Array.isArray(r)&&r.includes(t)},e.prototype._addParent=function(t){var r=this._parentage;this._parentage=Array.isArray(r)?(r.push(t),r):r?[r,t]:t},e.prototype._removeParent=function(t){var r=this._parentage;r===t?this._parentage=null:Array.isArray(r)&&qe(r,t)},e.prototype.remove=function(t){var r=this._finalizers;r&&qe(r,t),t instanceof e&&t._removeParent(this)},e.EMPTY=function(){var t=new e;return t.closed=!0,t}(),e}();var Tr=Fe.EMPTY;function qt(e){return e instanceof Fe||e&&"closed"in e&&k(e.remove)&&k(e.add)&&k(e.unsubscribe)}function fo(e){k(e)?e():e.unsubscribe()}var $e={onUnhandledError:null,onStoppedNotification:null,Promise:void 0,useDeprecatedSynchronousErrorHandling:!1,useDeprecatedNextContext:!1};var ut={setTimeout:function(e,t){for(var r=[],o=2;o0},enumerable:!1,configurable:!0}),t.prototype._trySubscribe=function(r){return this._throwIfClosed(),e.prototype._trySubscribe.call(this,r)},t.prototype._subscribe=function(r){return this._throwIfClosed(),this._checkFinalizedStatuses(r),this._innerSubscribe(r)},t.prototype._innerSubscribe=function(r){var o=this,n=this,i=n.hasError,a=n.isStopped,s=n.observers;return i||a?Tr:(this.currentObservers=null,s.push(r),new Fe(function(){o.currentObservers=null,qe(s,r)}))},t.prototype._checkFinalizedStatuses=function(r){var o=this,n=o.hasError,i=o.thrownError,a=o.isStopped;n?r.error(i):a&&r.complete()},t.prototype.asObservable=function(){var r=new F;return r.source=this,r},t.create=function(r,o){return new Eo(r,o)},t}(F);var Eo=function(e){re(t,e);function t(r,o){var n=e.call(this)||this;return n.destination=r,n.source=o,n}return t.prototype.next=function(r){var o,n;(n=(o=this.destination)===null||o===void 0?void 0:o.next)===null||n===void 0||n.call(o,r)},t.prototype.error=function(r){var o,n;(n=(o=this.destination)===null||o===void 0?void 0:o.error)===null||n===void 0||n.call(o,r)},t.prototype.complete=function(){var r,o;(o=(r=this.destination)===null||r===void 0?void 0:r.complete)===null||o===void 0||o.call(r)},t.prototype._subscribe=function(r){var o,n;return(n=(o=this.source)===null||o===void 0?void 0:o.subscribe(r))!==null&&n!==void 0?n:Tr},t}(g);var _r=function(e){re(t,e);function t(r){var o=e.call(this)||this;return o._value=r,o}return Object.defineProperty(t.prototype,"value",{get:function(){return this.getValue()},enumerable:!1,configurable:!0}),t.prototype._subscribe=function(r){var o=e.prototype._subscribe.call(this,r);return!o.closed&&r.next(this._value),o},t.prototype.getValue=function(){var r=this,o=r.hasError,n=r.thrownError,i=r._value;if(o)throw n;return this._throwIfClosed(),i},t.prototype.next=function(r){e.prototype.next.call(this,this._value=r)},t}(g);var Lt={now:function(){return(Lt.delegate||Date).now()},delegate:void 0};var _t=function(e){re(t,e);function t(r,o,n){r===void 0&&(r=1/0),o===void 0&&(o=1/0),n===void 0&&(n=Lt);var i=e.call(this)||this;return i._bufferSize=r,i._windowTime=o,i._timestampProvider=n,i._buffer=[],i._infiniteTimeWindow=!0,i._infiniteTimeWindow=o===1/0,i._bufferSize=Math.max(1,r),i._windowTime=Math.max(1,o),i}return t.prototype.next=function(r){var o=this,n=o.isStopped,i=o._buffer,a=o._infiniteTimeWindow,s=o._timestampProvider,p=o._windowTime;n||(i.push(r),!a&&i.push(s.now()+p)),this._trimBuffer(),e.prototype.next.call(this,r)},t.prototype._subscribe=function(r){this._throwIfClosed(),this._trimBuffer();for(var o=this._innerSubscribe(r),n=this,i=n._infiniteTimeWindow,a=n._buffer,s=a.slice(),p=0;p0?e.prototype.schedule.call(this,r,o):(this.delay=o,this.state=r,this.scheduler.flush(this),this)},t.prototype.execute=function(r,o){return o>0||this.closed?e.prototype.execute.call(this,r,o):this._execute(r,o)},t.prototype.requestAsyncId=function(r,o,n){return n===void 0&&(n=0),n!=null&&n>0||n==null&&this.delay>0?e.prototype.requestAsyncId.call(this,r,o,n):(r.flush(this),0)},t}(vt);var So=function(e){re(t,e);function t(){return e!==null&&e.apply(this,arguments)||this}return t}(gt);var Hr=new So(To);var Oo=function(e){re(t,e);function t(r,o){var n=e.call(this,r,o)||this;return n.scheduler=r,n.work=o,n}return t.prototype.requestAsyncId=function(r,o,n){return n===void 0&&(n=0),n!==null&&n>0?e.prototype.requestAsyncId.call(this,r,o,n):(r.actions.push(this),r._scheduled||(r._scheduled=bt.requestAnimationFrame(function(){return r.flush(void 0)})))},t.prototype.recycleAsyncId=function(r,o,n){var i;if(n===void 0&&(n=0),n!=null?n>0:this.delay>0)return e.prototype.recycleAsyncId.call(this,r,o,n);var a=r.actions;o!=null&&((i=a[a.length-1])===null||i===void 0?void 0:i.id)!==o&&(bt.cancelAnimationFrame(o),r._scheduled=void 0)},t}(vt);var Mo=function(e){re(t,e);function t(){return e!==null&&e.apply(this,arguments)||this}return t.prototype.flush=function(r){this._active=!0;var o=this._scheduled;this._scheduled=void 0;var n=this.actions,i;r=r||n.shift();do if(i=r.execute(r.state,r.delay))break;while((r=n[0])&&r.id===o&&n.shift());if(this._active=!1,i){for(;(r=n[0])&&r.id===o&&n.shift();)r.unsubscribe();throw i}},t}(gt);var me=new Mo(Oo);var M=new F(function(e){return e.complete()});function Yt(e){return e&&k(e.schedule)}function kr(e){return e[e.length-1]}function Xe(e){return k(kr(e))?e.pop():void 0}function He(e){return Yt(kr(e))?e.pop():void 0}function Bt(e,t){return typeof kr(e)=="number"?e.pop():t}var xt=function(e){return e&&typeof e.length=="number"&&typeof e!="function"};function Gt(e){return k(e==null?void 0:e.then)}function Jt(e){return k(e[ht])}function Xt(e){return Symbol.asyncIterator&&k(e==null?void 0:e[Symbol.asyncIterator])}function Zt(e){return new TypeError("You provided "+(e!==null&&typeof e=="object"?"an invalid object":"'"+e+"'")+" where a stream was expected. You can provide an Observable, Promise, ReadableStream, Array, AsyncIterable, or Iterable.")}function Gi(){return typeof Symbol!="function"||!Symbol.iterator?"@@iterator":Symbol.iterator}var er=Gi();function tr(e){return k(e==null?void 0:e[er])}function rr(e){return lo(this,arguments,function(){var r,o,n,i;return Nt(this,function(a){switch(a.label){case 0:r=e.getReader(),a.label=1;case 1:a.trys.push([1,,9,10]),a.label=2;case 2:return[4,nt(r.read())];case 3:return o=a.sent(),n=o.value,i=o.done,i?[4,nt(void 0)]:[3,5];case 4:return[2,a.sent()];case 5:return[4,nt(n)];case 6:return[4,a.sent()];case 7:return a.sent(),[3,2];case 8:return[3,10];case 9:return r.releaseLock(),[7];case 10:return[2]}})})}function or(e){return k(e==null?void 0:e.getReader)}function W(e){if(e instanceof F)return e;if(e!=null){if(Jt(e))return Ji(e);if(xt(e))return Xi(e);if(Gt(e))return Zi(e);if(Xt(e))return Lo(e);if(tr(e))return ea(e);if(or(e))return ta(e)}throw Zt(e)}function Ji(e){return new F(function(t){var r=e[ht]();if(k(r.subscribe))return r.subscribe(t);throw new TypeError("Provided object does not correctly implement Symbol.observable")})}function Xi(e){return new F(function(t){for(var r=0;r=2;return function(o){return o.pipe(e?b(function(n,i){return e(n,i,o)}):le,we(1),r?Be(t):zo(function(){return new ir}))}}function Fr(e){return e<=0?function(){return M}:x(function(t,r){var o=[];t.subscribe(T(r,function(n){o.push(n),e=2,!0))}function pe(e){e===void 0&&(e={});var t=e.connector,r=t===void 0?function(){return new g}:t,o=e.resetOnError,n=o===void 0?!0:o,i=e.resetOnComplete,a=i===void 0?!0:i,s=e.resetOnRefCountZero,p=s===void 0?!0:s;return function(c){var l,f,u,h=0,w=!1,A=!1,te=function(){f==null||f.unsubscribe(),f=void 0},ie=function(){te(),l=u=void 0,w=A=!1},J=function(){var H=l;ie(),H==null||H.unsubscribe()};return x(function(H,mt){h++,!A&&!w&&te();var ze=u=u!=null?u:r();mt.add(function(){h--,h===0&&!A&&!w&&(f=Wr(J,p))}),ze.subscribe(mt),!l&&h>0&&(l=new at({next:function(Ie){return ze.next(Ie)},error:function(Ie){A=!0,te(),f=Wr(ie,n,Ie),ze.error(Ie)},complete:function(){w=!0,te(),f=Wr(ie,a),ze.complete()}}),W(H).subscribe(l))})(c)}}function Wr(e,t){for(var r=[],o=2;oe.next(document)),e}function $(e,t=document){return Array.from(t.querySelectorAll(e))}function P(e,t=document){let r=fe(e,t);if(typeof r=="undefined")throw new ReferenceError(`Missing element: expected "${e}" to be present`);return r}function fe(e,t=document){return t.querySelector(e)||void 0}function Re(){var e,t,r,o;return(o=(r=(t=(e=document.activeElement)==null?void 0:e.shadowRoot)==null?void 0:t.activeElement)!=null?r:document.activeElement)!=null?o:void 0}var xa=S(d(document.body,"focusin"),d(document.body,"focusout")).pipe(_e(1),Q(void 0),m(()=>Re()||document.body),B(1));function et(e){return xa.pipe(m(t=>e.contains(t)),K())}function kt(e,t){return C(()=>S(d(e,"mouseenter").pipe(m(()=>!0)),d(e,"mouseleave").pipe(m(()=>!1))).pipe(t?Ht(r=>Me(+!r*t)):le,Q(e.matches(":hover"))))}function Bo(e,t){if(typeof t=="string"||typeof t=="number")e.innerHTML+=t.toString();else if(t instanceof Node)e.appendChild(t);else if(Array.isArray(t))for(let r of t)Bo(e,r)}function E(e,t,...r){let o=document.createElement(e);if(t)for(let n of Object.keys(t))typeof t[n]!="undefined"&&(typeof t[n]!="boolean"?o.setAttribute(n,t[n]):o.setAttribute(n,""));for(let n of r)Bo(o,n);return o}function sr(e){if(e>999){let t=+((e-950)%1e3>99);return`${((e+1e-6)/1e3).toFixed(t)}k`}else return e.toString()}function wt(e){let t=E("script",{src:e});return C(()=>(document.head.appendChild(t),S(d(t,"load"),d(t,"error").pipe(v(()=>$r(()=>new ReferenceError(`Invalid script: ${e}`))))).pipe(m(()=>{}),L(()=>document.head.removeChild(t)),we(1))))}var Go=new g,ya=C(()=>typeof ResizeObserver=="undefined"?wt("https://unpkg.com/resize-observer-polyfill"):I(void 0)).pipe(m(()=>new ResizeObserver(e=>e.forEach(t=>Go.next(t)))),v(e=>S(Ke,I(e)).pipe(L(()=>e.disconnect()))),B(1));function ce(e){return{width:e.offsetWidth,height:e.offsetHeight}}function ge(e){let t=e;for(;t.clientWidth===0&&t.parentElement;)t=t.parentElement;return ya.pipe(y(r=>r.observe(t)),v(r=>Go.pipe(b(o=>o.target===t),L(()=>r.unobserve(t)))),m(()=>ce(e)),Q(ce(e)))}function Tt(e){return{width:e.scrollWidth,height:e.scrollHeight}}function cr(e){let t=e.parentElement;for(;t&&(e.scrollWidth<=t.scrollWidth&&e.scrollHeight<=t.scrollHeight);)t=(e=t).parentElement;return t?e:void 0}function Jo(e){let t=[],r=e.parentElement;for(;r;)(e.clientWidth>r.clientWidth||e.clientHeight>r.clientHeight)&&t.push(r),r=(e=r).parentElement;return t.length===0&&t.push(document.documentElement),t}function Ue(e){return{x:e.offsetLeft,y:e.offsetTop}}function Xo(e){let t=e.getBoundingClientRect();return{x:t.x+window.scrollX,y:t.y+window.scrollY}}function Zo(e){return S(d(window,"load"),d(window,"resize")).pipe(Le(0,me),m(()=>Ue(e)),Q(Ue(e)))}function pr(e){return{x:e.scrollLeft,y:e.scrollTop}}function De(e){return S(d(e,"scroll"),d(window,"scroll"),d(window,"resize")).pipe(Le(0,me),m(()=>pr(e)),Q(pr(e)))}var en=new g,Ea=C(()=>I(new IntersectionObserver(e=>{for(let t of e)en.next(t)},{threshold:0}))).pipe(v(e=>S(Ke,I(e)).pipe(L(()=>e.disconnect()))),B(1));function tt(e){return Ea.pipe(y(t=>t.observe(e)),v(t=>en.pipe(b(({target:r})=>r===e),L(()=>t.unobserve(e)),m(({isIntersecting:r})=>r))))}function tn(e,t=16){return De(e).pipe(m(({y:r})=>{let o=ce(e),n=Tt(e);return r>=n.height-o.height-t}),K())}var lr={drawer:P("[data-md-toggle=drawer]"),search:P("[data-md-toggle=search]")};function rn(e){return lr[e].checked}function Je(e,t){lr[e].checked!==t&&lr[e].click()}function Ve(e){let t=lr[e];return d(t,"change").pipe(m(()=>t.checked),Q(t.checked))}function wa(e,t){switch(e.constructor){case HTMLInputElement:return e.type==="radio"?/^Arrow/.test(t):!0;case HTMLSelectElement:case HTMLTextAreaElement:return!0;default:return e.isContentEditable}}function Ta(){return S(d(window,"compositionstart").pipe(m(()=>!0)),d(window,"compositionend").pipe(m(()=>!1))).pipe(Q(!1))}function on(){let e=d(window,"keydown").pipe(b(t=>!(t.metaKey||t.ctrlKey)),m(t=>({mode:rn("search")?"search":"global",type:t.key,claim(){t.preventDefault(),t.stopPropagation()}})),b(({mode:t,type:r})=>{if(t==="global"){let o=Re();if(typeof o!="undefined")return!wa(o,r)}return!0}),pe());return Ta().pipe(v(t=>t?M:e))}function xe(){return new URL(location.href)}function pt(e,t=!1){if(G("navigation.instant")&&!t){let r=E("a",{href:e.href});document.body.appendChild(r),r.click(),r.remove()}else location.href=e.href}function nn(){return new g}function an(){return location.hash.slice(1)}function sn(e){let t=E("a",{href:e});t.addEventListener("click",r=>r.stopPropagation()),t.click()}function Sa(e){return S(d(window,"hashchange"),e).pipe(m(an),Q(an()),b(t=>t.length>0),B(1))}function cn(e){return Sa(e).pipe(m(t=>fe(`[id="${t}"]`)),b(t=>typeof t!="undefined"))}function $t(e){let t=matchMedia(e);return ar(r=>t.addListener(()=>r(t.matches))).pipe(Q(t.matches))}function pn(){let e=matchMedia("print");return S(d(window,"beforeprint").pipe(m(()=>!0)),d(window,"afterprint").pipe(m(()=>!1))).pipe(Q(e.matches))}function Nr(e,t){return e.pipe(v(r=>r?t():M))}function zr(e,t){return new F(r=>{let o=new XMLHttpRequest;return o.open("GET",`${e}`),o.responseType="blob",o.addEventListener("load",()=>{o.status>=200&&o.status<300?(r.next(o.response),r.complete()):r.error(new Error(o.statusText))}),o.addEventListener("error",()=>{r.error(new Error("Network error"))}),o.addEventListener("abort",()=>{r.complete()}),typeof(t==null?void 0:t.progress$)!="undefined"&&(o.addEventListener("progress",n=>{var i;if(n.lengthComputable)t.progress$.next(n.loaded/n.total*100);else{let a=(i=o.getResponseHeader("Content-Length"))!=null?i:0;t.progress$.next(n.loaded/+a*100)}}),t.progress$.next(5)),o.send(),()=>o.abort()})}function Ne(e,t){return zr(e,t).pipe(v(r=>r.text()),m(r=>JSON.parse(r)),B(1))}function ln(e,t){let r=new DOMParser;return zr(e,t).pipe(v(o=>o.text()),m(o=>r.parseFromString(o,"text/html")),B(1))}function mn(e,t){let r=new DOMParser;return zr(e,t).pipe(v(o=>o.text()),m(o=>r.parseFromString(o,"text/xml")),B(1))}function fn(){return{x:Math.max(0,scrollX),y:Math.max(0,scrollY)}}function un(){return S(d(window,"scroll",{passive:!0}),d(window,"resize",{passive:!0})).pipe(m(fn),Q(fn()))}function dn(){return{width:innerWidth,height:innerHeight}}function hn(){return d(window,"resize",{passive:!0}).pipe(m(dn),Q(dn()))}function bn(){return z([un(),hn()]).pipe(m(([e,t])=>({offset:e,size:t})),B(1))}function mr(e,{viewport$:t,header$:r}){let o=t.pipe(Z("size")),n=z([o,r]).pipe(m(()=>Ue(e)));return z([r,t,n]).pipe(m(([{height:i},{offset:a,size:s},{x:p,y:c}])=>({offset:{x:a.x-p,y:a.y-c+i},size:s})))}function Oa(e){return d(e,"message",t=>t.data)}function Ma(e){let t=new g;return t.subscribe(r=>e.postMessage(r)),t}function vn(e,t=new Worker(e)){let r=Oa(t),o=Ma(t),n=new g;n.subscribe(o);let i=o.pipe(X(),ne(!0));return n.pipe(X(),Pe(r.pipe(U(i))),pe())}var La=P("#__config"),St=JSON.parse(La.textContent);St.base=`${new URL(St.base,xe())}`;function Te(){return St}function G(e){return St.features.includes(e)}function ye(e,t){return typeof t!="undefined"?St.translations[e].replace("#",t.toString()):St.translations[e]}function Se(e,t=document){return P(`[data-md-component=${e}]`,t)}function ae(e,t=document){return $(`[data-md-component=${e}]`,t)}function _a(e){let t=P(".md-typeset > :first-child",e);return d(t,"click",{once:!0}).pipe(m(()=>P(".md-typeset",e)),m(r=>({hash:__md_hash(r.innerHTML)})))}function gn(e){if(!G("announce.dismiss")||!e.childElementCount)return M;if(!e.hidden){let t=P(".md-typeset",e);__md_hash(t.innerHTML)===__md_get("__announce")&&(e.hidden=!0)}return C(()=>{let t=new g;return t.subscribe(({hash:r})=>{e.hidden=!0,__md_set("__announce",r)}),_a(e).pipe(y(r=>t.next(r)),L(()=>t.complete()),m(r=>R({ref:e},r)))})}function Aa(e,{target$:t}){return t.pipe(m(r=>({hidden:r!==e})))}function xn(e,t){let r=new g;return r.subscribe(({hidden:o})=>{e.hidden=o}),Aa(e,t).pipe(y(o=>r.next(o)),L(()=>r.complete()),m(o=>R({ref:e},o)))}function Pt(e,t){return t==="inline"?E("div",{class:"md-tooltip md-tooltip--inline",id:e,role:"tooltip"},E("div",{class:"md-tooltip__inner md-typeset"})):E("div",{class:"md-tooltip",id:e,role:"tooltip"},E("div",{class:"md-tooltip__inner md-typeset"}))}function yn(...e){return E("div",{class:"md-tooltip2",role:"tooltip"},E("div",{class:"md-tooltip2__inner md-typeset"},e))}function En(e,t){if(t=t?`${t}_annotation_${e}`:void 0,t){let r=t?`#${t}`:void 0;return E("aside",{class:"md-annotation",tabIndex:0},Pt(t),E("a",{href:r,class:"md-annotation__index",tabIndex:-1},E("span",{"data-md-annotation-id":e})))}else return E("aside",{class:"md-annotation",tabIndex:0},Pt(t),E("span",{class:"md-annotation__index",tabIndex:-1},E("span",{"data-md-annotation-id":e})))}function wn(e){return E("button",{class:"md-clipboard md-icon",title:ye("clipboard.copy"),"data-clipboard-target":`#${e} > code`})}function qr(e,t){let r=t&2,o=t&1,n=Object.keys(e.terms).filter(p=>!e.terms[p]).reduce((p,c)=>[...p,E("del",null,c)," "],[]).slice(0,-1),i=Te(),a=new URL(e.location,i.base);G("search.highlight")&&a.searchParams.set("h",Object.entries(e.terms).filter(([,p])=>p).reduce((p,[c])=>`${p} ${c}`.trim(),""));let{tags:s}=Te();return E("a",{href:`${a}`,class:"md-search-result__link",tabIndex:-1},E("article",{class:"md-search-result__article md-typeset","data-md-score":e.score.toFixed(2)},r>0&&E("div",{class:"md-search-result__icon md-icon"}),r>0&&E("h1",null,e.title),r<=0&&E("h2",null,e.title),o>0&&e.text.length>0&&e.text,e.tags&&e.tags.map(p=>{let c=s?p in s?`md-tag-icon md-tag--${s[p]}`:"md-tag-icon":"";return E("span",{class:`md-tag ${c}`},p)}),o>0&&n.length>0&&E("p",{class:"md-search-result__terms"},ye("search.result.term.missing"),": ",...n)))}function Tn(e){let t=e[0].score,r=[...e],o=Te(),n=r.findIndex(l=>!`${new URL(l.location,o.base)}`.includes("#")),[i]=r.splice(n,1),a=r.findIndex(l=>l.scoreqr(l,1)),...p.length?[E("details",{class:"md-search-result__more"},E("summary",{tabIndex:-1},E("div",null,p.length>0&&p.length===1?ye("search.result.more.one"):ye("search.result.more.other",p.length))),...p.map(l=>qr(l,1)))]:[]];return E("li",{class:"md-search-result__item"},c)}function Sn(e){return E("ul",{class:"md-source__facts"},Object.entries(e).map(([t,r])=>E("li",{class:`md-source__fact md-source__fact--${t}`},typeof r=="number"?sr(r):r)))}function Qr(e){let t=`tabbed-control tabbed-control--${e}`;return E("div",{class:t,hidden:!0},E("button",{class:"tabbed-button",tabIndex:-1,"aria-hidden":"true"}))}function On(e){return E("div",{class:"md-typeset__scrollwrap"},E("div",{class:"md-typeset__table"},e))}function Ca(e){let t=Te(),r=new URL(`../${e.version}/`,t.base);return E("li",{class:"md-version__item"},E("a",{href:`${r}`,class:"md-version__link"},e.title))}function Mn(e,t){return e=e.filter(r=>{var o;return!((o=r.properties)!=null&&o.hidden)}),E("div",{class:"md-version"},E("button",{class:"md-version__current","aria-label":ye("select.version")},t.title),E("ul",{class:"md-version__list"},e.map(Ca)))}var Ha=0;function ka(e){let t=z([et(e),kt(e)]).pipe(m(([o,n])=>o||n),K()),r=C(()=>Jo(e)).pipe(oe(De),ct(1),m(()=>Xo(e)));return t.pipe(Ae(o=>o),v(()=>z([t,r])),m(([o,n])=>({active:o,offset:n})),pe())}function $a(e,t){let{content$:r,viewport$:o}=t,n=`__tooltip2_${Ha++}`;return C(()=>{let i=new g,a=new _r(!1);i.pipe(X(),ne(!1)).subscribe(a);let s=a.pipe(Ht(c=>Me(+!c*250,Hr)),K(),v(c=>c?r:M),y(c=>c.id=n),pe());z([i.pipe(m(({active:c})=>c)),s.pipe(v(c=>kt(c,250)),Q(!1))]).pipe(m(c=>c.some(l=>l))).subscribe(a);let p=a.pipe(b(c=>c),ee(s,o),m(([c,l,{size:f}])=>{let u=e.getBoundingClientRect(),h=u.width/2;if(l.role==="tooltip")return{x:h,y:8+u.height};if(u.y>=f.height/2){let{height:w}=ce(l);return{x:h,y:-16-w}}else return{x:h,y:16+u.height}}));return z([s,i,p]).subscribe(([c,{offset:l},f])=>{c.style.setProperty("--md-tooltip-host-x",`${l.x}px`),c.style.setProperty("--md-tooltip-host-y",`${l.y}px`),c.style.setProperty("--md-tooltip-x",`${f.x}px`),c.style.setProperty("--md-tooltip-y",`${f.y}px`),c.classList.toggle("md-tooltip2--top",f.y<0),c.classList.toggle("md-tooltip2--bottom",f.y>=0)}),a.pipe(b(c=>c),ee(s,(c,l)=>l),b(c=>c.role==="tooltip")).subscribe(c=>{let l=ce(P(":scope > *",c));c.style.setProperty("--md-tooltip-width",`${l.width}px`),c.style.setProperty("--md-tooltip-tail","0px")}),a.pipe(K(),be(me),ee(s)).subscribe(([c,l])=>{l.classList.toggle("md-tooltip2--active",c)}),z([a.pipe(b(c=>c)),s]).subscribe(([c,l])=>{l.role==="dialog"?(e.setAttribute("aria-controls",n),e.setAttribute("aria-haspopup","dialog")):e.setAttribute("aria-describedby",n)}),a.pipe(b(c=>!c)).subscribe(()=>{e.removeAttribute("aria-controls"),e.removeAttribute("aria-describedby"),e.removeAttribute("aria-haspopup")}),ka(e).pipe(y(c=>i.next(c)),L(()=>i.complete()),m(c=>R({ref:e},c)))})}function lt(e,{viewport$:t},r=document.body){return $a(e,{content$:new F(o=>{let n=e.title,i=yn(n);return o.next(i),e.removeAttribute("title"),r.append(i),()=>{i.remove(),e.setAttribute("title",n)}}),viewport$:t})}function Pa(e,t){let r=C(()=>z([Zo(e),De(t)])).pipe(m(([{x:o,y:n},i])=>{let{width:a,height:s}=ce(e);return{x:o-i.x+a/2,y:n-i.y+s/2}}));return et(e).pipe(v(o=>r.pipe(m(n=>({active:o,offset:n})),we(+!o||1/0))))}function Ln(e,t,{target$:r}){let[o,n]=Array.from(e.children);return C(()=>{let i=new g,a=i.pipe(X(),ne(!0));return i.subscribe({next({offset:s}){e.style.setProperty("--md-tooltip-x",`${s.x}px`),e.style.setProperty("--md-tooltip-y",`${s.y}px`)},complete(){e.style.removeProperty("--md-tooltip-x"),e.style.removeProperty("--md-tooltip-y")}}),tt(e).pipe(U(a)).subscribe(s=>{e.toggleAttribute("data-md-visible",s)}),S(i.pipe(b(({active:s})=>s)),i.pipe(_e(250),b(({active:s})=>!s))).subscribe({next({active:s}){s?e.prepend(o):o.remove()},complete(){e.prepend(o)}}),i.pipe(Le(16,me)).subscribe(({active:s})=>{o.classList.toggle("md-tooltip--active",s)}),i.pipe(ct(125,me),b(()=>!!e.offsetParent),m(()=>e.offsetParent.getBoundingClientRect()),m(({x:s})=>s)).subscribe({next(s){s?e.style.setProperty("--md-tooltip-0",`${-s}px`):e.style.removeProperty("--md-tooltip-0")},complete(){e.style.removeProperty("--md-tooltip-0")}}),d(n,"click").pipe(U(a),b(s=>!(s.metaKey||s.ctrlKey))).subscribe(s=>{s.stopPropagation(),s.preventDefault()}),d(n,"mousedown").pipe(U(a),ee(i)).subscribe(([s,{active:p}])=>{var c;if(s.button!==0||s.metaKey||s.ctrlKey)s.preventDefault();else if(p){s.preventDefault();let l=e.parentElement.closest(".md-annotation");l instanceof HTMLElement?l.focus():(c=Re())==null||c.blur()}}),r.pipe(U(a),b(s=>s===o),Ge(125)).subscribe(()=>e.focus()),Pa(e,t).pipe(y(s=>i.next(s)),L(()=>i.complete()),m(s=>R({ref:e},s)))})}function Ra(e){return e.tagName==="CODE"?$(".c, .c1, .cm",e):[e]}function Ia(e){let t=[];for(let r of Ra(e)){let o=[],n=document.createNodeIterator(r,NodeFilter.SHOW_TEXT);for(let i=n.nextNode();i;i=n.nextNode())o.push(i);for(let i of o){let a;for(;a=/(\(\d+\))(!)?/.exec(i.textContent);){let[,s,p]=a;if(typeof p=="undefined"){let c=i.splitText(a.index);i=c.splitText(s.length),t.push(c)}else{i.textContent=s,t.push(i);break}}}}return t}function _n(e,t){t.append(...Array.from(e.childNodes))}function fr(e,t,{target$:r,print$:o}){let n=t.closest("[id]"),i=n==null?void 0:n.id,a=new Map;for(let s of Ia(t)){let[,p]=s.textContent.match(/\((\d+)\)/);fe(`:scope > li:nth-child(${p})`,e)&&(a.set(p,En(p,i)),s.replaceWith(a.get(p)))}return a.size===0?M:C(()=>{let s=new g,p=s.pipe(X(),ne(!0)),c=[];for(let[l,f]of a)c.push([P(".md-typeset",f),P(`:scope > li:nth-child(${l})`,e)]);return o.pipe(U(p)).subscribe(l=>{e.hidden=!l,e.classList.toggle("md-annotation-list",l);for(let[f,u]of c)l?_n(f,u):_n(u,f)}),S(...[...a].map(([,l])=>Ln(l,t,{target$:r}))).pipe(L(()=>s.complete()),pe())})}function An(e){if(e.nextElementSibling){let t=e.nextElementSibling;if(t.tagName==="OL")return t;if(t.tagName==="P"&&!t.children.length)return An(t)}}function Cn(e,t){return C(()=>{let r=An(e);return typeof r!="undefined"?fr(r,e,t):M})}var Hn=Vt(Yr());var Fa=0;function kn(e){if(e.nextElementSibling){let t=e.nextElementSibling;if(t.tagName==="OL")return t;if(t.tagName==="P"&&!t.children.length)return kn(t)}}function ja(e){return ge(e).pipe(m(({width:t})=>({scrollable:Tt(e).width>t})),Z("scrollable"))}function $n(e,t){let{matches:r}=matchMedia("(hover)"),o=C(()=>{let n=new g,i=n.pipe(Fr(1));n.subscribe(({scrollable:c})=>{c&&r?e.setAttribute("tabindex","0"):e.removeAttribute("tabindex")});let a=[];if(Hn.default.isSupported()&&(e.closest(".copy")||G("content.code.copy")&&!e.closest(".no-copy"))){let c=e.closest("pre");c.id=`__code_${Fa++}`;let l=wn(c.id);c.insertBefore(l,e),G("content.tooltips")&&a.push(lt(l,{viewport$}))}let s=e.closest(".highlight");if(s instanceof HTMLElement){let c=kn(s);if(typeof c!="undefined"&&(s.classList.contains("annotate")||G("content.code.annotate"))){let l=fr(c,e,t);a.push(ge(s).pipe(U(i),m(({width:f,height:u})=>f&&u),K(),v(f=>f?l:M)))}}return $(":scope > span[id]",e).length&&e.classList.add("md-code__content"),ja(e).pipe(y(c=>n.next(c)),L(()=>n.complete()),m(c=>R({ref:e},c)),Pe(...a))});return G("content.lazy")?tt(e).pipe(b(n=>n),we(1),v(()=>o)):o}function Wa(e,{target$:t,print$:r}){let o=!0;return S(t.pipe(m(n=>n.closest("details:not([open])")),b(n=>e===n),m(()=>({action:"open",reveal:!0}))),r.pipe(b(n=>n||!o),y(()=>o=e.open),m(n=>({action:n?"open":"close"}))))}function Pn(e,t){return C(()=>{let r=new g;return r.subscribe(({action:o,reveal:n})=>{e.toggleAttribute("open",o==="open"),n&&e.scrollIntoView()}),Wa(e,t).pipe(y(o=>r.next(o)),L(()=>r.complete()),m(o=>R({ref:e},o)))})}var Rn=".node circle,.node ellipse,.node path,.node polygon,.node rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}marker{fill:var(--md-mermaid-edge-color)!important}.edgeLabel .label rect{fill:#0000}.label{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.label foreignObject{line-height:normal;overflow:visible}.label div .edgeLabel{color:var(--md-mermaid-label-fg-color)}.edgeLabel,.edgeLabel rect,.label div .edgeLabel{background-color:var(--md-mermaid-label-bg-color)}.edgeLabel,.edgeLabel rect{fill:var(--md-mermaid-label-bg-color);color:var(--md-mermaid-edge-color)}.edgePath .path,.flowchart-link{stroke:var(--md-mermaid-edge-color);stroke-width:.05rem}.edgePath .arrowheadPath{fill:var(--md-mermaid-edge-color);stroke:none}.cluster rect{fill:var(--md-default-fg-color--lightest);stroke:var(--md-default-fg-color--lighter)}.cluster span{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}g #flowchart-circleEnd,g #flowchart-circleStart,g #flowchart-crossEnd,g #flowchart-crossStart,g #flowchart-pointEnd,g #flowchart-pointStart{stroke:none}g.classGroup line,g.classGroup rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}g.classGroup text{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.classLabel .box{fill:var(--md-mermaid-label-bg-color);background-color:var(--md-mermaid-label-bg-color);opacity:1}.classLabel .label{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.node .divider{stroke:var(--md-mermaid-node-fg-color)}.relation{stroke:var(--md-mermaid-edge-color)}.cardinality{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.cardinality text{fill:inherit!important}defs #classDiagram-compositionEnd,defs #classDiagram-compositionStart,defs #classDiagram-dependencyEnd,defs #classDiagram-dependencyStart,defs #classDiagram-extensionEnd,defs #classDiagram-extensionStart{fill:var(--md-mermaid-edge-color)!important;stroke:var(--md-mermaid-edge-color)!important}defs #classDiagram-aggregationEnd,defs #classDiagram-aggregationStart{fill:var(--md-mermaid-label-bg-color)!important;stroke:var(--md-mermaid-edge-color)!important}g.stateGroup rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}g.stateGroup .state-title{fill:var(--md-mermaid-label-fg-color)!important;font-family:var(--md-mermaid-font-family)}g.stateGroup .composit{fill:var(--md-mermaid-label-bg-color)}.nodeLabel,.nodeLabel p{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.node circle.state-end,.node circle.state-start,.start-state{fill:var(--md-mermaid-edge-color);stroke:none}.end-state-inner,.end-state-outer{fill:var(--md-mermaid-edge-color)}.end-state-inner,.node circle.state-end{stroke:var(--md-mermaid-label-bg-color)}.transition{stroke:var(--md-mermaid-edge-color)}[id^=state-fork] rect,[id^=state-join] rect{fill:var(--md-mermaid-edge-color)!important;stroke:none!important}.statediagram-cluster.statediagram-cluster .inner{fill:var(--md-default-bg-color)}.statediagram-cluster rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}.statediagram-state rect.divider{fill:var(--md-default-fg-color--lightest);stroke:var(--md-default-fg-color--lighter)}defs #statediagram-barbEnd{stroke:var(--md-mermaid-edge-color)}.attributeBoxEven,.attributeBoxOdd{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}.entityBox{fill:var(--md-mermaid-label-bg-color);stroke:var(--md-mermaid-node-fg-color)}.entityLabel{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.relationshipLabelBox{fill:var(--md-mermaid-label-bg-color);fill-opacity:1;background-color:var(--md-mermaid-label-bg-color);opacity:1}.relationshipLabel{fill:var(--md-mermaid-label-fg-color)}.relationshipLine{stroke:var(--md-mermaid-edge-color)}defs #ONE_OR_MORE_END *,defs #ONE_OR_MORE_START *,defs #ONLY_ONE_END *,defs #ONLY_ONE_START *,defs #ZERO_OR_MORE_END *,defs #ZERO_OR_MORE_START *,defs #ZERO_OR_ONE_END *,defs #ZERO_OR_ONE_START *{stroke:var(--md-mermaid-edge-color)!important}defs #ZERO_OR_MORE_END circle,defs #ZERO_OR_MORE_START circle{fill:var(--md-mermaid-label-bg-color)}.actor{fill:var(--md-mermaid-sequence-actor-bg-color);stroke:var(--md-mermaid-sequence-actor-border-color)}text.actor>tspan{fill:var(--md-mermaid-sequence-actor-fg-color);font-family:var(--md-mermaid-font-family)}line{stroke:var(--md-mermaid-sequence-actor-line-color)}.actor-man circle,.actor-man line{fill:var(--md-mermaid-sequence-actorman-bg-color);stroke:var(--md-mermaid-sequence-actorman-line-color)}.messageLine0,.messageLine1{stroke:var(--md-mermaid-sequence-message-line-color)}.note{fill:var(--md-mermaid-sequence-note-bg-color);stroke:var(--md-mermaid-sequence-note-border-color)}.loopText,.loopText>tspan,.messageText,.noteText>tspan{stroke:none;font-family:var(--md-mermaid-font-family)!important}.messageText{fill:var(--md-mermaid-sequence-message-fg-color)}.loopText,.loopText>tspan{fill:var(--md-mermaid-sequence-loop-fg-color)}.noteText>tspan{fill:var(--md-mermaid-sequence-note-fg-color)}#arrowhead path{fill:var(--md-mermaid-sequence-message-line-color);stroke:none}.loopLine{fill:var(--md-mermaid-sequence-loop-bg-color);stroke:var(--md-mermaid-sequence-loop-border-color)}.labelBox{fill:var(--md-mermaid-sequence-label-bg-color);stroke:none}.labelText,.labelText>span{fill:var(--md-mermaid-sequence-label-fg-color);font-family:var(--md-mermaid-font-family)}.sequenceNumber{fill:var(--md-mermaid-sequence-number-fg-color)}rect.rect{fill:var(--md-mermaid-sequence-box-bg-color);stroke:none}rect.rect+text.text{fill:var(--md-mermaid-sequence-box-fg-color)}defs #sequencenumber{fill:var(--md-mermaid-sequence-number-bg-color)!important}";var Br,Da=0;function Va(){return typeof mermaid=="undefined"||mermaid instanceof Element?wt("https://unpkg.com/mermaid@10.7.0/dist/mermaid.min.js"):I(void 0)}function In(e){return e.classList.remove("mermaid"),Br||(Br=Va().pipe(y(()=>mermaid.initialize({startOnLoad:!1,themeCSS:Rn,sequence:{actorFontSize:"16px",messageFontSize:"16px",noteFontSize:"16px"}})),m(()=>{}),B(1))),Br.subscribe(()=>ao(this,null,function*(){e.classList.add("mermaid");let t=`__mermaid_${Da++}`,r=E("div",{class:"mermaid"}),o=e.textContent,{svg:n,fn:i}=yield mermaid.render(t,o),a=r.attachShadow({mode:"closed"});a.innerHTML=n,e.replaceWith(r),i==null||i(a)})),Br.pipe(m(()=>({ref:e})))}var Fn=E("table");function jn(e){return e.replaceWith(Fn),Fn.replaceWith(On(e)),I({ref:e})}function Na(e){let t=e.find(r=>r.checked)||e[0];return S(...e.map(r=>d(r,"change").pipe(m(()=>P(`label[for="${r.id}"]`))))).pipe(Q(P(`label[for="${t.id}"]`)),m(r=>({active:r})))}function Wn(e,{viewport$:t,target$:r}){let o=P(".tabbed-labels",e),n=$(":scope > input",e),i=Qr("prev");e.append(i);let a=Qr("next");return e.append(a),C(()=>{let s=new g,p=s.pipe(X(),ne(!0));z([s,ge(e)]).pipe(U(p),Le(1,me)).subscribe({next([{active:c},l]){let f=Ue(c),{width:u}=ce(c);e.style.setProperty("--md-indicator-x",`${f.x}px`),e.style.setProperty("--md-indicator-width",`${u}px`);let h=pr(o);(f.xh.x+l.width)&&o.scrollTo({left:Math.max(0,f.x-16),behavior:"smooth"})},complete(){e.style.removeProperty("--md-indicator-x"),e.style.removeProperty("--md-indicator-width")}}),z([De(o),ge(o)]).pipe(U(p)).subscribe(([c,l])=>{let f=Tt(o);i.hidden=c.x<16,a.hidden=c.x>f.width-l.width-16}),S(d(i,"click").pipe(m(()=>-1)),d(a,"click").pipe(m(()=>1))).pipe(U(p)).subscribe(c=>{let{width:l}=ce(o);o.scrollBy({left:l*c,behavior:"smooth"})}),r.pipe(U(p),b(c=>n.includes(c))).subscribe(c=>c.click()),o.classList.add("tabbed-labels--linked");for(let c of n){let l=P(`label[for="${c.id}"]`);l.replaceChildren(E("a",{href:`#${l.htmlFor}`,tabIndex:-1},...Array.from(l.childNodes))),d(l.firstElementChild,"click").pipe(U(p),b(f=>!(f.metaKey||f.ctrlKey)),y(f=>{f.preventDefault(),f.stopPropagation()})).subscribe(()=>{history.replaceState({},"",`#${l.htmlFor}`),l.click()})}return G("content.tabs.link")&&s.pipe(Ce(1),ee(t)).subscribe(([{active:c},{offset:l}])=>{let f=c.innerText.trim();if(c.hasAttribute("data-md-switching"))c.removeAttribute("data-md-switching");else{let u=e.offsetTop-l.y;for(let w of $("[data-tabs]"))for(let A of $(":scope > input",w)){let te=P(`label[for="${A.id}"]`);if(te!==c&&te.innerText.trim()===f){te.setAttribute("data-md-switching",""),A.click();break}}window.scrollTo({top:e.offsetTop-u});let h=__md_get("__tabs")||[];__md_set("__tabs",[...new Set([f,...h])])}}),s.pipe(U(p)).subscribe(()=>{for(let c of $("audio, video",e))c.pause()}),tt(e).pipe(v(()=>Na(n)),y(c=>s.next(c)),L(()=>s.complete()),m(c=>R({ref:e},c)))}).pipe(Qe(se))}function Un(e,{viewport$:t,target$:r,print$:o}){return S(...$(".annotate:not(.highlight)",e).map(n=>Cn(n,{target$:r,print$:o})),...$("pre:not(.mermaid) > code",e).map(n=>$n(n,{target$:r,print$:o})),...$("pre.mermaid",e).map(n=>In(n)),...$("table:not([class])",e).map(n=>jn(n)),...$("details",e).map(n=>Pn(n,{target$:r,print$:o})),...$("[data-tabs]",e).map(n=>Wn(n,{viewport$:t,target$:r})),...$("[title]",e).filter(()=>G("content.tooltips")).map(n=>lt(n,{viewport$:t})))}function za(e,{alert$:t}){return t.pipe(v(r=>S(I(!0),I(!1).pipe(Ge(2e3))).pipe(m(o=>({message:r,active:o})))))}function Dn(e,t){let r=P(".md-typeset",e);return C(()=>{let o=new g;return o.subscribe(({message:n,active:i})=>{e.classList.toggle("md-dialog--active",i),r.textContent=n}),za(e,t).pipe(y(n=>o.next(n)),L(()=>o.complete()),m(n=>R({ref:e},n)))})}var qa=0;function Qa(e,t){document.body.append(e);let{width:r}=ce(e);e.style.setProperty("--md-tooltip-width",`${r}px`),e.remove();let o=cr(t),n=typeof o!="undefined"?De(o):I({x:0,y:0}),i=S(et(t),kt(t)).pipe(K());return z([i,n]).pipe(m(([a,s])=>{let{x:p,y:c}=Ue(t),l=ce(t),f=t.closest("table");return f&&t.parentElement&&(p+=f.offsetLeft+t.parentElement.offsetLeft,c+=f.offsetTop+t.parentElement.offsetTop),{active:a,offset:{x:p-s.x+l.width/2-r/2,y:c-s.y+l.height+8}}}))}function Vn(e){let t=e.title;if(!t.length)return M;let r=`__tooltip_${qa++}`,o=Pt(r,"inline"),n=P(".md-typeset",o);return n.innerHTML=t,C(()=>{let i=new g;return i.subscribe({next({offset:a}){o.style.setProperty("--md-tooltip-x",`${a.x}px`),o.style.setProperty("--md-tooltip-y",`${a.y}px`)},complete(){o.style.removeProperty("--md-tooltip-x"),o.style.removeProperty("--md-tooltip-y")}}),S(i.pipe(b(({active:a})=>a)),i.pipe(_e(250),b(({active:a})=>!a))).subscribe({next({active:a}){a?(e.insertAdjacentElement("afterend",o),e.setAttribute("aria-describedby",r),e.removeAttribute("title")):(o.remove(),e.removeAttribute("aria-describedby"),e.setAttribute("title",t))},complete(){o.remove(),e.removeAttribute("aria-describedby"),e.setAttribute("title",t)}}),i.pipe(Le(16,me)).subscribe(({active:a})=>{o.classList.toggle("md-tooltip--active",a)}),i.pipe(ct(125,me),b(()=>!!e.offsetParent),m(()=>e.offsetParent.getBoundingClientRect()),m(({x:a})=>a)).subscribe({next(a){a?o.style.setProperty("--md-tooltip-0",`${-a}px`):o.style.removeProperty("--md-tooltip-0")},complete(){o.style.removeProperty("--md-tooltip-0")}}),Qa(o,e).pipe(y(a=>i.next(a)),L(()=>i.complete()),m(a=>R({ref:e},a)))}).pipe(Qe(se))}function Ka({viewport$:e}){if(!G("header.autohide"))return I(!1);let t=e.pipe(m(({offset:{y:n}})=>n),Ye(2,1),m(([n,i])=>[nMath.abs(i-n.y)>100),m(([,[n]])=>n),K()),o=Ve("search");return z([e,o]).pipe(m(([{offset:n},i])=>n.y>400&&!i),K(),v(n=>n?r:I(!1)),Q(!1))}function Nn(e,t){return C(()=>z([ge(e),Ka(t)])).pipe(m(([{height:r},o])=>({height:r,hidden:o})),K((r,o)=>r.height===o.height&&r.hidden===o.hidden),B(1))}function zn(e,{header$:t,main$:r}){return C(()=>{let o=new g,n=o.pipe(X(),ne(!0));o.pipe(Z("active"),We(t)).subscribe(([{active:a},{hidden:s}])=>{e.classList.toggle("md-header--shadow",a&&!s),e.hidden=s});let i=ue($("[title]",e)).pipe(b(()=>G("content.tooltips")),oe(a=>Vn(a)));return r.subscribe(o),t.pipe(U(n),m(a=>R({ref:e},a)),Pe(i.pipe(U(n))))})}function Ya(e,{viewport$:t,header$:r}){return mr(e,{viewport$:t,header$:r}).pipe(m(({offset:{y:o}})=>{let{height:n}=ce(e);return{active:o>=n}}),Z("active"))}function qn(e,t){return C(()=>{let r=new g;r.subscribe({next({active:n}){e.classList.toggle("md-header__title--active",n)},complete(){e.classList.remove("md-header__title--active")}});let o=fe(".md-content h1");return typeof o=="undefined"?M:Ya(o,t).pipe(y(n=>r.next(n)),L(()=>r.complete()),m(n=>R({ref:e},n)))})}function Qn(e,{viewport$:t,header$:r}){let o=r.pipe(m(({height:i})=>i),K()),n=o.pipe(v(()=>ge(e).pipe(m(({height:i})=>({top:e.offsetTop,bottom:e.offsetTop+i})),Z("bottom"))));return z([o,n,t]).pipe(m(([i,{top:a,bottom:s},{offset:{y:p},size:{height:c}}])=>(c=Math.max(0,c-Math.max(0,a-p,i)-Math.max(0,c+p-s)),{offset:a-i,height:c,active:a-i<=p})),K((i,a)=>i.offset===a.offset&&i.height===a.height&&i.active===a.active))}function Ba(e){let t=__md_get("__palette")||{index:e.findIndex(o=>matchMedia(o.getAttribute("data-md-color-media")).matches)},r=Math.max(0,Math.min(t.index,e.length-1));return I(...e).pipe(oe(o=>d(o,"change").pipe(m(()=>o))),Q(e[r]),m(o=>({index:e.indexOf(o),color:{media:o.getAttribute("data-md-color-media"),scheme:o.getAttribute("data-md-color-scheme"),primary:o.getAttribute("data-md-color-primary"),accent:o.getAttribute("data-md-color-accent")}})),B(1))}function Kn(e){let t=$("input",e),r=E("meta",{name:"theme-color"});document.head.appendChild(r);let o=E("meta",{name:"color-scheme"});document.head.appendChild(o);let n=$t("(prefers-color-scheme: light)");return C(()=>{let i=new g;return i.subscribe(a=>{if(document.body.setAttribute("data-md-color-switching",""),a.color.media==="(prefers-color-scheme)"){let s=matchMedia("(prefers-color-scheme: light)"),p=document.querySelector(s.matches?"[data-md-color-media='(prefers-color-scheme: light)']":"[data-md-color-media='(prefers-color-scheme: dark)']");a.color.scheme=p.getAttribute("data-md-color-scheme"),a.color.primary=p.getAttribute("data-md-color-primary"),a.color.accent=p.getAttribute("data-md-color-accent")}for(let[s,p]of Object.entries(a.color))document.body.setAttribute(`data-md-color-${s}`,p);for(let s=0;sa.key==="Enter"),ee(i,(a,s)=>s)).subscribe(({index:a})=>{a=(a+1)%t.length,t[a].click(),t[a].focus()}),i.pipe(m(()=>{let a=Se("header"),s=window.getComputedStyle(a);return o.content=s.colorScheme,s.backgroundColor.match(/\d+/g).map(p=>(+p).toString(16).padStart(2,"0")).join("")})).subscribe(a=>r.content=`#${a}`),i.pipe(be(se)).subscribe(()=>{document.body.removeAttribute("data-md-color-switching")}),Ba(t).pipe(U(n.pipe(Ce(1))),st(),y(a=>i.next(a)),L(()=>i.complete()),m(a=>R({ref:e},a)))})}function Yn(e,{progress$:t}){return C(()=>{let r=new g;return r.subscribe(({value:o})=>{e.style.setProperty("--md-progress-value",`${o}`)}),t.pipe(y(o=>r.next({value:o})),L(()=>r.complete()),m(o=>({ref:e,value:o})))})}var Gr=Vt(Yr());function Ga(e){e.setAttribute("data-md-copying","");let t=e.closest("[data-copy]"),r=t?t.getAttribute("data-copy"):e.innerText;return e.removeAttribute("data-md-copying"),r.trimEnd()}function Bn({alert$:e}){Gr.default.isSupported()&&new F(t=>{new Gr.default("[data-clipboard-target], [data-clipboard-text]",{text:r=>r.getAttribute("data-clipboard-text")||Ga(P(r.getAttribute("data-clipboard-target")))}).on("success",r=>t.next(r))}).pipe(y(t=>{t.trigger.focus()}),m(()=>ye("clipboard.copied"))).subscribe(e)}function Gn(e,t){return e.protocol=t.protocol,e.hostname=t.hostname,e}function Ja(e,t){let r=new Map;for(let o of $("url",e)){let n=P("loc",o),i=[Gn(new URL(n.textContent),t)];r.set(`${i[0]}`,i);for(let a of $("[rel=alternate]",o)){let s=a.getAttribute("href");s!=null&&i.push(Gn(new URL(s),t))}}return r}function ur(e){return mn(new URL("sitemap.xml",e)).pipe(m(t=>Ja(t,new URL(e))),ve(()=>I(new Map)))}function Xa(e,t){if(!(e.target instanceof Element))return M;let r=e.target.closest("a");if(r===null)return M;if(r.target||e.metaKey||e.ctrlKey)return M;let o=new URL(r.href);return o.search=o.hash="",t.has(`${o}`)?(e.preventDefault(),I(new URL(r.href))):M}function Jn(e){let t=new Map;for(let r of $(":scope > *",e.head))t.set(r.outerHTML,r);return t}function Xn(e){for(let t of $("[href], [src]",e))for(let r of["href","src"]){let o=t.getAttribute(r);if(o&&!/^(?:[a-z]+:)?\/\//i.test(o)){t[r]=t[r];break}}return I(e)}function Za(e){for(let o of["[data-md-component=announce]","[data-md-component=container]","[data-md-component=header-topic]","[data-md-component=outdated]","[data-md-component=logo]","[data-md-component=skip]",...G("navigation.tabs.sticky")?["[data-md-component=tabs]"]:[]]){let n=fe(o),i=fe(o,e);typeof n!="undefined"&&typeof i!="undefined"&&n.replaceWith(i)}let t=Jn(document);for(let[o,n]of Jn(e))t.has(o)?t.delete(o):document.head.appendChild(n);for(let o of t.values()){let n=o.getAttribute("name");n!=="theme-color"&&n!=="color-scheme"&&o.remove()}let r=Se("container");return je($("script",r)).pipe(v(o=>{let n=e.createElement("script");if(o.src){for(let i of o.getAttributeNames())n.setAttribute(i,o.getAttribute(i));return o.replaceWith(n),new F(i=>{n.onload=()=>i.complete()})}else return n.textContent=o.textContent,o.replaceWith(n),M}),X(),ne(document))}function Zn({location$:e,viewport$:t,progress$:r}){let o=Te();if(location.protocol==="file:")return M;let n=ur(o.base);I(document).subscribe(Xn);let i=d(document.body,"click").pipe(We(n),v(([p,c])=>Xa(p,c)),pe()),a=d(window,"popstate").pipe(m(xe),pe());i.pipe(ee(t)).subscribe(([p,{offset:c}])=>{history.replaceState(c,""),history.pushState(null,"",p)}),S(i,a).subscribe(e);let s=e.pipe(Z("pathname"),v(p=>ln(p,{progress$:r}).pipe(ve(()=>(pt(p,!0),M)))),v(Xn),v(Za),pe());return S(s.pipe(ee(e,(p,c)=>c)),e.pipe(Z("pathname"),v(()=>e),Z("hash")),e.pipe(K((p,c)=>p.pathname===c.pathname&&p.hash===c.hash),v(()=>i),y(()=>history.back()))).subscribe(p=>{var c,l;history.state!==null||!p.hash?window.scrollTo(0,(l=(c=history.state)==null?void 0:c.y)!=null?l:0):(history.scrollRestoration="auto",sn(p.hash),history.scrollRestoration="manual")}),e.subscribe(()=>{history.scrollRestoration="manual"}),d(window,"beforeunload").subscribe(()=>{history.scrollRestoration="auto"}),t.pipe(Z("offset"),_e(100)).subscribe(({offset:p})=>{history.replaceState(p,"")}),s}var ri=Vt(ti());function oi(e){let t=e.separator.split("|").map(n=>n.replace(/(\(\?[!=<][^)]+\))/g,"").length===0?"\uFFFD":n).join("|"),r=new RegExp(t,"img"),o=(n,i,a)=>`${i}${a}`;return n=>{n=n.replace(/[\s*+\-:~^]+/g," ").trim();let i=new RegExp(`(^|${e.separator}|)(${n.replace(/[|\\{}()[\]^$+*?.-]/g,"\\$&").replace(r,"|")})`,"img");return a=>(0,ri.default)(a).replace(i,o).replace(/<\/mark>(\s+)]*>/img,"$1")}}function It(e){return e.type===1}function dr(e){return e.type===3}function ni(e,t){let r=vn(e);return S(I(location.protocol!=="file:"),Ve("search")).pipe(Ae(o=>o),v(()=>t)).subscribe(({config:o,docs:n})=>r.next({type:0,data:{config:o,docs:n,options:{suggest:G("search.suggest")}}})),r}function ii({document$:e}){let t=Te(),r=Ne(new URL("../versions.json",t.base)).pipe(ve(()=>M)),o=r.pipe(m(n=>{let[,i]=t.base.match(/([^/]+)\/?$/);return n.find(({version:a,aliases:s})=>a===i||s.includes(i))||n[0]}));r.pipe(m(n=>new Map(n.map(i=>[`${new URL(`../${i.version}/`,t.base)}`,i]))),v(n=>d(document.body,"click").pipe(b(i=>!i.metaKey&&!i.ctrlKey),ee(o),v(([i,a])=>{if(i.target instanceof Element){let s=i.target.closest("a");if(s&&!s.target&&n.has(s.href)){let p=s.href;return!i.target.closest(".md-version")&&n.get(p)===a?M:(i.preventDefault(),I(p))}}return M}),v(i=>{let{version:a}=n.get(i);return ur(new URL(i)).pipe(m(s=>{let c=xe().href.replace(t.base,"");return s.has(c.split("#")[0])?new URL(`../${a}/${c}`,t.base):new URL(i)}))})))).subscribe(n=>pt(n,!0)),z([r,o]).subscribe(([n,i])=>{P(".md-header__topic").appendChild(Mn(n,i))}),e.pipe(v(()=>o)).subscribe(n=>{var a;let i=__md_get("__outdated",sessionStorage);if(i===null){i=!0;let s=((a=t.version)==null?void 0:a.default)||"latest";Array.isArray(s)||(s=[s]);e:for(let p of s)for(let c of n.aliases.concat(n.version))if(new RegExp(p,"i").test(c)){i=!1;break e}__md_set("__outdated",i,sessionStorage)}if(i)for(let s of ae("outdated"))s.hidden=!1})}function ns(e,{worker$:t}){let{searchParams:r}=xe();r.has("q")&&(Je("search",!0),e.value=r.get("q"),e.focus(),Ve("search").pipe(Ae(i=>!i)).subscribe(()=>{let i=xe();i.searchParams.delete("q"),history.replaceState({},"",`${i}`)}));let o=et(e),n=S(t.pipe(Ae(It)),d(e,"keyup"),o).pipe(m(()=>e.value),K());return z([n,o]).pipe(m(([i,a])=>({value:i,focus:a})),B(1))}function ai(e,{worker$:t}){let r=new g,o=r.pipe(X(),ne(!0));z([t.pipe(Ae(It)),r],(i,a)=>a).pipe(Z("value")).subscribe(({value:i})=>t.next({type:2,data:i})),r.pipe(Z("focus")).subscribe(({focus:i})=>{i&&Je("search",i)}),d(e.form,"reset").pipe(U(o)).subscribe(()=>e.focus());let n=P("header [for=__search]");return d(n,"click").subscribe(()=>e.focus()),ns(e,{worker$:t}).pipe(y(i=>r.next(i)),L(()=>r.complete()),m(i=>R({ref:e},i)),B(1))}function si(e,{worker$:t,query$:r}){let o=new g,n=tn(e.parentElement).pipe(b(Boolean)),i=e.parentElement,a=P(":scope > :first-child",e),s=P(":scope > :last-child",e);Ve("search").subscribe(l=>s.setAttribute("role",l?"list":"presentation")),o.pipe(ee(r),Ur(t.pipe(Ae(It)))).subscribe(([{items:l},{value:f}])=>{switch(l.length){case 0:a.textContent=f.length?ye("search.result.none"):ye("search.result.placeholder");break;case 1:a.textContent=ye("search.result.one");break;default:let u=sr(l.length);a.textContent=ye("search.result.other",u)}});let p=o.pipe(y(()=>s.innerHTML=""),v(({items:l})=>S(I(...l.slice(0,10)),I(...l.slice(10)).pipe(Ye(4),Vr(n),v(([f])=>f)))),m(Tn),pe());return p.subscribe(l=>s.appendChild(l)),p.pipe(oe(l=>{let f=fe("details",l);return typeof f=="undefined"?M:d(f,"toggle").pipe(U(o),m(()=>f))})).subscribe(l=>{l.open===!1&&l.offsetTop<=i.scrollTop&&i.scrollTo({top:l.offsetTop})}),t.pipe(b(dr),m(({data:l})=>l)).pipe(y(l=>o.next(l)),L(()=>o.complete()),m(l=>R({ref:e},l)))}function is(e,{query$:t}){return t.pipe(m(({value:r})=>{let o=xe();return o.hash="",r=r.replace(/\s+/g,"+").replace(/&/g,"%26").replace(/=/g,"%3D"),o.search=`q=${r}`,{url:o}}))}function ci(e,t){let r=new g,o=r.pipe(X(),ne(!0));return r.subscribe(({url:n})=>{e.setAttribute("data-clipboard-text",e.href),e.href=`${n}`}),d(e,"click").pipe(U(o)).subscribe(n=>n.preventDefault()),is(e,t).pipe(y(n=>r.next(n)),L(()=>r.complete()),m(n=>R({ref:e},n)))}function pi(e,{worker$:t,keyboard$:r}){let o=new g,n=Se("search-query"),i=S(d(n,"keydown"),d(n,"focus")).pipe(be(se),m(()=>n.value),K());return o.pipe(We(i),m(([{suggest:s},p])=>{let c=p.split(/([\s-]+)/);if(s!=null&&s.length&&c[c.length-1]){let l=s[s.length-1];l.startsWith(c[c.length-1])&&(c[c.length-1]=l)}else c.length=0;return c})).subscribe(s=>e.innerHTML=s.join("").replace(/\s/g," ")),r.pipe(b(({mode:s})=>s==="search")).subscribe(s=>{switch(s.type){case"ArrowRight":e.innerText.length&&n.selectionStart===n.value.length&&(n.value=e.innerText);break}}),t.pipe(b(dr),m(({data:s})=>s)).pipe(y(s=>o.next(s)),L(()=>o.complete()),m(()=>({ref:e})))}function li(e,{index$:t,keyboard$:r}){let o=Te();try{let n=ni(o.search,t),i=Se("search-query",e),a=Se("search-result",e);d(e,"click").pipe(b(({target:p})=>p instanceof Element&&!!p.closest("a"))).subscribe(()=>Je("search",!1)),r.pipe(b(({mode:p})=>p==="search")).subscribe(p=>{let c=Re();switch(p.type){case"Enter":if(c===i){let l=new Map;for(let f of $(":first-child [href]",a)){let u=f.firstElementChild;l.set(f,parseFloat(u.getAttribute("data-md-score")))}if(l.size){let[[f]]=[...l].sort(([,u],[,h])=>h-u);f.click()}p.claim()}break;case"Escape":case"Tab":Je("search",!1),i.blur();break;case"ArrowUp":case"ArrowDown":if(typeof c=="undefined")i.focus();else{let l=[i,...$(":not(details) > [href], summary, details[open] [href]",a)],f=Math.max(0,(Math.max(0,l.indexOf(c))+l.length+(p.type==="ArrowUp"?-1:1))%l.length);l[f].focus()}p.claim();break;default:i!==Re()&&i.focus()}}),r.pipe(b(({mode:p})=>p==="global")).subscribe(p=>{switch(p.type){case"f":case"s":case"/":i.focus(),i.select(),p.claim();break}});let s=ai(i,{worker$:n});return S(s,si(a,{worker$:n,query$:s})).pipe(Pe(...ae("search-share",e).map(p=>ci(p,{query$:s})),...ae("search-suggest",e).map(p=>pi(p,{worker$:n,keyboard$:r}))))}catch(n){return e.hidden=!0,Ke}}function mi(e,{index$:t,location$:r}){return z([t,r.pipe(Q(xe()),b(o=>!!o.searchParams.get("h")))]).pipe(m(([o,n])=>oi(o.config)(n.searchParams.get("h"))),m(o=>{var a;let n=new Map,i=document.createNodeIterator(e,NodeFilter.SHOW_TEXT);for(let s=i.nextNode();s;s=i.nextNode())if((a=s.parentElement)!=null&&a.offsetHeight){let p=s.textContent,c=o(p);c.length>p.length&&n.set(s,c)}for(let[s,p]of n){let{childNodes:c}=E("span",null,p);s.replaceWith(...Array.from(c))}return{ref:e,nodes:n}}))}function as(e,{viewport$:t,main$:r}){let o=e.closest(".md-grid"),n=o.offsetTop-o.parentElement.offsetTop;return z([r,t]).pipe(m(([{offset:i,height:a},{offset:{y:s}}])=>(a=a+Math.min(n,Math.max(0,s-i))-n,{height:a,locked:s>=i+n})),K((i,a)=>i.height===a.height&&i.locked===a.locked))}function Jr(e,o){var n=o,{header$:t}=n,r=io(n,["header$"]);let i=P(".md-sidebar__scrollwrap",e),{y:a}=Ue(i);return C(()=>{let s=new g,p=s.pipe(X(),ne(!0)),c=s.pipe(Le(0,me));return c.pipe(ee(t)).subscribe({next([{height:l},{height:f}]){i.style.height=`${l-2*a}px`,e.style.top=`${f}px`},complete(){i.style.height="",e.style.top=""}}),c.pipe(Ae()).subscribe(()=>{for(let l of $(".md-nav__link--active[href]",e)){if(!l.clientHeight)continue;let f=l.closest(".md-sidebar__scrollwrap");if(typeof f!="undefined"){let u=l.offsetTop-f.offsetTop,{height:h}=ce(f);f.scrollTo({top:u-h/2})}}}),ue($("label[tabindex]",e)).pipe(oe(l=>d(l,"click").pipe(be(se),m(()=>l),U(p)))).subscribe(l=>{let f=P(`[id="${l.htmlFor}"]`);P(`[aria-labelledby="${l.id}"]`).setAttribute("aria-expanded",`${f.checked}`)}),as(e,r).pipe(y(l=>s.next(l)),L(()=>s.complete()),m(l=>R({ref:e},l)))})}function fi(e,t){if(typeof t!="undefined"){let r=`https://api.github.com/repos/${e}/${t}`;return Ct(Ne(`${r}/releases/latest`).pipe(ve(()=>M),m(o=>({version:o.tag_name})),Be({})),Ne(r).pipe(ve(()=>M),m(o=>({stars:o.stargazers_count,forks:o.forks_count})),Be({}))).pipe(m(([o,n])=>R(R({},o),n)))}else{let r=`https://api.github.com/users/${e}`;return Ne(r).pipe(m(o=>({repositories:o.public_repos})),Be({}))}}function ui(e,t){let r=`https://${e}/api/v4/projects/${encodeURIComponent(t)}`;return Ne(r).pipe(ve(()=>M),m(({star_count:o,forks_count:n})=>({stars:o,forks:n})),Be({}))}function di(e){let t=e.match(/^.+github\.com\/([^/]+)\/?([^/]+)?/i);if(t){let[,r,o]=t;return fi(r,o)}if(t=e.match(/^.+?([^/]*gitlab[^/]+)\/(.+?)\/?$/i),t){let[,r,o]=t;return ui(r,o)}return M}var ss;function cs(e){return ss||(ss=C(()=>{let t=__md_get("__source",sessionStorage);if(t)return I(t);if(ae("consent").length){let o=__md_get("__consent");if(!(o&&o.github))return M}return di(e.href).pipe(y(o=>__md_set("__source",o,sessionStorage)))}).pipe(ve(()=>M),b(t=>Object.keys(t).length>0),m(t=>({facts:t})),B(1)))}function hi(e){let t=P(":scope > :last-child",e);return C(()=>{let r=new g;return r.subscribe(({facts:o})=>{t.appendChild(Sn(o)),t.classList.add("md-source__repository--active")}),cs(e).pipe(y(o=>r.next(o)),L(()=>r.complete()),m(o=>R({ref:e},o)))})}function ps(e,{viewport$:t,header$:r}){return ge(document.body).pipe(v(()=>mr(e,{header$:r,viewport$:t})),m(({offset:{y:o}})=>({hidden:o>=10})),Z("hidden"))}function bi(e,t){return C(()=>{let r=new g;return r.subscribe({next({hidden:o}){e.hidden=o},complete(){e.hidden=!1}}),(G("navigation.tabs.sticky")?I({hidden:!1}):ps(e,t)).pipe(y(o=>r.next(o)),L(()=>r.complete()),m(o=>R({ref:e},o)))})}function ls(e,{viewport$:t,header$:r}){let o=new Map,n=$(".md-nav__link",e);for(let s of n){let p=decodeURIComponent(s.hash.substring(1)),c=fe(`[id="${p}"]`);typeof c!="undefined"&&o.set(s,c)}let i=r.pipe(Z("height"),m(({height:s})=>{let p=Se("main"),c=P(":scope > :first-child",p);return s+.8*(c.offsetTop-p.offsetTop)}),pe());return ge(document.body).pipe(Z("height"),v(s=>C(()=>{let p=[];return I([...o].reduce((c,[l,f])=>{for(;p.length&&o.get(p[p.length-1]).tagName>=f.tagName;)p.pop();let u=f.offsetTop;for(;!u&&f.parentElement;)f=f.parentElement,u=f.offsetTop;let h=f.offsetParent;for(;h;h=h.offsetParent)u+=h.offsetTop;return c.set([...p=[...p,l]].reverse(),u)},new Map))}).pipe(m(p=>new Map([...p].sort(([,c],[,l])=>c-l))),We(i),v(([p,c])=>t.pipe(jr(([l,f],{offset:{y:u},size:h})=>{let w=u+h.height>=Math.floor(s.height);for(;f.length;){let[,A]=f[0];if(A-c=u&&!w)f=[l.pop(),...f];else break}return[l,f]},[[],[...p]]),K((l,f)=>l[0]===f[0]&&l[1]===f[1])))))).pipe(m(([s,p])=>({prev:s.map(([c])=>c),next:p.map(([c])=>c)})),Q({prev:[],next:[]}),Ye(2,1),m(([s,p])=>s.prev.length{let i=new g,a=i.pipe(X(),ne(!0));if(i.subscribe(({prev:s,next:p})=>{for(let[c]of p)c.classList.remove("md-nav__link--passed"),c.classList.remove("md-nav__link--active");for(let[c,[l]]of s.entries())l.classList.add("md-nav__link--passed"),l.classList.toggle("md-nav__link--active",c===s.length-1)}),G("toc.follow")){let s=S(t.pipe(_e(1),m(()=>{})),t.pipe(_e(250),m(()=>"smooth")));i.pipe(b(({prev:p})=>p.length>0),We(o.pipe(be(se))),ee(s)).subscribe(([[{prev:p}],c])=>{let[l]=p[p.length-1];if(l.offsetHeight){let f=cr(l);if(typeof f!="undefined"){let u=l.offsetTop-f.offsetTop,{height:h}=ce(f);f.scrollTo({top:u-h/2,behavior:c})}}})}return G("navigation.tracking")&&t.pipe(U(a),Z("offset"),_e(250),Ce(1),U(n.pipe(Ce(1))),st({delay:250}),ee(i)).subscribe(([,{prev:s}])=>{let p=xe(),c=s[s.length-1];if(c&&c.length){let[l]=c,{hash:f}=new URL(l.href);p.hash!==f&&(p.hash=f,history.replaceState({},"",`${p}`))}else p.hash="",history.replaceState({},"",`${p}`)}),ls(e,{viewport$:t,header$:r}).pipe(y(s=>i.next(s)),L(()=>i.complete()),m(s=>R({ref:e},s)))})}function ms(e,{viewport$:t,main$:r,target$:o}){let n=t.pipe(m(({offset:{y:a}})=>a),Ye(2,1),m(([a,s])=>a>s&&s>0),K()),i=r.pipe(m(({active:a})=>a));return z([i,n]).pipe(m(([a,s])=>!(a&&s)),K(),U(o.pipe(Ce(1))),ne(!0),st({delay:250}),m(a=>({hidden:a})))}function gi(e,{viewport$:t,header$:r,main$:o,target$:n}){let i=new g,a=i.pipe(X(),ne(!0));return i.subscribe({next({hidden:s}){e.hidden=s,s?(e.setAttribute("tabindex","-1"),e.blur()):e.removeAttribute("tabindex")},complete(){e.style.top="",e.hidden=!0,e.removeAttribute("tabindex")}}),r.pipe(U(a),Z("height")).subscribe(({height:s})=>{e.style.top=`${s+16}px`}),d(e,"click").subscribe(s=>{s.preventDefault(),window.scrollTo({top:0})}),ms(e,{viewport$:t,main$:o,target$:n}).pipe(y(s=>i.next(s)),L(()=>i.complete()),m(s=>R({ref:e},s)))}function xi({document$:e,viewport$:t}){e.pipe(v(()=>$(".md-ellipsis")),oe(r=>tt(r).pipe(U(e.pipe(Ce(1))),b(o=>o),m(()=>r),we(1))),b(r=>r.offsetWidth{let o=r.innerText,n=r.closest("a")||r;return n.title=o,lt(n,{viewport$:t}).pipe(U(e.pipe(Ce(1))),L(()=>n.removeAttribute("title")))})).subscribe(),e.pipe(v(()=>$(".md-status")),oe(r=>lt(r,{viewport$:t}))).subscribe()}function yi({document$:e,tablet$:t}){e.pipe(v(()=>$(".md-toggle--indeterminate")),y(r=>{r.indeterminate=!0,r.checked=!1}),oe(r=>d(r,"change").pipe(Dr(()=>r.classList.contains("md-toggle--indeterminate")),m(()=>r))),ee(t)).subscribe(([r,o])=>{r.classList.remove("md-toggle--indeterminate"),o&&(r.checked=!1)})}function fs(){return/(iPad|iPhone|iPod)/.test(navigator.userAgent)}function Ei({document$:e}){e.pipe(v(()=>$("[data-md-scrollfix]")),y(t=>t.removeAttribute("data-md-scrollfix")),b(fs),oe(t=>d(t,"touchstart").pipe(m(()=>t)))).subscribe(t=>{let r=t.scrollTop;r===0?t.scrollTop=1:r+t.offsetHeight===t.scrollHeight&&(t.scrollTop=r-1)})}function wi({viewport$:e,tablet$:t}){z([Ve("search"),t]).pipe(m(([r,o])=>r&&!o),v(r=>I(r).pipe(Ge(r?400:100))),ee(e)).subscribe(([r,{offset:{y:o}}])=>{if(r)document.body.setAttribute("data-md-scrolllock",""),document.body.style.top=`-${o}px`;else{let n=-1*parseInt(document.body.style.top,10);document.body.removeAttribute("data-md-scrolllock"),document.body.style.top="",n&&window.scrollTo(0,n)}})}Object.entries||(Object.entries=function(e){let t=[];for(let r of Object.keys(e))t.push([r,e[r]]);return t});Object.values||(Object.values=function(e){let t=[];for(let r of Object.keys(e))t.push(e[r]);return t});typeof Element!="undefined"&&(Element.prototype.scrollTo||(Element.prototype.scrollTo=function(e,t){typeof e=="object"?(this.scrollLeft=e.left,this.scrollTop=e.top):(this.scrollLeft=e,this.scrollTop=t)}),Element.prototype.replaceWith||(Element.prototype.replaceWith=function(...e){let t=this.parentNode;if(t){e.length===0&&t.removeChild(this);for(let r=e.length-1;r>=0;r--){let o=e[r];typeof o=="string"?o=document.createTextNode(o):o.parentNode&&o.parentNode.removeChild(o),r?t.insertBefore(this.previousSibling,o):t.replaceChild(o,this)}}}));function us(){return location.protocol==="file:"?wt(`${new URL("search/search_index.js",Xr.base)}`).pipe(m(()=>__index),B(1)):Ne(new URL("search/search_index.json",Xr.base))}document.documentElement.classList.remove("no-js");document.documentElement.classList.add("js");var ot=Yo(),jt=nn(),Ot=cn(jt),Zr=on(),Oe=bn(),hr=$t("(min-width: 960px)"),Si=$t("(min-width: 1220px)"),Oi=pn(),Xr=Te(),Mi=document.forms.namedItem("search")?us():Ke,eo=new g;Bn({alert$:eo});var to=new g;G("navigation.instant")&&Zn({location$:jt,viewport$:Oe,progress$:to}).subscribe(ot);var Ti;((Ti=Xr.version)==null?void 0:Ti.provider)==="mike"&&ii({document$:ot});S(jt,Ot).pipe(Ge(125)).subscribe(()=>{Je("drawer",!1),Je("search",!1)});Zr.pipe(b(({mode:e})=>e==="global")).subscribe(e=>{switch(e.type){case"p":case",":let t=fe("link[rel=prev]");typeof t!="undefined"&&pt(t);break;case"n":case".":let r=fe("link[rel=next]");typeof r!="undefined"&&pt(r);break;case"Enter":let o=Re();o instanceof HTMLLabelElement&&o.click()}});xi({viewport$:Oe,document$:ot});yi({document$:ot,tablet$:hr});Ei({document$:ot});wi({viewport$:Oe,tablet$:hr});var rt=Nn(Se("header"),{viewport$:Oe}),Ft=ot.pipe(m(()=>Se("main")),v(e=>Qn(e,{viewport$:Oe,header$:rt})),B(1)),ds=S(...ae("consent").map(e=>xn(e,{target$:Ot})),...ae("dialog").map(e=>Dn(e,{alert$:eo})),...ae("header").map(e=>zn(e,{viewport$:Oe,header$:rt,main$:Ft})),...ae("palette").map(e=>Kn(e)),...ae("progress").map(e=>Yn(e,{progress$:to})),...ae("search").map(e=>li(e,{index$:Mi,keyboard$:Zr})),...ae("source").map(e=>hi(e))),hs=C(()=>S(...ae("announce").map(e=>gn(e)),...ae("content").map(e=>Un(e,{viewport$:Oe,target$:Ot,print$:Oi})),...ae("content").map(e=>G("search.highlight")?mi(e,{index$:Mi,location$:jt}):M),...ae("header-title").map(e=>qn(e,{viewport$:Oe,header$:rt})),...ae("sidebar").map(e=>e.getAttribute("data-md-type")==="navigation"?Nr(Si,()=>Jr(e,{viewport$:Oe,header$:rt,main$:Ft})):Nr(hr,()=>Jr(e,{viewport$:Oe,header$:rt,main$:Ft}))),...ae("tabs").map(e=>bi(e,{viewport$:Oe,header$:rt})),...ae("toc").map(e=>vi(e,{viewport$:Oe,header$:rt,main$:Ft,target$:Ot})),...ae("top").map(e=>gi(e,{viewport$:Oe,header$:rt,main$:Ft,target$:Ot})))),Li=ot.pipe(v(()=>hs),Pe(ds),B(1));Li.subscribe();window.document$=ot;window.location$=jt;window.target$=Ot;window.keyboard$=Zr;window.viewport$=Oe;window.tablet$=hr;window.screen$=Si;window.print$=Oi;window.alert$=eo;window.progress$=to;window.component$=Li;})(); +//# sourceMappingURL=bundle.dd8806f2.min.js.map + diff --git a/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js.map b/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js.map new file mode 100644 index 000000000..17bf02572 --- /dev/null +++ b/v0.39.2/assets/javascripts/bundle.dd8806f2.min.js.map @@ -0,0 +1,7 @@ +{ + "version": 3, + "sources": ["node_modules/focus-visible/dist/focus-visible.js", "node_modules/clipboard/dist/clipboard.js", "node_modules/escape-html/index.js", "src/templates/assets/javascripts/bundle.ts", "node_modules/rxjs/node_modules/tslib/tslib.es6.js", "node_modules/rxjs/src/internal/util/isFunction.ts", "node_modules/rxjs/src/internal/util/createErrorClass.ts", "node_modules/rxjs/src/internal/util/UnsubscriptionError.ts", "node_modules/rxjs/src/internal/util/arrRemove.ts", "node_modules/rxjs/src/internal/Subscription.ts", "node_modules/rxjs/src/internal/config.ts", "node_modules/rxjs/src/internal/scheduler/timeoutProvider.ts", "node_modules/rxjs/src/internal/util/reportUnhandledError.ts", "node_modules/rxjs/src/internal/util/noop.ts", "node_modules/rxjs/src/internal/NotificationFactories.ts", "node_modules/rxjs/src/internal/util/errorContext.ts", "node_modules/rxjs/src/internal/Subscriber.ts", "node_modules/rxjs/src/internal/symbol/observable.ts", "node_modules/rxjs/src/internal/util/identity.ts", "node_modules/rxjs/src/internal/util/pipe.ts", "node_modules/rxjs/src/internal/Observable.ts", "node_modules/rxjs/src/internal/util/lift.ts", "node_modules/rxjs/src/internal/operators/OperatorSubscriber.ts", "node_modules/rxjs/src/internal/scheduler/animationFrameProvider.ts", "node_modules/rxjs/src/internal/util/ObjectUnsubscribedError.ts", "node_modules/rxjs/src/internal/Subject.ts", "node_modules/rxjs/src/internal/BehaviorSubject.ts", "node_modules/rxjs/src/internal/scheduler/dateTimestampProvider.ts", "node_modules/rxjs/src/internal/ReplaySubject.ts", "node_modules/rxjs/src/internal/scheduler/Action.ts", "node_modules/rxjs/src/internal/scheduler/intervalProvider.ts", "node_modules/rxjs/src/internal/scheduler/AsyncAction.ts", "node_modules/rxjs/src/internal/Scheduler.ts", "node_modules/rxjs/src/internal/scheduler/AsyncScheduler.ts", "node_modules/rxjs/src/internal/scheduler/async.ts", "node_modules/rxjs/src/internal/scheduler/QueueAction.ts", "node_modules/rxjs/src/internal/scheduler/QueueScheduler.ts", "node_modules/rxjs/src/internal/scheduler/queue.ts", "node_modules/rxjs/src/internal/scheduler/AnimationFrameAction.ts", "node_modules/rxjs/src/internal/scheduler/AnimationFrameScheduler.ts", "node_modules/rxjs/src/internal/scheduler/animationFrame.ts", "node_modules/rxjs/src/internal/observable/empty.ts", "node_modules/rxjs/src/internal/util/isScheduler.ts", "node_modules/rxjs/src/internal/util/args.ts", "node_modules/rxjs/src/internal/util/isArrayLike.ts", "node_modules/rxjs/src/internal/util/isPromise.ts", "node_modules/rxjs/src/internal/util/isInteropObservable.ts", "node_modules/rxjs/src/internal/util/isAsyncIterable.ts", "node_modules/rxjs/src/internal/util/throwUnobservableError.ts", "node_modules/rxjs/src/internal/symbol/iterator.ts", "node_modules/rxjs/src/internal/util/isIterable.ts", "node_modules/rxjs/src/internal/util/isReadableStreamLike.ts", "node_modules/rxjs/src/internal/observable/innerFrom.ts", "node_modules/rxjs/src/internal/util/executeSchedule.ts", "node_modules/rxjs/src/internal/operators/observeOn.ts", "node_modules/rxjs/src/internal/operators/subscribeOn.ts", "node_modules/rxjs/src/internal/scheduled/scheduleObservable.ts", "node_modules/rxjs/src/internal/scheduled/schedulePromise.ts", "node_modules/rxjs/src/internal/scheduled/scheduleArray.ts", "node_modules/rxjs/src/internal/scheduled/scheduleIterable.ts", "node_modules/rxjs/src/internal/scheduled/scheduleAsyncIterable.ts", "node_modules/rxjs/src/internal/scheduled/scheduleReadableStreamLike.ts", "node_modules/rxjs/src/internal/scheduled/scheduled.ts", "node_modules/rxjs/src/internal/observable/from.ts", "node_modules/rxjs/src/internal/observable/of.ts", "node_modules/rxjs/src/internal/observable/throwError.ts", "node_modules/rxjs/src/internal/util/EmptyError.ts", "node_modules/rxjs/src/internal/util/isDate.ts", "node_modules/rxjs/src/internal/operators/map.ts", "node_modules/rxjs/src/internal/util/mapOneOrManyArgs.ts", "node_modules/rxjs/src/internal/util/argsArgArrayOrObject.ts", "node_modules/rxjs/src/internal/util/createObject.ts", "node_modules/rxjs/src/internal/observable/combineLatest.ts", "node_modules/rxjs/src/internal/operators/mergeInternals.ts", "node_modules/rxjs/src/internal/operators/mergeMap.ts", "node_modules/rxjs/src/internal/operators/mergeAll.ts", "node_modules/rxjs/src/internal/operators/concatAll.ts", "node_modules/rxjs/src/internal/observable/concat.ts", "node_modules/rxjs/src/internal/observable/defer.ts", "node_modules/rxjs/src/internal/observable/fromEvent.ts", "node_modules/rxjs/src/internal/observable/fromEventPattern.ts", "node_modules/rxjs/src/internal/observable/timer.ts", "node_modules/rxjs/src/internal/observable/merge.ts", "node_modules/rxjs/src/internal/observable/never.ts", "node_modules/rxjs/src/internal/util/argsOrArgArray.ts", "node_modules/rxjs/src/internal/operators/filter.ts", "node_modules/rxjs/src/internal/observable/zip.ts", "node_modules/rxjs/src/internal/operators/audit.ts", "node_modules/rxjs/src/internal/operators/auditTime.ts", "node_modules/rxjs/src/internal/operators/bufferCount.ts", "node_modules/rxjs/src/internal/operators/catchError.ts", "node_modules/rxjs/src/internal/operators/scanInternals.ts", "node_modules/rxjs/src/internal/operators/combineLatest.ts", "node_modules/rxjs/src/internal/operators/combineLatestWith.ts", "node_modules/rxjs/src/internal/operators/debounce.ts", "node_modules/rxjs/src/internal/operators/debounceTime.ts", "node_modules/rxjs/src/internal/operators/defaultIfEmpty.ts", "node_modules/rxjs/src/internal/operators/take.ts", "node_modules/rxjs/src/internal/operators/ignoreElements.ts", "node_modules/rxjs/src/internal/operators/mapTo.ts", "node_modules/rxjs/src/internal/operators/delayWhen.ts", "node_modules/rxjs/src/internal/operators/delay.ts", "node_modules/rxjs/src/internal/operators/distinctUntilChanged.ts", "node_modules/rxjs/src/internal/operators/distinctUntilKeyChanged.ts", "node_modules/rxjs/src/internal/operators/throwIfEmpty.ts", "node_modules/rxjs/src/internal/operators/endWith.ts", "node_modules/rxjs/src/internal/operators/finalize.ts", "node_modules/rxjs/src/internal/operators/first.ts", "node_modules/rxjs/src/internal/operators/takeLast.ts", "node_modules/rxjs/src/internal/operators/merge.ts", "node_modules/rxjs/src/internal/operators/mergeWith.ts", "node_modules/rxjs/src/internal/operators/repeat.ts", "node_modules/rxjs/src/internal/operators/scan.ts", "node_modules/rxjs/src/internal/operators/share.ts", "node_modules/rxjs/src/internal/operators/shareReplay.ts", "node_modules/rxjs/src/internal/operators/skip.ts", "node_modules/rxjs/src/internal/operators/skipUntil.ts", "node_modules/rxjs/src/internal/operators/startWith.ts", "node_modules/rxjs/src/internal/operators/switchMap.ts", "node_modules/rxjs/src/internal/operators/takeUntil.ts", "node_modules/rxjs/src/internal/operators/takeWhile.ts", "node_modules/rxjs/src/internal/operators/tap.ts", "node_modules/rxjs/src/internal/operators/throttle.ts", "node_modules/rxjs/src/internal/operators/throttleTime.ts", "node_modules/rxjs/src/internal/operators/withLatestFrom.ts", "node_modules/rxjs/src/internal/operators/zip.ts", "node_modules/rxjs/src/internal/operators/zipWith.ts", "src/templates/assets/javascripts/browser/document/index.ts", "src/templates/assets/javascripts/browser/element/_/index.ts", "src/templates/assets/javascripts/browser/element/focus/index.ts", "src/templates/assets/javascripts/browser/element/hover/index.ts", "src/templates/assets/javascripts/utilities/h/index.ts", "src/templates/assets/javascripts/utilities/round/index.ts", "src/templates/assets/javascripts/browser/script/index.ts", "src/templates/assets/javascripts/browser/element/size/_/index.ts", "src/templates/assets/javascripts/browser/element/size/content/index.ts", "src/templates/assets/javascripts/browser/element/offset/_/index.ts", "src/templates/assets/javascripts/browser/element/offset/content/index.ts", "src/templates/assets/javascripts/browser/element/visibility/index.ts", "src/templates/assets/javascripts/browser/toggle/index.ts", "src/templates/assets/javascripts/browser/keyboard/index.ts", "src/templates/assets/javascripts/browser/location/_/index.ts", "src/templates/assets/javascripts/browser/location/hash/index.ts", "src/templates/assets/javascripts/browser/media/index.ts", "src/templates/assets/javascripts/browser/request/index.ts", "src/templates/assets/javascripts/browser/viewport/offset/index.ts", "src/templates/assets/javascripts/browser/viewport/size/index.ts", "src/templates/assets/javascripts/browser/viewport/_/index.ts", "src/templates/assets/javascripts/browser/viewport/at/index.ts", "src/templates/assets/javascripts/browser/worker/index.ts", "src/templates/assets/javascripts/_/index.ts", "src/templates/assets/javascripts/components/_/index.ts", "src/templates/assets/javascripts/components/announce/index.ts", "src/templates/assets/javascripts/components/consent/index.ts", "src/templates/assets/javascripts/templates/tooltip/index.tsx", "src/templates/assets/javascripts/templates/annotation/index.tsx", "src/templates/assets/javascripts/templates/clipboard/index.tsx", "src/templates/assets/javascripts/templates/search/index.tsx", "src/templates/assets/javascripts/templates/source/index.tsx", "src/templates/assets/javascripts/templates/tabbed/index.tsx", "src/templates/assets/javascripts/templates/table/index.tsx", "src/templates/assets/javascripts/templates/version/index.tsx", "src/templates/assets/javascripts/components/tooltip2/index.ts", "src/templates/assets/javascripts/components/content/annotation/_/index.ts", "src/templates/assets/javascripts/components/content/annotation/list/index.ts", "src/templates/assets/javascripts/components/content/annotation/block/index.ts", "src/templates/assets/javascripts/components/content/code/_/index.ts", "src/templates/assets/javascripts/components/content/details/index.ts", "src/templates/assets/javascripts/components/content/mermaid/index.css", "src/templates/assets/javascripts/components/content/mermaid/index.ts", "src/templates/assets/javascripts/components/content/table/index.ts", "src/templates/assets/javascripts/components/content/tabs/index.ts", "src/templates/assets/javascripts/components/content/_/index.ts", "src/templates/assets/javascripts/components/dialog/index.ts", "src/templates/assets/javascripts/components/tooltip/index.ts", "src/templates/assets/javascripts/components/header/_/index.ts", "src/templates/assets/javascripts/components/header/title/index.ts", "src/templates/assets/javascripts/components/main/index.ts", "src/templates/assets/javascripts/components/palette/index.ts", "src/templates/assets/javascripts/components/progress/index.ts", "src/templates/assets/javascripts/integrations/clipboard/index.ts", "src/templates/assets/javascripts/integrations/sitemap/index.ts", "src/templates/assets/javascripts/integrations/instant/index.ts", "src/templates/assets/javascripts/integrations/search/highlighter/index.ts", "src/templates/assets/javascripts/integrations/search/worker/message/index.ts", "src/templates/assets/javascripts/integrations/search/worker/_/index.ts", "src/templates/assets/javascripts/integrations/version/index.ts", "src/templates/assets/javascripts/components/search/query/index.ts", "src/templates/assets/javascripts/components/search/result/index.ts", "src/templates/assets/javascripts/components/search/share/index.ts", "src/templates/assets/javascripts/components/search/suggest/index.ts", "src/templates/assets/javascripts/components/search/_/index.ts", "src/templates/assets/javascripts/components/search/highlight/index.ts", "src/templates/assets/javascripts/components/sidebar/index.ts", "src/templates/assets/javascripts/components/source/facts/github/index.ts", "src/templates/assets/javascripts/components/source/facts/gitlab/index.ts", "src/templates/assets/javascripts/components/source/facts/_/index.ts", "src/templates/assets/javascripts/components/source/_/index.ts", "src/templates/assets/javascripts/components/tabs/index.ts", "src/templates/assets/javascripts/components/toc/index.ts", "src/templates/assets/javascripts/components/top/index.ts", "src/templates/assets/javascripts/patches/ellipsis/index.ts", "src/templates/assets/javascripts/patches/indeterminate/index.ts", "src/templates/assets/javascripts/patches/scrollfix/index.ts", "src/templates/assets/javascripts/patches/scrolllock/index.ts", "src/templates/assets/javascripts/polyfills/index.ts"], + "sourcesContent": ["(function (global, factory) {\n typeof exports === 'object' && typeof module !== 'undefined' ? factory() :\n typeof define === 'function' && define.amd ? define(factory) :\n (factory());\n}(this, (function () { 'use strict';\n\n /**\n * Applies the :focus-visible polyfill at the given scope.\n * A scope in this case is either the top-level Document or a Shadow Root.\n *\n * @param {(Document|ShadowRoot)} scope\n * @see https://github.com/WICG/focus-visible\n */\n function applyFocusVisiblePolyfill(scope) {\n var hadKeyboardEvent = true;\n var hadFocusVisibleRecently = false;\n var hadFocusVisibleRecentlyTimeout = null;\n\n var inputTypesAllowlist = {\n text: true,\n search: true,\n url: true,\n tel: true,\n email: true,\n password: true,\n number: true,\n date: true,\n month: true,\n week: true,\n time: true,\n datetime: true,\n 'datetime-local': true\n };\n\n /**\n * Helper function for legacy browsers and iframes which sometimes focus\n * elements like document, body, and non-interactive SVG.\n * @param {Element} el\n */\n function isValidFocusTarget(el) {\n if (\n el &&\n el !== document &&\n el.nodeName !== 'HTML' &&\n el.nodeName !== 'BODY' &&\n 'classList' in el &&\n 'contains' in el.classList\n ) {\n return true;\n }\n return false;\n }\n\n /**\n * Computes whether the given element should automatically trigger the\n * `focus-visible` class being added, i.e. whether it should always match\n * `:focus-visible` when focused.\n * @param {Element} el\n * @return {boolean}\n */\n function focusTriggersKeyboardModality(el) {\n var type = el.type;\n var tagName = el.tagName;\n\n if (tagName === 'INPUT' && inputTypesAllowlist[type] && !el.readOnly) {\n return true;\n }\n\n if (tagName === 'TEXTAREA' && !el.readOnly) {\n return true;\n }\n\n if (el.isContentEditable) {\n return true;\n }\n\n return false;\n }\n\n /**\n * Add the `focus-visible` class to the given element if it was not added by\n * the author.\n * @param {Element} el\n */\n function addFocusVisibleClass(el) {\n if (el.classList.contains('focus-visible')) {\n return;\n }\n el.classList.add('focus-visible');\n el.setAttribute('data-focus-visible-added', '');\n }\n\n /**\n * Remove the `focus-visible` class from the given element if it was not\n * originally added by the author.\n * @param {Element} el\n */\n function removeFocusVisibleClass(el) {\n if (!el.hasAttribute('data-focus-visible-added')) {\n return;\n }\n el.classList.remove('focus-visible');\n el.removeAttribute('data-focus-visible-added');\n }\n\n /**\n * If the most recent user interaction was via the keyboard;\n * and the key press did not include a meta, alt/option, or control key;\n * then the modality is keyboard. Otherwise, the modality is not keyboard.\n * Apply `focus-visible` to any current active element and keep track\n * of our keyboard modality state with `hadKeyboardEvent`.\n * @param {KeyboardEvent} e\n */\n function onKeyDown(e) {\n if (e.metaKey || e.altKey || e.ctrlKey) {\n return;\n }\n\n if (isValidFocusTarget(scope.activeElement)) {\n addFocusVisibleClass(scope.activeElement);\n }\n\n hadKeyboardEvent = true;\n }\n\n /**\n * If at any point a user clicks with a pointing device, ensure that we change\n * the modality away from keyboard.\n * This avoids the situation where a user presses a key on an already focused\n * element, and then clicks on a different element, focusing it with a\n * pointing device, while we still think we're in keyboard modality.\n * @param {Event} e\n */\n function onPointerDown(e) {\n hadKeyboardEvent = false;\n }\n\n /**\n * On `focus`, add the `focus-visible` class to the target if:\n * - the target received focus as a result of keyboard navigation, or\n * - the event target is an element that will likely require interaction\n * via the keyboard (e.g. a text box)\n * @param {Event} e\n */\n function onFocus(e) {\n // Prevent IE from focusing the document or HTML element.\n if (!isValidFocusTarget(e.target)) {\n return;\n }\n\n if (hadKeyboardEvent || focusTriggersKeyboardModality(e.target)) {\n addFocusVisibleClass(e.target);\n }\n }\n\n /**\n * On `blur`, remove the `focus-visible` class from the target.\n * @param {Event} e\n */\n function onBlur(e) {\n if (!isValidFocusTarget(e.target)) {\n return;\n }\n\n if (\n e.target.classList.contains('focus-visible') ||\n e.target.hasAttribute('data-focus-visible-added')\n ) {\n // To detect a tab/window switch, we look for a blur event followed\n // rapidly by a visibility change.\n // If we don't see a visibility change within 100ms, it's probably a\n // regular focus change.\n hadFocusVisibleRecently = true;\n window.clearTimeout(hadFocusVisibleRecentlyTimeout);\n hadFocusVisibleRecentlyTimeout = window.setTimeout(function() {\n hadFocusVisibleRecently = false;\n }, 100);\n removeFocusVisibleClass(e.target);\n }\n }\n\n /**\n * If the user changes tabs, keep track of whether or not the previously\n * focused element had .focus-visible.\n * @param {Event} e\n */\n function onVisibilityChange(e) {\n if (document.visibilityState === 'hidden') {\n // If the tab becomes active again, the browser will handle calling focus\n // on the element (Safari actually calls it twice).\n // If this tab change caused a blur on an element with focus-visible,\n // re-apply the class when the user switches back to the tab.\n if (hadFocusVisibleRecently) {\n hadKeyboardEvent = true;\n }\n addInitialPointerMoveListeners();\n }\n }\n\n /**\n * Add a group of listeners to detect usage of any pointing devices.\n * These listeners will be added when the polyfill first loads, and anytime\n * the window is blurred, so that they are active when the window regains\n * focus.\n */\n function addInitialPointerMoveListeners() {\n document.addEventListener('mousemove', onInitialPointerMove);\n document.addEventListener('mousedown', onInitialPointerMove);\n document.addEventListener('mouseup', onInitialPointerMove);\n document.addEventListener('pointermove', onInitialPointerMove);\n document.addEventListener('pointerdown', onInitialPointerMove);\n document.addEventListener('pointerup', onInitialPointerMove);\n document.addEventListener('touchmove', onInitialPointerMove);\n document.addEventListener('touchstart', onInitialPointerMove);\n document.addEventListener('touchend', onInitialPointerMove);\n }\n\n function removeInitialPointerMoveListeners() {\n document.removeEventListener('mousemove', onInitialPointerMove);\n document.removeEventListener('mousedown', onInitialPointerMove);\n document.removeEventListener('mouseup', onInitialPointerMove);\n document.removeEventListener('pointermove', onInitialPointerMove);\n document.removeEventListener('pointerdown', onInitialPointerMove);\n document.removeEventListener('pointerup', onInitialPointerMove);\n document.removeEventListener('touchmove', onInitialPointerMove);\n document.removeEventListener('touchstart', onInitialPointerMove);\n document.removeEventListener('touchend', onInitialPointerMove);\n }\n\n /**\n * When the polfyill first loads, assume the user is in keyboard modality.\n * If any event is received from a pointing device (e.g. mouse, pointer,\n * touch), turn off keyboard modality.\n * This accounts for situations where focus enters the page from the URL bar.\n * @param {Event} e\n */\n function onInitialPointerMove(e) {\n // Work around a Safari quirk that fires a mousemove on whenever the\n // window blurs, even if you're tabbing out of the page. \u00AF\\_(\u30C4)_/\u00AF\n if (e.target.nodeName && e.target.nodeName.toLowerCase() === 'html') {\n return;\n }\n\n hadKeyboardEvent = false;\n removeInitialPointerMoveListeners();\n }\n\n // For some kinds of state, we are interested in changes at the global scope\n // only. For example, global pointer input, global key presses and global\n // visibility change should affect the state at every scope:\n document.addEventListener('keydown', onKeyDown, true);\n document.addEventListener('mousedown', onPointerDown, true);\n document.addEventListener('pointerdown', onPointerDown, true);\n document.addEventListener('touchstart', onPointerDown, true);\n document.addEventListener('visibilitychange', onVisibilityChange, true);\n\n addInitialPointerMoveListeners();\n\n // For focus and blur, we specifically care about state changes in the local\n // scope. This is because focus / blur events that originate from within a\n // shadow root are not re-dispatched from the host element if it was already\n // the active element in its own scope:\n scope.addEventListener('focus', onFocus, true);\n scope.addEventListener('blur', onBlur, true);\n\n // We detect that a node is a ShadowRoot by ensuring that it is a\n // DocumentFragment and also has a host property. This check covers native\n // implementation and polyfill implementation transparently. If we only cared\n // about the native implementation, we could just check if the scope was\n // an instance of a ShadowRoot.\n if (scope.nodeType === Node.DOCUMENT_FRAGMENT_NODE && scope.host) {\n // Since a ShadowRoot is a special kind of DocumentFragment, it does not\n // have a root element to add a class to. So, we add this attribute to the\n // host element instead:\n scope.host.setAttribute('data-js-focus-visible', '');\n } else if (scope.nodeType === Node.DOCUMENT_NODE) {\n document.documentElement.classList.add('js-focus-visible');\n document.documentElement.setAttribute('data-js-focus-visible', '');\n }\n }\n\n // It is important to wrap all references to global window and document in\n // these checks to support server-side rendering use cases\n // @see https://github.com/WICG/focus-visible/issues/199\n if (typeof window !== 'undefined' && typeof document !== 'undefined') {\n // Make the polyfill helper globally available. This can be used as a signal\n // to interested libraries that wish to coordinate with the polyfill for e.g.,\n // applying the polyfill to a shadow root:\n window.applyFocusVisiblePolyfill = applyFocusVisiblePolyfill;\n\n // Notify interested libraries of the polyfill's presence, in case the\n // polyfill was loaded lazily:\n var event;\n\n try {\n event = new CustomEvent('focus-visible-polyfill-ready');\n } catch (error) {\n // IE11 does not support using CustomEvent as a constructor directly:\n event = document.createEvent('CustomEvent');\n event.initCustomEvent('focus-visible-polyfill-ready', false, false, {});\n }\n\n window.dispatchEvent(event);\n }\n\n if (typeof document !== 'undefined') {\n // Apply the polyfill to the global document, so that no JavaScript\n // coordination is required to use the polyfill in the top-level document:\n applyFocusVisiblePolyfill(document);\n }\n\n})));\n", "/*!\n * clipboard.js v2.0.11\n * https://clipboardjs.com/\n *\n * Licensed MIT \u00A9 Zeno Rocha\n */\n(function webpackUniversalModuleDefinition(root, factory) {\n\tif(typeof exports === 'object' && typeof module === 'object')\n\t\tmodule.exports = factory();\n\telse if(typeof define === 'function' && define.amd)\n\t\tdefine([], factory);\n\telse if(typeof exports === 'object')\n\t\texports[\"ClipboardJS\"] = factory();\n\telse\n\t\troot[\"ClipboardJS\"] = factory();\n})(this, function() {\nreturn /******/ (function() { // webpackBootstrap\n/******/ \tvar __webpack_modules__ = ({\n\n/***/ 686:\n/***/ (function(__unused_webpack_module, __webpack_exports__, __webpack_require__) {\n\n\"use strict\";\n\n// EXPORTS\n__webpack_require__.d(__webpack_exports__, {\n \"default\": function() { return /* binding */ clipboard; }\n});\n\n// EXTERNAL MODULE: ./node_modules/tiny-emitter/index.js\nvar tiny_emitter = __webpack_require__(279);\nvar tiny_emitter_default = /*#__PURE__*/__webpack_require__.n(tiny_emitter);\n// EXTERNAL MODULE: ./node_modules/good-listener/src/listen.js\nvar listen = __webpack_require__(370);\nvar listen_default = /*#__PURE__*/__webpack_require__.n(listen);\n// EXTERNAL MODULE: ./node_modules/select/src/select.js\nvar src_select = __webpack_require__(817);\nvar select_default = /*#__PURE__*/__webpack_require__.n(src_select);\n;// CONCATENATED MODULE: ./src/common/command.js\n/**\n * Executes a given operation type.\n * @param {String} type\n * @return {Boolean}\n */\nfunction command(type) {\n try {\n return document.execCommand(type);\n } catch (err) {\n return false;\n }\n}\n;// CONCATENATED MODULE: ./src/actions/cut.js\n\n\n/**\n * Cut action wrapper.\n * @param {String|HTMLElement} target\n * @return {String}\n */\n\nvar ClipboardActionCut = function ClipboardActionCut(target) {\n var selectedText = select_default()(target);\n command('cut');\n return selectedText;\n};\n\n/* harmony default export */ var actions_cut = (ClipboardActionCut);\n;// CONCATENATED MODULE: ./src/common/create-fake-element.js\n/**\n * Creates a fake textarea element with a value.\n * @param {String} value\n * @return {HTMLElement}\n */\nfunction createFakeElement(value) {\n var isRTL = document.documentElement.getAttribute('dir') === 'rtl';\n var fakeElement = document.createElement('textarea'); // Prevent zooming on iOS\n\n fakeElement.style.fontSize = '12pt'; // Reset box model\n\n fakeElement.style.border = '0';\n fakeElement.style.padding = '0';\n fakeElement.style.margin = '0'; // Move element out of screen horizontally\n\n fakeElement.style.position = 'absolute';\n fakeElement.style[isRTL ? 'right' : 'left'] = '-9999px'; // Move element to the same position vertically\n\n var yPosition = window.pageYOffset || document.documentElement.scrollTop;\n fakeElement.style.top = \"\".concat(yPosition, \"px\");\n fakeElement.setAttribute('readonly', '');\n fakeElement.value = value;\n return fakeElement;\n}\n;// CONCATENATED MODULE: ./src/actions/copy.js\n\n\n\n/**\n * Create fake copy action wrapper using a fake element.\n * @param {String} target\n * @param {Object} options\n * @return {String}\n */\n\nvar fakeCopyAction = function fakeCopyAction(value, options) {\n var fakeElement = createFakeElement(value);\n options.container.appendChild(fakeElement);\n var selectedText = select_default()(fakeElement);\n command('copy');\n fakeElement.remove();\n return selectedText;\n};\n/**\n * Copy action wrapper.\n * @param {String|HTMLElement} target\n * @param {Object} options\n * @return {String}\n */\n\n\nvar ClipboardActionCopy = function ClipboardActionCopy(target) {\n var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {\n container: document.body\n };\n var selectedText = '';\n\n if (typeof target === 'string') {\n selectedText = fakeCopyAction(target, options);\n } else if (target instanceof HTMLInputElement && !['text', 'search', 'url', 'tel', 'password'].includes(target === null || target === void 0 ? void 0 : target.type)) {\n // If input type doesn't support `setSelectionRange`. Simulate it. https://developer.mozilla.org/en-US/docs/Web/API/HTMLInputElement/setSelectionRange\n selectedText = fakeCopyAction(target.value, options);\n } else {\n selectedText = select_default()(target);\n command('copy');\n }\n\n return selectedText;\n};\n\n/* harmony default export */ var actions_copy = (ClipboardActionCopy);\n;// CONCATENATED MODULE: ./src/actions/default.js\nfunction _typeof(obj) { \"@babel/helpers - typeof\"; if (typeof Symbol === \"function\" && typeof Symbol.iterator === \"symbol\") { _typeof = function _typeof(obj) { return typeof obj; }; } else { _typeof = function _typeof(obj) { return obj && typeof Symbol === \"function\" && obj.constructor === Symbol && obj !== Symbol.prototype ? \"symbol\" : typeof obj; }; } return _typeof(obj); }\n\n\n\n/**\n * Inner function which performs selection from either `text` or `target`\n * properties and then executes copy or cut operations.\n * @param {Object} options\n */\n\nvar ClipboardActionDefault = function ClipboardActionDefault() {\n var options = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : {};\n // Defines base properties passed from constructor.\n var _options$action = options.action,\n action = _options$action === void 0 ? 'copy' : _options$action,\n container = options.container,\n target = options.target,\n text = options.text; // Sets the `action` to be performed which can be either 'copy' or 'cut'.\n\n if (action !== 'copy' && action !== 'cut') {\n throw new Error('Invalid \"action\" value, use either \"copy\" or \"cut\"');\n } // Sets the `target` property using an element that will be have its content copied.\n\n\n if (target !== undefined) {\n if (target && _typeof(target) === 'object' && target.nodeType === 1) {\n if (action === 'copy' && target.hasAttribute('disabled')) {\n throw new Error('Invalid \"target\" attribute. Please use \"readonly\" instead of \"disabled\" attribute');\n }\n\n if (action === 'cut' && (target.hasAttribute('readonly') || target.hasAttribute('disabled'))) {\n throw new Error('Invalid \"target\" attribute. You can\\'t cut text from elements with \"readonly\" or \"disabled\" attributes');\n }\n } else {\n throw new Error('Invalid \"target\" value, use a valid Element');\n }\n } // Define selection strategy based on `text` property.\n\n\n if (text) {\n return actions_copy(text, {\n container: container\n });\n } // Defines which selection strategy based on `target` property.\n\n\n if (target) {\n return action === 'cut' ? actions_cut(target) : actions_copy(target, {\n container: container\n });\n }\n};\n\n/* harmony default export */ var actions_default = (ClipboardActionDefault);\n;// CONCATENATED MODULE: ./src/clipboard.js\nfunction clipboard_typeof(obj) { \"@babel/helpers - typeof\"; if (typeof Symbol === \"function\" && typeof Symbol.iterator === \"symbol\") { clipboard_typeof = function _typeof(obj) { return typeof obj; }; } else { clipboard_typeof = function _typeof(obj) { return obj && typeof Symbol === \"function\" && obj.constructor === Symbol && obj !== Symbol.prototype ? \"symbol\" : typeof obj; }; } return clipboard_typeof(obj); }\n\nfunction _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError(\"Cannot call a class as a function\"); } }\n\nfunction _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if (\"value\" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }\n\nfunction _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }\n\nfunction _inherits(subClass, superClass) { if (typeof superClass !== \"function\" && superClass !== null) { throw new TypeError(\"Super expression must either be null or a function\"); } subClass.prototype = Object.create(superClass && superClass.prototype, { constructor: { value: subClass, writable: true, configurable: true } }); if (superClass) _setPrototypeOf(subClass, superClass); }\n\nfunction _setPrototypeOf(o, p) { _setPrototypeOf = Object.setPrototypeOf || function _setPrototypeOf(o, p) { o.__proto__ = p; return o; }; return _setPrototypeOf(o, p); }\n\nfunction _createSuper(Derived) { var hasNativeReflectConstruct = _isNativeReflectConstruct(); return function _createSuperInternal() { var Super = _getPrototypeOf(Derived), result; if (hasNativeReflectConstruct) { var NewTarget = _getPrototypeOf(this).constructor; result = Reflect.construct(Super, arguments, NewTarget); } else { result = Super.apply(this, arguments); } return _possibleConstructorReturn(this, result); }; }\n\nfunction _possibleConstructorReturn(self, call) { if (call && (clipboard_typeof(call) === \"object\" || typeof call === \"function\")) { return call; } return _assertThisInitialized(self); }\n\nfunction _assertThisInitialized(self) { if (self === void 0) { throw new ReferenceError(\"this hasn't been initialised - super() hasn't been called\"); } return self; }\n\nfunction _isNativeReflectConstruct() { if (typeof Reflect === \"undefined\" || !Reflect.construct) return false; if (Reflect.construct.sham) return false; if (typeof Proxy === \"function\") return true; try { Date.prototype.toString.call(Reflect.construct(Date, [], function () {})); return true; } catch (e) { return false; } }\n\nfunction _getPrototypeOf(o) { _getPrototypeOf = Object.setPrototypeOf ? Object.getPrototypeOf : function _getPrototypeOf(o) { return o.__proto__ || Object.getPrototypeOf(o); }; return _getPrototypeOf(o); }\n\n\n\n\n\n\n/**\n * Helper function to retrieve attribute value.\n * @param {String} suffix\n * @param {Element} element\n */\n\nfunction getAttributeValue(suffix, element) {\n var attribute = \"data-clipboard-\".concat(suffix);\n\n if (!element.hasAttribute(attribute)) {\n return;\n }\n\n return element.getAttribute(attribute);\n}\n/**\n * Base class which takes one or more elements, adds event listeners to them,\n * and instantiates a new `ClipboardAction` on each click.\n */\n\n\nvar Clipboard = /*#__PURE__*/function (_Emitter) {\n _inherits(Clipboard, _Emitter);\n\n var _super = _createSuper(Clipboard);\n\n /**\n * @param {String|HTMLElement|HTMLCollection|NodeList} trigger\n * @param {Object} options\n */\n function Clipboard(trigger, options) {\n var _this;\n\n _classCallCheck(this, Clipboard);\n\n _this = _super.call(this);\n\n _this.resolveOptions(options);\n\n _this.listenClick(trigger);\n\n return _this;\n }\n /**\n * Defines if attributes would be resolved using internal setter functions\n * or custom functions that were passed in the constructor.\n * @param {Object} options\n */\n\n\n _createClass(Clipboard, [{\n key: \"resolveOptions\",\n value: function resolveOptions() {\n var options = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : {};\n this.action = typeof options.action === 'function' ? options.action : this.defaultAction;\n this.target = typeof options.target === 'function' ? options.target : this.defaultTarget;\n this.text = typeof options.text === 'function' ? options.text : this.defaultText;\n this.container = clipboard_typeof(options.container) === 'object' ? options.container : document.body;\n }\n /**\n * Adds a click event listener to the passed trigger.\n * @param {String|HTMLElement|HTMLCollection|NodeList} trigger\n */\n\n }, {\n key: \"listenClick\",\n value: function listenClick(trigger) {\n var _this2 = this;\n\n this.listener = listen_default()(trigger, 'click', function (e) {\n return _this2.onClick(e);\n });\n }\n /**\n * Defines a new `ClipboardAction` on each click event.\n * @param {Event} e\n */\n\n }, {\n key: \"onClick\",\n value: function onClick(e) {\n var trigger = e.delegateTarget || e.currentTarget;\n var action = this.action(trigger) || 'copy';\n var text = actions_default({\n action: action,\n container: this.container,\n target: this.target(trigger),\n text: this.text(trigger)\n }); // Fires an event based on the copy operation result.\n\n this.emit(text ? 'success' : 'error', {\n action: action,\n text: text,\n trigger: trigger,\n clearSelection: function clearSelection() {\n if (trigger) {\n trigger.focus();\n }\n\n window.getSelection().removeAllRanges();\n }\n });\n }\n /**\n * Default `action` lookup function.\n * @param {Element} trigger\n */\n\n }, {\n key: \"defaultAction\",\n value: function defaultAction(trigger) {\n return getAttributeValue('action', trigger);\n }\n /**\n * Default `target` lookup function.\n * @param {Element} trigger\n */\n\n }, {\n key: \"defaultTarget\",\n value: function defaultTarget(trigger) {\n var selector = getAttributeValue('target', trigger);\n\n if (selector) {\n return document.querySelector(selector);\n }\n }\n /**\n * Allow fire programmatically a copy action\n * @param {String|HTMLElement} target\n * @param {Object} options\n * @returns Text copied.\n */\n\n }, {\n key: \"defaultText\",\n\n /**\n * Default `text` lookup function.\n * @param {Element} trigger\n */\n value: function defaultText(trigger) {\n return getAttributeValue('text', trigger);\n }\n /**\n * Destroy lifecycle.\n */\n\n }, {\n key: \"destroy\",\n value: function destroy() {\n this.listener.destroy();\n }\n }], [{\n key: \"copy\",\n value: function copy(target) {\n var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {\n container: document.body\n };\n return actions_copy(target, options);\n }\n /**\n * Allow fire programmatically a cut action\n * @param {String|HTMLElement} target\n * @returns Text cutted.\n */\n\n }, {\n key: \"cut\",\n value: function cut(target) {\n return actions_cut(target);\n }\n /**\n * Returns the support of the given action, or all actions if no action is\n * given.\n * @param {String} [action]\n */\n\n }, {\n key: \"isSupported\",\n value: function isSupported() {\n var action = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : ['copy', 'cut'];\n var actions = typeof action === 'string' ? [action] : action;\n var support = !!document.queryCommandSupported;\n actions.forEach(function (action) {\n support = support && !!document.queryCommandSupported(action);\n });\n return support;\n }\n }]);\n\n return Clipboard;\n}((tiny_emitter_default()));\n\n/* harmony default export */ var clipboard = (Clipboard);\n\n/***/ }),\n\n/***/ 828:\n/***/ (function(module) {\n\nvar DOCUMENT_NODE_TYPE = 9;\n\n/**\n * A polyfill for Element.matches()\n */\nif (typeof Element !== 'undefined' && !Element.prototype.matches) {\n var proto = Element.prototype;\n\n proto.matches = proto.matchesSelector ||\n proto.mozMatchesSelector ||\n proto.msMatchesSelector ||\n proto.oMatchesSelector ||\n proto.webkitMatchesSelector;\n}\n\n/**\n * Finds the closest parent that matches a selector.\n *\n * @param {Element} element\n * @param {String} selector\n * @return {Function}\n */\nfunction closest (element, selector) {\n while (element && element.nodeType !== DOCUMENT_NODE_TYPE) {\n if (typeof element.matches === 'function' &&\n element.matches(selector)) {\n return element;\n }\n element = element.parentNode;\n }\n}\n\nmodule.exports = closest;\n\n\n/***/ }),\n\n/***/ 438:\n/***/ (function(module, __unused_webpack_exports, __webpack_require__) {\n\nvar closest = __webpack_require__(828);\n\n/**\n * Delegates event to a selector.\n *\n * @param {Element} element\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @param {Boolean} useCapture\n * @return {Object}\n */\nfunction _delegate(element, selector, type, callback, useCapture) {\n var listenerFn = listener.apply(this, arguments);\n\n element.addEventListener(type, listenerFn, useCapture);\n\n return {\n destroy: function() {\n element.removeEventListener(type, listenerFn, useCapture);\n }\n }\n}\n\n/**\n * Delegates event to a selector.\n *\n * @param {Element|String|Array} [elements]\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @param {Boolean} useCapture\n * @return {Object}\n */\nfunction delegate(elements, selector, type, callback, useCapture) {\n // Handle the regular Element usage\n if (typeof elements.addEventListener === 'function') {\n return _delegate.apply(null, arguments);\n }\n\n // Handle Element-less usage, it defaults to global delegation\n if (typeof type === 'function') {\n // Use `document` as the first parameter, then apply arguments\n // This is a short way to .unshift `arguments` without running into deoptimizations\n return _delegate.bind(null, document).apply(null, arguments);\n }\n\n // Handle Selector-based usage\n if (typeof elements === 'string') {\n elements = document.querySelectorAll(elements);\n }\n\n // Handle Array-like based usage\n return Array.prototype.map.call(elements, function (element) {\n return _delegate(element, selector, type, callback, useCapture);\n });\n}\n\n/**\n * Finds closest match and invokes callback.\n *\n * @param {Element} element\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @return {Function}\n */\nfunction listener(element, selector, type, callback) {\n return function(e) {\n e.delegateTarget = closest(e.target, selector);\n\n if (e.delegateTarget) {\n callback.call(element, e);\n }\n }\n}\n\nmodule.exports = delegate;\n\n\n/***/ }),\n\n/***/ 879:\n/***/ (function(__unused_webpack_module, exports) {\n\n/**\n * Check if argument is a HTML element.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.node = function(value) {\n return value !== undefined\n && value instanceof HTMLElement\n && value.nodeType === 1;\n};\n\n/**\n * Check if argument is a list of HTML elements.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.nodeList = function(value) {\n var type = Object.prototype.toString.call(value);\n\n return value !== undefined\n && (type === '[object NodeList]' || type === '[object HTMLCollection]')\n && ('length' in value)\n && (value.length === 0 || exports.node(value[0]));\n};\n\n/**\n * Check if argument is a string.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.string = function(value) {\n return typeof value === 'string'\n || value instanceof String;\n};\n\n/**\n * Check if argument is a function.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.fn = function(value) {\n var type = Object.prototype.toString.call(value);\n\n return type === '[object Function]';\n};\n\n\n/***/ }),\n\n/***/ 370:\n/***/ (function(module, __unused_webpack_exports, __webpack_require__) {\n\nvar is = __webpack_require__(879);\nvar delegate = __webpack_require__(438);\n\n/**\n * Validates all params and calls the right\n * listener function based on its target type.\n *\n * @param {String|HTMLElement|HTMLCollection|NodeList} target\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listen(target, type, callback) {\n if (!target && !type && !callback) {\n throw new Error('Missing required arguments');\n }\n\n if (!is.string(type)) {\n throw new TypeError('Second argument must be a String');\n }\n\n if (!is.fn(callback)) {\n throw new TypeError('Third argument must be a Function');\n }\n\n if (is.node(target)) {\n return listenNode(target, type, callback);\n }\n else if (is.nodeList(target)) {\n return listenNodeList(target, type, callback);\n }\n else if (is.string(target)) {\n return listenSelector(target, type, callback);\n }\n else {\n throw new TypeError('First argument must be a String, HTMLElement, HTMLCollection, or NodeList');\n }\n}\n\n/**\n * Adds an event listener to a HTML element\n * and returns a remove listener function.\n *\n * @param {HTMLElement} node\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenNode(node, type, callback) {\n node.addEventListener(type, callback);\n\n return {\n destroy: function() {\n node.removeEventListener(type, callback);\n }\n }\n}\n\n/**\n * Add an event listener to a list of HTML elements\n * and returns a remove listener function.\n *\n * @param {NodeList|HTMLCollection} nodeList\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenNodeList(nodeList, type, callback) {\n Array.prototype.forEach.call(nodeList, function(node) {\n node.addEventListener(type, callback);\n });\n\n return {\n destroy: function() {\n Array.prototype.forEach.call(nodeList, function(node) {\n node.removeEventListener(type, callback);\n });\n }\n }\n}\n\n/**\n * Add an event listener to a selector\n * and returns a remove listener function.\n *\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenSelector(selector, type, callback) {\n return delegate(document.body, selector, type, callback);\n}\n\nmodule.exports = listen;\n\n\n/***/ }),\n\n/***/ 817:\n/***/ (function(module) {\n\nfunction select(element) {\n var selectedText;\n\n if (element.nodeName === 'SELECT') {\n element.focus();\n\n selectedText = element.value;\n }\n else if (element.nodeName === 'INPUT' || element.nodeName === 'TEXTAREA') {\n var isReadOnly = element.hasAttribute('readonly');\n\n if (!isReadOnly) {\n element.setAttribute('readonly', '');\n }\n\n element.select();\n element.setSelectionRange(0, element.value.length);\n\n if (!isReadOnly) {\n element.removeAttribute('readonly');\n }\n\n selectedText = element.value;\n }\n else {\n if (element.hasAttribute('contenteditable')) {\n element.focus();\n }\n\n var selection = window.getSelection();\n var range = document.createRange();\n\n range.selectNodeContents(element);\n selection.removeAllRanges();\n selection.addRange(range);\n\n selectedText = selection.toString();\n }\n\n return selectedText;\n}\n\nmodule.exports = select;\n\n\n/***/ }),\n\n/***/ 279:\n/***/ (function(module) {\n\nfunction E () {\n // Keep this empty so it's easier to inherit from\n // (via https://github.com/lipsmack from https://github.com/scottcorgan/tiny-emitter/issues/3)\n}\n\nE.prototype = {\n on: function (name, callback, ctx) {\n var e = this.e || (this.e = {});\n\n (e[name] || (e[name] = [])).push({\n fn: callback,\n ctx: ctx\n });\n\n return this;\n },\n\n once: function (name, callback, ctx) {\n var self = this;\n function listener () {\n self.off(name, listener);\n callback.apply(ctx, arguments);\n };\n\n listener._ = callback\n return this.on(name, listener, ctx);\n },\n\n emit: function (name) {\n var data = [].slice.call(arguments, 1);\n var evtArr = ((this.e || (this.e = {}))[name] || []).slice();\n var i = 0;\n var len = evtArr.length;\n\n for (i; i < len; i++) {\n evtArr[i].fn.apply(evtArr[i].ctx, data);\n }\n\n return this;\n },\n\n off: function (name, callback) {\n var e = this.e || (this.e = {});\n var evts = e[name];\n var liveEvents = [];\n\n if (evts && callback) {\n for (var i = 0, len = evts.length; i < len; i++) {\n if (evts[i].fn !== callback && evts[i].fn._ !== callback)\n liveEvents.push(evts[i]);\n }\n }\n\n // Remove event from queue to prevent memory leak\n // Suggested by https://github.com/lazd\n // Ref: https://github.com/scottcorgan/tiny-emitter/commit/c6ebfaa9bc973b33d110a84a307742b7cf94c953#commitcomment-5024910\n\n (liveEvents.length)\n ? e[name] = liveEvents\n : delete e[name];\n\n return this;\n }\n};\n\nmodule.exports = E;\nmodule.exports.TinyEmitter = E;\n\n\n/***/ })\n\n/******/ \t});\n/************************************************************************/\n/******/ \t// The module cache\n/******/ \tvar __webpack_module_cache__ = {};\n/******/ \t\n/******/ \t// The require function\n/******/ \tfunction __webpack_require__(moduleId) {\n/******/ \t\t// Check if module is in cache\n/******/ \t\tif(__webpack_module_cache__[moduleId]) {\n/******/ \t\t\treturn __webpack_module_cache__[moduleId].exports;\n/******/ \t\t}\n/******/ \t\t// Create a new module (and put it into the cache)\n/******/ \t\tvar module = __webpack_module_cache__[moduleId] = {\n/******/ \t\t\t// no module.id needed\n/******/ \t\t\t// no module.loaded needed\n/******/ \t\t\texports: {}\n/******/ \t\t};\n/******/ \t\n/******/ \t\t// Execute the module function\n/******/ \t\t__webpack_modules__[moduleId](module, module.exports, __webpack_require__);\n/******/ \t\n/******/ \t\t// Return the exports of the module\n/******/ \t\treturn module.exports;\n/******/ \t}\n/******/ \t\n/************************************************************************/\n/******/ \t/* webpack/runtime/compat get default export */\n/******/ \t!function() {\n/******/ \t\t// getDefaultExport function for compatibility with non-harmony modules\n/******/ \t\t__webpack_require__.n = function(module) {\n/******/ \t\t\tvar getter = module && module.__esModule ?\n/******/ \t\t\t\tfunction() { return module['default']; } :\n/******/ \t\t\t\tfunction() { return module; };\n/******/ \t\t\t__webpack_require__.d(getter, { a: getter });\n/******/ \t\t\treturn getter;\n/******/ \t\t};\n/******/ \t}();\n/******/ \t\n/******/ \t/* webpack/runtime/define property getters */\n/******/ \t!function() {\n/******/ \t\t// define getter functions for harmony exports\n/******/ \t\t__webpack_require__.d = function(exports, definition) {\n/******/ \t\t\tfor(var key in definition) {\n/******/ \t\t\t\tif(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) {\n/******/ \t\t\t\t\tObject.defineProperty(exports, key, { enumerable: true, get: definition[key] });\n/******/ \t\t\t\t}\n/******/ \t\t\t}\n/******/ \t\t};\n/******/ \t}();\n/******/ \t\n/******/ \t/* webpack/runtime/hasOwnProperty shorthand */\n/******/ \t!function() {\n/******/ \t\t__webpack_require__.o = function(obj, prop) { return Object.prototype.hasOwnProperty.call(obj, prop); }\n/******/ \t}();\n/******/ \t\n/************************************************************************/\n/******/ \t// module exports must be returned from runtime so entry inlining is disabled\n/******/ \t// startup\n/******/ \t// Load entry module and return exports\n/******/ \treturn __webpack_require__(686);\n/******/ })()\n.default;\n});", "/*!\n * escape-html\n * Copyright(c) 2012-2013 TJ Holowaychuk\n * Copyright(c) 2015 Andreas Lubbe\n * Copyright(c) 2015 Tiancheng \"Timothy\" Gu\n * MIT Licensed\n */\n\n'use strict';\n\n/**\n * Module variables.\n * @private\n */\n\nvar matchHtmlRegExp = /[\"'&<>]/;\n\n/**\n * Module exports.\n * @public\n */\n\nmodule.exports = escapeHtml;\n\n/**\n * Escape special characters in the given string of html.\n *\n * @param {string} string The string to escape for inserting into HTML\n * @return {string}\n * @public\n */\n\nfunction escapeHtml(string) {\n var str = '' + string;\n var match = matchHtmlRegExp.exec(str);\n\n if (!match) {\n return str;\n }\n\n var escape;\n var html = '';\n var index = 0;\n var lastIndex = 0;\n\n for (index = match.index; index < str.length; index++) {\n switch (str.charCodeAt(index)) {\n case 34: // \"\n escape = '"';\n break;\n case 38: // &\n escape = '&';\n break;\n case 39: // '\n escape = ''';\n break;\n case 60: // <\n escape = '<';\n break;\n case 62: // >\n escape = '>';\n break;\n default:\n continue;\n }\n\n if (lastIndex !== index) {\n html += str.substring(lastIndex, index);\n }\n\n lastIndex = index + 1;\n html += escape;\n }\n\n return lastIndex !== index\n ? html + str.substring(lastIndex, index)\n : html;\n}\n", "/*\n * Copyright (c) 2016-2024 Martin Donath \n *\n * Permission is hereby granted, free of charge, to any person obtaining a copy\n * of this software and associated documentation files (the \"Software\"), to\n * deal in the Software without restriction, including without limitation the\n * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or\n * sell copies of the Software, and to permit persons to whom the Software is\n * furnished to do so, subject to the following conditions:\n *\n * The above copyright notice and this permission notice shall be included in\n * all copies or substantial portions of the Software.\n *\n * THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n * FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. IN NO EVENT SHALL THE\n * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n * IN THE SOFTWARE.\n */\n\nimport \"focus-visible\"\n\nimport {\n EMPTY,\n NEVER,\n Observable,\n Subject,\n defer,\n delay,\n filter,\n map,\n merge,\n mergeWith,\n shareReplay,\n switchMap\n} from \"rxjs\"\n\nimport { configuration, feature } from \"./_\"\nimport {\n at,\n getActiveElement,\n getOptionalElement,\n requestJSON,\n setLocation,\n setToggle,\n watchDocument,\n watchKeyboard,\n watchLocation,\n watchLocationTarget,\n watchMedia,\n watchPrint,\n watchScript,\n watchViewport\n} from \"./browser\"\nimport {\n getComponentElement,\n getComponentElements,\n mountAnnounce,\n mountBackToTop,\n mountConsent,\n mountContent,\n mountDialog,\n mountHeader,\n mountHeaderTitle,\n mountPalette,\n mountProgress,\n mountSearch,\n mountSearchHiglight,\n mountSidebar,\n mountSource,\n mountTableOfContents,\n mountTabs,\n watchHeader,\n watchMain\n} from \"./components\"\nimport {\n SearchIndex,\n setupClipboardJS,\n setupInstantNavigation,\n setupVersionSelector\n} from \"./integrations\"\nimport {\n patchEllipsis,\n patchIndeterminate,\n patchScrollfix,\n patchScrolllock\n} from \"./patches\"\nimport \"./polyfills\"\n\n/* ----------------------------------------------------------------------------\n * Functions - @todo refactor\n * ------------------------------------------------------------------------- */\n\n/**\n * Fetch search index\n *\n * @returns Search index observable\n */\nfunction fetchSearchIndex(): Observable {\n if (location.protocol === \"file:\") {\n return watchScript(\n `${new URL(\"search/search_index.js\", config.base)}`\n )\n .pipe(\n // @ts-ignore - @todo fix typings\n map(() => __index),\n shareReplay(1)\n )\n } else {\n return requestJSON(\n new URL(\"search/search_index.json\", config.base)\n )\n }\n}\n\n/* ----------------------------------------------------------------------------\n * Application\n * ------------------------------------------------------------------------- */\n\n/* Yay, JavaScript is available */\ndocument.documentElement.classList.remove(\"no-js\")\ndocument.documentElement.classList.add(\"js\")\n\n/* Set up navigation observables and subjects */\nconst document$ = watchDocument()\nconst location$ = watchLocation()\nconst target$ = watchLocationTarget(location$)\nconst keyboard$ = watchKeyboard()\n\n/* Set up media observables */\nconst viewport$ = watchViewport()\nconst tablet$ = watchMedia(\"(min-width: 960px)\")\nconst screen$ = watchMedia(\"(min-width: 1220px)\")\nconst print$ = watchPrint()\n\n/* Retrieve search index, if search is enabled */\nconst config = configuration()\nconst index$ = document.forms.namedItem(\"search\")\n ? fetchSearchIndex()\n : NEVER\n\n/* Set up Clipboard.js integration */\nconst alert$ = new Subject()\nsetupClipboardJS({ alert$ })\n\n/* Set up progress indicator */\nconst progress$ = new Subject()\n\n/* Set up instant navigation, if enabled */\nif (feature(\"navigation.instant\"))\n setupInstantNavigation({ location$, viewport$, progress$ })\n .subscribe(document$)\n\n/* Set up version selector */\nif (config.version?.provider === \"mike\")\n setupVersionSelector({ document$ })\n\n/* Always close drawer and search on navigation */\nmerge(location$, target$)\n .pipe(\n delay(125)\n )\n .subscribe(() => {\n setToggle(\"drawer\", false)\n setToggle(\"search\", false)\n })\n\n/* Set up global keyboard handlers */\nkeyboard$\n .pipe(\n filter(({ mode }) => mode === \"global\")\n )\n .subscribe(key => {\n switch (key.type) {\n\n /* Go to previous page */\n case \"p\":\n case \",\":\n const prev = getOptionalElement(\"link[rel=prev]\")\n if (typeof prev !== \"undefined\")\n setLocation(prev)\n break\n\n /* Go to next page */\n case \"n\":\n case \".\":\n const next = getOptionalElement(\"link[rel=next]\")\n if (typeof next !== \"undefined\")\n setLocation(next)\n break\n\n /* Expand navigation, see https://bit.ly/3ZjG5io */\n case \"Enter\":\n const active = getActiveElement()\n if (active instanceof HTMLLabelElement)\n active.click()\n }\n })\n\n/* Set up patches */\npatchEllipsis({ viewport$, document$ })\npatchIndeterminate({ document$, tablet$ })\npatchScrollfix({ document$ })\npatchScrolllock({ viewport$, tablet$ })\n\n/* Set up header and main area observable */\nconst header$ = watchHeader(getComponentElement(\"header\"), { viewport$ })\nconst main$ = document$\n .pipe(\n map(() => getComponentElement(\"main\")),\n switchMap(el => watchMain(el, { viewport$, header$ })),\n shareReplay(1)\n )\n\n/* Set up control component observables */\nconst control$ = merge(\n\n /* Consent */\n ...getComponentElements(\"consent\")\n .map(el => mountConsent(el, { target$ })),\n\n /* Dialog */\n ...getComponentElements(\"dialog\")\n .map(el => mountDialog(el, { alert$ })),\n\n /* Header */\n ...getComponentElements(\"header\")\n .map(el => mountHeader(el, { viewport$, header$, main$ })),\n\n /* Color palette */\n ...getComponentElements(\"palette\")\n .map(el => mountPalette(el)),\n\n /* Progress bar */\n ...getComponentElements(\"progress\")\n .map(el => mountProgress(el, { progress$ })),\n\n /* Search */\n ...getComponentElements(\"search\")\n .map(el => mountSearch(el, { index$, keyboard$ })),\n\n /* Repository information */\n ...getComponentElements(\"source\")\n .map(el => mountSource(el))\n)\n\n/* Set up content component observables */\nconst content$ = defer(() => merge(\n\n /* Announcement bar */\n ...getComponentElements(\"announce\")\n .map(el => mountAnnounce(el)),\n\n /* Content */\n ...getComponentElements(\"content\")\n .map(el => mountContent(el, { viewport$, target$, print$ })),\n\n /* Search highlighting */\n ...getComponentElements(\"content\")\n .map(el => feature(\"search.highlight\")\n ? mountSearchHiglight(el, { index$, location$ })\n : EMPTY\n ),\n\n /* Header title */\n ...getComponentElements(\"header-title\")\n .map(el => mountHeaderTitle(el, { viewport$, header$ })),\n\n /* Sidebar */\n ...getComponentElements(\"sidebar\")\n .map(el => el.getAttribute(\"data-md-type\") === \"navigation\"\n ? at(screen$, () => mountSidebar(el, { viewport$, header$, main$ }))\n : at(tablet$, () => mountSidebar(el, { viewport$, header$, main$ }))\n ),\n\n /* Navigation tabs */\n ...getComponentElements(\"tabs\")\n .map(el => mountTabs(el, { viewport$, header$ })),\n\n /* Table of contents */\n ...getComponentElements(\"toc\")\n .map(el => mountTableOfContents(el, {\n viewport$, header$, main$, target$\n })),\n\n /* Back-to-top button */\n ...getComponentElements(\"top\")\n .map(el => mountBackToTop(el, { viewport$, header$, main$, target$ }))\n))\n\n/* Set up component observables */\nconst component$ = document$\n .pipe(\n switchMap(() => content$),\n mergeWith(control$),\n shareReplay(1)\n )\n\n/* Subscribe to all components */\ncomponent$.subscribe()\n\n/* ----------------------------------------------------------------------------\n * Exports\n * ------------------------------------------------------------------------- */\n\nwindow.document$ = document$ /* Document observable */\nwindow.location$ = location$ /* Location subject */\nwindow.target$ = target$ /* Location target observable */\nwindow.keyboard$ = keyboard$ /* Keyboard observable */\nwindow.viewport$ = viewport$ /* Viewport observable */\nwindow.tablet$ = tablet$ /* Media tablet observable */\nwindow.screen$ = screen$ /* Media screen observable */\nwindow.print$ = print$ /* Media print observable */\nwindow.alert$ = alert$ /* Alert subject */\nwindow.progress$ = progress$ /* Progress indicator subject */\nwindow.component$ = component$ /* Component observable */\n", "/*! *****************************************************************************\r\nCopyright (c) Microsoft Corporation.\r\n\r\nPermission to use, copy, modify, and/or distribute this software for any\r\npurpose with or without fee is hereby granted.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\r\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\r\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\r\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\r\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\r\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\r\nPERFORMANCE OF THIS SOFTWARE.\r\n***************************************************************************** */\r\n/* global Reflect, Promise */\r\n\r\nvar extendStatics = function(d, b) {\r\n extendStatics = Object.setPrototypeOf ||\r\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\r\n function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };\r\n return extendStatics(d, b);\r\n};\r\n\r\nexport function __extends(d, b) {\r\n if (typeof b !== \"function\" && b !== null)\r\n throw new TypeError(\"Class extends value \" + String(b) + \" is not a constructor or null\");\r\n extendStatics(d, b);\r\n function __() { this.constructor = d; }\r\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\r\n}\r\n\r\nexport var __assign = function() {\r\n __assign = Object.assign || function __assign(t) {\r\n for (var s, i = 1, n = arguments.length; i < n; i++) {\r\n s = arguments[i];\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\r\n }\r\n return t;\r\n }\r\n return __assign.apply(this, arguments);\r\n}\r\n\r\nexport function __rest(s, e) {\r\n var t = {};\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\r\n t[p] = s[p];\r\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\r\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\r\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\r\n t[p[i]] = s[p[i]];\r\n }\r\n return t;\r\n}\r\n\r\nexport function __decorate(decorators, target, key, desc) {\r\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\r\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\r\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\r\n return c > 3 && r && Object.defineProperty(target, key, r), r;\r\n}\r\n\r\nexport function __param(paramIndex, decorator) {\r\n return function (target, key) { decorator(target, key, paramIndex); }\r\n}\r\n\r\nexport function __metadata(metadataKey, metadataValue) {\r\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\r\n}\r\n\r\nexport function __awaiter(thisArg, _arguments, P, generator) {\r\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\r\n return new (P || (P = Promise))(function (resolve, reject) {\r\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\r\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\r\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\r\n step((generator = generator.apply(thisArg, _arguments || [])).next());\r\n });\r\n}\r\n\r\nexport function __generator(thisArg, body) {\r\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\r\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\r\n function verb(n) { return function (v) { return step([n, v]); }; }\r\n function step(op) {\r\n if (f) throw new TypeError(\"Generator is already executing.\");\r\n while (_) try {\r\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\r\n if (y = 0, t) op = [op[0] & 2, t.value];\r\n switch (op[0]) {\r\n case 0: case 1: t = op; break;\r\n case 4: _.label++; return { value: op[1], done: false };\r\n case 5: _.label++; y = op[1]; op = [0]; continue;\r\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\r\n default:\r\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\r\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\r\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\r\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\r\n if (t[2]) _.ops.pop();\r\n _.trys.pop(); continue;\r\n }\r\n op = body.call(thisArg, _);\r\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\r\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\r\n }\r\n}\r\n\r\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\r\n}) : (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n o[k2] = m[k];\r\n});\r\n\r\nexport function __exportStar(m, o) {\r\n for (var p in m) if (p !== \"default\" && !Object.prototype.hasOwnProperty.call(o, p)) __createBinding(o, m, p);\r\n}\r\n\r\nexport function __values(o) {\r\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\r\n if (m) return m.call(o);\r\n if (o && typeof o.length === \"number\") return {\r\n next: function () {\r\n if (o && i >= o.length) o = void 0;\r\n return { value: o && o[i++], done: !o };\r\n }\r\n };\r\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\r\n}\r\n\r\nexport function __read(o, n) {\r\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\r\n if (!m) return o;\r\n var i = m.call(o), r, ar = [], e;\r\n try {\r\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\r\n }\r\n catch (error) { e = { error: error }; }\r\n finally {\r\n try {\r\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\r\n }\r\n finally { if (e) throw e.error; }\r\n }\r\n return ar;\r\n}\r\n\r\n/** @deprecated */\r\nexport function __spread() {\r\n for (var ar = [], i = 0; i < arguments.length; i++)\r\n ar = ar.concat(__read(arguments[i]));\r\n return ar;\r\n}\r\n\r\n/** @deprecated */\r\nexport function __spreadArrays() {\r\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\r\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\r\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\r\n r[k] = a[j];\r\n return r;\r\n}\r\n\r\nexport function __spreadArray(to, from, pack) {\r\n if (pack || arguments.length === 2) for (var i = 0, l = from.length, ar; i < l; i++) {\r\n if (ar || !(i in from)) {\r\n if (!ar) ar = Array.prototype.slice.call(from, 0, i);\r\n ar[i] = from[i];\r\n }\r\n }\r\n return to.concat(ar || Array.prototype.slice.call(from));\r\n}\r\n\r\nexport function __await(v) {\r\n return this instanceof __await ? (this.v = v, this) : new __await(v);\r\n}\r\n\r\nexport function __asyncGenerator(thisArg, _arguments, generator) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\r\n return i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i;\r\n function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }\r\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\r\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\r\n function fulfill(value) { resume(\"next\", value); }\r\n function reject(value) { resume(\"throw\", value); }\r\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\r\n}\r\n\r\nexport function __asyncDelegator(o) {\r\n var i, p;\r\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\r\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === \"return\" } : f ? f(v) : v; } : f; }\r\n}\r\n\r\nexport function __asyncValues(o) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var m = o[Symbol.asyncIterator], i;\r\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\r\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\r\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\r\n}\r\n\r\nexport function __makeTemplateObject(cooked, raw) {\r\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\r\n return cooked;\r\n};\r\n\r\nvar __setModuleDefault = Object.create ? (function(o, v) {\r\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\r\n}) : function(o, v) {\r\n o[\"default\"] = v;\r\n};\r\n\r\nexport function __importStar(mod) {\r\n if (mod && mod.__esModule) return mod;\r\n var result = {};\r\n if (mod != null) for (var k in mod) if (k !== \"default\" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\r\n __setModuleDefault(result, mod);\r\n return result;\r\n}\r\n\r\nexport function __importDefault(mod) {\r\n return (mod && mod.__esModule) ? mod : { default: mod };\r\n}\r\n\r\nexport function __classPrivateFieldGet(receiver, state, kind, f) {\r\n if (kind === \"a\" && !f) throw new TypeError(\"Private accessor was defined without a getter\");\r\n if (typeof state === \"function\" ? receiver !== state || !f : !state.has(receiver)) throw new TypeError(\"Cannot read private member from an object whose class did not declare it\");\r\n return kind === \"m\" ? f : kind === \"a\" ? f.call(receiver) : f ? f.value : state.get(receiver);\r\n}\r\n\r\nexport function __classPrivateFieldSet(receiver, state, value, kind, f) {\r\n if (kind === \"m\") throw new TypeError(\"Private method is not writable\");\r\n if (kind === \"a\" && !f) throw new TypeError(\"Private accessor was defined without a setter\");\r\n if (typeof state === \"function\" ? receiver !== state || !f : !state.has(receiver)) throw new TypeError(\"Cannot write private member to an object whose class did not declare it\");\r\n return (kind === \"a\" ? f.call(receiver, value) : f ? f.value = value : state.set(receiver, value)), value;\r\n}\r\n", "/**\n * Returns true if the object is a function.\n * @param value The value to check\n */\nexport function isFunction(value: any): value is (...args: any[]) => any {\n return typeof value === 'function';\n}\n", "/**\n * Used to create Error subclasses until the community moves away from ES5.\n *\n * This is because compiling from TypeScript down to ES5 has issues with subclassing Errors\n * as well as other built-in types: https://github.com/Microsoft/TypeScript/issues/12123\n *\n * @param createImpl A factory function to create the actual constructor implementation. The returned\n * function should be a named function that calls `_super` internally.\n */\nexport function createErrorClass(createImpl: (_super: any) => any): T {\n const _super = (instance: any) => {\n Error.call(instance);\n instance.stack = new Error().stack;\n };\n\n const ctorFunc = createImpl(_super);\n ctorFunc.prototype = Object.create(Error.prototype);\n ctorFunc.prototype.constructor = ctorFunc;\n return ctorFunc;\n}\n", "import { createErrorClass } from './createErrorClass';\n\nexport interface UnsubscriptionError extends Error {\n readonly errors: any[];\n}\n\nexport interface UnsubscriptionErrorCtor {\n /**\n * @deprecated Internal implementation detail. Do not construct error instances.\n * Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269\n */\n new (errors: any[]): UnsubscriptionError;\n}\n\n/**\n * An error thrown when one or more errors have occurred during the\n * `unsubscribe` of a {@link Subscription}.\n */\nexport const UnsubscriptionError: UnsubscriptionErrorCtor = createErrorClass(\n (_super) =>\n function UnsubscriptionErrorImpl(this: any, errors: (Error | string)[]) {\n _super(this);\n this.message = errors\n ? `${errors.length} errors occurred during unsubscription:\n${errors.map((err, i) => `${i + 1}) ${err.toString()}`).join('\\n ')}`\n : '';\n this.name = 'UnsubscriptionError';\n this.errors = errors;\n }\n);\n", "/**\n * Removes an item from an array, mutating it.\n * @param arr The array to remove the item from\n * @param item The item to remove\n */\nexport function arrRemove(arr: T[] | undefined | null, item: T) {\n if (arr) {\n const index = arr.indexOf(item);\n 0 <= index && arr.splice(index, 1);\n }\n}\n", "import { isFunction } from './util/isFunction';\nimport { UnsubscriptionError } from './util/UnsubscriptionError';\nimport { SubscriptionLike, TeardownLogic, Unsubscribable } from './types';\nimport { arrRemove } from './util/arrRemove';\n\n/**\n * Represents a disposable resource, such as the execution of an Observable. A\n * Subscription has one important method, `unsubscribe`, that takes no argument\n * and just disposes the resource held by the subscription.\n *\n * Additionally, subscriptions may be grouped together through the `add()`\n * method, which will attach a child Subscription to the current Subscription.\n * When a Subscription is unsubscribed, all its children (and its grandchildren)\n * will be unsubscribed as well.\n *\n * @class Subscription\n */\nexport class Subscription implements SubscriptionLike {\n /** @nocollapse */\n public static EMPTY = (() => {\n const empty = new Subscription();\n empty.closed = true;\n return empty;\n })();\n\n /**\n * A flag to indicate whether this Subscription has already been unsubscribed.\n */\n public closed = false;\n\n private _parentage: Subscription[] | Subscription | null = null;\n\n /**\n * The list of registered finalizers to execute upon unsubscription. Adding and removing from this\n * list occurs in the {@link #add} and {@link #remove} methods.\n */\n private _finalizers: Exclude[] | null = null;\n\n /**\n * @param initialTeardown A function executed first as part of the finalization\n * process that is kicked off when {@link #unsubscribe} is called.\n */\n constructor(private initialTeardown?: () => void) {}\n\n /**\n * Disposes the resources held by the subscription. May, for instance, cancel\n * an ongoing Observable execution or cancel any other type of work that\n * started when the Subscription was created.\n * @return {void}\n */\n unsubscribe(): void {\n let errors: any[] | undefined;\n\n if (!this.closed) {\n this.closed = true;\n\n // Remove this from it's parents.\n const { _parentage } = this;\n if (_parentage) {\n this._parentage = null;\n if (Array.isArray(_parentage)) {\n for (const parent of _parentage) {\n parent.remove(this);\n }\n } else {\n _parentage.remove(this);\n }\n }\n\n const { initialTeardown: initialFinalizer } = this;\n if (isFunction(initialFinalizer)) {\n try {\n initialFinalizer();\n } catch (e) {\n errors = e instanceof UnsubscriptionError ? e.errors : [e];\n }\n }\n\n const { _finalizers } = this;\n if (_finalizers) {\n this._finalizers = null;\n for (const finalizer of _finalizers) {\n try {\n execFinalizer(finalizer);\n } catch (err) {\n errors = errors ?? [];\n if (err instanceof UnsubscriptionError) {\n errors = [...errors, ...err.errors];\n } else {\n errors.push(err);\n }\n }\n }\n }\n\n if (errors) {\n throw new UnsubscriptionError(errors);\n }\n }\n }\n\n /**\n * Adds a finalizer to this subscription, so that finalization will be unsubscribed/called\n * when this subscription is unsubscribed. If this subscription is already {@link #closed},\n * because it has already been unsubscribed, then whatever finalizer is passed to it\n * will automatically be executed (unless the finalizer itself is also a closed subscription).\n *\n * Closed Subscriptions cannot be added as finalizers to any subscription. Adding a closed\n * subscription to a any subscription will result in no operation. (A noop).\n *\n * Adding a subscription to itself, or adding `null` or `undefined` will not perform any\n * operation at all. (A noop).\n *\n * `Subscription` instances that are added to this instance will automatically remove themselves\n * if they are unsubscribed. Functions and {@link Unsubscribable} objects that you wish to remove\n * will need to be removed manually with {@link #remove}\n *\n * @param teardown The finalization logic to add to this subscription.\n */\n add(teardown: TeardownLogic): void {\n // Only add the finalizer if it's not undefined\n // and don't add a subscription to itself.\n if (teardown && teardown !== this) {\n if (this.closed) {\n // If this subscription is already closed,\n // execute whatever finalizer is handed to it automatically.\n execFinalizer(teardown);\n } else {\n if (teardown instanceof Subscription) {\n // We don't add closed subscriptions, and we don't add the same subscription\n // twice. Subscription unsubscribe is idempotent.\n if (teardown.closed || teardown._hasParent(this)) {\n return;\n }\n teardown._addParent(this);\n }\n (this._finalizers = this._finalizers ?? []).push(teardown);\n }\n }\n }\n\n /**\n * Checks to see if a this subscription already has a particular parent.\n * This will signal that this subscription has already been added to the parent in question.\n * @param parent the parent to check for\n */\n private _hasParent(parent: Subscription) {\n const { _parentage } = this;\n return _parentage === parent || (Array.isArray(_parentage) && _parentage.includes(parent));\n }\n\n /**\n * Adds a parent to this subscription so it can be removed from the parent if it\n * unsubscribes on it's own.\n *\n * NOTE: THIS ASSUMES THAT {@link _hasParent} HAS ALREADY BEEN CHECKED.\n * @param parent The parent subscription to add\n */\n private _addParent(parent: Subscription) {\n const { _parentage } = this;\n this._parentage = Array.isArray(_parentage) ? (_parentage.push(parent), _parentage) : _parentage ? [_parentage, parent] : parent;\n }\n\n /**\n * Called on a child when it is removed via {@link #remove}.\n * @param parent The parent to remove\n */\n private _removeParent(parent: Subscription) {\n const { _parentage } = this;\n if (_parentage === parent) {\n this._parentage = null;\n } else if (Array.isArray(_parentage)) {\n arrRemove(_parentage, parent);\n }\n }\n\n /**\n * Removes a finalizer from this subscription that was previously added with the {@link #add} method.\n *\n * Note that `Subscription` instances, when unsubscribed, will automatically remove themselves\n * from every other `Subscription` they have been added to. This means that using the `remove` method\n * is not a common thing and should be used thoughtfully.\n *\n * If you add the same finalizer instance of a function or an unsubscribable object to a `Subscription` instance\n * more than once, you will need to call `remove` the same number of times to remove all instances.\n *\n * All finalizer instances are removed to free up memory upon unsubscription.\n *\n * @param teardown The finalizer to remove from this subscription\n */\n remove(teardown: Exclude): void {\n const { _finalizers } = this;\n _finalizers && arrRemove(_finalizers, teardown);\n\n if (teardown instanceof Subscription) {\n teardown._removeParent(this);\n }\n }\n}\n\nexport const EMPTY_SUBSCRIPTION = Subscription.EMPTY;\n\nexport function isSubscription(value: any): value is Subscription {\n return (\n value instanceof Subscription ||\n (value && 'closed' in value && isFunction(value.remove) && isFunction(value.add) && isFunction(value.unsubscribe))\n );\n}\n\nfunction execFinalizer(finalizer: Unsubscribable | (() => void)) {\n if (isFunction(finalizer)) {\n finalizer();\n } else {\n finalizer.unsubscribe();\n }\n}\n", "import { Subscriber } from './Subscriber';\nimport { ObservableNotification } from './types';\n\n/**\n * The {@link GlobalConfig} object for RxJS. It is used to configure things\n * like how to react on unhandled errors.\n */\nexport const config: GlobalConfig = {\n onUnhandledError: null,\n onStoppedNotification: null,\n Promise: undefined,\n useDeprecatedSynchronousErrorHandling: false,\n useDeprecatedNextContext: false,\n};\n\n/**\n * The global configuration object for RxJS, used to configure things\n * like how to react on unhandled errors. Accessible via {@link config}\n * object.\n */\nexport interface GlobalConfig {\n /**\n * A registration point for unhandled errors from RxJS. These are errors that\n * cannot were not handled by consuming code in the usual subscription path. For\n * example, if you have this configured, and you subscribe to an observable without\n * providing an error handler, errors from that subscription will end up here. This\n * will _always_ be called asynchronously on another job in the runtime. This is because\n * we do not want errors thrown in this user-configured handler to interfere with the\n * behavior of the library.\n */\n onUnhandledError: ((err: any) => void) | null;\n\n /**\n * A registration point for notifications that cannot be sent to subscribers because they\n * have completed, errored or have been explicitly unsubscribed. By default, next, complete\n * and error notifications sent to stopped subscribers are noops. However, sometimes callers\n * might want a different behavior. For example, with sources that attempt to report errors\n * to stopped subscribers, a caller can configure RxJS to throw an unhandled error instead.\n * This will _always_ be called asynchronously on another job in the runtime. This is because\n * we do not want errors thrown in this user-configured handler to interfere with the\n * behavior of the library.\n */\n onStoppedNotification: ((notification: ObservableNotification, subscriber: Subscriber) => void) | null;\n\n /**\n * The promise constructor used by default for {@link Observable#toPromise toPromise} and {@link Observable#forEach forEach}\n * methods.\n *\n * @deprecated As of version 8, RxJS will no longer support this sort of injection of a\n * Promise constructor. If you need a Promise implementation other than native promises,\n * please polyfill/patch Promise as you see appropriate. Will be removed in v8.\n */\n Promise?: PromiseConstructorLike;\n\n /**\n * If true, turns on synchronous error rethrowing, which is a deprecated behavior\n * in v6 and higher. This behavior enables bad patterns like wrapping a subscribe\n * call in a try/catch block. It also enables producer interference, a nasty bug\n * where a multicast can be broken for all observers by a downstream consumer with\n * an unhandled error. DO NOT USE THIS FLAG UNLESS IT'S NEEDED TO BUY TIME\n * FOR MIGRATION REASONS.\n *\n * @deprecated As of version 8, RxJS will no longer support synchronous throwing\n * of unhandled errors. All errors will be thrown on a separate call stack to prevent bad\n * behaviors described above. Will be removed in v8.\n */\n useDeprecatedSynchronousErrorHandling: boolean;\n\n /**\n * If true, enables an as-of-yet undocumented feature from v5: The ability to access\n * `unsubscribe()` via `this` context in `next` functions created in observers passed\n * to `subscribe`.\n *\n * This is being removed because the performance was severely problematic, and it could also cause\n * issues when types other than POJOs are passed to subscribe as subscribers, as they will likely have\n * their `this` context overwritten.\n *\n * @deprecated As of version 8, RxJS will no longer support altering the\n * context of next functions provided as part of an observer to Subscribe. Instead,\n * you will have access to a subscription or a signal or token that will allow you to do things like\n * unsubscribe and test closed status. Will be removed in v8.\n */\n useDeprecatedNextContext: boolean;\n}\n", "import type { TimerHandle } from './timerHandle';\ntype SetTimeoutFunction = (handler: () => void, timeout?: number, ...args: any[]) => TimerHandle;\ntype ClearTimeoutFunction = (handle: TimerHandle) => void;\n\ninterface TimeoutProvider {\n setTimeout: SetTimeoutFunction;\n clearTimeout: ClearTimeoutFunction;\n delegate:\n | {\n setTimeout: SetTimeoutFunction;\n clearTimeout: ClearTimeoutFunction;\n }\n | undefined;\n}\n\nexport const timeoutProvider: TimeoutProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n setTimeout(handler: () => void, timeout?: number, ...args) {\n const { delegate } = timeoutProvider;\n if (delegate?.setTimeout) {\n return delegate.setTimeout(handler, timeout, ...args);\n }\n return setTimeout(handler, timeout, ...args);\n },\n clearTimeout(handle) {\n const { delegate } = timeoutProvider;\n return (delegate?.clearTimeout || clearTimeout)(handle as any);\n },\n delegate: undefined,\n};\n", "import { config } from '../config';\nimport { timeoutProvider } from '../scheduler/timeoutProvider';\n\n/**\n * Handles an error on another job either with the user-configured {@link onUnhandledError},\n * or by throwing it on that new job so it can be picked up by `window.onerror`, `process.on('error')`, etc.\n *\n * This should be called whenever there is an error that is out-of-band with the subscription\n * or when an error hits a terminal boundary of the subscription and no error handler was provided.\n *\n * @param err the error to report\n */\nexport function reportUnhandledError(err: any) {\n timeoutProvider.setTimeout(() => {\n const { onUnhandledError } = config;\n if (onUnhandledError) {\n // Execute the user-configured error handler.\n onUnhandledError(err);\n } else {\n // Throw so it is picked up by the runtime's uncaught error mechanism.\n throw err;\n }\n });\n}\n", "/* tslint:disable:no-empty */\nexport function noop() { }\n", "import { CompleteNotification, NextNotification, ErrorNotification } from './types';\n\n/**\n * A completion object optimized for memory use and created to be the\n * same \"shape\" as other notifications in v8.\n * @internal\n */\nexport const COMPLETE_NOTIFICATION = (() => createNotification('C', undefined, undefined) as CompleteNotification)();\n\n/**\n * Internal use only. Creates an optimized error notification that is the same \"shape\"\n * as other notifications.\n * @internal\n */\nexport function errorNotification(error: any): ErrorNotification {\n return createNotification('E', undefined, error) as any;\n}\n\n/**\n * Internal use only. Creates an optimized next notification that is the same \"shape\"\n * as other notifications.\n * @internal\n */\nexport function nextNotification(value: T) {\n return createNotification('N', value, undefined) as NextNotification;\n}\n\n/**\n * Ensures that all notifications created internally have the same \"shape\" in v8.\n *\n * TODO: This is only exported to support a crazy legacy test in `groupBy`.\n * @internal\n */\nexport function createNotification(kind: 'N' | 'E' | 'C', value: any, error: any) {\n return {\n kind,\n value,\n error,\n };\n}\n", "import { config } from '../config';\n\nlet context: { errorThrown: boolean; error: any } | null = null;\n\n/**\n * Handles dealing with errors for super-gross mode. Creates a context, in which\n * any synchronously thrown errors will be passed to {@link captureError}. Which\n * will record the error such that it will be rethrown after the call back is complete.\n * TODO: Remove in v8\n * @param cb An immediately executed function.\n */\nexport function errorContext(cb: () => void) {\n if (config.useDeprecatedSynchronousErrorHandling) {\n const isRoot = !context;\n if (isRoot) {\n context = { errorThrown: false, error: null };\n }\n cb();\n if (isRoot) {\n const { errorThrown, error } = context!;\n context = null;\n if (errorThrown) {\n throw error;\n }\n }\n } else {\n // This is the general non-deprecated path for everyone that\n // isn't crazy enough to use super-gross mode (useDeprecatedSynchronousErrorHandling)\n cb();\n }\n}\n\n/**\n * Captures errors only in super-gross mode.\n * @param err the error to capture\n */\nexport function captureError(err: any) {\n if (config.useDeprecatedSynchronousErrorHandling && context) {\n context.errorThrown = true;\n context.error = err;\n }\n}\n", "import { isFunction } from './util/isFunction';\nimport { Observer, ObservableNotification } from './types';\nimport { isSubscription, Subscription } from './Subscription';\nimport { config } from './config';\nimport { reportUnhandledError } from './util/reportUnhandledError';\nimport { noop } from './util/noop';\nimport { nextNotification, errorNotification, COMPLETE_NOTIFICATION } from './NotificationFactories';\nimport { timeoutProvider } from './scheduler/timeoutProvider';\nimport { captureError } from './util/errorContext';\n\n/**\n * Implements the {@link Observer} interface and extends the\n * {@link Subscription} class. While the {@link Observer} is the public API for\n * consuming the values of an {@link Observable}, all Observers get converted to\n * a Subscriber, in order to provide Subscription-like capabilities such as\n * `unsubscribe`. Subscriber is a common type in RxJS, and crucial for\n * implementing operators, but it is rarely used as a public API.\n *\n * @class Subscriber\n */\nexport class Subscriber extends Subscription implements Observer {\n /**\n * A static factory for a Subscriber, given a (potentially partial) definition\n * of an Observer.\n * @param next The `next` callback of an Observer.\n * @param error The `error` callback of an\n * Observer.\n * @param complete The `complete` callback of an\n * Observer.\n * @return A Subscriber wrapping the (partially defined)\n * Observer represented by the given arguments.\n * @nocollapse\n * @deprecated Do not use. Will be removed in v8. There is no replacement for this\n * method, and there is no reason to be creating instances of `Subscriber` directly.\n * If you have a specific use case, please file an issue.\n */\n static create(next?: (x?: T) => void, error?: (e?: any) => void, complete?: () => void): Subscriber {\n return new SafeSubscriber(next, error, complete);\n }\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n protected isStopped: boolean = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n protected destination: Subscriber | Observer; // this `any` is the escape hatch to erase extra type param (e.g. R)\n\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n * There is no reason to directly create an instance of Subscriber. This type is exported for typings reasons.\n */\n constructor(destination?: Subscriber | Observer) {\n super();\n if (destination) {\n this.destination = destination;\n // Automatically chain subscriptions together here.\n // if destination is a Subscription, then it is a Subscriber.\n if (isSubscription(destination)) {\n destination.add(this);\n }\n } else {\n this.destination = EMPTY_OBSERVER;\n }\n }\n\n /**\n * The {@link Observer} callback to receive notifications of type `next` from\n * the Observable, with a value. The Observable may call this method 0 or more\n * times.\n * @param {T} [value] The `next` value.\n * @return {void}\n */\n next(value?: T): void {\n if (this.isStopped) {\n handleStoppedNotification(nextNotification(value), this);\n } else {\n this._next(value!);\n }\n }\n\n /**\n * The {@link Observer} callback to receive notifications of type `error` from\n * the Observable, with an attached `Error`. Notifies the Observer that\n * the Observable has experienced an error condition.\n * @param {any} [err] The `error` exception.\n * @return {void}\n */\n error(err?: any): void {\n if (this.isStopped) {\n handleStoppedNotification(errorNotification(err), this);\n } else {\n this.isStopped = true;\n this._error(err);\n }\n }\n\n /**\n * The {@link Observer} callback to receive a valueless notification of type\n * `complete` from the Observable. Notifies the Observer that the Observable\n * has finished sending push-based notifications.\n * @return {void}\n */\n complete(): void {\n if (this.isStopped) {\n handleStoppedNotification(COMPLETE_NOTIFICATION, this);\n } else {\n this.isStopped = true;\n this._complete();\n }\n }\n\n unsubscribe(): void {\n if (!this.closed) {\n this.isStopped = true;\n super.unsubscribe();\n this.destination = null!;\n }\n }\n\n protected _next(value: T): void {\n this.destination.next(value);\n }\n\n protected _error(err: any): void {\n try {\n this.destination.error(err);\n } finally {\n this.unsubscribe();\n }\n }\n\n protected _complete(): void {\n try {\n this.destination.complete();\n } finally {\n this.unsubscribe();\n }\n }\n}\n\n/**\n * This bind is captured here because we want to be able to have\n * compatibility with monoid libraries that tend to use a method named\n * `bind`. In particular, a library called Monio requires this.\n */\nconst _bind = Function.prototype.bind;\n\nfunction bind any>(fn: Fn, thisArg: any): Fn {\n return _bind.call(fn, thisArg);\n}\n\n/**\n * Internal optimization only, DO NOT EXPOSE.\n * @internal\n */\nclass ConsumerObserver implements Observer {\n constructor(private partialObserver: Partial>) {}\n\n next(value: T): void {\n const { partialObserver } = this;\n if (partialObserver.next) {\n try {\n partialObserver.next(value);\n } catch (error) {\n handleUnhandledError(error);\n }\n }\n }\n\n error(err: any): void {\n const { partialObserver } = this;\n if (partialObserver.error) {\n try {\n partialObserver.error(err);\n } catch (error) {\n handleUnhandledError(error);\n }\n } else {\n handleUnhandledError(err);\n }\n }\n\n complete(): void {\n const { partialObserver } = this;\n if (partialObserver.complete) {\n try {\n partialObserver.complete();\n } catch (error) {\n handleUnhandledError(error);\n }\n }\n }\n}\n\nexport class SafeSubscriber extends Subscriber {\n constructor(\n observerOrNext?: Partial> | ((value: T) => void) | null,\n error?: ((e?: any) => void) | null,\n complete?: (() => void) | null\n ) {\n super();\n\n let partialObserver: Partial>;\n if (isFunction(observerOrNext) || !observerOrNext) {\n // The first argument is a function, not an observer. The next\n // two arguments *could* be observers, or they could be empty.\n partialObserver = {\n next: (observerOrNext ?? undefined) as (((value: T) => void) | undefined),\n error: error ?? undefined,\n complete: complete ?? undefined,\n };\n } else {\n // The first argument is a partial observer.\n let context: any;\n if (this && config.useDeprecatedNextContext) {\n // This is a deprecated path that made `this.unsubscribe()` available in\n // next handler functions passed to subscribe. This only exists behind a flag\n // now, as it is *very* slow.\n context = Object.create(observerOrNext);\n context.unsubscribe = () => this.unsubscribe();\n partialObserver = {\n next: observerOrNext.next && bind(observerOrNext.next, context),\n error: observerOrNext.error && bind(observerOrNext.error, context),\n complete: observerOrNext.complete && bind(observerOrNext.complete, context),\n };\n } else {\n // The \"normal\" path. Just use the partial observer directly.\n partialObserver = observerOrNext;\n }\n }\n\n // Wrap the partial observer to ensure it's a full observer, and\n // make sure proper error handling is accounted for.\n this.destination = new ConsumerObserver(partialObserver);\n }\n}\n\nfunction handleUnhandledError(error: any) {\n if (config.useDeprecatedSynchronousErrorHandling) {\n captureError(error);\n } else {\n // Ideal path, we report this as an unhandled error,\n // which is thrown on a new call stack.\n reportUnhandledError(error);\n }\n}\n\n/**\n * An error handler used when no error handler was supplied\n * to the SafeSubscriber -- meaning no error handler was supplied\n * do the `subscribe` call on our observable.\n * @param err The error to handle\n */\nfunction defaultErrorHandler(err: any) {\n throw err;\n}\n\n/**\n * A handler for notifications that cannot be sent to a stopped subscriber.\n * @param notification The notification being sent\n * @param subscriber The stopped subscriber\n */\nfunction handleStoppedNotification(notification: ObservableNotification, subscriber: Subscriber) {\n const { onStoppedNotification } = config;\n onStoppedNotification && timeoutProvider.setTimeout(() => onStoppedNotification(notification, subscriber));\n}\n\n/**\n * The observer used as a stub for subscriptions where the user did not\n * pass any arguments to `subscribe`. Comes with the default error handling\n * behavior.\n */\nexport const EMPTY_OBSERVER: Readonly> & { closed: true } = {\n closed: true,\n next: noop,\n error: defaultErrorHandler,\n complete: noop,\n};\n", "/**\n * Symbol.observable or a string \"@@observable\". Used for interop\n *\n * @deprecated We will no longer be exporting this symbol in upcoming versions of RxJS.\n * Instead polyfill and use Symbol.observable directly *or* use https://www.npmjs.com/package/symbol-observable\n */\nexport const observable: string | symbol = (() => (typeof Symbol === 'function' && Symbol.observable) || '@@observable')();\n", "/**\n * This function takes one parameter and just returns it. Simply put,\n * this is like `(x: T): T => x`.\n *\n * ## Examples\n *\n * This is useful in some cases when using things like `mergeMap`\n *\n * ```ts\n * import { interval, take, map, range, mergeMap, identity } from 'rxjs';\n *\n * const source$ = interval(1000).pipe(take(5));\n *\n * const result$ = source$.pipe(\n * map(i => range(i)),\n * mergeMap(identity) // same as mergeMap(x => x)\n * );\n *\n * result$.subscribe({\n * next: console.log\n * });\n * ```\n *\n * Or when you want to selectively apply an operator\n *\n * ```ts\n * import { interval, take, identity } from 'rxjs';\n *\n * const shouldLimit = () => Math.random() < 0.5;\n *\n * const source$ = interval(1000);\n *\n * const result$ = source$.pipe(shouldLimit() ? take(5) : identity);\n *\n * result$.subscribe({\n * next: console.log\n * });\n * ```\n *\n * @param x Any value that is returned by this function\n * @returns The value passed as the first parameter to this function\n */\nexport function identity(x: T): T {\n return x;\n}\n", "import { identity } from './identity';\nimport { UnaryFunction } from '../types';\n\nexport function pipe(): typeof identity;\nexport function pipe(fn1: UnaryFunction): UnaryFunction;\nexport function pipe(fn1: UnaryFunction, fn2: UnaryFunction): UnaryFunction;\nexport function pipe(fn1: UnaryFunction, fn2: UnaryFunction, fn3: UnaryFunction): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction,\n fn9: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction,\n fn9: UnaryFunction,\n ...fns: UnaryFunction[]\n): UnaryFunction;\n\n/**\n * pipe() can be called on one or more functions, each of which can take one argument (\"UnaryFunction\")\n * and uses it to return a value.\n * It returns a function that takes one argument, passes it to the first UnaryFunction, and then\n * passes the result to the next one, passes that result to the next one, and so on. \n */\nexport function pipe(...fns: Array>): UnaryFunction {\n return pipeFromArray(fns);\n}\n\n/** @internal */\nexport function pipeFromArray(fns: Array>): UnaryFunction {\n if (fns.length === 0) {\n return identity as UnaryFunction;\n }\n\n if (fns.length === 1) {\n return fns[0];\n }\n\n return function piped(input: T): R {\n return fns.reduce((prev: any, fn: UnaryFunction) => fn(prev), input as any);\n };\n}\n", "import { Operator } from './Operator';\nimport { SafeSubscriber, Subscriber } from './Subscriber';\nimport { isSubscription, Subscription } from './Subscription';\nimport { TeardownLogic, OperatorFunction, Subscribable, Observer } from './types';\nimport { observable as Symbol_observable } from './symbol/observable';\nimport { pipeFromArray } from './util/pipe';\nimport { config } from './config';\nimport { isFunction } from './util/isFunction';\nimport { errorContext } from './util/errorContext';\n\n/**\n * A representation of any set of values over any amount of time. This is the most basic building block\n * of RxJS.\n *\n * @class Observable\n */\nexport class Observable implements Subscribable {\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n */\n source: Observable | undefined;\n\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n */\n operator: Operator | undefined;\n\n /**\n * @constructor\n * @param {Function} subscribe the function that is called when the Observable is\n * initially subscribed to. This function is given a Subscriber, to which new values\n * can be `next`ed, or an `error` method can be called to raise an error, or\n * `complete` can be called to notify of a successful completion.\n */\n constructor(subscribe?: (this: Observable, subscriber: Subscriber) => TeardownLogic) {\n if (subscribe) {\n this._subscribe = subscribe;\n }\n }\n\n // HACK: Since TypeScript inherits static properties too, we have to\n // fight against TypeScript here so Subject can have a different static create signature\n /**\n * Creates a new Observable by calling the Observable constructor\n * @owner Observable\n * @method create\n * @param {Function} subscribe? the subscriber function to be passed to the Observable constructor\n * @return {Observable} a new observable\n * @nocollapse\n * @deprecated Use `new Observable()` instead. Will be removed in v8.\n */\n static create: (...args: any[]) => any = (subscribe?: (subscriber: Subscriber) => TeardownLogic) => {\n return new Observable(subscribe);\n };\n\n /**\n * Creates a new Observable, with this Observable instance as the source, and the passed\n * operator defined as the new observable's operator.\n * @method lift\n * @param operator the operator defining the operation to take on the observable\n * @return a new observable with the Operator applied\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n * If you have implemented an operator using `lift`, it is recommended that you create an\n * operator by simply returning `new Observable()` directly. See \"Creating new operators from\n * scratch\" section here: https://rxjs.dev/guide/operators\n */\n lift(operator?: Operator): Observable {\n const observable = new Observable();\n observable.source = this;\n observable.operator = operator;\n return observable;\n }\n\n subscribe(observerOrNext?: Partial> | ((value: T) => void)): Subscription;\n /** @deprecated Instead of passing separate callback arguments, use an observer argument. Signatures taking separate callback arguments will be removed in v8. Details: https://rxjs.dev/deprecations/subscribe-arguments */\n subscribe(next?: ((value: T) => void) | null, error?: ((error: any) => void) | null, complete?: (() => void) | null): Subscription;\n /**\n * Invokes an execution of an Observable and registers Observer handlers for notifications it will emit.\n *\n * Use it when you have all these Observables, but still nothing is happening.\n *\n * `subscribe` is not a regular operator, but a method that calls Observable's internal `subscribe` function. It\n * might be for example a function that you passed to Observable's constructor, but most of the time it is\n * a library implementation, which defines what will be emitted by an Observable, and when it be will emitted. This means\n * that calling `subscribe` is actually the moment when Observable starts its work, not when it is created, as it is often\n * the thought.\n *\n * Apart from starting the execution of an Observable, this method allows you to listen for values\n * that an Observable emits, as well as for when it completes or errors. You can achieve this in two\n * of the following ways.\n *\n * The first way is creating an object that implements {@link Observer} interface. It should have methods\n * defined by that interface, but note that it should be just a regular JavaScript object, which you can create\n * yourself in any way you want (ES6 class, classic function constructor, object literal etc.). In particular, do\n * not attempt to use any RxJS implementation details to create Observers - you don't need them. Remember also\n * that your object does not have to implement all methods. If you find yourself creating a method that doesn't\n * do anything, you can simply omit it. Note however, if the `error` method is not provided and an error happens,\n * it will be thrown asynchronously. Errors thrown asynchronously cannot be caught using `try`/`catch`. Instead,\n * use the {@link onUnhandledError} configuration option or use a runtime handler (like `window.onerror` or\n * `process.on('error)`) to be notified of unhandled errors. Because of this, it's recommended that you provide\n * an `error` method to avoid missing thrown errors.\n *\n * The second way is to give up on Observer object altogether and simply provide callback functions in place of its methods.\n * This means you can provide three functions as arguments to `subscribe`, where the first function is equivalent\n * of a `next` method, the second of an `error` method and the third of a `complete` method. Just as in case of an Observer,\n * if you do not need to listen for something, you can omit a function by passing `undefined` or `null`,\n * since `subscribe` recognizes these functions by where they were placed in function call. When it comes\n * to the `error` function, as with an Observer, if not provided, errors emitted by an Observable will be thrown asynchronously.\n *\n * You can, however, subscribe with no parameters at all. This may be the case where you're not interested in terminal events\n * and you also handled emissions internally by using operators (e.g. using `tap`).\n *\n * Whichever style of calling `subscribe` you use, in both cases it returns a Subscription object.\n * This object allows you to call `unsubscribe` on it, which in turn will stop the work that an Observable does and will clean\n * up all resources that an Observable used. Note that cancelling a subscription will not call `complete` callback\n * provided to `subscribe` function, which is reserved for a regular completion signal that comes from an Observable.\n *\n * Remember that callbacks provided to `subscribe` are not guaranteed to be called asynchronously.\n * It is an Observable itself that decides when these functions will be called. For example {@link of}\n * by default emits all its values synchronously. Always check documentation for how given Observable\n * will behave when subscribed and if its default behavior can be modified with a `scheduler`.\n *\n * #### Examples\n *\n * Subscribe with an {@link guide/observer Observer}\n *\n * ```ts\n * import { of } from 'rxjs';\n *\n * const sumObserver = {\n * sum: 0,\n * next(value) {\n * console.log('Adding: ' + value);\n * this.sum = this.sum + value;\n * },\n * error() {\n * // We actually could just remove this method,\n * // since we do not really care about errors right now.\n * },\n * complete() {\n * console.log('Sum equals: ' + this.sum);\n * }\n * };\n *\n * of(1, 2, 3) // Synchronously emits 1, 2, 3 and then completes.\n * .subscribe(sumObserver);\n *\n * // Logs:\n * // 'Adding: 1'\n * // 'Adding: 2'\n * // 'Adding: 3'\n * // 'Sum equals: 6'\n * ```\n *\n * Subscribe with functions ({@link deprecations/subscribe-arguments deprecated})\n *\n * ```ts\n * import { of } from 'rxjs'\n *\n * let sum = 0;\n *\n * of(1, 2, 3).subscribe(\n * value => {\n * console.log('Adding: ' + value);\n * sum = sum + value;\n * },\n * undefined,\n * () => console.log('Sum equals: ' + sum)\n * );\n *\n * // Logs:\n * // 'Adding: 1'\n * // 'Adding: 2'\n * // 'Adding: 3'\n * // 'Sum equals: 6'\n * ```\n *\n * Cancel a subscription\n *\n * ```ts\n * import { interval } from 'rxjs';\n *\n * const subscription = interval(1000).subscribe({\n * next(num) {\n * console.log(num)\n * },\n * complete() {\n * // Will not be called, even when cancelling subscription.\n * console.log('completed!');\n * }\n * });\n *\n * setTimeout(() => {\n * subscription.unsubscribe();\n * console.log('unsubscribed!');\n * }, 2500);\n *\n * // Logs:\n * // 0 after 1s\n * // 1 after 2s\n * // 'unsubscribed!' after 2.5s\n * ```\n *\n * @param {Observer|Function} observerOrNext (optional) Either an observer with methods to be called,\n * or the first of three possible handlers, which is the handler for each value emitted from the subscribed\n * Observable.\n * @param {Function} error (optional) A handler for a terminal event resulting from an error. If no error handler is provided,\n * the error will be thrown asynchronously as unhandled.\n * @param {Function} complete (optional) A handler for a terminal event resulting from successful completion.\n * @return {Subscription} a subscription reference to the registered handlers\n * @method subscribe\n */\n subscribe(\n observerOrNext?: Partial> | ((value: T) => void) | null,\n error?: ((error: any) => void) | null,\n complete?: (() => void) | null\n ): Subscription {\n const subscriber = isSubscriber(observerOrNext) ? observerOrNext : new SafeSubscriber(observerOrNext, error, complete);\n\n errorContext(() => {\n const { operator, source } = this;\n subscriber.add(\n operator\n ? // We're dealing with a subscription in the\n // operator chain to one of our lifted operators.\n operator.call(subscriber, source)\n : source\n ? // If `source` has a value, but `operator` does not, something that\n // had intimate knowledge of our API, like our `Subject`, must have\n // set it. We're going to just call `_subscribe` directly.\n this._subscribe(subscriber)\n : // In all other cases, we're likely wrapping a user-provided initializer\n // function, so we need to catch errors and handle them appropriately.\n this._trySubscribe(subscriber)\n );\n });\n\n return subscriber;\n }\n\n /** @internal */\n protected _trySubscribe(sink: Subscriber): TeardownLogic {\n try {\n return this._subscribe(sink);\n } catch (err) {\n // We don't need to return anything in this case,\n // because it's just going to try to `add()` to a subscription\n // above.\n sink.error(err);\n }\n }\n\n /**\n * Used as a NON-CANCELLABLE means of subscribing to an observable, for use with\n * APIs that expect promises, like `async/await`. You cannot unsubscribe from this.\n *\n * **WARNING**: Only use this with observables you *know* will complete. If the source\n * observable does not complete, you will end up with a promise that is hung up, and\n * potentially all of the state of an async function hanging out in memory. To avoid\n * this situation, look into adding something like {@link timeout}, {@link take},\n * {@link takeWhile}, or {@link takeUntil} amongst others.\n *\n * #### Example\n *\n * ```ts\n * import { interval, take } from 'rxjs';\n *\n * const source$ = interval(1000).pipe(take(4));\n *\n * async function getTotal() {\n * let total = 0;\n *\n * await source$.forEach(value => {\n * total += value;\n * console.log('observable -> ' + value);\n * });\n *\n * return total;\n * }\n *\n * getTotal().then(\n * total => console.log('Total: ' + total)\n * );\n *\n * // Expected:\n * // 'observable -> 0'\n * // 'observable -> 1'\n * // 'observable -> 2'\n * // 'observable -> 3'\n * // 'Total: 6'\n * ```\n *\n * @param next a handler for each value emitted by the observable\n * @return a promise that either resolves on observable completion or\n * rejects with the handled error\n */\n forEach(next: (value: T) => void): Promise;\n\n /**\n * @param next a handler for each value emitted by the observable\n * @param promiseCtor a constructor function used to instantiate the Promise\n * @return a promise that either resolves on observable completion or\n * rejects with the handled error\n * @deprecated Passing a Promise constructor will no longer be available\n * in upcoming versions of RxJS. This is because it adds weight to the library, for very\n * little benefit. If you need this functionality, it is recommended that you either\n * polyfill Promise, or you create an adapter to convert the returned native promise\n * to whatever promise implementation you wanted. Will be removed in v8.\n */\n forEach(next: (value: T) => void, promiseCtor: PromiseConstructorLike): Promise;\n\n forEach(next: (value: T) => void, promiseCtor?: PromiseConstructorLike): Promise {\n promiseCtor = getPromiseCtor(promiseCtor);\n\n return new promiseCtor((resolve, reject) => {\n const subscriber = new SafeSubscriber({\n next: (value) => {\n try {\n next(value);\n } catch (err) {\n reject(err);\n subscriber.unsubscribe();\n }\n },\n error: reject,\n complete: resolve,\n });\n this.subscribe(subscriber);\n }) as Promise;\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): TeardownLogic {\n return this.source?.subscribe(subscriber);\n }\n\n /**\n * An interop point defined by the es7-observable spec https://github.com/zenparsing/es-observable\n * @method Symbol.observable\n * @return {Observable} this instance of the observable\n */\n [Symbol_observable]() {\n return this;\n }\n\n /* tslint:disable:max-line-length */\n pipe(): Observable;\n pipe(op1: OperatorFunction): Observable;\n pipe(op1: OperatorFunction, op2: OperatorFunction): Observable;\n pipe(op1: OperatorFunction, op2: OperatorFunction, op3: OperatorFunction): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction,\n op9: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction,\n op9: OperatorFunction,\n ...operations: OperatorFunction[]\n ): Observable;\n /* tslint:enable:max-line-length */\n\n /**\n * Used to stitch together functional operators into a chain.\n * @method pipe\n * @return {Observable} the Observable result of all of the operators having\n * been called in the order they were passed in.\n *\n * ## Example\n *\n * ```ts\n * import { interval, filter, map, scan } from 'rxjs';\n *\n * interval(1000)\n * .pipe(\n * filter(x => x % 2 === 0),\n * map(x => x + x),\n * scan((acc, x) => acc + x)\n * )\n * .subscribe(x => console.log(x));\n * ```\n */\n pipe(...operations: OperatorFunction[]): Observable {\n return pipeFromArray(operations)(this);\n }\n\n /* tslint:disable:max-line-length */\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(): Promise;\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(PromiseCtor: typeof Promise): Promise;\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(PromiseCtor: PromiseConstructorLike): Promise;\n /* tslint:enable:max-line-length */\n\n /**\n * Subscribe to this Observable and get a Promise resolving on\n * `complete` with the last emission (if any).\n *\n * **WARNING**: Only use this with observables you *know* will complete. If the source\n * observable does not complete, you will end up with a promise that is hung up, and\n * potentially all of the state of an async function hanging out in memory. To avoid\n * this situation, look into adding something like {@link timeout}, {@link take},\n * {@link takeWhile}, or {@link takeUntil} amongst others.\n *\n * @method toPromise\n * @param [promiseCtor] a constructor function used to instantiate\n * the Promise\n * @return A Promise that resolves with the last value emit, or\n * rejects on an error. If there were no emissions, Promise\n * resolves with undefined.\n * @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise\n */\n toPromise(promiseCtor?: PromiseConstructorLike): Promise {\n promiseCtor = getPromiseCtor(promiseCtor);\n\n return new promiseCtor((resolve, reject) => {\n let value: T | undefined;\n this.subscribe(\n (x: T) => (value = x),\n (err: any) => reject(err),\n () => resolve(value)\n );\n }) as Promise;\n }\n}\n\n/**\n * Decides between a passed promise constructor from consuming code,\n * A default configured promise constructor, and the native promise\n * constructor and returns it. If nothing can be found, it will throw\n * an error.\n * @param promiseCtor The optional promise constructor to passed by consuming code\n */\nfunction getPromiseCtor(promiseCtor: PromiseConstructorLike | undefined) {\n return promiseCtor ?? config.Promise ?? Promise;\n}\n\nfunction isObserver(value: any): value is Observer {\n return value && isFunction(value.next) && isFunction(value.error) && isFunction(value.complete);\n}\n\nfunction isSubscriber(value: any): value is Subscriber {\n return (value && value instanceof Subscriber) || (isObserver(value) && isSubscription(value));\n}\n", "import { Observable } from '../Observable';\nimport { Subscriber } from '../Subscriber';\nimport { OperatorFunction } from '../types';\nimport { isFunction } from './isFunction';\n\n/**\n * Used to determine if an object is an Observable with a lift function.\n */\nexport function hasLift(source: any): source is { lift: InstanceType['lift'] } {\n return isFunction(source?.lift);\n}\n\n/**\n * Creates an `OperatorFunction`. Used to define operators throughout the library in a concise way.\n * @param init The logic to connect the liftedSource to the subscriber at the moment of subscription.\n */\nexport function operate(\n init: (liftedSource: Observable, subscriber: Subscriber) => (() => void) | void\n): OperatorFunction {\n return (source: Observable) => {\n if (hasLift(source)) {\n return source.lift(function (this: Subscriber, liftedSource: Observable) {\n try {\n return init(liftedSource, this);\n } catch (err) {\n this.error(err);\n }\n });\n }\n throw new TypeError('Unable to lift unknown Observable type');\n };\n}\n", "import { Subscriber } from '../Subscriber';\n\n/**\n * Creates an instance of an `OperatorSubscriber`.\n * @param destination The downstream subscriber.\n * @param onNext Handles next values, only called if this subscriber is not stopped or closed. Any\n * error that occurs in this function is caught and sent to the `error` method of this subscriber.\n * @param onError Handles errors from the subscription, any errors that occur in this handler are caught\n * and send to the `destination` error handler.\n * @param onComplete Handles completion notification from the subscription. Any errors that occur in\n * this handler are sent to the `destination` error handler.\n * @param onFinalize Additional teardown logic here. This will only be called on teardown if the\n * subscriber itself is not already closed. This is called after all other teardown logic is executed.\n */\nexport function createOperatorSubscriber(\n destination: Subscriber,\n onNext?: (value: T) => void,\n onComplete?: () => void,\n onError?: (err: any) => void,\n onFinalize?: () => void\n): Subscriber {\n return new OperatorSubscriber(destination, onNext, onComplete, onError, onFinalize);\n}\n\n/**\n * A generic helper for allowing operators to be created with a Subscriber and\n * use closures to capture necessary state from the operator function itself.\n */\nexport class OperatorSubscriber extends Subscriber {\n /**\n * Creates an instance of an `OperatorSubscriber`.\n * @param destination The downstream subscriber.\n * @param onNext Handles next values, only called if this subscriber is not stopped or closed. Any\n * error that occurs in this function is caught and sent to the `error` method of this subscriber.\n * @param onError Handles errors from the subscription, any errors that occur in this handler are caught\n * and send to the `destination` error handler.\n * @param onComplete Handles completion notification from the subscription. Any errors that occur in\n * this handler are sent to the `destination` error handler.\n * @param onFinalize Additional finalization logic here. This will only be called on finalization if the\n * subscriber itself is not already closed. This is called after all other finalization logic is executed.\n * @param shouldUnsubscribe An optional check to see if an unsubscribe call should truly unsubscribe.\n * NOTE: This currently **ONLY** exists to support the strange behavior of {@link groupBy}, where unsubscription\n * to the resulting observable does not actually disconnect from the source if there are active subscriptions\n * to any grouped observable. (DO NOT EXPOSE OR USE EXTERNALLY!!!)\n */\n constructor(\n destination: Subscriber,\n onNext?: (value: T) => void,\n onComplete?: () => void,\n onError?: (err: any) => void,\n private onFinalize?: () => void,\n private shouldUnsubscribe?: () => boolean\n ) {\n // It's important - for performance reasons - that all of this class's\n // members are initialized and that they are always initialized in the same\n // order. This will ensure that all OperatorSubscriber instances have the\n // same hidden class in V8. This, in turn, will help keep the number of\n // hidden classes involved in property accesses within the base class as\n // low as possible. If the number of hidden classes involved exceeds four,\n // the property accesses will become megamorphic and performance penalties\n // will be incurred - i.e. inline caches won't be used.\n //\n // The reasons for ensuring all instances have the same hidden class are\n // further discussed in this blog post from Benedikt Meurer:\n // https://benediktmeurer.de/2018/03/23/impact-of-polymorphism-on-component-based-frameworks-like-react/\n super(destination);\n this._next = onNext\n ? function (this: OperatorSubscriber, value: T) {\n try {\n onNext(value);\n } catch (err) {\n destination.error(err);\n }\n }\n : super._next;\n this._error = onError\n ? function (this: OperatorSubscriber, err: any) {\n try {\n onError(err);\n } catch (err) {\n // Send any errors that occur down stream.\n destination.error(err);\n } finally {\n // Ensure finalization.\n this.unsubscribe();\n }\n }\n : super._error;\n this._complete = onComplete\n ? function (this: OperatorSubscriber) {\n try {\n onComplete();\n } catch (err) {\n // Send any errors that occur down stream.\n destination.error(err);\n } finally {\n // Ensure finalization.\n this.unsubscribe();\n }\n }\n : super._complete;\n }\n\n unsubscribe() {\n if (!this.shouldUnsubscribe || this.shouldUnsubscribe()) {\n const { closed } = this;\n super.unsubscribe();\n // Execute additional teardown if we have any and we didn't already do so.\n !closed && this.onFinalize?.();\n }\n }\n}\n", "import { Subscription } from '../Subscription';\n\ninterface AnimationFrameProvider {\n schedule(callback: FrameRequestCallback): Subscription;\n requestAnimationFrame: typeof requestAnimationFrame;\n cancelAnimationFrame: typeof cancelAnimationFrame;\n delegate:\n | {\n requestAnimationFrame: typeof requestAnimationFrame;\n cancelAnimationFrame: typeof cancelAnimationFrame;\n }\n | undefined;\n}\n\nexport const animationFrameProvider: AnimationFrameProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n schedule(callback) {\n let request = requestAnimationFrame;\n let cancel: typeof cancelAnimationFrame | undefined = cancelAnimationFrame;\n const { delegate } = animationFrameProvider;\n if (delegate) {\n request = delegate.requestAnimationFrame;\n cancel = delegate.cancelAnimationFrame;\n }\n const handle = request((timestamp) => {\n // Clear the cancel function. The request has been fulfilled, so\n // attempting to cancel the request upon unsubscription would be\n // pointless.\n cancel = undefined;\n callback(timestamp);\n });\n return new Subscription(() => cancel?.(handle));\n },\n requestAnimationFrame(...args) {\n const { delegate } = animationFrameProvider;\n return (delegate?.requestAnimationFrame || requestAnimationFrame)(...args);\n },\n cancelAnimationFrame(...args) {\n const { delegate } = animationFrameProvider;\n return (delegate?.cancelAnimationFrame || cancelAnimationFrame)(...args);\n },\n delegate: undefined,\n};\n", "import { createErrorClass } from './createErrorClass';\n\nexport interface ObjectUnsubscribedError extends Error {}\n\nexport interface ObjectUnsubscribedErrorCtor {\n /**\n * @deprecated Internal implementation detail. Do not construct error instances.\n * Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269\n */\n new (): ObjectUnsubscribedError;\n}\n\n/**\n * An error thrown when an action is invalid because the object has been\n * unsubscribed.\n *\n * @see {@link Subject}\n * @see {@link BehaviorSubject}\n *\n * @class ObjectUnsubscribedError\n */\nexport const ObjectUnsubscribedError: ObjectUnsubscribedErrorCtor = createErrorClass(\n (_super) =>\n function ObjectUnsubscribedErrorImpl(this: any) {\n _super(this);\n this.name = 'ObjectUnsubscribedError';\n this.message = 'object unsubscribed';\n }\n);\n", "import { Operator } from './Operator';\nimport { Observable } from './Observable';\nimport { Subscriber } from './Subscriber';\nimport { Subscription, EMPTY_SUBSCRIPTION } from './Subscription';\nimport { Observer, SubscriptionLike, TeardownLogic } from './types';\nimport { ObjectUnsubscribedError } from './util/ObjectUnsubscribedError';\nimport { arrRemove } from './util/arrRemove';\nimport { errorContext } from './util/errorContext';\n\n/**\n * A Subject is a special type of Observable that allows values to be\n * multicasted to many Observers. Subjects are like EventEmitters.\n *\n * Every Subject is an Observable and an Observer. You can subscribe to a\n * Subject, and you can call next to feed values as well as error and complete.\n */\nexport class Subject extends Observable implements SubscriptionLike {\n closed = false;\n\n private currentObservers: Observer[] | null = null;\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n observers: Observer[] = [];\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n isStopped = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n hasError = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n thrownError: any = null;\n\n /**\n * Creates a \"subject\" by basically gluing an observer to an observable.\n *\n * @nocollapse\n * @deprecated Recommended you do not use. Will be removed at some point in the future. Plans for replacement still under discussion.\n */\n static create: (...args: any[]) => any = (destination: Observer, source: Observable): AnonymousSubject => {\n return new AnonymousSubject(destination, source);\n };\n\n constructor() {\n // NOTE: This must be here to obscure Observable's constructor.\n super();\n }\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n lift(operator: Operator): Observable {\n const subject = new AnonymousSubject(this, this);\n subject.operator = operator as any;\n return subject as any;\n }\n\n /** @internal */\n protected _throwIfClosed() {\n if (this.closed) {\n throw new ObjectUnsubscribedError();\n }\n }\n\n next(value: T) {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n if (!this.currentObservers) {\n this.currentObservers = Array.from(this.observers);\n }\n for (const observer of this.currentObservers) {\n observer.next(value);\n }\n }\n });\n }\n\n error(err: any) {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n this.hasError = this.isStopped = true;\n this.thrownError = err;\n const { observers } = this;\n while (observers.length) {\n observers.shift()!.error(err);\n }\n }\n });\n }\n\n complete() {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n this.isStopped = true;\n const { observers } = this;\n while (observers.length) {\n observers.shift()!.complete();\n }\n }\n });\n }\n\n unsubscribe() {\n this.isStopped = this.closed = true;\n this.observers = this.currentObservers = null!;\n }\n\n get observed() {\n return this.observers?.length > 0;\n }\n\n /** @internal */\n protected _trySubscribe(subscriber: Subscriber): TeardownLogic {\n this._throwIfClosed();\n return super._trySubscribe(subscriber);\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n this._throwIfClosed();\n this._checkFinalizedStatuses(subscriber);\n return this._innerSubscribe(subscriber);\n }\n\n /** @internal */\n protected _innerSubscribe(subscriber: Subscriber) {\n const { hasError, isStopped, observers } = this;\n if (hasError || isStopped) {\n return EMPTY_SUBSCRIPTION;\n }\n this.currentObservers = null;\n observers.push(subscriber);\n return new Subscription(() => {\n this.currentObservers = null;\n arrRemove(observers, subscriber);\n });\n }\n\n /** @internal */\n protected _checkFinalizedStatuses(subscriber: Subscriber) {\n const { hasError, thrownError, isStopped } = this;\n if (hasError) {\n subscriber.error(thrownError);\n } else if (isStopped) {\n subscriber.complete();\n }\n }\n\n /**\n * Creates a new Observable with this Subject as the source. You can do this\n * to create custom Observer-side logic of the Subject and conceal it from\n * code that uses the Observable.\n * @return {Observable} Observable that the Subject casts to\n */\n asObservable(): Observable {\n const observable: any = new Observable();\n observable.source = this;\n return observable;\n }\n}\n\n/**\n * @class AnonymousSubject\n */\nexport class AnonymousSubject extends Subject {\n constructor(\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n public destination?: Observer,\n source?: Observable\n ) {\n super();\n this.source = source;\n }\n\n next(value: T) {\n this.destination?.next?.(value);\n }\n\n error(err: any) {\n this.destination?.error?.(err);\n }\n\n complete() {\n this.destination?.complete?.();\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n return this.source?.subscribe(subscriber) ?? EMPTY_SUBSCRIPTION;\n }\n}\n", "import { Subject } from './Subject';\nimport { Subscriber } from './Subscriber';\nimport { Subscription } from './Subscription';\n\n/**\n * A variant of Subject that requires an initial value and emits its current\n * value whenever it is subscribed to.\n *\n * @class BehaviorSubject\n */\nexport class BehaviorSubject extends Subject {\n constructor(private _value: T) {\n super();\n }\n\n get value(): T {\n return this.getValue();\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n const subscription = super._subscribe(subscriber);\n !subscription.closed && subscriber.next(this._value);\n return subscription;\n }\n\n getValue(): T {\n const { hasError, thrownError, _value } = this;\n if (hasError) {\n throw thrownError;\n }\n this._throwIfClosed();\n return _value;\n }\n\n next(value: T): void {\n super.next((this._value = value));\n }\n}\n", "import { TimestampProvider } from '../types';\n\ninterface DateTimestampProvider extends TimestampProvider {\n delegate: TimestampProvider | undefined;\n}\n\nexport const dateTimestampProvider: DateTimestampProvider = {\n now() {\n // Use the variable rather than `this` so that the function can be called\n // without being bound to the provider.\n return (dateTimestampProvider.delegate || Date).now();\n },\n delegate: undefined,\n};\n", "import { Subject } from './Subject';\nimport { TimestampProvider } from './types';\nimport { Subscriber } from './Subscriber';\nimport { Subscription } from './Subscription';\nimport { dateTimestampProvider } from './scheduler/dateTimestampProvider';\n\n/**\n * A variant of {@link Subject} that \"replays\" old values to new subscribers by emitting them when they first subscribe.\n *\n * `ReplaySubject` has an internal buffer that will store a specified number of values that it has observed. Like `Subject`,\n * `ReplaySubject` \"observes\" values by having them passed to its `next` method. When it observes a value, it will store that\n * value for a time determined by the configuration of the `ReplaySubject`, as passed to its constructor.\n *\n * When a new subscriber subscribes to the `ReplaySubject` instance, it will synchronously emit all values in its buffer in\n * a First-In-First-Out (FIFO) manner. The `ReplaySubject` will also complete, if it has observed completion; and it will\n * error if it has observed an error.\n *\n * There are two main configuration items to be concerned with:\n *\n * 1. `bufferSize` - This will determine how many items are stored in the buffer, defaults to infinite.\n * 2. `windowTime` - The amount of time to hold a value in the buffer before removing it from the buffer.\n *\n * Both configurations may exist simultaneously. So if you would like to buffer a maximum of 3 values, as long as the values\n * are less than 2 seconds old, you could do so with a `new ReplaySubject(3, 2000)`.\n *\n * ### Differences with BehaviorSubject\n *\n * `BehaviorSubject` is similar to `new ReplaySubject(1)`, with a couple of exceptions:\n *\n * 1. `BehaviorSubject` comes \"primed\" with a single value upon construction.\n * 2. `ReplaySubject` will replay values, even after observing an error, where `BehaviorSubject` will not.\n *\n * @see {@link Subject}\n * @see {@link BehaviorSubject}\n * @see {@link shareReplay}\n */\nexport class ReplaySubject extends Subject {\n private _buffer: (T | number)[] = [];\n private _infiniteTimeWindow = true;\n\n /**\n * @param bufferSize The size of the buffer to replay on subscription\n * @param windowTime The amount of time the buffered items will stay buffered\n * @param timestampProvider An object with a `now()` method that provides the current timestamp. This is used to\n * calculate the amount of time something has been buffered.\n */\n constructor(\n private _bufferSize = Infinity,\n private _windowTime = Infinity,\n private _timestampProvider: TimestampProvider = dateTimestampProvider\n ) {\n super();\n this._infiniteTimeWindow = _windowTime === Infinity;\n this._bufferSize = Math.max(1, _bufferSize);\n this._windowTime = Math.max(1, _windowTime);\n }\n\n next(value: T): void {\n const { isStopped, _buffer, _infiniteTimeWindow, _timestampProvider, _windowTime } = this;\n if (!isStopped) {\n _buffer.push(value);\n !_infiniteTimeWindow && _buffer.push(_timestampProvider.now() + _windowTime);\n }\n this._trimBuffer();\n super.next(value);\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n this._throwIfClosed();\n this._trimBuffer();\n\n const subscription = this._innerSubscribe(subscriber);\n\n const { _infiniteTimeWindow, _buffer } = this;\n // We use a copy here, so reentrant code does not mutate our array while we're\n // emitting it to a new subscriber.\n const copy = _buffer.slice();\n for (let i = 0; i < copy.length && !subscriber.closed; i += _infiniteTimeWindow ? 1 : 2) {\n subscriber.next(copy[i] as T);\n }\n\n this._checkFinalizedStatuses(subscriber);\n\n return subscription;\n }\n\n private _trimBuffer() {\n const { _bufferSize, _timestampProvider, _buffer, _infiniteTimeWindow } = this;\n // If we don't have an infinite buffer size, and we're over the length,\n // use splice to truncate the old buffer values off. Note that we have to\n // double the size for instances where we're not using an infinite time window\n // because we're storing the values and the timestamps in the same array.\n const adjustedBufferSize = (_infiniteTimeWindow ? 1 : 2) * _bufferSize;\n _bufferSize < Infinity && adjustedBufferSize < _buffer.length && _buffer.splice(0, _buffer.length - adjustedBufferSize);\n\n // Now, if we're not in an infinite time window, remove all values where the time is\n // older than what is allowed.\n if (!_infiniteTimeWindow) {\n const now = _timestampProvider.now();\n let last = 0;\n // Search the array for the first timestamp that isn't expired and\n // truncate the buffer up to that point.\n for (let i = 1; i < _buffer.length && (_buffer[i] as number) <= now; i += 2) {\n last = i;\n }\n last && _buffer.splice(0, last + 1);\n }\n }\n}\n", "import { Scheduler } from '../Scheduler';\nimport { Subscription } from '../Subscription';\nimport { SchedulerAction } from '../types';\n\n/**\n * A unit of work to be executed in a `scheduler`. An action is typically\n * created from within a {@link SchedulerLike} and an RxJS user does not need to concern\n * themselves about creating and manipulating an Action.\n *\n * ```ts\n * class Action extends Subscription {\n * new (scheduler: Scheduler, work: (state?: T) => void);\n * schedule(state?: T, delay: number = 0): Subscription;\n * }\n * ```\n *\n * @class Action\n */\nexport class Action extends Subscription {\n constructor(scheduler: Scheduler, work: (this: SchedulerAction, state?: T) => void) {\n super();\n }\n /**\n * Schedules this action on its parent {@link SchedulerLike} for execution. May be passed\n * some context object, `state`. May happen at some point in the future,\n * according to the `delay` parameter, if specified.\n * @param {T} [state] Some contextual data that the `work` function uses when\n * called by the Scheduler.\n * @param {number} [delay] Time to wait before executing the work, where the\n * time unit is implicit and defined by the Scheduler.\n * @return {void}\n */\n public schedule(state?: T, delay: number = 0): Subscription {\n return this;\n }\n}\n", "import type { TimerHandle } from './timerHandle';\ntype SetIntervalFunction = (handler: () => void, timeout?: number, ...args: any[]) => TimerHandle;\ntype ClearIntervalFunction = (handle: TimerHandle) => void;\n\ninterface IntervalProvider {\n setInterval: SetIntervalFunction;\n clearInterval: ClearIntervalFunction;\n delegate:\n | {\n setInterval: SetIntervalFunction;\n clearInterval: ClearIntervalFunction;\n }\n | undefined;\n}\n\nexport const intervalProvider: IntervalProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n setInterval(handler: () => void, timeout?: number, ...args) {\n const { delegate } = intervalProvider;\n if (delegate?.setInterval) {\n return delegate.setInterval(handler, timeout, ...args);\n }\n return setInterval(handler, timeout, ...args);\n },\n clearInterval(handle) {\n const { delegate } = intervalProvider;\n return (delegate?.clearInterval || clearInterval)(handle as any);\n },\n delegate: undefined,\n};\n", "import { Action } from './Action';\nimport { SchedulerAction } from '../types';\nimport { Subscription } from '../Subscription';\nimport { AsyncScheduler } from './AsyncScheduler';\nimport { intervalProvider } from './intervalProvider';\nimport { arrRemove } from '../util/arrRemove';\nimport { TimerHandle } from './timerHandle';\n\nexport class AsyncAction extends Action {\n public id: TimerHandle | undefined;\n public state?: T;\n // @ts-ignore: Property has no initializer and is not definitely assigned\n public delay: number;\n protected pending: boolean = false;\n\n constructor(protected scheduler: AsyncScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n public schedule(state?: T, delay: number = 0): Subscription {\n if (this.closed) {\n return this;\n }\n\n // Always replace the current state with the new state.\n this.state = state;\n\n const id = this.id;\n const scheduler = this.scheduler;\n\n //\n // Important implementation note:\n //\n // Actions only execute once by default, unless rescheduled from within the\n // scheduled callback. This allows us to implement single and repeat\n // actions via the same code path, without adding API surface area, as well\n // as mimic traditional recursion but across asynchronous boundaries.\n //\n // However, JS runtimes and timers distinguish between intervals achieved by\n // serial `setTimeout` calls vs. a single `setInterval` call. An interval of\n // serial `setTimeout` calls can be individually delayed, which delays\n // scheduling the next `setTimeout`, and so on. `setInterval` attempts to\n // guarantee the interval callback will be invoked more precisely to the\n // interval period, regardless of load.\n //\n // Therefore, we use `setInterval` to schedule single and repeat actions.\n // If the action reschedules itself with the same delay, the interval is not\n // canceled. If the action doesn't reschedule, or reschedules with a\n // different delay, the interval will be canceled after scheduled callback\n // execution.\n //\n if (id != null) {\n this.id = this.recycleAsyncId(scheduler, id, delay);\n }\n\n // Set the pending flag indicating that this action has been scheduled, or\n // has recursively rescheduled itself.\n this.pending = true;\n\n this.delay = delay;\n // If this action has already an async Id, don't request a new one.\n this.id = this.id ?? this.requestAsyncId(scheduler, this.id, delay);\n\n return this;\n }\n\n protected requestAsyncId(scheduler: AsyncScheduler, _id?: TimerHandle, delay: number = 0): TimerHandle {\n return intervalProvider.setInterval(scheduler.flush.bind(scheduler, this), delay);\n }\n\n protected recycleAsyncId(_scheduler: AsyncScheduler, id?: TimerHandle, delay: number | null = 0): TimerHandle | undefined {\n // If this action is rescheduled with the same delay time, don't clear the interval id.\n if (delay != null && this.delay === delay && this.pending === false) {\n return id;\n }\n // Otherwise, if the action's delay time is different from the current delay,\n // or the action has been rescheduled before it's executed, clear the interval id\n if (id != null) {\n intervalProvider.clearInterval(id);\n }\n\n return undefined;\n }\n\n /**\n * Immediately executes this action and the `work` it contains.\n * @return {any}\n */\n public execute(state: T, delay: number): any {\n if (this.closed) {\n return new Error('executing a cancelled action');\n }\n\n this.pending = false;\n const error = this._execute(state, delay);\n if (error) {\n return error;\n } else if (this.pending === false && this.id != null) {\n // Dequeue if the action didn't reschedule itself. Don't call\n // unsubscribe(), because the action could reschedule later.\n // For example:\n // ```\n // scheduler.schedule(function doWork(counter) {\n // /* ... I'm a busy worker bee ... */\n // var originalAction = this;\n // /* wait 100ms before rescheduling the action */\n // setTimeout(function () {\n // originalAction.schedule(counter + 1);\n // }, 100);\n // }, 1000);\n // ```\n this.id = this.recycleAsyncId(this.scheduler, this.id, null);\n }\n }\n\n protected _execute(state: T, _delay: number): any {\n let errored: boolean = false;\n let errorValue: any;\n try {\n this.work(state);\n } catch (e) {\n errored = true;\n // HACK: Since code elsewhere is relying on the \"truthiness\" of the\n // return here, we can't have it return \"\" or 0 or false.\n // TODO: Clean this up when we refactor schedulers mid-version-8 or so.\n errorValue = e ? e : new Error('Scheduled action threw falsy error');\n }\n if (errored) {\n this.unsubscribe();\n return errorValue;\n }\n }\n\n unsubscribe() {\n if (!this.closed) {\n const { id, scheduler } = this;\n const { actions } = scheduler;\n\n this.work = this.state = this.scheduler = null!;\n this.pending = false;\n\n arrRemove(actions, this);\n if (id != null) {\n this.id = this.recycleAsyncId(scheduler, id, null);\n }\n\n this.delay = null!;\n super.unsubscribe();\n }\n }\n}\n", "import { Action } from './scheduler/Action';\nimport { Subscription } from './Subscription';\nimport { SchedulerLike, SchedulerAction } from './types';\nimport { dateTimestampProvider } from './scheduler/dateTimestampProvider';\n\n/**\n * An execution context and a data structure to order tasks and schedule their\n * execution. Provides a notion of (potentially virtual) time, through the\n * `now()` getter method.\n *\n * Each unit of work in a Scheduler is called an `Action`.\n *\n * ```ts\n * class Scheduler {\n * now(): number;\n * schedule(work, delay?, state?): Subscription;\n * }\n * ```\n *\n * @class Scheduler\n * @deprecated Scheduler is an internal implementation detail of RxJS, and\n * should not be used directly. Rather, create your own class and implement\n * {@link SchedulerLike}. Will be made internal in v8.\n */\nexport class Scheduler implements SchedulerLike {\n public static now: () => number = dateTimestampProvider.now;\n\n constructor(private schedulerActionCtor: typeof Action, now: () => number = Scheduler.now) {\n this.now = now;\n }\n\n /**\n * A getter method that returns a number representing the current time\n * (at the time this function was called) according to the scheduler's own\n * internal clock.\n * @return {number} A number that represents the current time. May or may not\n * have a relation to wall-clock time. May or may not refer to a time unit\n * (e.g. milliseconds).\n */\n public now: () => number;\n\n /**\n * Schedules a function, `work`, for execution. May happen at some point in\n * the future, according to the `delay` parameter, if specified. May be passed\n * some context object, `state`, which will be passed to the `work` function.\n *\n * The given arguments will be processed an stored as an Action object in a\n * queue of actions.\n *\n * @param {function(state: ?T): ?Subscription} work A function representing a\n * task, or some unit of work to be executed by the Scheduler.\n * @param {number} [delay] Time to wait before executing the work, where the\n * time unit is implicit and defined by the Scheduler itself.\n * @param {T} [state] Some contextual data that the `work` function uses when\n * called by the Scheduler.\n * @return {Subscription} A subscription in order to be able to unsubscribe\n * the scheduled work.\n */\n public schedule(work: (this: SchedulerAction, state?: T) => void, delay: number = 0, state?: T): Subscription {\n return new this.schedulerActionCtor(this, work).schedule(state, delay);\n }\n}\n", "import { Scheduler } from '../Scheduler';\nimport { Action } from './Action';\nimport { AsyncAction } from './AsyncAction';\nimport { TimerHandle } from './timerHandle';\n\nexport class AsyncScheduler extends Scheduler {\n public actions: Array> = [];\n /**\n * A flag to indicate whether the Scheduler is currently executing a batch of\n * queued actions.\n * @type {boolean}\n * @internal\n */\n public _active: boolean = false;\n /**\n * An internal ID used to track the latest asynchronous task such as those\n * coming from `setTimeout`, `setInterval`, `requestAnimationFrame`, and\n * others.\n * @type {any}\n * @internal\n */\n public _scheduled: TimerHandle | undefined;\n\n constructor(SchedulerAction: typeof Action, now: () => number = Scheduler.now) {\n super(SchedulerAction, now);\n }\n\n public flush(action: AsyncAction): void {\n const { actions } = this;\n\n if (this._active) {\n actions.push(action);\n return;\n }\n\n let error: any;\n this._active = true;\n\n do {\n if ((error = action.execute(action.state, action.delay))) {\n break;\n }\n } while ((action = actions.shift()!)); // exhaust the scheduler queue\n\n this._active = false;\n\n if (error) {\n while ((action = actions.shift()!)) {\n action.unsubscribe();\n }\n throw error;\n }\n }\n}\n", "import { AsyncAction } from './AsyncAction';\nimport { AsyncScheduler } from './AsyncScheduler';\n\n/**\n *\n * Async Scheduler\n *\n * Schedule task as if you used setTimeout(task, duration)\n *\n * `async` scheduler schedules tasks asynchronously, by putting them on the JavaScript\n * event loop queue. It is best used to delay tasks in time or to schedule tasks repeating\n * in intervals.\n *\n * If you just want to \"defer\" task, that is to perform it right after currently\n * executing synchronous code ends (commonly achieved by `setTimeout(deferredTask, 0)`),\n * better choice will be the {@link asapScheduler} scheduler.\n *\n * ## Examples\n * Use async scheduler to delay task\n * ```ts\n * import { asyncScheduler } from 'rxjs';\n *\n * const task = () => console.log('it works!');\n *\n * asyncScheduler.schedule(task, 2000);\n *\n * // After 2 seconds logs:\n * // \"it works!\"\n * ```\n *\n * Use async scheduler to repeat task in intervals\n * ```ts\n * import { asyncScheduler } from 'rxjs';\n *\n * function task(state) {\n * console.log(state);\n * this.schedule(state + 1, 1000); // `this` references currently executing Action,\n * // which we reschedule with new state and delay\n * }\n *\n * asyncScheduler.schedule(task, 3000, 0);\n *\n * // Logs:\n * // 0 after 3s\n * // 1 after 4s\n * // 2 after 5s\n * // 3 after 6s\n * ```\n */\n\nexport const asyncScheduler = new AsyncScheduler(AsyncAction);\n\n/**\n * @deprecated Renamed to {@link asyncScheduler}. Will be removed in v8.\n */\nexport const async = asyncScheduler;\n", "import { AsyncAction } from './AsyncAction';\nimport { Subscription } from '../Subscription';\nimport { QueueScheduler } from './QueueScheduler';\nimport { SchedulerAction } from '../types';\nimport { TimerHandle } from './timerHandle';\n\nexport class QueueAction extends AsyncAction {\n constructor(protected scheduler: QueueScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n public schedule(state?: T, delay: number = 0): Subscription {\n if (delay > 0) {\n return super.schedule(state, delay);\n }\n this.delay = delay;\n this.state = state;\n this.scheduler.flush(this);\n return this;\n }\n\n public execute(state: T, delay: number): any {\n return delay > 0 || this.closed ? super.execute(state, delay) : this._execute(state, delay);\n }\n\n protected requestAsyncId(scheduler: QueueScheduler, id?: TimerHandle, delay: number = 0): TimerHandle {\n // If delay exists and is greater than 0, or if the delay is null (the\n // action wasn't rescheduled) but was originally scheduled as an async\n // action, then recycle as an async action.\n\n if ((delay != null && delay > 0) || (delay == null && this.delay > 0)) {\n return super.requestAsyncId(scheduler, id, delay);\n }\n\n // Otherwise flush the scheduler starting with this action.\n scheduler.flush(this);\n\n // HACK: In the past, this was returning `void`. However, `void` isn't a valid\n // `TimerHandle`, and generally the return value here isn't really used. So the\n // compromise is to return `0` which is both \"falsy\" and a valid `TimerHandle`,\n // as opposed to refactoring every other instanceo of `requestAsyncId`.\n return 0;\n }\n}\n", "import { AsyncScheduler } from './AsyncScheduler';\n\nexport class QueueScheduler extends AsyncScheduler {\n}\n", "import { QueueAction } from './QueueAction';\nimport { QueueScheduler } from './QueueScheduler';\n\n/**\n *\n * Queue Scheduler\n *\n * Put every next task on a queue, instead of executing it immediately\n *\n * `queue` scheduler, when used with delay, behaves the same as {@link asyncScheduler} scheduler.\n *\n * When used without delay, it schedules given task synchronously - executes it right when\n * it is scheduled. However when called recursively, that is when inside the scheduled task,\n * another task is scheduled with queue scheduler, instead of executing immediately as well,\n * that task will be put on a queue and wait for current one to finish.\n *\n * This means that when you execute task with `queue` scheduler, you are sure it will end\n * before any other task scheduled with that scheduler will start.\n *\n * ## Examples\n * Schedule recursively first, then do something\n * ```ts\n * import { queueScheduler } from 'rxjs';\n *\n * queueScheduler.schedule(() => {\n * queueScheduler.schedule(() => console.log('second')); // will not happen now, but will be put on a queue\n *\n * console.log('first');\n * });\n *\n * // Logs:\n * // \"first\"\n * // \"second\"\n * ```\n *\n * Reschedule itself recursively\n * ```ts\n * import { queueScheduler } from 'rxjs';\n *\n * queueScheduler.schedule(function(state) {\n * if (state !== 0) {\n * console.log('before', state);\n * this.schedule(state - 1); // `this` references currently executing Action,\n * // which we reschedule with new state\n * console.log('after', state);\n * }\n * }, 0, 3);\n *\n * // In scheduler that runs recursively, you would expect:\n * // \"before\", 3\n * // \"before\", 2\n * // \"before\", 1\n * // \"after\", 1\n * // \"after\", 2\n * // \"after\", 3\n *\n * // But with queue it logs:\n * // \"before\", 3\n * // \"after\", 3\n * // \"before\", 2\n * // \"after\", 2\n * // \"before\", 1\n * // \"after\", 1\n * ```\n */\n\nexport const queueScheduler = new QueueScheduler(QueueAction);\n\n/**\n * @deprecated Renamed to {@link queueScheduler}. Will be removed in v8.\n */\nexport const queue = queueScheduler;\n", "import { AsyncAction } from './AsyncAction';\nimport { AnimationFrameScheduler } from './AnimationFrameScheduler';\nimport { SchedulerAction } from '../types';\nimport { animationFrameProvider } from './animationFrameProvider';\nimport { TimerHandle } from './timerHandle';\n\nexport class AnimationFrameAction extends AsyncAction {\n constructor(protected scheduler: AnimationFrameScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n protected requestAsyncId(scheduler: AnimationFrameScheduler, id?: TimerHandle, delay: number = 0): TimerHandle {\n // If delay is greater than 0, request as an async action.\n if (delay !== null && delay > 0) {\n return super.requestAsyncId(scheduler, id, delay);\n }\n // Push the action to the end of the scheduler queue.\n scheduler.actions.push(this);\n // If an animation frame has already been requested, don't request another\n // one. If an animation frame hasn't been requested yet, request one. Return\n // the current animation frame request id.\n return scheduler._scheduled || (scheduler._scheduled = animationFrameProvider.requestAnimationFrame(() => scheduler.flush(undefined)));\n }\n\n protected recycleAsyncId(scheduler: AnimationFrameScheduler, id?: TimerHandle, delay: number = 0): TimerHandle | undefined {\n // If delay exists and is greater than 0, or if the delay is null (the\n // action wasn't rescheduled) but was originally scheduled as an async\n // action, then recycle as an async action.\n if (delay != null ? delay > 0 : this.delay > 0) {\n return super.recycleAsyncId(scheduler, id, delay);\n }\n // If the scheduler queue has no remaining actions with the same async id,\n // cancel the requested animation frame and set the scheduled flag to\n // undefined so the next AnimationFrameAction will request its own.\n const { actions } = scheduler;\n if (id != null && actions[actions.length - 1]?.id !== id) {\n animationFrameProvider.cancelAnimationFrame(id as number);\n scheduler._scheduled = undefined;\n }\n // Return undefined so the action knows to request a new async id if it's rescheduled.\n return undefined;\n }\n}\n", "import { AsyncAction } from './AsyncAction';\nimport { AsyncScheduler } from './AsyncScheduler';\n\nexport class AnimationFrameScheduler extends AsyncScheduler {\n public flush(action?: AsyncAction): void {\n this._active = true;\n // The async id that effects a call to flush is stored in _scheduled.\n // Before executing an action, it's necessary to check the action's async\n // id to determine whether it's supposed to be executed in the current\n // flush.\n // Previous implementations of this method used a count to determine this,\n // but that was unsound, as actions that are unsubscribed - i.e. cancelled -\n // are removed from the actions array and that can shift actions that are\n // scheduled to be executed in a subsequent flush into positions at which\n // they are executed within the current flush.\n const flushId = this._scheduled;\n this._scheduled = undefined;\n\n const { actions } = this;\n let error: any;\n action = action || actions.shift()!;\n\n do {\n if ((error = action.execute(action.state, action.delay))) {\n break;\n }\n } while ((action = actions[0]) && action.id === flushId && actions.shift());\n\n this._active = false;\n\n if (error) {\n while ((action = actions[0]) && action.id === flushId && actions.shift()) {\n action.unsubscribe();\n }\n throw error;\n }\n }\n}\n", "import { AnimationFrameAction } from './AnimationFrameAction';\nimport { AnimationFrameScheduler } from './AnimationFrameScheduler';\n\n/**\n *\n * Animation Frame Scheduler\n *\n * Perform task when `window.requestAnimationFrame` would fire\n *\n * When `animationFrame` scheduler is used with delay, it will fall back to {@link asyncScheduler} scheduler\n * behaviour.\n *\n * Without delay, `animationFrame` scheduler can be used to create smooth browser animations.\n * It makes sure scheduled task will happen just before next browser content repaint,\n * thus performing animations as efficiently as possible.\n *\n * ## Example\n * Schedule div height animation\n * ```ts\n * // html:
\n * import { animationFrameScheduler } from 'rxjs';\n *\n * const div = document.querySelector('div');\n *\n * animationFrameScheduler.schedule(function(height) {\n * div.style.height = height + \"px\";\n *\n * this.schedule(height + 1); // `this` references currently executing Action,\n * // which we reschedule with new state\n * }, 0, 0);\n *\n * // You will see a div element growing in height\n * ```\n */\n\nexport const animationFrameScheduler = new AnimationFrameScheduler(AnimationFrameAction);\n\n/**\n * @deprecated Renamed to {@link animationFrameScheduler}. Will be removed in v8.\n */\nexport const animationFrame = animationFrameScheduler;\n", "import { Observable } from '../Observable';\nimport { SchedulerLike } from '../types';\n\n/**\n * A simple Observable that emits no items to the Observer and immediately\n * emits a complete notification.\n *\n * Just emits 'complete', and nothing else.\n *\n * ![](empty.png)\n *\n * A simple Observable that only emits the complete notification. It can be used\n * for composing with other Observables, such as in a {@link mergeMap}.\n *\n * ## Examples\n *\n * Log complete notification\n *\n * ```ts\n * import { EMPTY } from 'rxjs';\n *\n * EMPTY.subscribe({\n * next: () => console.log('Next'),\n * complete: () => console.log('Complete!')\n * });\n *\n * // Outputs\n * // Complete!\n * ```\n *\n * Emit the number 7, then complete\n *\n * ```ts\n * import { EMPTY, startWith } from 'rxjs';\n *\n * const result = EMPTY.pipe(startWith(7));\n * result.subscribe(x => console.log(x));\n *\n * // Outputs\n * // 7\n * ```\n *\n * Map and flatten only odd numbers to the sequence `'a'`, `'b'`, `'c'`\n *\n * ```ts\n * import { interval, mergeMap, of, EMPTY } from 'rxjs';\n *\n * const interval$ = interval(1000);\n * const result = interval$.pipe(\n * mergeMap(x => x % 2 === 1 ? of('a', 'b', 'c') : EMPTY),\n * );\n * result.subscribe(x => console.log(x));\n *\n * // Results in the following to the console:\n * // x is equal to the count on the interval, e.g. (0, 1, 2, 3, ...)\n * // x will occur every 1000ms\n * // if x % 2 is equal to 1, print a, b, c (each on its own)\n * // if x % 2 is not equal to 1, nothing will be output\n * ```\n *\n * @see {@link Observable}\n * @see {@link NEVER}\n * @see {@link of}\n * @see {@link throwError}\n */\nexport const EMPTY = new Observable((subscriber) => subscriber.complete());\n\n/**\n * @param scheduler A {@link SchedulerLike} to use for scheduling\n * the emission of the complete notification.\n * @deprecated Replaced with the {@link EMPTY} constant or {@link scheduled} (e.g. `scheduled([], scheduler)`). Will be removed in v8.\n */\nexport function empty(scheduler?: SchedulerLike) {\n return scheduler ? emptyScheduled(scheduler) : EMPTY;\n}\n\nfunction emptyScheduled(scheduler: SchedulerLike) {\n return new Observable((subscriber) => scheduler.schedule(() => subscriber.complete()));\n}\n", "import { SchedulerLike } from '../types';\nimport { isFunction } from './isFunction';\n\nexport function isScheduler(value: any): value is SchedulerLike {\n return value && isFunction(value.schedule);\n}\n", "import { SchedulerLike } from '../types';\nimport { isFunction } from './isFunction';\nimport { isScheduler } from './isScheduler';\n\nfunction last(arr: T[]): T | undefined {\n return arr[arr.length - 1];\n}\n\nexport function popResultSelector(args: any[]): ((...args: unknown[]) => unknown) | undefined {\n return isFunction(last(args)) ? args.pop() : undefined;\n}\n\nexport function popScheduler(args: any[]): SchedulerLike | undefined {\n return isScheduler(last(args)) ? args.pop() : undefined;\n}\n\nexport function popNumber(args: any[], defaultValue: number): number {\n return typeof last(args) === 'number' ? args.pop()! : defaultValue;\n}\n", "export const isArrayLike = ((x: any): x is ArrayLike => x && typeof x.length === 'number' && typeof x !== 'function');", "import { isFunction } from \"./isFunction\";\n\n/**\n * Tests to see if the object is \"thennable\".\n * @param value the object to test\n */\nexport function isPromise(value: any): value is PromiseLike {\n return isFunction(value?.then);\n}\n", "import { InteropObservable } from '../types';\nimport { observable as Symbol_observable } from '../symbol/observable';\nimport { isFunction } from './isFunction';\n\n/** Identifies an input as being Observable (but not necessary an Rx Observable) */\nexport function isInteropObservable(input: any): input is InteropObservable {\n return isFunction(input[Symbol_observable]);\n}\n", "import { isFunction } from './isFunction';\n\nexport function isAsyncIterable(obj: any): obj is AsyncIterable {\n return Symbol.asyncIterator && isFunction(obj?.[Symbol.asyncIterator]);\n}\n", "/**\n * Creates the TypeError to throw if an invalid object is passed to `from` or `scheduled`.\n * @param input The object that was passed.\n */\nexport function createInvalidObservableTypeError(input: any) {\n // TODO: We should create error codes that can be looked up, so this can be less verbose.\n return new TypeError(\n `You provided ${\n input !== null && typeof input === 'object' ? 'an invalid object' : `'${input}'`\n } where a stream was expected. You can provide an Observable, Promise, ReadableStream, Array, AsyncIterable, or Iterable.`\n );\n}\n", "export function getSymbolIterator(): symbol {\n if (typeof Symbol !== 'function' || !Symbol.iterator) {\n return '@@iterator' as any;\n }\n\n return Symbol.iterator;\n}\n\nexport const iterator = getSymbolIterator();\n", "import { iterator as Symbol_iterator } from '../symbol/iterator';\nimport { isFunction } from './isFunction';\n\n/** Identifies an input as being an Iterable */\nexport function isIterable(input: any): input is Iterable {\n return isFunction(input?.[Symbol_iterator]);\n}\n", "import { ReadableStreamLike } from '../types';\nimport { isFunction } from './isFunction';\n\nexport async function* readableStreamLikeToAsyncGenerator(readableStream: ReadableStreamLike): AsyncGenerator {\n const reader = readableStream.getReader();\n try {\n while (true) {\n const { value, done } = await reader.read();\n if (done) {\n return;\n }\n yield value!;\n }\n } finally {\n reader.releaseLock();\n }\n}\n\nexport function isReadableStreamLike(obj: any): obj is ReadableStreamLike {\n // We don't want to use instanceof checks because they would return\n // false for instances from another Realm, like an +

+

The nav2 example is located in the pixi repository.

+
+

Move to the example folder

+
cd pixi/examples/ros2-nav2
+
+

Run the start command

+
pixi run start
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/advanced_tasks/index.html b/v0.39.2/features/advanced_tasks/index.html new file mode 100644 index 000000000..f2eae6a45 --- /dev/null +++ b/v0.39.2/features/advanced_tasks/index.html @@ -0,0 +1,2075 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Advanced tasks - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+ +
+
+ + + +
+
+ + + + + + + + + + + + +

Tasks

+ +

When building a package, you often have to do more than just run the code. +Steps like formatting, linting, compiling, testing, benchmarking, etc. are often part of a project. +With pixi tasks, this should become much easier to do.

+

Here are some quick examples

+
pixi.toml
[tasks]
+# Commands as lists so you can also add documentation in between.
+configure = { cmd = [
+    "cmake",
+    # Use the cross-platform Ninja generator
+    "-G",
+    "Ninja",
+    # The source is in the root directory
+    "-S",
+    ".",
+    # We wanna build in the .build directory
+    "-B",
+    ".build",
+] }
+
+# Depend on other tasks
+build = { cmd = ["ninja", "-C", ".build"], depends-on = ["configure"] }
+
+# Using environment variables
+run = "python main.py $PIXI_PROJECT_ROOT"
+set = "export VAR=hello && echo $VAR"
+
+# Cross platform file operations
+copy = "cp pixi.toml pixi_backup.toml"
+clean = "rm pixi_backup.toml"
+move = "mv pixi.toml backup.toml"
+
+

Depends on#

+

Just like packages can depend on other packages, our tasks can depend on other tasks. +This allows for complete pipelines to be run with a single command.

+

An obvious example is compiling before running an application.

+

Checkout our cpp_sdl example for a running example. +In that package we have some tasks that depend on each other, so we can assure that when you run pixi run start everything is set up as expected.

+
pixi task add configure "cmake -G Ninja -S . -B .build"
+pixi task add build "ninja -C .build" --depends-on configure
+pixi task add start ".build/bin/sdl_example" --depends-on build
+
+

Results in the following lines added to the pixi.toml

+
pixi.toml
[tasks]
+# Configures CMake
+configure = "cmake -G Ninja -S . -B .build"
+# Build the executable but make sure CMake is configured first.
+build = { cmd = "ninja -C .build", depends-on = ["configure"] }
+# Start the built executable
+start = { cmd = ".build/bin/sdl_example", depends-on = ["build"] }
+
+
pixi run start
+
+

The tasks will be executed after each other:

+
    +
  • First configure because it has no dependencies.
  • +
  • Then build as it only depends on configure.
  • +
  • Then start as all it dependencies are run.
  • +
+

If one of the commands fails (exit with non-zero code.) it will stop and the next one will not be started.

+

With this logic, you can also create aliases as you don't have to specify any command in a task.

+
pixi task add fmt ruff
+pixi task add lint pylint
+
+
pixi task alias style fmt lint
+
+

Results in the following pixi.toml.

+
pixi.toml
fmt = "ruff"
+lint = "pylint"
+style = { depends-on = ["fmt", "lint"] }
+
+

Now run both tools with one command.

+
pixi run style
+
+

Working directory#

+

Pixi tasks support the definition of a working directory.

+

cwd" stands for Current Working Directory. +The directory is relative to the pixi package root, where the pixi.toml file is located.

+

Consider a pixi project structured as follows:

+
├── pixi.toml
+└── scripts
+    └── bar.py
+
+

To add a task to run the bar.py file, use:

+
pixi task add bar "python bar.py" --cwd scripts
+
+

This will add the following line to manifest file:

+
pixi.toml
[tasks]
+bar = { cmd = "python bar.py", cwd = "scripts" }
+
+

Caching#

+

When you specify inputs and/or outputs to a task, pixi will reuse the result of the task.

+

For the cache, pixi checks that the following are true:

+
    +
  • No package in the environment has changed.
  • +
  • The selected inputs and outputs are the same as the last time the task was + run. We compute fingerprints of all the files selected by the globs and + compare them to the last time the task was run.
  • +
  • The command is the same as the last time the task was run.
  • +
+

If all of these conditions are met, pixi will not run the task again and instead use the existing result.

+

Inputs and outputs can be specified as globs, which will be expanded to all matching files.

+
pixi.toml
[tasks]
+# This task will only run if the `main.py` file has changed.
+run = { cmd = "python main.py", inputs = ["main.py"] }
+
+# This task will remember the result of the `curl` command and not run it again if the file `data.csv` already exists.
+download_data = { cmd = "curl -o data.csv https://example.com/data.csv", outputs = ["data.csv"] }
+
+# This task will only run if the `src` directory has changed and will remember the result of the `make` command.
+build = { cmd = "make", inputs = ["src/*.cpp", "include/*.hpp"], outputs = ["build/app.exe"] }
+
+

Note: if you want to debug the globs you can use the --verbose flag to see which files are selected.

+
# shows info logs of all files that were selected by the globs
+pixi run -v start
+
+

Environment variables#

+

You can set environment variables for a task. +These are seen as "default" values for the variables as you can overwrite them from the shell.

+

pixi.toml
[tasks]
+echo = { cmd = "echo $ARGUMENT", env = { ARGUMENT = "hello" } }
+
+If you run pixi run echo it will output hello. +When you set the environment variable ARGUMENT before running the task, it will use that value instead.

+
ARGUMENT=world pixi run echo
+ Pixi task (echo in default): echo $ARGUMENT
+world
+
+

These variables are not shared over tasks, so you need to define these for every task you want to use them in.

+
+

Extend instead of overwrite

+

If you use the same environment variable in the value as in the key of the map you will also overwrite the variable. +For example overwriting a PATH +

pixi.toml
[tasks]
+echo = { cmd = "echo $PATH", env = { PATH = "/tmp/path:$PATH" } }
+
+This will output /tmp/path:/usr/bin:/bin instead of the original /usr/bin:/bin.

+
+

Clean environment#

+

You can make sure the environment of a task is "pixi only". +Here pixi will only include the minimal required environment variables for your platform to run the command in. +The environment will contain all variables set by the conda environment like "CONDA_PREFIX". +It will however include some default values from the shell, like: +"DISPLAY", "LC_ALL", "LC_TIME", "LC_NUMERIC", "LC_MEASUREMENT", "SHELL", "USER", "USERNAME", "LOGNAME", "HOME", "HOSTNAME","TMPDIR", "XPC_SERVICE_NAME", "XPC_FLAGS"

+

[tasks]
+clean_command = { cmd = "python run_in_isolated_env.py", clean-env = true}
+
+This setting can also be set from the command line with pixi run --clean-env TASK_NAME.

+
+

clean-env not supported on Windows

+

On Windows it's hard to create a "clean environment" as conda-forge doesn't ship Windows compilers and Windows needs a lot of base variables. +Making this feature not worthy of implementing as the amount of edge cases will make it unusable.

+
+

Our task runner: deno_task_shell#

+

To support the different OS's (Windows, OSX and Linux), pixi integrates a shell that can run on all of them. +This is deno_task_shell. +The task shell is a limited implementation of a bourne-shell interface.

+

Built-in commands#

+

Next to running actual executable like ./myprogram, cmake or python the shell has some built-in commandos.

+
    +
  • cp: Copies files.
  • +
  • mv: Moves files.
  • +
  • rm: Remove files or directories. + Ex: rm -rf [FILE]... - Commonly used to recursively delete files or directories.
  • +
  • mkdir: Makes directories. + Ex. mkdir -p DIRECTORY... - Commonly used to make a directory and all its parents with no error if it exists.
  • +
  • pwd: Prints the name of the current/working directory.
  • +
  • sleep: Delays for a specified amount of time. + Ex. sleep 1 to sleep for 1 second, sleep 0.5 to sleep for half a second, or sleep 1m to sleep a minute
  • +
  • echo: Displays a line of text.
  • +
  • cat: Concatenates files and outputs them on stdout. When no arguments are provided, it reads and outputs stdin.
  • +
  • exit: Causes the shell to exit.
  • +
  • unset: Unsets environment variables.
  • +
  • xargs: Builds arguments from stdin and executes a command.
  • +
+

Syntax#

+
    +
  • Boolean list: use && or || to separate two commands.
      +
    • &&: if the command before && succeeds continue with the next command.
    • +
    • ||: if the command before || fails continue with the next command.
    • +
    +
  • +
  • Sequential lists: use ; to run two commands without checking if the first command failed or succeeded.
  • +
  • Environment variables:
      +
    • Set env variable using: export ENV_VAR=value
    • +
    • Use env variable using: $ENV_VAR
    • +
    • unset env variable using unset ENV_VAR
    • +
    +
  • +
  • Shell variables: Shell variables are similar to environment variables, but won’t be exported to spawned commands.
      +
    • Set them: VAR=value
    • +
    • use them: VAR=value && echo $VAR
    • +
    +
  • +
  • Pipelines: Use the stdout output of a command into the stdin a following command
      +
    • |: echo Hello | python receiving_app.py
    • +
    • |&: use this to also get the stderr as input.
    • +
    +
  • +
  • Command substitution: $() to use the output of a command as input for another command.
      +
    • python main.py $(git rev-parse HEAD)
    • +
    +
  • +
  • Negate exit code: ! before any command will negate the exit code from 1 to 0 or visa-versa.
  • +
  • Redirects: > to redirect the stdout to a file.
      +
    • echo hello > file.txt will put hello in file.txt and overwrite existing text.
    • +
    • python main.py 2> file.txt will put the stderr output in file.txt.
    • +
    • python main.py &> file.txt will put the stderr and stdout in file.txt.
    • +
    • echo hello >> file.txt will append hello to the existing file.txt.
    • +
    +
  • +
  • Glob expansion: * to expand all options.
      +
    • echo *.py will echo all filenames that end with .py
    • +
    • echo **/*.py will echo all filenames that end with .py in this directory and all descendant directories.
    • +
    • echo data[0-9].csv will echo all filenames that have a single number after data and before .csv
    • +
    +
  • +
+

More info in deno_task_shell documentation.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/environment/index.html b/v0.39.2/features/environment/index.html new file mode 100644 index 000000000..400ec4247 --- /dev/null +++ b/v0.39.2/features/environment/index.html @@ -0,0 +1,2081 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Environment - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Environments#

+

Pixi is a tool to manage virtual environments. +This document explains what an environment looks like and how to use it.

+

Structure#

+

A pixi environment is located in the .pixi/envs directory of the project by default. +This keeps your machine and your project clean and isolated from each other, and makes it easy to clean up after a project is done. +While this structure is generally recommended, environments can also be stored outside of project directories by enabling detached environments.

+

If you look at the .pixi/envs directory, you will see a directory for each environment, the default being the one that is normally used, if you specify a custom environment the name you specified will be used.

+
.pixi
+└── envs
+    ├── cuda
+       ├── bin
+       ├── conda-meta
+       ├── etc
+       ├── include
+       ├── lib
+       ...
+    └── default
+        ├── bin
+        ├── conda-meta
+        ├── etc
+        ├── include
+        ├── lib
+        ...
+
+

These directories are conda environments, and you can use them as such, but you cannot manually edit them, this should always go through the pixi.toml. +Pixi will always make sure the environment is in sync with the pixi.lock file. +If this is not the case then all the commands that use the environment will automatically update the environment, e.g. pixi run, pixi shell.

+

Environment Installation Metadata#

+

On environment installation, pixi will write a small file to the environment that contains some metadata about installation. +This file is called pixi and is located in the conda-meta folder of the environment. +This file contains the following information:

+
    +
  • manifest_path: The path to the manifest file that describes the project used to create this environment
  • +
  • environment_name: The name of the environment
  • +
  • pixi_version: The version of pixi that was used to create this environment
  • +
  • environment_lock_file_hash: The hash of the pixi.lock file that was used to create this environment
  • +
+
{
+  "manifest_path": "/home/user/dev/pixi/pixi.toml",
+  "environment_name": "default",
+  "pixi_version": "0.34.0",
+  "environment_lock_file_hash": "4f36ee620f10329d"
+}
+
+

The environment_lock_file_hash is used to check if the environment is in sync with the pixi.lock file. +If the hash of the pixi.lock file is different from the hash in the pixi file, pixi will update the environment.

+

This is used to speedup activation, in order to trigger a full revalidation pass --revalidate to the pixi run or pixi shell command. +A broken environment would typically not be found with a hash comparison, but a revalidation would reinstall the environment. +By default, all lock file modifying commands will always use the revalidation and on pixi install it always revalidates.

+

Cleaning up#

+

If you want to clean up the environments, you can simply delete the .pixi/envs directory, and pixi will recreate the environments when needed.

+
# either:
+rm -rf .pixi/envs
+
+# or per environment:
+rm -rf .pixi/envs/default
+rm -rf .pixi/envs/cuda
+
+

Activation#

+

An environment is nothing more than a set of files that are installed into a certain location, that somewhat mimics a global system install. +You need to activate the environment to use it. +In the most simple sense that mean adding the bin directory of the environment to the PATH variable. +But there is more to it in a conda environment, as it also sets some environment variables.

+

To do the activation we have multiple options:

+
    +
  • Use the pixi shell command to open a shell with the environment activated.
  • +
  • Use the pixi shell-hook command to print the command to activate the environment in your current shell.
  • +
  • Use the pixi run command to run a command in the environment.
  • +
+

Where the run command is special as it runs its own cross-platform shell and has the ability to run tasks. +More information about tasks can be found in the tasks documentation.

+

Using the pixi shell-hook in pixi you would get the following output:

+
export PATH="/home/user/development/pixi/.pixi/envs/default/bin:/home/user/.local/bin:/home/user/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/home/user/.pixi/bin"
+export CONDA_PREFIX="/home/user/development/pixi/.pixi/envs/default"
+export PIXI_PROJECT_NAME="pixi"
+export PIXI_PROJECT_ROOT="/home/user/development/pixi"
+export PIXI_PROJECT_VERSION="0.12.0"
+export PIXI_PROJECT_MANIFEST="/home/user/development/pixi/pixi.toml"
+export CONDA_DEFAULT_ENV="pixi"
+export PIXI_ENVIRONMENT_PLATFORMS="osx-64,linux-64,win-64,osx-arm64"
+export PIXI_ENVIRONMENT_NAME="default"
+export PIXI_PROMPT="(pixi) "
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-binutils_linux-64.sh"
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gcc_linux-64.sh"
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gfortran_linux-64.sh"
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gxx_linux-64.sh"
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/libglib_activate.sh"
+. "/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/rust.sh"
+
+

It sets the PATH and some more environment variables. But more importantly it also runs activation scripts that are presented by the installed packages. +An example of this would be the libglib_activate.sh script. +Thus, just adding the bin directory to the PATH is not enough.

+

Traditional conda activate-like activation#

+

If you prefer to use the traditional conda activate-like activation, you could use the pixi shell-hook command.

+
$ which python
+python not found
+$ eval "$(pixi shell-hook)"
+$ (default) which python
+/path/to/project/.pixi/envs/default/bin/python
+
+
+

Warning

+

It is not encouraged to use the traditional conda activate-like activation, as deactivating the environment is not really possible. Use pixi shell instead.

+
+

Using pixi with direnv#

+
+Installing direnv +

Of course you can use pixi to install direnv globally. We recommend to run

+
pixi global install direnv
+
+

to install the latest version of direnv on your computer.

+
+

This allows you to use pixi in combination with direnv. +Enter the following into your .envrc file:

+
.envrc
watch_file pixi.lock # (1)!
+eval "$(pixi shell-hook)" # (2)!
+
+
    +
  1. This ensures that every time your pixi.lock changes, direnv invokes the shell-hook again.
  2. +
  3. This installs if needed, and activates the environment. direnv ensures that the environment is deactivated when you leave the directory.
  4. +
+
$ cd my-project
+direnv: error /my-project/.envrc is blocked. Run `direnv allow` to approve its content
+$ direnv allow
+direnv: loading /my-project/.envrc
+ Project in /my-project is ready to use!
+direnv: export +CONDA_DEFAULT_ENV +CONDA_PREFIX +PIXI_ENVIRONMENT_NAME +PIXI_ENVIRONMENT_PLATFORMS +PIXI_PROJECT_MANIFEST +PIXI_PROJECT_NAME +PIXI_PROJECT_ROOT +PIXI_PROJECT_VERSION +PIXI_PROMPT ~PATH
+$ which python
+/my-project/.pixi/envs/default/bin/python
+$ cd ..
+direnv: unloading
+$ which python
+python not found
+
+

Environment variables#

+

The following environment variables are set by pixi, when using the pixi run, pixi shell, or pixi shell-hook command:

+
    +
  • PIXI_PROJECT_ROOT: The root directory of the project.
  • +
  • PIXI_PROJECT_NAME: The name of the project.
  • +
  • PIXI_PROJECT_MANIFEST: The path to the manifest file (pixi.toml).
  • +
  • PIXI_PROJECT_VERSION: The version of the project.
  • +
  • PIXI_PROMPT: The prompt to use in the shell, also used by pixi shell itself.
  • +
  • PIXI_ENVIRONMENT_NAME: The name of the environment, defaults to default.
  • +
  • PIXI_ENVIRONMENT_PLATFORMS: Comma separated list of platforms supported by the project.
  • +
  • CONDA_PREFIX: The path to the environment. (Used by multiple tools that already understand conda environments)
  • +
  • CONDA_DEFAULT_ENV: The name of the environment. (Used by multiple tools that already understand conda environments)
  • +
  • PATH: We prepend the bin directory of the environment to the PATH variable, so you can use the tools installed in the environment directly.
  • +
  • INIT_CWD: ONLY IN pixi run: The directory where the command was run from.
  • +
+
+

Note

+

Even though the variables are environment variables these cannot be overridden. E.g. you can not change the root of the project by setting PIXI_PROJECT_ROOT in the environment.

+
+

Solving environments#

+

When you run a command that uses the environment, pixi will check if the environment is in sync with the pixi.lock file. +If it is not, pixi will solve the environment and update it. +This means that pixi will retrieve the best set of packages for the dependency requirements that you specified in the pixi.toml and will put the output of the solve step into the pixi.lock file. +Solving is a mathematical problem and can take some time, but we take pride in the way we solve environments, and we are confident that we can solve your environment in a reasonable time. +If you want to learn more about the solving process, you can read these:

+ +

Pixi solves both the conda and PyPI dependencies, where the PyPI dependencies use the conda packages as a base, so you can be sure that the packages are compatible with each other. +These solvers are split between the rattler and uv library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. +resolve is able to solve multiple ecosystem like conda and PyPI. It implements the lazy solving process for PyPI packages, which means that it only downloads the metadata of the packages that are needed to solve the environment. +It also supports the conda way of solving, which means that it downloads the metadata of all the packages at once and then solves in one go.

+

For the [pypi-dependencies], uv implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. +For this building step, pixi requires to first install python in the (conda)[dependencies] section of the pixi.toml file. +This will always be slower than the pure conda solves. So for the best pixi experience you should stay within the [dependencies] section of the pixi.toml file.

+

Caching packages#

+

Pixi caches all previously downloaded packages in a cache folder. +This cache folder is shared between all pixi projects and globally installed tools.

+

Normally the location would be the following +platform-specific default cache folder:

+
    +
  • Linux: $XDG_CACHE_HOME/rattler or $HOME/.cache/rattler
  • +
  • macOS: $HOME/Library/Caches/rattler
  • +
  • Windows: %LOCALAPPDATA%\rattler
  • +
+

This location is configurable by setting the PIXI_CACHE_DIR or RATTLER_CACHE_DIR environment variable.

+

When you want to clean the cache, you can simply delete the cache directory, and pixi will re-create the cache when needed.

+

The cache contains multiple folders concerning different caches from within pixi.

+
    +
  • pkgs: Contains the downloaded/unpacked conda packages.
  • +
  • repodata: Contains the conda repodata cache.
  • +
  • uv-cache: Contains the uv cache. This includes multiple caches, e.g. built-wheels wheels archives
  • +
  • http-cache: Contains the conda-pypi mapping cache.
  • +
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/global_tools/index.html b/v0.39.2/features/global_tools/index.html new file mode 100644 index 000000000..9b1ed5094 --- /dev/null +++ b/v0.39.2/features/global_tools/index.html @@ -0,0 +1,2306 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Global Tools - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Pixi Global#

+
+ +
+ +

With pixi global, users can manage globally installed tools in a way that makes them available from any directory. +This means that the pixi environment will be placed in a global location, and the tools will be exposed to the system PATH, allowing you to run them from the command line.

+

Basic Usage#

+

Running the following command installs rattler-build on your system.

+
pixi global install rattler-build
+
+

What's great about pixi global is that, by default, it isolates each package in its own environment, exposing only the necessary entry points. +This means you don't have to worry about removing a package and accidentally breaking seemingly unrelated packages. +This behavior is quite similar to that of pipx.

+

However, there are times when you may want multiple dependencies in the same environment. +For instance, while ipython is really useful on its own, it becomes much more useful when numpy and matplotlib are available when using it.

+

Let's execute the following command:

+
pixi global install ipython --with numpy --with matplotlib
+
+

numpy exposes executables, but since it's added via --with it's executables are not being exposed.

+

Importing numpy and matplotlib now works as expected. +

ipython -c 'import numpy; import matplotlib'
+

+

At some point, you might want to install multiple versions of the same package on your system. +Since they will be all available on the system PATH, they need to be exposed under different names.

+

Let's check out the following command: +

pixi global install --expose py3=python "python=3.12"
+

+

By specifying --expose we specified that we want to expose the executable python under the name py3. +The package python has more executables, but since we specified --exposed they are not auto-exposed.

+

You can run py3 to start the python interpreter. +

py3 -c "print('Hello World')"
+

+

The Global Manifest#

+

Since v0.33.0 pixi has a new manifest file that will be created in the global directory. +This file will contain the list of environments that are installed globally, their dependencies and exposed binaries. +The manifest can be edited, synced, checked in to a version control system, and shared with others.

+

Running the commands from the section before results in the following manifest: +

version = 1
+
+[envs.rattler-build]
+channels = ["conda-forge"]
+dependencies = { rattler-build = "*" }
+exposed = { rattler-build = "rattler-build" }
+
+[envs.ipython]
+channels = ["conda-forge"]
+dependencies = { ipython = "*", numpy = "*", matplotlib = "*" }
+exposed = { ipython = "ipython", ipython3 = "ipython3" }
+
+[envs.python]
+channels = ["conda-forge"]
+dependencies = { python = "3.12.*" } # (1)!
+exposed = { py3 = "python" } # (2)!
+

+
    +
  1. Dependencies are the packages that will be installed in the environment. You can specify the version or use a wildcard.
  2. +
  3. The exposed binaries are the ones that will be available in the system path. In this case, python is exposed under the name py3.
  4. +
+

Manifest locations#

+

The manifest can be found at the following locations depending on your operating system. +Run pixi info, to find the currently used manifest on your system.

+
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
4$PIXI_HOME/manifests/pixi-global.tomlGlobal manifest in PIXI_HOME.
3$HOME/.pixi/manifests/pixi-global.tomlGlobal manifest in user home directory.
2$XDG_CONFIG_HOME/pixi/manifests/pixi-global.tomlXDG compliant config directory.
1$HOME/.config/pixi/manifests/pixi-global.tomlConfig directory.
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
3$PIXI_HOME/manifests/pixi-global.tomlGlobal manifest in PIXI_HOME.
2$HOME/.pixi/manifests/pixi-global.tomlGlobal manifest in user home directory.
1$HOME/Library/Application Support/pixi/manifests/pixi-global.tomlConfig directory.
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
3$PIXI_HOME\manifests/pixi-global.tomlGlobal manifest in PIXI_HOME.
2%USERPROFILE%\.pixi\manifests\pixi-global.tomlGlobal manifest in user home directory.
1%APPDATA%\pixi\manifests\pixi-global.tomlConfig directory.
+
+
+
+
+

Note

+

If multiple locations exist, the manifest with the highest priority will be used.

+
+

Channels#

+

The channels are the conda channels that will be used to search for the packages. +There is a priority to these, so the first one will have the highest priority, if a package is not found in that channel the next one will be used. +For example, running: +

pixi global install --channel conda-forge --channel bioconda snakemake
+
+Results in the following entry in the manifest: +
[envs.snakemake]
+channels = ["conda-forge", "bioconda"]
+dependencies = { snakemake = "*" }
+exposed = { snakemake = "snakemake" }
+

+

More information on channels can be found here.

+

Automatic Exposed#

+

There is some added automatic behavior, if you install a package with the same name as the environment, it will be exposed with the same name. +Even if the binary name is only exposed through dependencies of the package +For example, running: +

pixi global install ansible
+
+will create the following entry in the manifest: +
[envs.ansible]
+channels = ["conda-forge"]
+dependencies = { ansible = "*" }
+exposed = { ansible = "ansible" } # (1)!
+

+
    +
  1. The ansible binary is exposed even though it is installed by a dependency of ansible, the ansible-core package.
  2. +
+

It's also possible to expose an executable which is located in a nested directory. +For example dotnet.exe executable is located in a dotnet folder, +to expose dotnet you must specify its relative path :

+
pixi global install dotnet --expose dotnet=dotnet\dotnet
+
+

Which will create the following entry in the manifest: +

[envs.dotnet]
+channels = ["conda-forge"]
+dependencies = { dotnet = "*" }
+exposed = { dotnet = 'dotnet\dotnet' }
+

+

Dependencies#

+

Dependencies are the Conda packages that will be installed into your environment. For example, running: +

pixi global install "python<3.12"
+
+creates the following entry in the manifest: +
[envs.vim]
+channels = ["conda-forge"]
+dependencies = { python = "<3.12" }
+# ...
+
+Typically, you'd specify just the tool you're installing, but you can add more packages if needed. +Defining the environment to install into will allow you to add multiple dependencies at once. +For example, running: +
pixi global install --environment my-env git vim python
+
+will create the following entry in the manifest: +
[envs.my-env]
+channels = ["conda-forge"]
+dependencies = { git = "*", vim = "*", python = "*" }
+# ...
+

+

You can add a dependency to an existing environment by running: +

pixi global install --environment my-env package-a package-b
+
+This will be added as dependencies to the my-env environment but won't auto expose the binaries from the new packages.

+

You can remove dependencies by running: +

pixi global remove --environment my-env package-a package-b
+

+

Trampolines#

+

To increase efficiency, pixi uses trampolines—small, specialized binary files that manage configuration and environment setup before executing the main binary. The trampoline approach allows for skipping the execution of activation scripts that have a significant performance impact.

+

When you execute a global install binary, a trampoline performs the following sequence of steps:

+
    +
  • Each trampoline first reads a configuration file named after the binary being executed. This configuration file, in JSON format (e.g., python.json), contains key information about how the environment should be set up. The configuration file is stored in .pixi/bin/trampoline_configuration.
  • +
  • Once the configuration is loaded and the environment is set, the trampoline executes the original binary with the correct environment settings.
  • +
  • When installing a new binary, a new trampoline is placed in the .pixi/bin directory and is hard-linked to the .pixi/bin/trampoline_configuration/trampoline_bin. This optimizes storage space and avoids duplication of the same trampoline.
  • +
+

Example: Adding a series of tools at once#

+

Without specifying an environment, you can add multiple tools at once: +

pixi global install pixi-pack rattler-build
+
+This command generates the following entry in the manifest: +
[envs.pixi-pack]
+channels = ["conda-forge"]
+dependencies= { pixi-pack = "*" }
+exposed = { pixi-pack = "pixi-pack" }
+
+[envs.rattler-build]
+channels = ["conda-forge"]
+dependencies = { rattler-build = "*" }
+exposed = { rattler-build = "rattler-build" }
+
+Creating two separate non-interfering environments, while exposing only the minimum required binaries.

+

Example: Creating a Data Science Sandbox Environment#

+

You can create an environment with multiple tools using the following command: +

pixi global install --environment data-science --expose jupyter --expose ipython jupyter numpy pandas matplotlib ipython
+
+This command generates the following entry in the manifest: +
[envs.data-science]
+channels = ["conda-forge"]
+dependencies = { jupyter = "*", ipython = "*" }
+exposed = { jupyter = "jupyter", ipython = "ipython" }
+
+In this setup, both jupyter and ipython are exposed from the data-science environment, allowing you to run: +
> ipython
+# Or
+> jupyter lab
+
+These commands will be available globally, making it easy to access your preferred tools without switching environments.

+

Example: Install packages for a different platform#

+

You can install packages for a different platform using the --platform flag. +This is useful when you want to install packages for a different platform, such as osx-64 packages on osx-arm64. +For example, running this on osx-arm64: +

pixi global install --platform osx-64 python
+
+will create the following entry in the manifest: +
[envs.python]
+channels = ["conda-forge"]
+platforms = ["osx-64"]
+dependencies = { python = "*" }
+# ...
+

+

Potential Future Features#

+

PyPI support#

+

We could support packages from PyPI via a command like this:

+
pixi global install --pypi flask
+
+

Lock file#

+

A lock file is less important for global tools. +However, there is demand for it, and users that don't care about it should not be negatively impacted

+

Multiple manifests#

+

We could go for one default manifest, but also parse other manifests in the same directory. +The only requirement to be parsed as manifest is a .toml extension +In order to modify those with the CLI one would have to add an option --manifest to select the correct one.

+
    +
  • pixi-global.toml: Default
  • +
  • pixi-global-company-tools.toml
  • +
  • pixi-global-from-my-dotfiles.toml
  • +
+

It is unclear whether the first implementation already needs to support this. +At the very least we should put the manifest into its own folder like ~/.pixi/global/manifests/pixi-global.toml

+

No activation#

+

The current pixi global install features --no-activation. +When this flag is set, CONDA_PREFIX and PATH will not be set when running the exposed executable. +This is useful when installing Python package managers or shells.

+

Assuming that this needs to be set per mapping, one way to expose this functionality would be to allow the following:

+
[envs.pip.exposed]
+pip = { executable = "pip", activation = false }
+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/lockfile/index.html b/v0.39.2/features/lockfile/index.html new file mode 100644 index 000000000..fd9144aeb --- /dev/null +++ b/v0.39.2/features/lockfile/index.html @@ -0,0 +1,2021 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Lockfile - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

The pixi.lock lock file#

+
+

A lock file is the protector of the environments, and pixi is the key to unlock it.

+
+

What is a lock file?#

+

A lock file locks the environment in a specific state. +Within pixi a lock file is a description of the packages in an environment. +The lock file contains two definitions:

+
    +
  • +

    The environments that are used in the project with their complete set of packages. e.g.:

    +
    environments:
    +    default:
    +        channels:
    +          - url: https://conda.anaconda.org/conda-forge/
    +        packages:
    +            linux-64:
    +            ...
    +            - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.12.2-hab00c5b_0_cpython.conda
    +            ...
    +            osx-64:
    +            ...
    +            - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda
    +            ...
    +
    +
      +
    • +

      The definition of the packages themselves. e.g.:

      +
      - kind: conda
      +  name: python
      +  version: 3.12.2
      +  build: h9f0c242_0_cpython
      +  subdir: osx-64
      +  url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda
      +  sha256: 7647ac06c3798a182a4bcb1ff58864f1ef81eb3acea6971295304c23e43252fb
      +  md5: 0179b8007ba008cf5bec11f3b3853902
      +  depends:
      +    - bzip2 >=1.0.8,<2.0a0
      +    - libexpat >=2.5.0,<3.0a0
      +    - libffi >=3.4,<4.0a0
      +    - libsqlite >=3.45.1,<4.0a0
      +    - libzlib >=1.2.13,<1.3.0a0
      +    - ncurses >=6.4,<7.0a0
      +    - openssl >=3.2.1,<4.0a0
      +    - readline >=8.2,<9.0a0
      +    - tk >=8.6.13,<8.7.0a0
      +    - tzdata
      +    - xz >=5.2.6,<6.0a0
      +  constrains:
      +    - python_abi 3.12.* *_cp312
      +  license: Python-2.0
      +  size: 14596811
      +  timestamp: 1708118065292
      +
      +
    • +
    +
  • +
+

Why a lock file#

+

Pixi uses the lock file for the following reasons:

+
    +
  • To save a working installation state, without copying the entire environment's data.
  • +
  • To ensure the project configuration is aligned with the installed environment.
  • +
  • To give the user a file that contains all the information about the environment.
  • +
+

This gives you (and your collaborators) a way to really reproduce the environment they are working in. +Using tools such as docker suddenly becomes much less necessary.

+

When is a lock file generated?#

+

A lock file is generated when you install a package. +More specifically, a lock file is generated from the solve step of the installation process. +The solve will return a list of packages that are to be installed, and the lock file will be generated from this list. +This diagram tries to explain the process:

+
graph TD
+    A[Install] --> B[Solve]
+    B --> C[Generate and write lock file]
+    C --> D[Install Packages]
+

How to use a lock file#

+
+

Do not edit the lock file

+

A lock file is a machine only file, and should not be edited by hand.

+
+

That said, the pixi.lock is human-readable, so it's easy to track the changes in the environment. +We recommend you track the lock file in git or other version control systems. +This will ensure that the environment is always reproducible and that you can always revert back to a working state, in case something goes wrong. +The pixi.lock and the manifest file pixi.toml/pyproject.toml should always be in sync.

+

Running the following commands will check and automatically update the lock file if you changed any dependencies:

+
    +
  • pixi install
  • +
  • pixi run
  • +
  • pixi shell
  • +
  • pixi shell-hook
  • +
  • pixi tree
  • +
  • pixi list
  • +
  • pixi add
  • +
  • pixi remove
  • +
+

All the commands that support the interaction with the lock file also include some lock file usage options:

+
    +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: only install if the pixi.lock is up-to-date with the manifest file[^1]. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
+
+

Syncing the lock file with the manifest file

+

The lock file is always matched with the whole configuration in the manifest file. +This means that if you change the manifest file, the lock file will be updated. +

flowchart TD
+    C[manifest] --> A[lockfile] --> B[environment]

+
+

Lockfile satisfiability#

+

The lock file is a description of the environment, and it should always be satisfiable. +Satisfiable means that the given manifest file and the created environment are in sync with the lockfile. +If the lock file is not satisfiable, pixi will generate a new lock file automatically.

+

Steps to check if the lock file is satisfiable:

+
    +
  • All environments in the manifest file are in the lock file
  • +
  • All channels in the manifest file are in the lock file
  • +
  • All packages in the manifest file are in the lock file, and the versions in the lock file are compatible with the requirements in the manifest file, for both conda and pypi packages.
      +
    • Conda packages use a matchspec which can match on all the information we store in the lockfile, even timestamp, subdir and license.
    • +
    +
  • +
  • If pypi-dependencies are added, all conda package that are python packages in the lock file have a purls field.
  • +
  • All hashes for the pypi editable packages are correct.
  • +
  • There is only a single entry for every package in the lock file.
  • +
+

If you want to get more details checkout the actual code as this is a simplification of the actual code.

+

The version of the lock file#

+

The lock file has a version number, this is to ensure that the lock file is compatible with the local version of pixi.

+
version: 4
+
+

Pixi is backward compatible with the lock file, but not forward compatible. +This means that you can use an older lock file with a newer version of pixi, but not the other way around.

+

Your lock file is big#

+

The lock file can grow quite large, especially if you have a lot of packages installed. +This is because the lock file contains all the information about the packages.

+
    +
  1. We try to keep the lock file as small as possible.
  2. +
  3. It's always smaller than a docker image.
  4. +
  5. Downloading the lock file is always faster than downloading the incorrect packages.
  6. +
+

You don't need a lock file because...#

+

If you can not think of a case where you would benefit from a fast reproducible environment, then you don't need a lock file.

+

But take note of the following:

+
    +
  • A lock file allows you to run the same environment on different machines, think CI systems.
  • +
  • It also allows you to go back to a working state if you have made a mistake.
  • +
  • It helps other users onboard to your project as they don't have to figure out the environment setup or solve dependency issues.
  • +
+

Removing the lock file#

+

If you want to remove the lock file, you can simply delete it.

+
rm pixi.lock
+
+

This will remove the lock file, and the next time you run a command that requires the lock file, it will be generated again.

+
+

Note

+

This does remove the locked state of the environment, and the environment will be updated to the latest version of the packages.

+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/multi_environment/index.html b/v0.39.2/features/multi_environment/index.html new file mode 100644 index 000000000..c06754d05 --- /dev/null +++ b/v0.39.2/features/multi_environment/index.html @@ -0,0 +1,2279 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Multi Environment - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Multi Environment Support#

+

Motivating Example#

+

There are multiple scenarios where multiple environments are useful.

+
    +
  • Testing of multiple package versions, e.g. py39 and py310 or polars 0.12 and 0.13.
  • +
  • Smaller single tool environments, e.g. lint or docs.
  • +
  • Large developer environments, that combine all the smaller environments, e.g. dev.
  • +
  • Strict supersets of environments, e.g. prod and test-prod where test-prod is a strict superset of prod.
  • +
  • Multiple machines from one project, e.g. a cuda environment and a cpu environment.
  • +
  • And many more. (Feel free to edit this document in our GitHub and add your use case.)
  • +
+

This prepares pixi for use in large projects with multiple use-cases, multiple developers and different CI needs.

+

Design Considerations#

+

There are a few things we wanted to keep in mind in the design:

+
    +
  1. User-friendliness: Pixi is a user focussed tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
  2. +
  3. Keep it simple: Not understanding the multiple environments feature shouldn't limit a user to use pixi. The feature should be "invisible" to the non-multi env use-cases.
  4. +
  5. No Automatic Combinatorial: To ensure the dependency resolution process remains manageable, the solution should avoid a combinatorial explosion of dependency sets. By making the environments user defined and not automatically inferred by testing a matrix of the features.
  6. +
  7. Single environment Activation: The design should allow only one environment to be active at any given time, simplifying the resolution process and preventing conflicts.
  8. +
  9. Fixed lock files: It's crucial to preserve fixed lock files for consistency and predictability. Solutions must ensure reliability not just for authors but also for end-users, particularly at the time of lock file creation.
  10. +
+

Feature & Environment Set Definitions#

+

Introduce environment sets into the pixi.toml this describes environments based on feature's. Introduce features into the pixi.toml that can describe parts of environments. +As an environment goes beyond just dependencies the features should be described including the following fields:

+
    +
  • dependencies: The conda package dependencies
  • +
  • pypi-dependencies: The pypi package dependencies
  • +
  • system-requirements: The system requirements of the environment
  • +
  • activation: The activation information for the environment
  • +
  • platforms: The platforms the environment can be run on.
  • +
  • channels: The channels used to create the environment. Adding the priority field to the channels to allow concatenation of channels instead of overwriting.
  • +
  • target: All the above features but also separated by targets.
  • +
  • tasks: Feature specific tasks, tasks in one environment are selected as default tasks for the environment.
  • +
+
Default features
[dependencies] # short for [feature.default.dependencies]
+python = "*"
+numpy = "==2.3"
+
+[pypi-dependencies] # short for [feature.default.pypi-dependencies]
+pandas = "*"
+
+[system-requirements] # short for [feature.default.system-requirements]
+libc = "2.33"
+
+[activation] # short for [feature.default.activation]
+scripts = ["activate.sh"]
+
+
Different dependencies per feature
[feature.py39.dependencies]
+python = "~=3.9.0"
+[feature.py310.dependencies]
+python = "~=3.10.0"
+[feature.test.dependencies]
+pytest = "*"
+
+
Full set of environment modification in one feature
[feature.cuda]
+dependencies = {cuda = "x.y.z", cudnn = "12.0"}
+pypi-dependencies = {torch = "1.9.0"}
+platforms = ["linux-64", "osx-arm64"]
+activation = {scripts = ["cuda_activation.sh"]}
+system-requirements = {cuda = "12"}
+# Channels concatenate using a priority instead of overwrite, so the default channels are still used.
+# Using the priority the concatenation is controlled, default is 0, the default channels are used last.
+# Highest priority comes first.
+channels = ["nvidia", {channel = "pytorch", priority = -1}] # Results in:  ["nvidia", "conda-forge", "pytorch"] when the default is `conda-forge`
+tasks = { warmup = "python warmup.py" }
+target.osx-arm64 = {dependencies = {mlx = "x.y.z"}}
+
+
Define tasks as defaults of an environment
[feature.test.tasks]
+test = "pytest"
+
+[environments]
+test = ["test"]
+
+# `pixi run test` == `pixi run --environment test test`
+
+

The environment definition should contain the following fields:

+
    +
  • features: Vec<Feature>: The features that are included in the environment set, which is also the default field in the environments.
  • +
  • solve-group: String: The solve group is used to group environments together at the solve stage. + This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. + For instance when testing a production environment with additional test dependencies.
  • +
+
Creating environments from features
[environments]
+# implicit: default = ["default"]
+default = ["py39"] # implicit: default = ["py39", "default"]
+py310 = ["py310"] # implicit: py310 = ["py310", "default"]
+test = ["test"] # implicit: test = ["test", "default"]
+test39 = ["test", "py39"] # implicit: test39 = ["test", "py39", "default"]
+
+
Testing a production environment with additional dependencies
[environments]
+# Creating a `prod` environment which is the minimal set of dependencies used for production.
+prod = {features = ["py39"], solve-group = "prod"}
+# Creating a `test_prod` environment which is the `prod` environment plus the `test` feature.
+test_prod = {features = ["py39", "test"], solve-group = "prod"}
+# Using the `solve-group` to solve the `prod` and `test_prod` environments together
+# Which makes sure the tested environment has the same version of the dependencies as the production environment.
+
+
Creating environments without including the default feature
[dependencies]
+python = "*"
+numpy = "*"
+
+[feature.lint.dependencies]
+pre-commit = "*"
+
+[environments]
+# Create a custom environment which only has the `lint` feature (numpy isn't part of that env).
+lint = {features = ["lint"], no-default-feature = true}
+
+

lock file Structure#

+

Within the pixi.lock file, a package may now include an additional environments field, specifying the environment to which it belongs. +To avoid duplication the packages environments field may contain multiple environments so the lock file is of minimal size.

+
- platform: linux-64
+  name: pre-commit
+  version: 3.3.3
+  category: main
+  environments:
+    - dev
+    - test
+    - lint
+  ...:
+- platform: linux-64
+  name: python
+  version: 3.9.3
+  category: main
+  environments:
+    - dev
+    - test
+    - lint
+    - py39
+    - default
+  ...:
+
+

User Interface Environment Activation#

+

Users can manually activate the desired environment via command line or configuration. +This approach guarantees a conflict-free environment by allowing only one feature set to be active at a time. +For the user the cli would look like this:

+
Default behavior
 pixi run python
+# Runs python in the `default` environment
+
+
Activating an specific environment
 pixi run -e test pytest
+ pixi run --environment test pytest
+# Runs `pytest` in the `test` environment
+
+
Activating a shell in an environment
 pixi shell -e cuda
+pixi shell --environment cuda
+# Starts a shell in the `cuda` environment
+
+
Running any command in an environment
 pixi run -e test any_command
+# Runs any_command in the `test` environment which doesn't require to be predefined as a task.
+
+

Ambiguous Environment Selection#

+

It's possible to define tasks in multiple environments, in this case the user should be prompted to select the environment.

+

Here is a simple example of a task only manifest:

+

pixi.toml
[project]
+name = "test_ambiguous_env"
+channels = []
+platforms = ["linux-64", "win-64", "osx-64", "osx-arm64"]
+
+[tasks]
+default = "echo Default"
+ambi = "echo Ambi::Default"
+[feature.test.tasks]
+test = "echo Test"
+ambi = "echo Ambi::Test"
+
+[feature.dev.tasks]
+dev = "echo Dev"
+ambi = "echo Ambi::Dev"
+
+[environments]
+default = ["test", "dev"]
+test = ["test"]
+dev = ["dev"]
+
+Trying to run the ambi task will prompt the user to select the environment. +As it is available in all environments.

+
Interactive selection of environments if task is in multiple environments
 pixi run ambi
+? The task 'ambi' can be run in multiple environments.
+
+Please select an environment to run the task in: ›
+ default # selecting default
+  test
+  dev
+
+ Pixi task (ambi in default): echo Ambi::Test
+Ambi::Test
+
+

As you can see it runs the task defined in the feature.task but it is run in the default environment. +This happens because the ambi task is defined in the test feature, and it is overwritten in the default environment. +So the tasks.default is now non-reachable from any environment.

+

Some other results running in this example: +

 pixi run --environment test ambi
+ Pixi task (ambi in test): echo Ambi::Test
+Ambi::Test
+
+ pixi run --environment dev ambi
+ Pixi task (ambi in dev): echo Ambi::Dev
+Ambi::Dev
+
+# dev is run in the default environment
+ pixi run dev
+ Pixi task (dev in default): echo Dev
+Dev
+
+# dev is run in the dev environment
+ pixi run -e dev dev
+ Pixi task (dev in dev): echo Dev
+Dev
+

+ + +

Real world example use cases#

+
+Polarify test setup +

In polarify they want to test multiple versions combined with multiple versions of polars. +This is currently done by using a matrix in GitHub actions. +This can be replaced by using multiple environments.

+
pixi.toml
[project]
+name = "polarify"
+# ...
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]
+
+[tasks]
+postinstall = "pip install --no-build-isolation --no-deps --disable-pip-version-check -e ."
+
+[dependencies]
+python = ">=3.9"
+pip = "*"
+polars = ">=0.14.24,<0.21"
+
+[feature.py39.dependencies]
+python = "3.9.*"
+[feature.py310.dependencies]
+python = "3.10.*"
+[feature.py311.dependencies]
+python = "3.11.*"
+[feature.py312.dependencies]
+python = "3.12.*"
+[feature.pl017.dependencies]
+polars = "0.17.*"
+[feature.pl018.dependencies]
+polars = "0.18.*"
+[feature.pl019.dependencies]
+polars = "0.19.*"
+[feature.pl020.dependencies]
+polars = "0.20.*"
+
+[feature.test.dependencies]
+pytest = "*"
+pytest-md = "*"
+pytest-emoji = "*"
+hypothesis = "*"
+[feature.test.tasks]
+test = "pytest"
+
+[feature.lint.dependencies]
+pre-commit = "*"
+[feature.lint.tasks]
+lint = "pre-commit run --all"
+
+[environments]
+pl017 = ["pl017", "py39", "test"]
+pl018 = ["pl018", "py39", "test"]
+pl019 = ["pl019", "py39", "test"]
+pl020 = ["pl020", "py39", "test"]
+py39 = ["py39", "test"]
+py310 = ["py310", "test"]
+py311 = ["py311", "test"]
+py312 = ["py312", "test"]
+
+
.github/workflows/test.yml
jobs:
+  tests-per-env:
+    runs-on: ubuntu-latest
+    strategy:
+      matrix:
+        environment: [py311, py312]
+    steps:
+    - uses: actions/checkout@v4
+      - uses: prefix-dev/setup-pixi@v0.5.1
+        with:
+          environments: ${{ matrix.environment }}
+      - name: Run tasks
+        run: |
+          pixi run --environment ${{ matrix.environment }} test
+  tests-with-multiple-envs:
+    runs-on: ubuntu-latest
+    steps:
+    - uses: actions/checkout@v4
+    - uses: prefix-dev/setup-pixi@v0.5.1
+      with:
+       environments: pl017 pl018
+    - run: |
+        pixi run -e pl017 test
+        pixi run -e pl018 test
+
+
+
+Test vs Production example +

This is an example of a project that has a test feature and prod environment. +The prod environment is a production environment that contains the run dependencies. +The test feature is a set of dependencies and tasks that we want to put on top of the previously solved prod environment. +This is a common use case where we want to test the production environment with additional dependencies.

+

pixi.toml
[project]
+name = "my-app"
+# ...
+channels = ["conda-forge"]
+platforms = ["osx-arm64", "linux-64"]
+
+[tasks]
+postinstall-e = "pip install --no-build-isolation --no-deps --disable-pip-version-check -e ."
+postinstall = "pip install --no-build-isolation --no-deps --disable-pip-version-check ."
+dev = "uvicorn my_app.app:main --reload"
+serve = "uvicorn my_app.app:main"
+
+[dependencies]
+python = ">=3.12"
+pip = "*"
+pydantic = ">=2"
+fastapi = ">=0.105.0"
+sqlalchemy = ">=2,<3"
+uvicorn = "*"
+aiofiles = "*"
+
+[feature.test.dependencies]
+pytest = "*"
+pytest-md = "*"
+pytest-asyncio = "*"
+[feature.test.tasks]
+test = "pytest --md=report.md"
+
+[environments]
+# both default and prod will have exactly the same dependency versions when they share a dependency
+default = {features = ["test"], solve-group = "prod-group"}
+prod = {features = [], solve-group = "prod-group"}
+
+In ci you would run the following commands: +
pixi run postinstall-e && pixi run test
+
+Locally you would run the following command: +
pixi run postinstall-e && pixi run dev
+

+

Then in a Dockerfile you would run the following command: +

Dockerfile
FROM ghcr.io/prefix-dev/pixi:latest # this doesn't exist yet
+WORKDIR /app
+COPY . .
+RUN pixi run --environment prod postinstall
+EXPOSE 8080
+CMD ["/usr/local/bin/pixi", "run", "--environment", "prod", "serve"]
+

+
+
+Multiple machines from one project +

This is an example for an ML project that should be executable on a machine that supports cuda and mlx. It should also be executable on machines that don't support cuda or mlx, we use the cpu feature for this.

+
pixi.toml
[project]
+name = "my-ml-project"
+description = "A project that does ML stuff"
+authors = ["Your Name <your.name@gmail.com>"]
+channels = ["conda-forge", "pytorch"]
+# All platforms that are supported by the project as the features will take the intersection of the platforms defined there.
+platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
+
+[tasks]
+train-model = "python train.py"
+evaluate-model = "python test.py"
+
+[dependencies]
+python = "3.11.*"
+pytorch = {version = ">=2.0.1", channel = "pytorch"}
+torchvision = {version = ">=0.15", channel = "pytorch"}
+polars = ">=0.20,<0.21"
+matplotlib-base = ">=3.8.2,<3.9"
+ipykernel = ">=6.28.0,<6.29"
+
+[feature.cuda]
+platforms = ["win-64", "linux-64"]
+channels = ["nvidia", {channel = "pytorch", priority = -1}]
+system-requirements = {cuda = "12.1"}
+
+[feature.cuda.tasks]
+train-model = "python train.py --cuda"
+evaluate-model = "python test.py --cuda"
+
+[feature.cuda.dependencies]
+pytorch-cuda = {version = "12.1.*", channel = "pytorch"}
+
+[feature.mlx]
+platforms = ["osx-arm64"]
+# MLX is only available on macOS >=13.5 (>14.0 is recommended)
+system-requirements = {macos = "13.5"}
+
+[feature.mlx.tasks]
+train-model = "python train.py --mlx"
+evaluate-model = "python test.py --mlx"
+
+[feature.mlx.dependencies]
+mlx = ">=0.16.0,<0.17.0"
+
+[feature.cpu]
+platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
+
+[environments]
+cuda = ["cuda"]
+mlx = ["mlx"]
+default = ["cpu"]
+
+
Running the project on a cuda machine
pixi run train-model --environment cuda
+# will execute `python train.py --cuda`
+# fails if not on linux-64 or win-64 with cuda 12.1
+
+
Running the project with mlx
pixi run train-model --environment mlx
+# will execute `python train.py --mlx`
+# fails if not on osx-arm64
+
+
Running the project on a machine without cuda or mlx
pixi run train-model
+
+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/multi_platform_configuration/index.html b/v0.39.2/features/multi_platform_configuration/index.html new file mode 100644 index 000000000..d79073910 --- /dev/null +++ b/v0.39.2/features/multi_platform_configuration/index.html @@ -0,0 +1,1893 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Multi platform config - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Multi Platform

+ +

Pixi's vision includes being supported on all major platforms. Sometimes that needs some extra configuration to work well. +On this page, you will learn what you can configure to align better with the platform you are making your application for.

+

Here is an example manifest file that highlights some of the features:

+
+
+
+
pixi.toml
[project]
+# Default project info....
+# A list of platforms you are supporting with your package.
+platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
+
+[dependencies]
+python = ">=3.8"
+
+[target.win-64.dependencies]
+# Overwrite the needed python version only on win-64
+python = "3.7"
+
+
+[activation]
+scripts = ["setup.sh"]
+
+[target.win-64.activation]
+# Overwrite activation scripts only for windows
+scripts = ["setup.bat"]
+
+
+
+
pyproject.toml
[tool.pixi.project]
+# Default project info....
+# A list of platforms you are supporting with your package.
+platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
+
+[tool.pixi.dependencies]
+python = ">=3.8"
+
+[tool.pixi.target.win-64.dependencies]
+# Overwrite the needed python version only on win-64
+python = "~=3.7.0"
+
+
+[tool.pixi.activation]
+scripts = ["setup.sh"]
+
+[tool.pixi.target.win-64.activation]
+# Overwrite activation scripts only for windows
+scripts = ["setup.bat"]
+
+
+
+
+

Platform definition#

+

The project.platforms defines which platforms your project supports. +When multiple platforms are defined, pixi determines which dependencies to install for each platform individually. +All of this is stored in a lock file.

+

Running pixi install on a platform that is not configured will warn the user that it is not setup for that platform:

+
 pixi install
+  × the project is not configured for your current platform
+   ╭─[pixi.toml:6:1]
+ 6  channels = ["conda-forge"]
+ 7  platforms = ["osx-64", "osx-arm64", "win-64"]
+   ·             ────────────────┬────────────────
+   ·                             ╰── add 'linux-64' here
+ 8 │
+   ╰────
+  help: The project needs to be configured to support your platform (linux-64).
+
+

Target specifier#

+

With the target specifier, you can overwrite the original configuration specifically for a single platform. +If you are targeting a specific platform in your target specifier that was not specified in your project.platforms then pixi will throw an error.

+

Dependencies#

+

It might happen that you want to install a certain dependency only on a specific platform, or you might want to use a different version on different platforms.

+
pixi.toml
[dependencies]
+python = ">=3.8"
+
+[target.win-64.dependencies]
+msmpi = "*"
+python = "3.8"
+
+

In the above example, we specify that we depend on msmpi only on Windows. +We also specifically want python on 3.8 when installing on Windows. +This will overwrite the dependencies from the generic set of dependencies. +This will not touch any of the other platforms.

+

You can use pixi's cli to add these dependencies to the manifest file.

+
pixi add --platform win-64 posix
+
+

This also works for the host and build dependencies.

+
pixi add --host --platform win-64 posix
+pixi add --build --platform osx-64 clang
+
+

Which results in this.

+
pixi.toml
[target.win-64.host-dependencies]
+posix = "1.0.0.*"
+
+[target.osx-64.build-dependencies]
+clang = "16.0.6.*"
+
+

Activation#

+

Pixi's vision is to enable completely cross-platform projects, but you often need to run tools that are not built by your projects. +Generated activation scripts are often in this category, default scripts in unix are bash and for windows they are bat

+

To deal with this, you can define your activation scripts using the target definition.

+

pixi.toml
[activation]
+scripts = ["setup.sh", "local_setup.bash"]
+
+[target.win-64.activation]
+scripts = ["setup.bat", "local_setup.bat"]
+
+When this project is run on win-64 it will only execute the target scripts not the scripts specified in the default activation.scripts

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/features/system_requirements/index.html b/v0.39.2/features/system_requirements/index.html new file mode 100644 index 000000000..fa7d0451b --- /dev/null +++ b/v0.39.2/features/system_requirements/index.html @@ -0,0 +1,1929 @@ + + + + + + + + + + + + + + + + + + + + + + + + + System Requirements - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

System Requirements in pixi#

+

System requirements define the minimal system specifications necessary during dependency resolution for a project. +For instance, specifying a Unix system with a particular minimal libc version ensures that dependencies are compatible +with the project's environment.

+

System specifications are closely related +to virtual packages, allowing for +flexible and accurate dependency management.

+

Default System Requirements#

+

The following configurations outline the default minimal system requirements for different operating systems:

+
+
+
+
# Default system requirements for Linux
+[system-requirements]
+linux = "4.18"
+libc = { family = "glibc", version = "2.28" }
+
+
+
+

Windows currently has no minimal system requirements defined. If your project requires specific Windows configurations, +you should define them accordingly.

+
+
+
# Default system requirements for macOS
+[system-requirements]
+macos = "13.0"
+
+
+
+
# Default system requirements for macOS ARM64
+[system-requirements]
+macos = "13.0"
+
+
+
+
+

Customizing System Requirements#

+

You only need to define system requirements if your project necessitates a different set from the defaults. +This is common when installing environments on older or newer versions of operating systems.

+

Adjusting for Older Systems#

+

If you're encountering an error like:

+
× The current system has a mismatching virtual package. The project requires '__linux' to be at least version '4.18' but the system has version '4.12.14'
+
+

This indicates that the project's system requirements are higher than your current system's specifications. +To resolve this, you can lower the system requirements in your project's configuration:

+
[system-requirements]
+linux = "4.12.14"
+
+

This adjustment informs the dependency resolver to accommodate the older system version.

+

Using CUDA in pixi#

+

To utilize CUDA in your project, you must specify the desired CUDA version in the system-requirements table. +This ensures that CUDA is recognized and appropriately locked into the lock file if necessary.

+

Example Configuration

+
[system-requirements]
+cuda = "12"  # Replace "12" with the specific CUDA version you intend to use
+
+

Setting System Requirements environment specific#

+

This can be set per feature in the the manifest file.

+
[feature.cuda.system-requirements]
+cuda = "12"
+
+[environments]
+cuda = ["cuda"]
+
+

Available Override Options#

+

In certain scenarios, you might need to override the system requirements detected on your machine. +This can be particularly useful when working on systems that do not meet the project's default requirements.

+

You can override virtual packages by setting the following environment variables:

+
    +
  • CONDA_OVERRIDE_CUDA
      +
    • Description: Sets the CUDA version.
    • +
    • Usage Example: CONDA_OVERRIDE_CUDA=11
    • +
    +
  • +
  • CONDA_OVERRIDE_GLIBC
      +
    • Description: Sets the glibc version.
    • +
    • Usage Example: CONDA_OVERRIDE_GLIBC=2.28
    • +
    +
  • +
  • CONDA_OVERRIDE_OSX
      +
    • Description: Sets the macOS version.
    • +
    • Usage Example: CONDA_OVERRIDE_OSX=13.0
    • +
    +
  • +
+

Additional Resources#

+

For more detailed information on managing virtual packages and overriding system requirements, refer to +the Conda Documentation.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/ide_integration/devcontainer/index.html b/v0.39.2/ide_integration/devcontainer/index.html new file mode 100644 index 000000000..bccdae72f --- /dev/null +++ b/v0.39.2/ide_integration/devcontainer/index.html @@ -0,0 +1,1784 @@ + + + + + + + + + + + + + + + + + + + + + + + + + VSCode Devcontainer - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Use pixi inside of a devcontainer#

+

VSCode Devcontainers are a popular tool to develop on a project with a consistent environment. +They are also used in GitHub Codespaces which makes it a great way to develop on a project without having to install anything on your local machine.

+

To use pixi inside of a devcontainer, follow these steps:

+

Create a new directory .devcontainer in the root of your project. +Then, create the following two files in the .devcontainer directory:

+
.devcontainer/Dockerfile
FROM mcr.microsoft.com/devcontainers/base:jammy
+
+ARG PIXI_VERSION=v0.39.2
+
+RUN curl -L -o /usr/local/bin/pixi -fsSL --compressed "https://github.com/prefix-dev/pixi/releases/download/${PIXI_VERSION}/pixi-$(uname -m)-unknown-linux-musl" \
+    && chmod +x /usr/local/bin/pixi \
+    && pixi info
+
+# set some user and workdir settings to work nicely with vscode
+USER vscode
+WORKDIR /home/vscode
+
+RUN echo 'eval "$(pixi completion -s bash)"' >> /home/vscode/.bashrc
+
+
.devcontainer/devcontainer.json
{
+    "name": "my-project",
+    "build": {
+      "dockerfile": "Dockerfile",
+      "context": "..",
+    },
+    "customizations": {
+      "vscode": {
+        "settings": {},
+        "extensions": ["ms-python.python", "charliermarsh.ruff", "GitHub.copilot"]
+      }
+    },
+    "features": {
+      "ghcr.io/devcontainers/features/docker-in-docker:2": {}
+    },
+    "mounts": ["source=${localWorkspaceFolderBasename}-pixi,target=${containerWorkspaceFolder}/.pixi,type=volume"],
+    "postCreateCommand": "sudo chown vscode .pixi && pixi install"
+}
+
+
+

Put .pixi in a mount

+

In the above example, we mount the .pixi directory into a volume. +This is needed since the .pixi directory shouldn't be on a case insensitive filesystem (default on macOS, Windows) but instead in its own volume. +There are some conda packages (for example ncurses-feedstock#73) that contain files that only differ in case which leads to errors on case insensitive filesystems.

+
+

Secrets#

+

If you want to authenticate to a private conda channel, you can add secrets to your devcontainer.

+
.devcontainer/devcontainer.json
{
+    "build": "Dockerfile",
+    "context": "..",
+    "options": [
+        "--secret",
+        "id=prefix_dev_token,env=PREFIX_DEV_TOKEN",
+    ],
+    // ...
+}
+
+
.devcontainer/Dockerfile
# ...
+RUN --mount=type=secret,id=prefix_dev_token,uid=1000 \
+    test -s /run/secrets/prefix_dev_token \
+    && pixi auth login --token "$(cat /run/secrets/prefix_dev_token)" https://repo.prefix.dev
+
+

These secrets need to be present either as an environment variable when starting the devcontainer locally or in your GitHub Codespaces settings under Secrets.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/ide_integration/jupyterlab/index.html b/v0.39.2/ide_integration/jupyterlab/index.html new file mode 100644 index 000000000..d4778e3a4 --- /dev/null +++ b/v0.39.2/ide_integration/jupyterlab/index.html @@ -0,0 +1,1865 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + JupyterLab Integration - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

JupyterLab

+ +

Basic usage#

+

Using JupyterLab with pixi is very simple. +You can just create a new pixi project and add the jupyterlab package to it. +The full example is provided under the following Github link.

+
pixi init
+pixi add jupyterlab
+
+

This will create a new pixi project and add the jupyterlab package to it. You can then start JupyterLab using the +following command:

+
pixi run jupyter lab
+
+

If you want to add more "kernels" to JupyterLab, you can simply add them to your current project – as well as any dependencies from the scientific stack you might need.

+
pixi add bash_kernel ipywidgets matplotlib numpy pandas  # ...
+
+

What kernels are available?#

+

You can easily install more "kernels" for JupyterLab. The conda-forge repository has a number of interesting additional kernels - not just Python!

+ +

Advanced usage#

+ + +

If you want to have only one instance of JupyterLab running but still want per-directory Pixi environments, you can use +one of the kernels provided by the pixi-kernel +package.

+

Configuring JupyterLab#

+

To get started, create a Pixi project, add jupyterlab and pixi-kernel and then start JupyterLab:

+
pixi init
+pixi add jupyterlab pixi-kernel
+pixi run jupyter lab
+
+

This will start JupyterLab and open it in your browser.

+

JupyterLab launcher screen showing Pixi
+Kernel +JupyterLab launcher screen showing Pixi
+Kernel

+

pixi-kernel searches for a manifest file, either pixi.toml or pyproject.toml, in the same directory of your +notebook or in any parent directory. When it finds one, it will use the environment specified in the manifest file to +start the kernel and run your notebooks.

+

Binder#

+

If you just want to check a JupyterLab environment running in the cloud using pixi-kernel, you can visit +Binder.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/ide_integration/pycharm/index.html b/v0.39.2/ide_integration/pycharm/index.html new file mode 100644 index 000000000..faa579ce9 --- /dev/null +++ b/v0.39.2/ide_integration/pycharm/index.html @@ -0,0 +1,1910 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + PyCharm Integration - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

PyCharm

+ + + +

You can use PyCharm with pixi environments by using the conda shim provided by the pixi-pycharm package.

+

How to use#

+

To get started, add pixi-pycharm to your pixi project.

+
pixi add pixi-pycharm
+
+

This will ensure that the conda shim is installed in your project's environment.

+

Having pixi-pycharm installed, you can now configure PyCharm to use your pixi environments. +Go to the Add Python Interpreter dialog (bottom right corner of the PyCharm window) and select Conda Environment. +Set Conda Executable to the full path of the conda file (on Windows: conda.bat) which is located in .pixi/envs/default/libexec. +You can get the path using the following command:

+
+
+
+
pixi run 'echo $CONDA_PREFIX/libexec/conda'
+
+
+
+
pixi run 'echo $CONDA_PREFIX\\libexec\\conda.bat'
+
+
+
+
+

This is an executable that tricks PyCharm into thinking it's the proper conda executable. +Under the hood it redirects all calls to the corresponding pixi equivalent.

+
+

Use the conda shim from this pixi project

+

Please make sure that this is the conda shim from this pixi project and not another one. +If you use multiple pixi projects, you might have to adjust the path accordingly as PyCharm remembers the path to the conda executable.

+
+

Add Python Interpreter +Add Python Interpreter

+

Having selected the environment, PyCharm will now use the Python interpreter from your pixi environment.

+

PyCharm should now be able to show you the installed packages as well.

+

PyCharm package list +PyCharm package list

+

You can now run your programs and tests as usual.

+

PyCharm run tests +PyCharm run tests

+
+

Mark .pixi as excluded

+

In order for PyCharm to not get confused about the .pixi directory, please mark it as excluded.

+

Mark Directory as excluded 1 +Mark Directory as excluded 1 +Mark Directory as excluded 2 +Mark Directory as excluded 2

+

Also, when using a remote interpreter, you should exclude the .pixi directory on the remote machine. +Instead, you should run pixi install on the remote machine and select the conda shim from there. +Deployment exclude from remote machine +Deployment exclude from remote machine

+
+

Multiple environments#

+

If your project uses multiple environments to tests different Python versions or dependencies, you can add multiple environments to PyCharm +by specifying Use existing environment in the Add Python Interpreter dialog.

+

Multiple pixi environments +Multiple pixi environments

+

You can then specify the corresponding environment in the bottom right corner of the PyCharm window.

+

Specify environment +Specify environment

+

Multiple pixi projects#

+

When using multiple pixi projects, remember to select the correct Conda Executable for each project as mentioned above. +It also might come up that you have multiple environments it might come up that you have multiple environments with the same name.

+

Multiple default environments +Multiple default environments

+

It is recommended to rename the environments to something unique.

+

Debugging#

+

Logs are written to ~/.cache/pixi-pycharm.log. +You can use them to debug problems. +Please attach the logs when filing a bug report.

+

Install as an optional dependency#

+

In some cases, you might only want to install pixi-pycharm on your local dev-machines but not in production. +To achieve this, we can use multiple environments.

+
[project]
+name = "multi-env"
+version = "0.1.0"
+requires-python = ">=3.12"
+dependencies = ["numpy"]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["linux-64"]
+
+[tool.pixi.feature.lint.dependencies]
+ruff =  "*"
+
+[tool.pixi.feature.dev.dependencies]
+pixi-pycharm = "*"
+
+[tool.pixi.environments]
+# The production environment is the default feature set.
+# Adding a solve group to make sure the same versions are used in the `default` and `prod` environments.
+prod = { solve-group = "main" }
+
+# Setup the default environment to include the dev features.
+# By using `default` instead of `dev` you'll not have to specify the `--environment` flag when running `pixi run`.
+default = { features = ["dev"], solve-group = "main" }
+
+# The lint environment doesn't need the default feature set but only the `lint` feature
+# and thus can also be excluded from the solve group.
+lint = { features = ["lint"], no-default-feature = true }
+
+

Now you as a user can run pixi shell, which will start the default environment. +In production, you then just run pixi run -e prod COMMAND, and the minimal prod environment is installed.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/ide_integration/r_studio/index.html b/v0.39.2/ide_integration/r_studio/index.html new file mode 100644 index 000000000..39a942f83 --- /dev/null +++ b/v0.39.2/ide_integration/r_studio/index.html @@ -0,0 +1,1805 @@ + + + + + + + + + + + + + + + + + + + + + + + + + RStudio - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Developing R scripts in RStudio#

+

You can use pixi to manage your R dependencies. The conda-forge channel contains a wide range of R packages that can be installed using pixi.

+

Installing R packages#

+

R packages are usually prefixed with r- in the conda-forge channel. To install an R package, you can use the following command:

+
pixi add r-<package-name>
+# for example
+pixi add r-ggplot2
+
+

Using R packages in RStudio#

+

To use the R packages installed by pixi in RStudio, you need to run rstudio from an activated environment. This can be achieved by running RStudio from pixi shell or from a task in the pixi.toml file.

+

Full example#

+

The full example can be found here: RStudio example. +Here is an example of a pixi.toml file that sets up an RStudio task:

+
[project]
+name = "r"
+channels = ["conda-forge"]
+platforms = ["linux-64", "osx-64", "osx-arm64"]
+
+[target.linux.tasks]
+rstudio = "rstudio"
+
+[target.osx.tasks]
+rstudio = "open -a rstudio"
+# or alternatively with the full path:
+# rstudio = "/Applications/RStudio.app/Contents/MacOS/RStudio"
+
+[dependencies]
+r = ">=4.3,<5"
+r-ggplot2 = ">=3.5.0,<3.6"
+
+

Once RStudio has loaded, you can execute the following R code that uses the ggplot2 package:

+
# Load the ggplot2 package
+library(ggplot2)
+
+# Load the built-in 'mtcars' dataset
+data <- mtcars
+
+# Create a scatterplot of 'mpg' vs 'wt'
+ggplot(data, aes(x = wt, y = mpg)) +
+  geom_point() +
+  labs(x = "Weight (1000 lbs)", y = "Miles per Gallon") +
+  ggtitle("Fuel Efficiency vs. Weight")
+
+
+

Note

+

This example assumes that you have installed RStudio system-wide. +We are working on updating RStudio as well as the R interpreter builds on Windows for maximum compatibility with pixi.

+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/index.html b/v0.39.2/index.html new file mode 100644 index 000000000..3866ea05d --- /dev/null +++ b/v0.39.2/index.html @@ -0,0 +1,2179 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Getting Started - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Getting Started#

+

Pixi with magic wand

+

Pixi is a package management tool for developers. +It allows the developer to install libraries and applications in a reproducible way. +Use pixi cross-platform, on Windows, Mac and Linux.

+

Installation#

+

To install pixi you can run the following command in your terminal:

+
+
+
+
curl -fsSL https://pixi.sh/install.sh | bash
+
+

The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to ~/.pixi/bin. +If this directory does not already exist, the script will create it.

+

The script will also update your ~/.bash_profile to include ~/.pixi/bin in your PATH, allowing you to invoke the pixi command from anywhere.

+
+
+

PowerShell: +

iwr -useb https://pixi.sh/install.ps1 | iex
+
+winget: +
winget install prefix-dev.pixi
+
+The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to LocalAppData/pixi/bin. +If this directory does not already exist, the script will create it.

+

The command will also automatically add LocalAppData/pixi/bin to your path allowing you to invoke pixi from anywhere.

+
+
+
+
+

Tip

+

You might need to restart your terminal or source your shell for the changes to take effect.

+
+

You can find more options for the installation script here.

+

Autocompletion#

+

To get autocompletion follow the instructions for your shell. +Afterwards, restart the shell or source the shell config file.

+

Bash (default on most Linux systems)#

+

Add the following to the end of ~/.bashrc:

+
~/.bashrc
eval "$(pixi completion --shell bash)"
+
+

Zsh (default on macOS)#

+

Add the following to the end of ~/.zshrc:

+
~/.zshrc
eval "$(pixi completion --shell zsh)"
+
+

PowerShell (pre-installed on all Windows systems)#

+

Add the following to the end of Microsoft.PowerShell_profile.ps1. +You can check the location of this file by querying the $PROFILE variable in PowerShell. +Typically the path is ~\Documents\PowerShell\Microsoft.PowerShell_profile.ps1 or +~/.config/powershell/Microsoft.PowerShell_profile.ps1 on -Nix.

+
(& pixi completion --shell powershell) | Out-String | Invoke-Expression
+
+

Fish#

+

Add the following to the end of ~/.config/fish/config.fish:

+
~/.config/fish/config.fish
pixi completion --shell fish | source
+
+

Nushell#

+

Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

+
mkdir ~/.cache/pixi
+pixi completion --shell nushell | save -f ~/.cache/pixi/completions.nu
+
+

And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

+
use ~/.cache/pixi/completions.nu *
+
+

Elvish#

+

Add the following to the end of ~/.elvish/rc.elv:

+
~/.elvish/rc.elv
eval (pixi completion --shell elvish | slurp)
+
+

Alternative installation methods#

+

Although we recommend installing pixi through the above method we also provide additional installation methods.

+

Homebrew#

+

Pixi is available via homebrew. To install pixi via homebrew simply run:

+
brew install pixi
+
+

Windows installer#

+

We provide an msi installer on our GitHub releases page. +The installer will download pixi and add it to the path.

+

Install from source#

+

pixi is 100% written in Rust, and therefore it can be installed, built and tested with cargo. +To start using pixi from a source build run:

+
cargo install --locked --git https://github.com/prefix-dev/pixi.git pixi
+
+

We don't publish to crates.io anymore, so you need to install it from the repository. +The reason for this is that we depend on some unpublished crates which disallows us to publish to crates.io.

+

or when you want to make changes use:

+
cargo build
+cargo test
+
+

If you have any issues building because of the dependency on rattler checkout +its compile steps.

+

Installer script options#

+
+
+
+

The installation script has several options that can be manipulated through environment variables.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
VariableDescriptionDefault Value
PIXI_VERSIONThe version of pixi getting installed, can be used to up- or down-grade.latest
PIXI_HOMEThe location of the binary folder.$HOME/.pixi
PIXI_ARCHThe architecture the pixi version was built for.uname -m
PIXI_NO_PATH_UPDATEIf set the $PATH will not be updated to add pixi to it.
TMP_DIRThe temporary directory the script uses to download to and unpack the binary from./tmp
+

For example, on Apple Silicon, you can force the installation of the x86 version: +

curl -fsSL https://pixi.sh/install.sh | PIXI_ARCH=x86_64 bash
+
+Or set the version +
curl -fsSL https://pixi.sh/install.sh | PIXI_VERSION=v0.18.0 bash
+

+
+
+

The installation script has several options that can be manipulated through environment variables.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
VariableEnvironment variableDescriptionDefault Value
PixiVersionPIXI_VERSIONThe version of pixi getting installed, can be used to up- or down-grade.latest
PixiHomePIXI_HOMEThe location of the installation.$Env:USERPROFILE\.pixi
NoPathUpdateIf set, the $PATH will not be updated to add pixi to it.
+

For example, set the version using:

+
iwr -useb https://pixi.sh/install.ps1 | iex -Args "-PixiVersion v0.18.0"
+
+
+
+
+

Update#

+

Updating is as simple as installing, rerunning the installation script gets you the latest version.

+

pixi self-update
+
+Or get a specific pixi version using: +
pixi self-update --version x.y.z
+

+
+

Note

+

If you've used a package manager like brew, mamba, conda, paru etc. to install pixi +you must use the built-in update mechanism. e.g. brew upgrade pixi.

+
+

Uninstall#

+

To uninstall pixi from your system, simply remove the binary.

+
+
+
+
rm ~/.pixi/bin/pixi
+
+
+
+
$PIXI_BIN = "$Env:LocalAppData\pixi\bin\pixi"; Remove-Item -Path $PIXI_BIN
+
+
+
+
+

After this command, you can still use the tools you installed with pixi. +To remove these as well, just remove the whole ~/.pixi directory and remove the directory from your path.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/install.ps1 b/v0.39.2/install.ps1 new file mode 100644 index 000000000..9de9f2fe1 --- /dev/null +++ b/v0.39.2/install.ps1 @@ -0,0 +1,207 @@ +<# +.SYNOPSIS + Pixi install script. +.DESCRIPTION + This script is used to install Pixi on Windows from the command line. +.PARAMETER PixiVersion + Specifies the version of Pixi to install. + The default value is 'latest'. You can also specify it by setting the + environment variable 'PIXI_VERSION'. +.PARAMETER PixiHome + Specifies Pixi's home directory. + The default value is '$Env:USERPROFILE\.pixi'. You can also specify it by + setting the environment variable 'PIXI_HOME'. +.PARAMETER NoPathUpdate + If specified, the script will not update the PATH environment variable. +.LINK + https://pixi.sh +.LINK + https://github.com/prefix-dev/pixi +.NOTES + Version: v0.39.2 +#> +param ( + [string] $PixiVersion = 'latest', + [string] $PixiHome = "$Env:USERPROFILE\.pixi", + [switch] $NoPathUpdate +) + +Set-StrictMode -Version Latest + +function Publish-Env { + if (-not ("Win32.NativeMethods" -as [Type])) { + Add-Type -Namespace Win32 -Name NativeMethods -MemberDefinition @" +[DllImport("user32.dll", SetLastError = true, CharSet = CharSet.Auto)] +public static extern IntPtr SendMessageTimeout( + IntPtr hWnd, uint Msg, UIntPtr wParam, string lParam, + uint fuFlags, uint uTimeout, out UIntPtr lpdwResult); +"@ + } + + $HWND_BROADCAST = [IntPtr] 0xffff + $WM_SETTINGCHANGE = 0x1a + $result = [UIntPtr]::Zero + + [Win32.Nativemethods]::SendMessageTimeout($HWND_BROADCAST, + $WM_SETTINGCHANGE, + [UIntPtr]::Zero, + "Environment", + 2, + 5000, + [ref] $result + ) | Out-Null +} + +function Write-Env { + param( + [String] $name, + [String] $val, + [Switch] $global + ) + + $RegisterKey = if ($global) { + Get-Item -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\Session Manager' + } else { + Get-Item -Path 'HKCU:' + } + + $EnvRegisterKey = $RegisterKey.OpenSubKey('Environment', $true) + if ($null -eq $val) { + $EnvRegisterKey.DeleteValue($name) + } else { + $RegistryValueKind = if ($val.Contains('%')) { + [Microsoft.Win32.RegistryValueKind]::ExpandString + } elseif ($EnvRegisterKey.GetValue($name)) { + $EnvRegisterKey.GetValueKind($name) + } else { + [Microsoft.Win32.RegistryValueKind]::String + } + $EnvRegisterKey.SetValue($name, $val, $RegistryValueKind) + } + Publish-Env +} + +function Get-Env { + param( + [String] $name, + [Switch] $global + ) + + $RegisterKey = if ($global) { + Get-Item -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\Session Manager' + } else { + Get-Item -Path 'HKCU:' + } + + $EnvRegisterKey = $RegisterKey.OpenSubKey('Environment') + $RegistryValueOption = [Microsoft.Win32.RegistryValueOptions]::DoNotExpandEnvironmentNames + $EnvRegisterKey.GetValue($name, $null, $RegistryValueOption) +} + +function Get-TargetTriple() { + try { + # NOTE: this might return X64 on ARM64 Windows, which is OK since emulation is available. + # It works correctly starting in PowerShell Core 7.3 and Windows PowerShell in Win 11 22H2. + # Ideally this would just be + # [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture + # but that gets a type from the wrong assembly on Windows PowerShell (i.e. not Core) + $a = [System.Reflection.Assembly]::LoadWithPartialName("System.Runtime.InteropServices.RuntimeInformation") + $t = $a.GetType("System.Runtime.InteropServices.RuntimeInformation") + $p = $t.GetProperty("OSArchitecture") + # Possible OSArchitecture Values: https://learn.microsoft.com/dotnet/api/system.runtime.interopservices.architecture + # Rust supported platforms: https://doc.rust-lang.org/stable/rustc/platform-support.html + switch ($p.GetValue($null).ToString()) + { + "X86" { return "i686-pc-windows-msvc" } + "X64" { return "x86_64-pc-windows-msvc" } + "Arm" { return "thumbv7a-pc-windows-msvc" } + "Arm64" { return "aarch64-pc-windows-msvc" } + } + } catch { + # The above was added in .NET 4.7.1, so Windows PowerShell in versions of Windows + # prior to Windows 10 v1709 may not have this API. + Write-Verbose "Get-TargetTriple: Exception when trying to determine OS architecture." + Write-Verbose $_ + } + + # This is available in .NET 4.0. We already checked for PS 5, which requires .NET 4.5. + Write-Verbose("Get-TargetTriple: falling back to Is64BitOperatingSystem.") + if ([System.Environment]::Is64BitOperatingSystem) { + return "x86_64-pc-windows-msvc" + } else { + return "i686-pc-windows-msvc" + } +} + +if ($Env:PIXI_VERSION) { + $PixiVersion = $Env:PIXI_VERSION +} + +if ($Env:PIXI_HOME) { + $PixiHome = $Env:PIXI_HOME +} + +if ($Env:PIXI_NO_PATH_UPDATE) { + $NoPathUpdate = $true +} + +# Repository name +$REPO = 'prefix-dev/pixi' +$ARCH = Get-TargetTriple + +if (-not @("x86_64-pc-windows-msvc", "aarch64-pc-windows-msvc") -contains $ARCH) { + throw "ERROR: could not find binaries for this platform ($ARCH)." +} + +$BINARY = "pixi-$ARCH" + +if ($PixiVersion -eq 'latest') { + $DOWNLOAD_URL = "https://github.com/$REPO/releases/latest/download/$BINARY.zip" +} else { + $DOWNLOAD_URL = "https://github.com/$REPO/releases/download/$PixiVersion/$BINARY.zip" +} + +$BinDir = Join-Path $PixiHome 'bin' + +Write-Host "This script will automatically download and install Pixi ($PixiVersion) for you." +Write-Host "Getting it from this url: $DOWNLOAD_URL" +Write-Host "The binary will be installed into '$BinDir'" + +$TEMP_FILE = [System.IO.Path]::GetTempFileName() + +try { + Invoke-WebRequest -Uri $DOWNLOAD_URL -OutFile $TEMP_FILE + + # Create the install dir if it doesn't exist + if (!(Test-Path -Path $BinDir)) { + New-Item -ItemType Directory -Path $BinDir | Out-Null + } + + $ZIP_FILE = $TEMP_FILE + ".zip" + Rename-Item -Path $TEMP_FILE -NewName $ZIP_FILE + + # Extract pixi from the downloaded zip file + Expand-Archive -Path $ZIP_FILE -DestinationPath $BinDir -Force +} catch { + Write-Host "Error: '$DOWNLOAD_URL' is not available or failed to download" + exit 1 +} finally { + Remove-Item -Path $ZIP_FILE +} + +# Add pixi to PATH if the folder is not already in the PATH variable +if (!$NoPathUpdate) { + $PATH = Get-Env 'PATH' + if ($PATH -notlike "*$BinDir*") { + Write-Output "Adding $BinDir to PATH" + # For future sessions + Write-Env -name 'PATH' -val "$BinDir;$PATH" + # For current session + $Env:PATH = "$BinDir;$PATH" + Write-Output "You may need to restart your shell" + } else { + Write-Output "$BinDir is already in PATH" + } +} else { + Write-Output "You may need to update your PATH manually to use pixi" +} diff --git a/v0.39.2/install.sh b/v0.39.2/install.sh new file mode 100644 index 000000000..fe9dee837 --- /dev/null +++ b/v0.39.2/install.sh @@ -0,0 +1,162 @@ +#!/usr/bin/env bash +set -euo pipefail +# Version: v0.39.2 + +__wrap__() { + +VERSION="${PIXI_VERSION:-latest}" +PIXI_HOME="${PIXI_HOME:-$HOME/.pixi}" +PIXI_HOME="${PIXI_HOME/#\~/$HOME}" +BIN_DIR="$PIXI_HOME/bin" + +REPO="prefix-dev/pixi" +PLATFORM="$(uname -s)" +ARCH="${PIXI_ARCH:-$(uname -m)}" + +if [[ $PLATFORM == "Darwin" ]]; then + PLATFORM="apple-darwin" +elif [[ $PLATFORM == "Linux" ]]; then + PLATFORM="unknown-linux-musl" +elif [[ $(uname -o) == "Msys" ]]; then + PLATFORM="pc-windows-msvc" +fi + +if [[ $ARCH == "arm64" ]] || [[ $ARCH == "aarch64" ]]; then + ARCH="aarch64" +fi + + + +BINARY="pixi-${ARCH}-${PLATFORM}" +EXTENSION="tar.gz" +if [[ $(uname -o) == "Msys" ]]; then + EXTENSION="zip" +fi + +if [[ $VERSION == "latest" ]]; then + DOWNLOAD_URL="https://github.com/${REPO}/releases/latest/download/${BINARY}.${EXTENSION}" +else + DOWNLOAD_URL="https://github.com/${REPO}/releases/download/${VERSION}/${BINARY}.${EXTENSION}" +fi + +printf "This script will automatically download and install Pixi (${VERSION}) for you.\nGetting it from this url: $DOWNLOAD_URL\n" + +if ! hash curl 2> /dev/null && ! hash wget 2> /dev/null; then + echo "error: you need either 'curl' or 'wget' installed for this script." + exit 1 +fi + +if ! hash tar 2> /dev/null; then + echo "error: you do not have 'tar' installed which is required for this script." + exit 1 +fi + +TEMP_FILE="$(mktemp "${TMPDIR:-/tmp}/.pixi_install.XXXXXXXX")" + +cleanup() { + rm -f "$TEMP_FILE" +} + +trap cleanup EXIT + +# Test if stdout is a terminal before showing progress +if [[ ! -t 1 ]]; then + CURL_OPTIONS="--silent" # --no-progress-meter is better, but only available in 7.67+ + WGET_OPTIONS="--no-verbose" +else + CURL_OPTIONS="--no-silent" + WGET_OPTIONS="--show-progress" +fi + +if hash curl 2> /dev/null; then + HTTP_CODE="$(curl -SL $CURL_OPTIONS "$DOWNLOAD_URL" --output "$TEMP_FILE" --write-out "%{http_code}")" + if [[ "${HTTP_CODE}" -lt 200 || "${HTTP_CODE}" -gt 299 ]]; then + echo "error: '${DOWNLOAD_URL}' is not available" + exit 1 + fi +elif hash wget 2> /dev/null; then + if ! wget $WGET_OPTIONS --output-document="$TEMP_FILE" "$DOWNLOAD_URL"; then + echo "error: '${DOWNLOAD_URL}' is not available" + exit 1 + fi +fi + +# Check that file was correctly created (https://github.com/prefix-dev/pixi/issues/446) +if [[ ! -s "$TEMP_FILE" ]]; then + echo "error: temporary file ${TEMP_FILE} not correctly created." + echo " As a workaround, you can try set TMPDIR env variable to directory with write permissions." + exit 1 +fi + +# Extract pixi from the downloaded file +mkdir -p "$BIN_DIR" +if [[ "$(uname -o)" == "Msys" ]]; then + unzip "$TEMP_FILE" -d "$BIN_DIR" +else + # Extract to a temporary directory first + TEMP_DIR=$(mktemp -d) + tar -xzf "$TEMP_FILE" -C "$TEMP_DIR" + + # Find and move the `pixi` binary, making sure to handle the case where it's in a subdirectory + if [[ -f "$TEMP_DIR/pixi" ]]; then + mv "$TEMP_DIR/pixi" "$BIN_DIR/" + else + mv "$(find "$TEMP_DIR" -type f -name pixi)" "$BIN_DIR/" + fi + + chmod +x "$BIN_DIR/pixi" + rm -rf "$TEMP_DIR" +fi + +echo "The 'pixi' binary is installed into '${BIN_DIR}'" + +update_shell() { + FILE="$1" + LINE="$2" + + # shell update can be suppressed by `PIXI_NO_PATH_UPDATE` env var + [[ ! -z "${PIXI_NO_PATH_UPDATE:-}" ]] && echo "No path update because PIXI_NO_PATH_UPDATE has a value" && return + + # Create the file if it doesn't exist + if [ -f "$FILE" ]; then + touch "$FILE" + fi + + # Append the line if not already present + if ! grep -Fxq "$LINE" "$FILE" + then + echo "Updating '${FILE}'" + echo "$LINE" >> "$FILE" + echo "Please restart or source your shell." + fi +} + +case "$(basename "$SHELL")" in + bash) + # Default to bashrc as that is used in non login shells instead of the profile. + LINE="export PATH=\"${BIN_DIR}:\$PATH\"" + update_shell ~/.bashrc "$LINE" + ;; + + fish) + LINE="fish_add_path ${BIN_DIR}" + update_shell ~/.config/fish/config.fish "$LINE" + ;; + + zsh) + LINE="export PATH=\"${BIN_DIR}:\$PATH\"" + update_shell ~/.zshrc "$LINE" + ;; + + tcsh) + LINE="set path = ( ${BIN_DIR} \$path )" + update_shell ~/.tcshrc "$LINE" + ;; + + *) + echo "Could not update shell: $(basename "$SHELL")" + echo "Please permanently add '${BIN_DIR}' to your \$PATH to enable the 'pixi' command." + ;; +esac + +}; __wrap__ diff --git a/v0.39.2/overrides/partials/footer.html b/v0.39.2/overrides/partials/footer.html new file mode 100644 index 000000000..784834f80 --- /dev/null +++ b/v0.39.2/overrides/partials/footer.html @@ -0,0 +1,104 @@ + + + + diff --git a/v0.39.2/packaging/index.html b/v0.39.2/packaging/index.html new file mode 100644 index 000000000..1be2928c6 --- /dev/null +++ b/v0.39.2/packaging/index.html @@ -0,0 +1,1803 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Packaging pixi - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Packaging

+ +

This is a guide for distribution maintainers wanting to package pixi for a different package manager. +Users of pixi can ignore this page.

+

Building#

+

Pixi is written in Rust and compiled using Cargo, which are needed as compile-time dependencies. +At runtime pixi needs no dependencies in other than the runtime it was compiled against (libc, ...).

+

To build pixi run +

cargo build --locked --profile dist
+
+Instead of using the predefined dist profile, which is optimized for binary size, you can also pass other options to +let cargo optimize the binary for other metrics.

+

Build-time Options#

+

Pixi provides some compile-time options, which can influence the build

+

TLS#

+

By default, pixi is built with Rustls TLS implementation. You can compile pixi using the platform native TLS implementation +using by adding --no-default-features --feature native-tls to the build command. Note that this might add additional +runtime dependencies, such as OpenSSL on Linux.

+

Self-Update#

+

Pixi has a self-update functionality. When pixi is installed using another package manager one usually doesn't want pixi +to try to update itself and instead let it be updated by the package manager. +For this reason the self-update feature is disabled by default. It can be enabled by adding --feature self_update to +the build command.

+

When the self-update feature is disabled and a user tries to run pixi self-update an error message is displayed. This +message can be customized by setting the PIXI_SELF_UPDATE_DISABLED_MESSAGE environment variable at build time to point +the user to the package manager they should be using to update pixi. +

PIXI_SELF_UPDATE_DISABLED_MESSAGE="`self-update` has been disabled for this build. Run `brew upgrade pixi` instead" cargo build --locked --profile dist
+

+

Custom version#

+

You can specify a custom version string to be used in the --version output by setting the PIXI_VERSION environment variable during the build.

+
PIXI_VERSION="HEAD-123456" cargo build --locked --profile dist
+
+

Shell completion#

+

After building pixi you can generate shell autocompletion scripts by running +

pixi completion --shell <SHELL>
+
+and saving the output to a file. +Currently supported shells are bash, elvish, fish, nushell, powershell and zsh.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/reference/cli/index.html b/v0.39.2/reference/cli/index.html new file mode 100644 index 000000000..b90457544 --- /dev/null +++ b/v0.39.2/reference/cli/index.html @@ -0,0 +1,4180 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Commands - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

CLI

+ +

Global options#

+
    +
  • --verbose (-v|vv|vvv) Increase the verbosity of the output messages, the -v|vv|vvv increases the level of verbosity respectively.
  • +
  • --help (-h) Shows help information, use -h to get the short version of the help.
  • +
  • --version (-V): shows the version of pixi that is used.
  • +
  • --quiet (-q): Decreases the amount of output.
  • +
  • --color <COLOR>: Whether the log needs to be colored [env: PIXI_COLOR=] [default: auto] [possible values: always, never, auto]. + Pixi also honors the FORCE_COLOR and NO_COLOR environment variables. + They both take precedence over --color and PIXI_COLOR.
  • +
  • --no-progress: Disables the progress bar.[env: PIXI_NO_PROGRESS] [default: false]
  • +
+

init#

+

This command is used to create a new project. +It initializes a pixi.toml file and also prepares a .gitignore to prevent the environment from being added to git.

+

It also supports the pyproject.toml file, if you have a pyproject.toml file in the directory where you run pixi init, it appends the pixi data to the pyproject.toml instead of a new pixi.toml file.

+
Arguments#
+
    +
  1. [PATH]: Where to place the project (defaults to current path) [default: .]
  2. +
+
Options#
+
    +
  • --channel <CHANNEL> (-c): Specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • +
  • --platform <PLATFORM> (-p): Specify a platform that the project supports. (Allowed to be used more than once)
  • +
  • --import <ENV_FILE> (-i): Import an existing conda environment file, e.g. environment.yml.
  • +
  • --format <FORMAT>: Specify the format of the project file, either pyproject or pixi. [default: pixi]
  • +
  • --scm <SCM>: Specify the SCM used to manage the project with. Possible values: github, gitlab, codeberg. [default: github]
  • +
+
+

Importing an environment.yml

+
+

When importing an environment, the pixi.toml will be created with the dependencies from the environment file. + The pixi.lock will be created when you install the environment. + We don't support git+ urls as dependencies for pip packages and for the defaults channel we use main, r and msys2 as the default channels.

+
pixi init myproject
+pixi init ~/myproject
+pixi init  # Initializes directly in the current directory.
+pixi init --channel conda-forge --channel bioconda myproject
+pixi init --platform osx-64 --platform linux-64 myproject
+pixi init --import environment.yml
+pixi init --format pyproject
+pixi init --format pixi --scm gitlab
+
+

add#

+

Adds dependencies to the manifest file. +It will only add dependencies compatible with the rest of the dependencies in the project. +More info on multi-platform configuration.

+

If the project manifest is a pyproject.toml, by default, adding a pypi dependency will add it to the native project.dependencies array, or to the native dependency-groups table if a feature is specified:

+
    +
  • pixi add --pypi boto3 would add boto3 to the project.dependencies array
  • +
  • pixi add --pypi boto3 --feature aws would add boto3 to the dependency-groups.aws array
  • +
+

Note that if --platform or --editable are specified, the pypi dependency +will be added to the tool.pixi.pypi-dependencies table instead as native +arrays have no support for platform-specific or editable dependencies.

+

These dependencies will be read by pixi as if they had been added to the pixi pypi-dependencies tables of the default or a named feature.

+
Arguments#
+
    +
  1. [SPECS]: The package(s) to add, space separated. The version constraint is optional.
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --host: Specifies a host dependency, important for building a package.
  • +
  • --build: Specifies a build dependency, important for building a package.
  • +
  • --pypi: Specifies a PyPI dependency, not a conda package. + Parses dependencies as PEP508 requirements, supporting extras and versions. + See configuration for details.
  • +
  • --no-install: Don't install the package to the environment, only add the package to the lock-file.
  • +
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • +
  • --platform <PLATFORM> (-p): The platform for which the dependency should be added. (Allowed to be used more than once)
  • +
  • --feature <FEATURE> (-f): The feature for which the dependency should be added.
  • +
  • --editable: Specifies an editable dependency; only used in combination with --pypi.
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi add numpy # (1)!
+pixi add numpy pandas "pytorch>=1.8" # (2)!
+pixi add "numpy>=1.22,<1.24" # (3)!
+pixi add --manifest-path ~/myproject/pixi.toml numpy # (4)!
+pixi add --host "python>=3.9.0" # (5)!
+pixi add --build cmake # (6)!
+pixi add --platform osx-64 clang # (7)!
+pixi add --no-install numpy # (8)!
+pixi add --no-lockfile-update numpy # (9)!
+pixi add --feature featurex numpy # (10)!
+
+# Add a pypi dependency
+pixi add --pypi requests[security] # (11)!
+pixi add --pypi Django==5.1rc1 # (12)!
+pixi add --pypi "boltons>=24.0.0" --feature lint # (13)!
+pixi add --pypi "boltons @ https://files.pythonhosted.org/packages/46/35/e50d4a115f93e2a3fbf52438435bb2efcf14c11d4fcd6bdcd77a6fc399c9/boltons-24.0.0-py3-none-any.whl" # (14)!
+pixi add --pypi "exchangelib @ git+https://github.com/ecederstrand/exchangelib" # (15)!
+pixi add --pypi "project @ file:///absolute/path/to/project" # (16)!
+pixi add --pypi "project@file:///absolute/path/to/project" --editable # (17)!
+
+
    +
  1. This will add the numpy package to the project with the latest available for the solved environment.
  2. +
  3. This will add multiple packages to the project solving them all together.
  4. +
  5. This will add the numpy package with the version constraint.
  6. +
  7. This will add the numpy package to the project of the manifest file at the given path.
  8. +
  9. This will add the python package as a host dependency. There is currently no different behavior for host dependencies.
  10. +
  11. This will add the cmake package as a build dependency. There is currently no different behavior for build dependencies.
  12. +
  13. This will add the clang package only for the osx-64 platform.
  14. +
  15. This will add the numpy package to the manifest and lockfile, without installing it in an environment.
  16. +
  17. This will add the numpy package to the manifest without updating the lockfile or installing it in the environment.
  18. +
  19. This will add the numpy package in the feature featurex.
  20. +
  21. This will add the requests package as pypi dependency with the security extra.
  22. +
  23. This will add the pre-release version of Django to the project as a pypi dependency.
  24. +
  25. This will add the boltons package in the feature lint as pypi dependency.
  26. +
  27. This will add the boltons package with the given url as pypi dependency.
  28. +
  29. This will add the exchangelib package with the given git url as pypi dependency.
  30. +
  31. This will add the project package with the given file url as pypi dependency.
  32. +
  33. This will add the project package with the given file url as an editable package as pypi dependency.
  34. +
+
+

Tip

+

If you want to use a non default pinning strategy, you can set it using pixi's configuration. +

pixi config set pinning-strategy no-pin --global
+
+The default is semver which will pin the dependencies to the latest major version or minor for v0 versions.

+
+

Note

+

There is an exception to this rule when you add a package we defined as non semver, then we'll use the minor strategy. +These are the packages we defined as non semver: +Python, Rust, Julia, GCC, GXX, GFortran, NodeJS, Deno, R, R-Base, Perl

+
+
+

install#

+

Installs an environment based on the manifest file. +If there is no pixi.lock file or it is not up-to-date with the manifest file, it will (re-)generate the lock file.

+

pixi install only installs one environment at a time, if you have multiple environments you can select the right one with the --environment flag. +If you don't provide an environment, the default environment will be installed.

+

Running pixi install is not required before running other commands. +As all commands interacting with the environment will first run the install command if the environment is not ready, to make sure you always run in a correct state. +E.g. pixi run, pixi shell, pixi shell-hook, pixi add, pixi remove to name a few.

+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided the default environment will be used.
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi install
+pixi install --manifest-path ~/myproject/pixi.toml
+pixi install --frozen
+pixi install --locked
+pixi install --environment lint
+pixi install -e lint
+
+

update#

+

The update command checks if there are newer versions of the dependencies and updates the pixi.lock file and environments accordingly. +It will only update the lock file if the dependencies in the manifest file are still compatible with the new versions.

+
Arguments#
+
    +
  1. [PACKAGES]... The packages to update, space separated. If no packages are provided, all packages will be updated.
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided all the environments are updated.
  • +
  • --platform <PLATFORM> (-p): The platform for which the dependencies should be updated.
  • +
  • --dry-run (-n): Only show the changes that would be made, without actually updating the lock file or environment.
  • +
  • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
  • +
  • --json: Output the changes in json format.
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi update numpy
+pixi update numpy pandas
+pixi update --manifest-path ~/myproject/pixi.toml numpy
+pixi update --environment lint python
+pixi update -e lint -e schema -e docs pre-commit
+pixi update --platform osx-arm64 mlx
+pixi update -p linux-64 -p osx-64 numpy
+pixi update --dry-run
+pixi update --no-install boto3
+
+

upgrade#

+

The upgrade command checks if there are newer versions of the dependencies and upgrades them in the manifest file. +update updates dependencies in the lock file while still fulfilling the version requirements set in the manifest. +upgrade loosens the requirements for the given packages, updates the lock file and the adapts the manifest accordingly.

+
Arguments#
+
    +
  1. [PACKAGES]... The packages to upgrade, space separated. If no packages are provided, all packages will be upgraded.
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --feature <FEATURE> (-e): The feature to upgrade, if none are provided the default feature will be used.
  • +
  • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
  • +
  • --json: Output the changes in json format.
  • +
  • --dry-run (-n): Only show the changes that would be made, without actually updating the manifest, lock file, or environment.
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi upgrade
+pixi upgrade numpy
+pixi upgrade numpy pandas
+pixi upgrade --manifest-path ~/myproject/pixi.toml numpy
+pixi upgrade --feature lint python
+pixi upgrade --json
+pixi upgrade --dry-run
+
+
+

Note

+

The pixi upgrade command will only update versions, except when you specify the exact package name (pixi upgrade numpy).

+

Then it will remove all fields, apart from:

+
    +
  • build field containing a wildcard *
  • +
  • channel
  • +
  • file_name
  • +
  • url
  • +
  • subdir.
  • +
+
+

run#

+

The run commands first checks if the environment is ready to use. +When you didn't run pixi install the run command will do that for you. +The custom tasks defined in the manifest file are also available through the run command.

+

You cannot run pixi run source setup.bash as source is not available in the deno_task_shell commandos and not an executable.

+
Arguments#
+
    +
  1. [TASK]... The task you want to run in the projects environment, this can also be a normal command. And all arguments after the task will be passed to the task.
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to run the task in, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • +
  • --clean-env: Run the task in a clean environment, this will remove all environment variables of the shell environment except for the ones pixi sets. THIS DOESN't WORK ON Windows.
  • +
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • +
  • --revalidate: Revalidate the full environment, instead of checking the lock file hash. more info
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi run python
+pixi run cowpy "Hey pixi user"
+pixi run --manifest-path ~/myproject/pixi.toml python
+pixi run --frozen python
+pixi run --locked python
+# If you have specified a custom task in the pixi.toml you can run it with run as well
+pixi run build
+# Extra arguments will be passed to the tasks command.
+pixi run task argument1 argument2
+
+# If you have multiple environments you can select the right one with the --environment flag.
+pixi run --environment cuda python
+
+# THIS DOESN'T WORK ON WINDOWS
+# If you want to run a command in a clean environment you can use the --clean-env flag.
+# The PATH should only contain the pixi environment here.
+pixi run --clean-env "echo \$PATH"
+
+
+

Info

+

In pixi the deno_task_shell is the underlying runner of the run command. +Checkout their documentation for the syntax and available commands. +This is done so that the run commands can be run across all platforms.

+
+
+

Cross environment tasks

+

If you're using the depends-on feature of the tasks, the tasks will be run in the order you specified them. +The depends-on can be used cross environment, e.g. you have this pixi.toml:

+
+pixi.toml +
[tasks]
+start = { cmd = "python start.py", depends-on = ["build"] }
+
+[feature.build.tasks]
+build = "cargo build"
+[feature.build.dependencies]
+rust = ">=1.74"
+
+[environments]
+build = ["build"]
+
+

Then you're able to run the build from the build environment and start from the default environment. +By only calling: +

pixi run start
+

+
+
+

exec#

+

Runs a command in a temporary environment disconnected from any project. +This can be useful to quickly test out a certain package or version.

+

Temporary environments are cached. If the same command is run again, the same environment will be reused.

+
+Cleaning temporary environments +

Currently, temporary environments can only be cleaned up manually. +Environments for pixi exec are stored under cached-envs-v0/ in the cache directory. +Run pixi info to find the cache directory.

+
+
Arguments#
+
    +
  1. <COMMAND>: The command to run.
  2. +
+

Options:#

+
    +
  • --spec <SPECS> (-s): Matchspecs of packages to install. If this is not provided, the package is guessed from the command.
  • +
  • --channel <CHANNELS> (-c): The channel to install the packages from. If not specified the default channel is used.
  • +
  • --force-reinstall If specified a new environment is always created even if one already exists.
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi exec python
+
+# Add a constraint to the python version
+pixi exec -s python=3.9 python
+
+# Run ipython and include the py-rattler package in the environment
+pixi exec -s ipython -s py-rattler ipython
+
+# Force reinstall to recreate the environment and get the latest package versions
+pixi exec --force-reinstall -s ipython -s py-rattler ipython
+
+

remove#

+

Removes dependencies from the manifest file.

+

If the project manifest is a pyproject.toml, removing a pypi dependency with the --pypi flag will remove it from either

+
    +
  • the native pyproject project.dependencies array or the native project.optional-dependencies table (if a feature is specified)
  • +
  • pixi pypi-dependencies tables of the default or a named feature (if a feature is specified)
  • +
+
Arguments#
+
    +
  1. <DEPS>...: List of dependencies you wish to remove from the project.
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --host: Specifies a host dependency, important for building a package.
  • +
  • --build: Specifies a build dependency, important for building a package.
  • +
  • --pypi: Specifies a PyPI dependency, not a conda package.
  • +
  • --platform <PLATFORM> (-p): The platform from which the dependency should be removed.
  • +
  • --feature <FEATURE> (-f): The feature from which the dependency should be removed.
  • +
  • --no-install: Don't install the environment, only remove the package from the lock-file and manifest.
  • +
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • +
+
pixi remove numpy
+pixi remove numpy pandas pytorch
+pixi remove --manifest-path ~/myproject/pixi.toml numpy
+pixi remove --host python
+pixi remove --build cmake
+pixi remove --pypi requests
+pixi remove --platform osx-64 --build clang
+pixi remove --feature featurex clang
+pixi remove --feature featurex --platform osx-64 clang
+pixi remove --feature featurex --platform osx-64 --build clang
+pixi remove --no-install numpy
+
+

task#

+

If you want to make a shorthand for a specific command you can add a task for it.

+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
+

task add#

+

Add a task to the manifest file, use --depends-on to add tasks you want to run before this task, e.g. build before an execute task.

+
Arguments#
+
    +
  1. <NAME>: The name of the task.
  2. +
  3. <COMMAND>: The command to run. This can be more than one word.
  4. +
+
+

Info

+

If you are using $ for env variables they will be resolved before adding them to the task. +If you want to use $ in the task you need to escape it with a \, e.g. echo \$HOME.

+
+
Options#
+
    +
  • --platform <PLATFORM> (-p): the platform for which this task should be added.
  • +
  • --feature <FEATURE> (-f): the feature for which the task is added, if non provided the default tasks will be added.
  • +
  • --depends-on <DEPENDS_ON>: the task it depends on to be run before the one your adding.
  • +
  • --cwd <CWD>: the working directory for the task relative to the root of the project.
  • +
  • --env <ENV>: the environment variables as key=value pairs for the task, can be used multiple times, e.g. --env "VAR1=VALUE1" --env "VAR2=VALUE2".
  • +
  • --description <DESCRIPTION>: a description of the task.
  • +
+
pixi task add cow cowpy "Hello User"
+pixi task add tls ls --cwd tests
+pixi task add test cargo t --depends-on build
+pixi task add build-osx "METAL=1 cargo build" --platform osx-64
+pixi task add train python train.py --feature cuda
+pixi task add publish-pypi "hatch publish --yes --repo main" --feature build --env HATCH_CONFIG=config/hatch.toml --description "Publish the package to pypi"
+
+

This adds the following to the manifest file:

+
[tasks]
+cow = "cowpy \"Hello User\""
+tls = { cmd = "ls", cwd = "tests" }
+test = { cmd = "cargo t", depends-on = ["build"] }
+
+[target.osx-64.tasks]
+build-osx = "METAL=1 cargo build"
+
+[feature.cuda.tasks]
+train = "python train.py"
+
+[feature.build.tasks]
+publish-pypi = { cmd = "hatch publish --yes --repo main", env = { HATCH_CONFIG = "config/hatch.toml" }, description = "Publish the package to pypi" }
+
+

Which you can then run with the run command:

+
pixi run cow
+# Extra arguments will be passed to the tasks command.
+pixi run test --test test1
+
+

task remove#

+

Remove the task from the manifest file

+
Arguments#
+
    +
  • <NAMES>: The names of the tasks, space separated.
  • +
+
Options#
+
    +
  • --platform <PLATFORM> (-p): the platform for which this task is removed.
  • +
  • --feature <FEATURE> (-f): the feature for which the task is removed.
  • +
+
pixi task remove cow
+pixi task remove --platform linux-64 test
+pixi task remove --feature cuda task
+
+

task alias#

+

Create an alias for a task.

+
Arguments#
+
    +
  1. <ALIAS>: The alias name
  2. +
  3. <DEPENDS_ON>: The names of the tasks you want to execute on this alias, order counts, first one runs first.
  4. +
+
Options#
+
    +
  • --platform <PLATFORM> (-p): the platform for which this alias is created.
  • +
+
pixi task alias test-all test-py test-cpp test-rust
+pixi task alias --platform linux-64 test test-linux
+pixi task alias moo cow
+
+

task list#

+

List all tasks in the project.

+
Options#
+
    +
  • --environment(-e): the environment's tasks list, if non is provided the default tasks will be listed.
  • +
  • --summary(-s): list the tasks per environment.
  • +
+
pixi task list
+pixi task list --environment cuda
+pixi task list --summary
+
+

list#

+

List project's packages. Highlighted packages are explicit dependencies.

+
Arguments#
+
    +
  1. [REGEX]: List only packages matching a regular expression (optional).
  2. +
+
Options#
+
    +
  • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
  • +
  • --json: Whether to output in json format.
  • +
  • --json-pretty: Whether to output in pretty json format
  • +
  • --sort-by <SORT_BY>: Sorting strategy [default: name] [possible values: size, name, type]
  • +
  • --explicit (-x): Only list the packages that are explicitly added to the manifest file.
  • +
  • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
  • +
  • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
  • +
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • +
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
  • +
+
pixi list
+pixi list py
+pixi list --json-pretty
+pixi list --explicit
+pixi list --sort-by size
+pixi list --platform win-64
+pixi list --environment cuda
+pixi list --frozen
+pixi list --locked
+pixi list --no-install
+
+

Output will look like this, where python will be green as it is the package that was explicitly added to the manifest file:

+
 pixi list
+ Package           Version     Build               Size       Kind   Source
+ _libgcc_mutex     0.1         conda_forge         2.5 KiB    conda  _libgcc_mutex-0.1-conda_forge.tar.bz2
+ _openmp_mutex     4.5         2_gnu               23.1 KiB   conda  _openmp_mutex-4.5-2_gnu.tar.bz2
+ bzip2             1.0.8       hd590300_5          248.3 KiB  conda  bzip2-1.0.8-hd590300_5.conda
+ ca-certificates   2023.11.17  hbcca054_0          150.5 KiB  conda  ca-certificates-2023.11.17-hbcca054_0.conda
+ ld_impl_linux-64  2.40        h41732ed_0          688.2 KiB  conda  ld_impl_linux-64-2.40-h41732ed_0.conda
+ libexpat          2.5.0       hcb278e6_1          76.2 KiB   conda  libexpat-2.5.0-hcb278e6_1.conda
+ libffi            3.4.2       h7f98852_5          56.9 KiB   conda  libffi-3.4.2-h7f98852_5.tar.bz2
+ libgcc-ng         13.2.0      h807b86a_4          755.7 KiB  conda  libgcc-ng-13.2.0-h807b86a_4.conda
+ libgomp           13.2.0      h807b86a_4          412.2 KiB  conda  libgomp-13.2.0-h807b86a_4.conda
+ libnsl            2.0.1       hd590300_0          32.6 KiB   conda  libnsl-2.0.1-hd590300_0.conda
+ libsqlite         3.44.2      h2797004_0          826 KiB    conda  libsqlite-3.44.2-h2797004_0.conda
+ libuuid           2.38.1      h0b41bf4_0          32.8 KiB   conda  libuuid-2.38.1-h0b41bf4_0.conda
+ libxcrypt         4.4.36      hd590300_1          98 KiB     conda  libxcrypt-4.4.36-hd590300_1.conda
+ libzlib           1.2.13      hd590300_5          60.1 KiB   conda  libzlib-1.2.13-hd590300_5.conda
+ ncurses           6.4         h59595ed_2          863.7 KiB  conda  ncurses-6.4-h59595ed_2.conda
+ openssl           3.2.0       hd590300_1          2.7 MiB    conda  openssl-3.2.0-hd590300_1.conda
+ python            3.12.1      hab00c5b_1_cpython  30.8 MiB   conda  python-3.12.1-hab00c5b_1_cpython.conda
+ readline          8.2         h8228510_1          274.9 KiB  conda  readline-8.2-h8228510_1.conda
+ tk                8.6.13      noxft_h4845f30_101  3.2 MiB    conda  tk-8.6.13-noxft_h4845f30_101.conda
+ tzdata            2023d       h0c530f3_0          116.8 KiB  conda  tzdata-2023d-h0c530f3_0.conda
+ xz                5.2.6       h166bdaf_0          408.6 KiB  conda  xz-5.2.6-h166bdaf_0.tar.bz2
+
+

tree#

+

Display the project's packages in a tree. Highlighted packages are those specified in the manifest.

+

The package tree can also be inverted (-i), to see which packages require a specific dependencies.

+
Arguments#
+
    +
  • REGEX optional regex of which dependencies to filter the tree to, or which dependencies to start with when inverting the tree.
  • +
+
Options#
+
    +
  • --invert (-i): Invert the dependency tree, that is given a REGEX pattern that matches some packages, show all the packages that depend on those.
  • +
  • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
  • +
  • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
  • +
  • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
  • +
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • +
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
  • +
+
pixi tree
+pixi tree pre-commit
+pixi tree -i yaml
+pixi tree --environment docs
+pixi tree --platform win-64
+
+
+

Warning

+

Use -v to show which pypi packages are not yet parsed correctly. The extras and markers parsing is still under development.

+
+

Output will look like this, where direct packages in the manifest file will be green. +Once a package has been displayed once, the tree won't continue to recurse through its dependencies (compare the first time python appears, vs the rest), and it will instead be marked with a star (*).

+

Version numbers are colored by the package type, yellow for Conda packages and blue for PyPI.

+
 pixi tree
+├── pre-commit v3.3.3
+   ├── cfgv v3.3.1
+      └── python v3.12.2
+          ├── bzip2 v1.0.8
+          ├── libexpat v2.6.2
+          ├── libffi v3.4.2
+          ├── libsqlite v3.45.2
+             └── libzlib v1.2.13
+          ├── libzlib v1.2.13 (*)
+          ├── ncurses v6.4.20240210
+          ├── openssl v3.2.1
+          ├── readline v8.2
+             └── ncurses v6.4.20240210 (*)
+          ├── tk v8.6.13
+             └── libzlib v1.2.13 (*)
+          └── xz v5.2.6
+   ├── identify v2.5.35
+      └── python v3.12.2 (*)
+...
+└── tbump v6.9.0
+...
+    └── tomlkit v0.12.4
+        └── python v3.12.2 (*)
+
+

A regex pattern can be specified to filter the tree to just those that show a specific direct, or transitive dependency:

+
 pixi tree pre-commit
+└── pre-commit v3.3.3
+    ├── virtualenv v20.25.1
+       ├── filelock v3.13.1
+          └── python v3.12.2
+              ├── libexpat v2.6.2
+              ├── readline v8.2
+                 └── ncurses v6.4.20240210
+              ├── libsqlite v3.45.2
+                 └── libzlib v1.2.13
+              ├── bzip2 v1.0.8
+              ├── libzlib v1.2.13 (*)
+              ├── libffi v3.4.2
+              ├── tk v8.6.13
+                 └── libzlib v1.2.13 (*)
+              ├── xz v5.2.6
+              ├── ncurses v6.4.20240210 (*)
+              └── openssl v3.2.1
+       ├── platformdirs v4.2.0
+          └── python v3.12.2 (*)
+       ├── distlib v0.3.8
+          └── python v3.12.2 (*)
+       └── python v3.12.2 (*)
+    ├── pyyaml v6.0.1
+...
+
+

Additionally, the tree can be inverted, and it can show which packages depend on a regex pattern. +The packages specified in the manifest will also be highlighted (in this case cffconvert and pre-commit would be).

+
 pixi tree -i yaml
+
+ruamel.yaml v0.18.6
+├── pykwalify v1.8.0
+   └── cffconvert v2.0.0
+└── cffconvert v2.0.0
+
+pyyaml v6.0.1
+└── pre-commit v3.3.3
+
+ruamel.yaml.clib v0.2.8
+└── ruamel.yaml v0.18.6
+    ├── pykwalify v1.8.0
+       └── cffconvert v2.0.0
+    └── cffconvert v2.0.0
+
+yaml v0.2.5
+└── pyyaml v6.0.1
+    └── pre-commit v3.3.3
+
+

shell#

+

This command starts a new shell in the project's environment. +To exit the pixi shell, simply run exit.

+
Options#
+
    +
  • --change-ps1 <true or false>: When set to false, the (pixi) prefix in the shell prompt is removed (default: true). The default behavior can be configured globally.
  • +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --no-install: Don't install the environment, only activate the environment.
  • +
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to activate the shell in, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • +
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
  • +
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • +
  • --revalidate: Revalidate the full environment, instead of checking lock file hash. more info
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi shell
+exit
+pixi shell --manifest-path ~/myproject/pixi.toml
+exit
+pixi shell --frozen
+exit
+pixi shell --locked
+exit
+pixi shell --environment cuda
+exit
+
+

shell-hook#

+

This command prints the activation script of an environment.

+
Options#
+
    +
  • --shell <SHELL> (-s): The shell for which the activation script should be printed. Defaults to the current shell. + Currently supported variants: [bash, zsh, xonsh, cmd, powershell, fish, nushell]
  • +
  • --manifest-path: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • +
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to activate, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • +
  • --json: Print all environment variables that are exported by running the activation script as JSON. When specifying + this option, --shell is ignored.
  • +
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • +
  • --revalidate: Revalidate the full environment, instead of checking lock file hash. more info
  • +
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • +
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
  • +
+
pixi shell-hook
+pixi shell-hook --shell bash
+pixi shell-hook --shell zsh
+pixi shell-hook -s powershell
+pixi shell-hook --manifest-path ~/myproject/pixi.toml
+pixi shell-hook --frozen
+pixi shell-hook --locked
+pixi shell-hook --environment cuda
+pixi shell-hook --json
+
+

Example use-case, when you want to get rid of the pixi executable in a Docker container.

+
pixi shell-hook --shell bash > /etc/profile.d/pixi.sh
+rm ~/.pixi/bin/pixi # Now the environment will be activated without the need for the pixi executable.
+
+ +

Search a package, output will list the latest version of the package.

+
Arguments#
+
    +
  1. <PACKAGE>: Name of package to search, it's possible to use wildcards (*).
  2. +
+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • +
  • --limit <LIMIT> (-l): optionally limit the number of search results
  • +
  • --platform <PLATFORM> (-p): specify a platform that you want to search for. (default: current platform)
  • +
+
pixi search pixi
+pixi search --limit 30 "py*"
+# search in a different channel and for a specific platform
+pixi search -c robostack --platform linux-64 "plotjuggler*"
+
+

self-update#

+

Update pixi to the latest version or a specific version. If pixi was installed using another package manager this feature might not +be available and pixi should be updated using the package manager used to install it.

+
Options#
+
    +
  • --version <VERSION>: The desired version (to downgrade or upgrade to). Update to the latest version if not specified.
  • +
+
pixi self-update
+pixi self-update --version 0.13.0
+
+

info#

+

Shows helpful information about the pixi installation, cache directories, disk usage, and more. +More information here.

+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --extended: extend the information with more slow queries to the system, like directory sizes.
  • +
  • --json: Get a machine-readable version of the information as output.
  • +
+
pixi info
+pixi info --json --extended
+
+

clean#

+

Clean the parts of your system which are touched by pixi. +Defaults to cleaning the environments and task cache. +Use the cache subcommand to clean the cache

+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
  • --environment <ENVIRONMENT> (-e): The environment to clean, if none are provided all environments will be removed.
  • +
+
pixi clean
+
+

clean cache#

+

Clean the pixi cache on your system.

+
Options#
+
    +
  • --pypi: Clean the pypi cache.
  • +
  • --conda: Clean the conda cache.
  • +
  • --mapping: Clean the mapping cache.
  • +
  • --exec: Clean the exec cache.
  • +
  • --repodata: Clean the repodata cache.
  • +
  • --yes: Skip the confirmation prompt.
  • +
+
pixi clean cache # clean all pixi caches
+pixi clean cache --pypi # clean only the pypi cache
+pixi clean cache --conda # clean only the conda cache
+pixi clean cache --mapping # clean only the mapping cache
+pixi clean cache --exec # clean only the `exec` cache
+pixi clean cache --repodata # clean only the `repodata` cache
+pixi clean cache --yes # skip the confirmation prompt
+
+

upload#

+

Upload a package to a prefix.dev channel

+
Arguments#
+
    +
  1. <HOST>: The host + channel to upload to.
  2. +
  3. <PACKAGE_FILE>: The package file to upload.
  4. +
+
pixi upload https://prefix.dev/api/v1/upload/my_channel my_package.conda
+
+

auth#

+

This command is used to authenticate the user's access to remote hosts such as prefix.dev or anaconda.org for private channels.

+

auth login#

+

Store authentication information for given host.

+
+

Tip

+

The host is real hostname not a channel.

+
+
Arguments#
+
    +
  1. <HOST>: The host to authenticate with.
  2. +
+
Options#
+
    +
  • --token <TOKEN>: The token to use for authentication with prefix.dev.
  • +
  • --username <USERNAME>: The username to use for basic HTTP authentication
  • +
  • --password <PASSWORD>: The password to use for basic HTTP authentication.
  • +
  • --conda-token <CONDA_TOKEN>: The token to use on anaconda.org / quetz authentication.
  • +
+
pixi auth login repo.prefix.dev --token pfx_JQEV-m_2bdz-D8NSyRSaAndHANx0qHjq7f2iD
+pixi auth login anaconda.org --conda-token ABCDEFGHIJKLMNOP
+pixi auth login https://myquetz.server --username john --password xxxxxx
+
+

auth logout#

+

Remove authentication information for a given host.

+
Arguments#
+
    +
  1. <HOST>: The host to authenticate with.
  2. +
+
pixi auth logout <HOST>
+pixi auth logout repo.prefix.dev
+pixi auth logout anaconda.org
+
+

config#

+

Use this command to manage the configuration.

+
Options#
+
    +
  • --system (-s): Specify management scope to system configuration.
  • +
  • --global (-g): Specify management scope to global configuration.
  • +
  • --local (-l): Specify management scope to local configuration.
  • +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
+

Checkout the pixi configuration for more information about the locations.

+

config edit#

+

Edit the configuration file in the default editor.

+
Arguments#
+
    +
  1. [EDITOR]: The editor to use, defaults to EDITOR environment variable or nano on Unix and notepad on Windows
  2. +
+
pixi config edit --system
+pixi config edit --local
+pixi config edit -g
+pixi config edit --global code
+pixi config edit --system vim
+
+

config list#

+

List the configuration

+
Arguments#
+
    +
  1. [KEY]: The key to list the value of. (all if not provided)
  2. +
+
Options#
+
    +
  • --json: Output the configuration in JSON format.
  • +
+
pixi config list default-channels
+pixi config list --json
+pixi config list --system
+pixi config list -g
+
+

config prepend#

+

Prepend a value to a list configuration key.

+
Arguments#
+
    +
  1. <KEY>: The key to prepend the value to.
  2. +
  3. <VALUE>: The value to prepend.
  4. +
+
pixi config prepend default-channels conda-forge
+
+

config append#

+

Append a value to a list configuration key.

+
Arguments#
+
    +
  1. <KEY>: The key to append the value to.
  2. +
  3. <VALUE>: The value to append.
  4. +
+
pixi config append default-channels robostack
+pixi config append default-channels bioconda --global
+
+

config set#

+

Set a configuration key to a value.

+
Arguments#
+
    +
  1. <KEY>: The key to set the value of.
  2. +
  3. [VALUE]: The value to set. (if not provided, the key will be removed)
  4. +
+
pixi config set default-channels '["conda-forge", "bioconda"]'
+pixi config set --global mirrors '{"https://conda.anaconda.org/": ["https://prefix.dev/conda-forge"]}'
+pixi config set repodata-config.disable-zstd true --system
+pixi config set --global detached-environments "/opt/pixi/envs"
+pixi config set detached-environments false
+
+

config unset#

+

Unset a configuration key.

+
Arguments#
+
    +
  1. <KEY>: The key to unset.
  2. +
+
pixi config unset default-channels
+pixi config unset --global mirrors
+pixi config unset repodata-config.disable-zstd --system
+
+

global#

+

Global is the main entry point for the part of pixi that executes on the global(system) level. +All commands in this section are used to manage global installations of packages and environments through the global manifest. +More info on the global manifest can be found here.

+
+

Tip

+

Binaries and environments installed globally are stored in ~/.pixi +by default, this can be changed by setting the PIXI_HOME environment +variable.

+
+

global add#

+

Adds dependencies to a global environment. +Without exposing the binaries of that package to the system by default.

+
Arguments#
+
    +
  1. [PACKAGE]: The packages to add, this excepts the matchspec format. (e.g. python=3.9.*, python [version='3.11.0', build_number=1])
  2. +
+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): The environment to install the package into.
  • +
  • --expose <EXPOSE>: A mapping from name to the binary to expose to the system.
  • +
+
pixi global add python=3.9.* --environment my-env
+pixi global add python=3.9.* --expose py39=python3.9 --environment my-env
+pixi global add numpy matplotlib --environment my-env
+pixi global add numpy matplotlib --expose np=python3.9 --environment my-env
+
+

global edit#

+

Edit the global manifest file in the default editor.

+

Will try to use the EDITOR environment variable, if not set it will use nano on Unix systems and notepad on Windows.

+
Arguments#
+
    +
  1. <EDITOR>: The editor to use. (optional) +
    pixi global edit
    +pixi global edit code
    +pixi global edit vim
    +
  2. +
+

global install#

+

This command installs package(s) into its own environment and adds the binary to PATH. +Allowing you to access it anywhere on your system without activating the environment.

+
Arguments#
+

1.[PACKAGE]: The package(s) to install, this can also be a version constraint.

+
Options#
+
    +
  • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • +
  • --platform <PLATFORM> (-p): specify a platform that you want to install the package for. (default: current platform)
  • +
  • --environment <ENVIRONMENT> (-e): The environment to install the package into. (default: name of the tool)
  • +
  • --expose <EXPOSE>: A mapping from name to the binary to expose to the system. (default: name of the tool)
  • +
  • --with <WITH>: Add additional dependencies to the environment. Their executables will not be exposed.
  • +
+
pixi global install ruff
+# Multiple packages can be installed at once
+pixi global install starship rattler-build
+# Specify the channel(s)
+pixi global install --channel conda-forge --channel bioconda trackplot
+# Or in a more concise form
+pixi global install -c conda-forge -c bioconda trackplot
+
+# Support full conda matchspec
+pixi global install python=3.9.*
+pixi global install "python [version='3.11.0', build_number=1]"
+pixi global install "python [version='3.11.0', build=he550d4f_1_cpython]"
+pixi global install python=3.11.0=h10a6764_1_cpython
+
+# Install for a specific platform, only useful on osx-arm64
+pixi global install --platform osx-64 ruff
+
+# Install a package with all its executables exposed, together with additional packages that don't expose anything
+pixi global install ipython --with numpy --with scipy
+
+# Install into a specific environment name and expose all executables
+pixi global install --environment data-science ipython jupyterlab numpy matplotlib
+
+# Expose the binary under a different name
+pixi global install --expose "py39=python3.9" "python=3.9.*"
+
+
+

Tip

+

Running osx-64 on Apple Silicon will install the Intel binary but run it using Rosetta +

pixi global install --platform osx-64 ruff
+

+
+

After using global install, you can use the package you installed anywhere on your system.

+

global uninstall#

+

Uninstalls environments from the global environment. +This will remove the environment and all its dependencies from the global environment. +It will also remove the related binaries from the system.

+
Arguments#
+
    +
  1. [ENVIRONMENT]: The environments to uninstall.
  2. +
+
pixi global uninstall my-env
+pixi global uninstall pixi-pack rattler-build
+
+

global remove#

+

Removes a package from a global environment.

+
Arguments#
+
    +
  1. [PACKAGE]: The packages to remove.
  2. +
+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): The environment to remove the package from.
  • +
+
pixi global remove -e my-env package1 package2
+
+

global list#

+

This command shows the current installed global environments including what binaries come with it. +A global installed package/environment can possibly contain multiple exposed binaries and they will be listed out in the command output.

+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): The environment to install the package into. (default: name of the tool)
  • +
+

We'll only show the dependencies and exposed binaries of the environment if they differ from the environment name. +Here is an example of a few installed packages:

+

pixi global list
+
+Results in: +
Global environments at /home/user/.pixi:
+├── gh: 2.57.0
+├── pixi-pack: 0.1.8
+├── python: 3.11.0
+│   └─ exposes: 2to3, 2to3-3.11, idle3, idle3.11, pydoc, pydoc3, pydoc3.11, python, python3, python3-config, python3.1, python3.11, python3.11-config
+├── rattler-build: 0.22.0
+├── ripgrep: 14.1.0
+│   └─ exposes: rg
+├── vim: 9.1.0611
+│   └─ exposes: ex, rview, rvim, view, vim, vimdiff, vimtutor, xxd
+└── zoxide: 0.9.6
+

+

Here is an example of list of a single environment: +

pixi g list -e pixi-pack
+
+Results in: +
The 'pixi-pack' environment has 8 packages:
+Package          Version    Build        Size
+_libgcc_mutex    0.1        conda_forge  2.5 KiB
+_openmp_mutex    4.5        2_gnu        23.1 KiB
+ca-certificates  2024.8.30  hbcca054_0   155.3 KiB
+libgcc           14.1.0     h77fa898_1   826.5 KiB
+libgcc-ng        14.1.0     h69a702a_1   50.9 KiB
+libgomp          14.1.0     h77fa898_1   449.4 KiB
+openssl          3.3.2      hb9d3cd8_0   2.8 MiB
+pixi-pack        0.1.8      hc762bcd_0   4.3 MiB
+Package          Version    Build        Size
+
+Exposes:
+pixi-pack
+Channels:
+conda-forge
+Platform: linux-64
+

+

global sync#

+

As the global manifest can be manually edited, this command will sync the global manifest with the current state of the global environment. +You can modify the manifest in $HOME/manifests/pixi_global.toml.

+
pixi global sync
+
+

global expose#

+

Modify the exposed binaries of a global environment.

+

global expose add#

+

Add exposed binaries from an environment to your global environment.

+
Arguments#
+
    +
  1. [MAPPING]: The binaries to expose (python), or give a map to expose a binary under a different name. (e.g. py310=python3.10) +The mapping is mapped as exposed_name=binary_name. +Where the exposed name is the one you will be able to use in the terminal, and the binary name is the name of the binary in the environment.
  2. +
+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): The environment to expose the binaries from.
  • +
+
pixi global expose add python --environment my-env
+pixi global expose add py310=python3.10 --environment python
+
+

global expose remove#

+

Remove exposed binaries from the global environment.

+
Arguments#
+
    +
  1. [EXPOSED_NAME]: The binaries to remove from the main global environment.
  2. +
+
pixi global expose remove python
+pixi global expose remove py310 python3
+
+

global update#

+

Update all environments or specify an environment to update to the version.

+
Arguments#
+
    +
  1. [ENVIRONMENT]: The environment(s) to update.
  2. +
+
pixi global update
+pixi global update pixi-pack
+pixi global update bat rattler-build
+
+

project#

+

This subcommand allows you to modify the project configuration through the command line interface.

+
Options#
+
    +
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • +
+

project channel add#

+

Add channels to the channel list in the project configuration. +When you add channels, the channels are tested for existence, added to the lock file and the environment is reinstalled.

+
Arguments#
+
    +
  1. <CHANNEL>: The channels to add, name or URL.
  2. +
+
Options#
+
    +
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • +
  • --feature <FEATURE> (-f): The feature for which the channel is added.
  • +
  • --prepend: Prepend the channel to the list of channels.
  • +
+
pixi project channel add robostack
+pixi project channel add bioconda conda-forge robostack
+pixi project channel add file:///home/user/local_channel
+pixi project channel add https://repo.prefix.dev/conda-forge
+pixi project channel add --no-install robostack
+pixi project channel add --feature cuda nvidia
+pixi project channel add --prepend pytorch
+
+

project channel list#

+

List the channels in the manifest file

+
Options#
+
    +
  • urls: show the urls of the channels instead of the names.
  • +
+
$ pixi project channel list
+Environment: default
+- conda-forge
+
+$ pixi project channel list --urls
+Environment: default
+- https://conda.anaconda.org/conda-forge/
+
+

project channel remove#

+

List the channels in the manifest file

+
Arguments#
+
    +
  1. <CHANNEL>...: The channels to remove, name(s) or URL(s).
  2. +
+
Options#
+
    +
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • +
  • --feature <FEATURE> (-f): The feature for which the channel is removed.
  • +
+
pixi project channel remove conda-forge
+pixi project channel remove https://conda.anaconda.org/conda-forge/
+pixi project channel remove --no-install conda-forge
+pixi project channel remove --feature cuda nvidia
+
+

project description get#

+

Get the project description.

+
$ pixi project description get
+Package management made easy!
+
+

project description set#

+

Set the project description.

+
Arguments#
+
    +
  1. <DESCRIPTION>: The description to set.
  2. +
+
pixi project description set "my new description"
+
+

project environment add#

+

Add an environment to the manifest file.

+
Arguments#
+
    +
  1. <NAME>: The name of the environment to add.
  2. +
+
Options#
+
    +
  • -f, --feature <FEATURES>: Features to add to the environment.
  • +
  • --solve-group <SOLVE_GROUP>: The solve-group to add the environment to.
  • +
  • --no-default-feature: Don't include the default feature in the environment.
  • +
  • --force: Update the manifest even if the environment already exists.
  • +
+
pixi project environment add env1 --feature feature1 --feature feature2
+pixi project environment add env2 -f feature1 --solve-group test
+pixi project environment add env3 -f feature1 --no-default-feature
+pixi project environment add env3 -f feature1 --force
+
+

project environment remove#

+

Remove an environment from the manifest file.

+
Arguments#
+
    +
  1. <NAME>: The name of the environment to remove.
  2. +
+
pixi project environment remove env1
+
+

project environment list#

+

List the environments in the manifest file.

+
pixi project environment list
+
+

project export conda-environment#

+

Exports a conda environment.yml file. The file can be used to create a conda environment using conda/mamba:

+
pixi project export conda-environment environment.yml
+mamba create --name <env> --file environment.yml
+
+
Arguments#
+
    +
  1. <OUTPUT_PATH>: Optional path to render environment.yml to. Otherwise it will be printed to standard out.
  2. +
+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): Environment to render.
  • +
  • --platform <PLATFORM> (-p): The platform to render.
  • +
+
pixi project export conda-environment --environment lint
+pixi project export conda-environment --platform linux-64 environment.linux-64.yml
+
+

project export conda-explicit-spec#

+

Render a platform-specific conda explicit specification file +for an environment. The file can be then used to create a conda environment using conda/mamba:

+
mamba create --name <env> --file <explicit spec file>
+
+

As the explicit specification file format does not support pypi-dependencies, use the --ignore-pypi-errors option to ignore those dependencies.

+
Arguments#
+
    +
  1. <OUTPUT_DIR>: Output directory for rendered explicit environment spec files.
  2. +
+
Options#
+
    +
  • --environment <ENVIRONMENT> (-e): Environment to render. Can be repeated for multiple envs. Defaults to all environments.
  • +
  • --platform <PLATFORM> (-p): The platform to render. Can be repeated for multiple platforms. Defaults to all platforms available for selected environments.
  • +
  • --ignore-pypi-errors: PyPI dependencies are not supported in the conda explicit spec file. This flag allows creating the spec file even if PyPI dependencies are present.
  • +
+
pixi project export conda-explicit-spec output
+pixi project export conda-explicit-spec -e default -e test -p linux-64 output
+
+

project name get#

+

Get the project name.

+
$ pixi project name get
+my project name
+
+

project name set#

+

Set the project name.

+
Arguments#
+
    +
  1. <NAME>: The name to set.
  2. +
+
pixi project name set "my new project name"
+
+

project platform add#

+

Adds a platform(s) to the manifest file and updates the lock file.

+
Arguments#
+
    +
  1. <PLATFORM>...: The platforms to add.
  2. +
+
Options#
+
    +
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • +
  • --feature <FEATURE> (-f): The feature for which the platform will be added.
  • +
+
pixi project platform add win-64
+pixi project platform add --feature test win-64
+
+

project platform list#

+

List the platforms in the manifest file.

+
$ pixi project platform list
+osx-64
+linux-64
+win-64
+osx-arm64
+
+

project platform remove#

+

Remove platform(s) from the manifest file and updates the lock file.

+
Arguments#
+
    +
  1. <PLATFORM>...: The platforms to remove.
  2. +
+
Options#
+
    +
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • +
  • --feature <FEATURE> (-f): The feature for which the platform will be removed.
  • +
+
pixi project platform remove win-64
+pixi project platform remove --feature test win-64
+
+

project version get#

+

Get the project version.

+
$ pixi project version get
+0.11.0
+
+

project version set#

+

Set the project version.

+
Arguments#
+
    +
  1. <VERSION>: The version to set.
  2. +
+
pixi project version set "0.13.0"
+
+

project version {major|minor|patch}#

+

Bump the project version to {MAJOR|MINOR|PATCH}.

+
pixi project version major
+pixi project version minor
+pixi project version patch
+
+
+
+
    +
  1. +

    An up-to-date lock file means that the dependencies in the lock file are allowed by the dependencies in the manifest file. +For example

    +
      +
    • a manifest with python = ">= 3.11" is up-to-date with a name: python, version: 3.11.0 in the pixi.lock.
    • +
    • a manifest with python = ">= 3.12" is not up-to-date with a name: python, version: 3.11.0 in the pixi.lock.
    • +
    +

    Being up-to-date does not mean that the lock file holds the latest version available on the channel for the given dependency. 

    +
  2. +
+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/reference/pixi_configuration/index.html b/v0.39.2/reference/pixi_configuration/index.html new file mode 100644 index 000000000..bbd5f068b --- /dev/null +++ b/v0.39.2/reference/pixi_configuration/index.html @@ -0,0 +1,2389 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Pixi Configuration - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

The configuration of pixi itself#

+

Apart from the project specific configuration pixi supports configuration options which are not required for the project to work but are local to the machine. +The configuration is loaded in the following order:

+
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
7Command line arguments (--tls-no-verify, --change-ps1=false, etc.)Configuration via command line arguments
6your_project/.pixi/config.tomlProject-specific configuration
5$PIXI_HOME/config.tomlGlobal configuration in PIXI_HOME.
4$HOME/.pixi/config.tomlGlobal configuration in the user home directory.
3$XDG_CONFIG_HOME/pixi/config.tomlXDG compliant user-specific configuration
2$HOME/.config/pixi/config.tomlUser-specific configuration
1/etc/pixi/config.tomlSystem-wide configuration
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
6Command line arguments (--tls-no-verify, --change-ps1=false, etc.)Configuration via command line arguments
5your_project/.pixi/config.tomlProject-specific configuration
4$PIXI_HOME/config.tomlGlobal configuration in PIXI_HOME.
3$HOME/.pixi/config.tomlGlobal configuration in the user home directory.
2$HOME/Library/Application Support/pixi/config.tomlUser-specific configuration
1/etc/pixi/config.tomlSystem-wide configuration
+
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
6Command line arguments (--tls-no-verify, --change-ps1=false, etc.)Configuration via command line arguments
5your_project\.pixi\config.tomlProject-specific configuration
4%PIXI_HOME%\config.tomlGlobal configuration in PIXI_HOME.
3%USERPROFILE%\.pixi\config.tomlGlobal configuration in the user home directory.
2%APPDATA%\pixi\config.tomlUser-specific configuration
1C:\ProgramData\pixi\config.tomlSystem-wide configuration
+
+
+
+
+

Note

+

The highest priority wins. If a configuration file is found in a higher priority location, the values from the configuration read from lower priority locations are overwritten.

+
+
+

Note

+

To find the locations where pixi looks for configuration files, run +pixi with -vv.

+
+

Configuration options#

+
+Casing In Configuration +

In versions of pixi 0.20.1 and older the global configuration used snake_case +we've changed to kebab-case for consistency with the rest of the configuration. +But we still support the old snake_case configuration, for older configuration options. +These are:

+
    +
  • default_channels
  • +
  • change_ps1
  • +
  • tls_no_verify
  • +
  • authentication_override_file
  • +
  • mirrors and sub-options
  • +
  • repodata-config and sub-options
  • +
+
+

The following reference describes all available configuration options.

+

default-channels#

+

The default channels to select when running pixi init or pixi global install. +This defaults to only conda-forge. +

config.toml
default-channels = ["conda-forge"]
+

+
+

Note

+

The default-channels are only used when initializing a new project. Once initialized the channels are used from the project manifest.

+
+

change-ps1#

+

When set to false, the (pixi) prefix in the shell prompt is removed. +This applies to the pixi shell subcommand. +You can override this from the CLI with --change-ps1.

+
config.toml
change-ps1 = true
+
+

tls-no-verify#

+

When set to true, the TLS certificates are not verified.

+
+

Warning

+

This is a security risk and should only be used for testing purposes or internal networks.

+
+

You can override this from the CLI with --tls-no-verify.

+
config.toml
tls-no-verify = false
+
+

authentication-override-file#

+

Override from where the authentication information is loaded. +Usually, we try to use the keyring to load authentication data from, and only use a JSON +file as a fallback. This option allows you to force the use of a JSON file. +Read more in the authentication section. +

config.toml
authentication-override-file = "/path/to/your/override.json"
+

+

detached-environments#

+

The directory where pixi stores the project environments, what would normally be placed in the .pixi/envs folder in a project's root. +It doesn't affect the environments built for pixi global. +The location of environments created for a pixi global installation can be controlled using the PIXI_HOME environment variable.

+
+

Warning

+

We recommend against using this because any environment created for a project is no longer placed in the same folder as the project. +This creates a disconnect between the project and its environments and manual cleanup of the environments is required when deleting the project.

+

However, in some cases, this option can still be very useful, for instance to:

+
    +
  • force the installation on a specific filesystem/drive.
  • +
  • install environments locally but keep the project on a network drive.
  • +
  • let a system-administrator have more control over all environments on a system.
  • +
+
+

This field can consist of two types of input.

+
    +
  • A boolean value, true or false, which will enable or disable the feature respectively. (not "true" or "false", this is read as false)
  • +
  • A string value, which will be the absolute path to the directory where the environments will be stored.
  • +
+

config.toml
detached-environments = true
+
+or: +
config.toml
detached-environments = "/opt/pixi/envs"
+

+

The environments will be stored in the cache directory when this option is true. +When you specify a custom path the environments will be stored in that directory.

+

The resulting directory structure will look like this: +

/opt/pixi/envs
+├── pixi-6837172896226367631
+   └── envs
+└── NAME_OF_PROJECT-HASH_OF_ORIGINAL_PATH
+    ├── envs # the runnable environments
+    └── solve-group-envs # If there are solve groups
+

+

pinning-strategy#

+

The strategy to use for pinning dependencies when running pixi add. +The default is semver but you can set the following:

+
    +
  • no-pin: No pinning, resulting in an unconstraint dependency. *
  • +
  • semver: Pinning to the latest version that satisfies the semver constraint. Resulting in a pin to major for most versions and to minor for v0 versions.
  • +
  • exact-version: Pinning to the exact version, 1.2.3 -> ==1.2.3.
  • +
  • major: Pinning to the major version, 1.2.3 -> >=1.2.3, <2.
  • +
  • minor: Pinning to the minor version, 1.2.3 -> >=1.2.3, <1.3.
  • +
  • latest-up: Pinning to the latest version, 1.2.3 -> >=1.2.3.
  • +
+
config.toml
pinning-strategy = "no-pin"
+
+

mirrors#

+

Configuration for conda channel-mirrors, more info below.

+
config.toml
[mirrors]
+# redirect all requests for conda-forge to the prefix.dev mirror
+"https://conda.anaconda.org/conda-forge" = ["https://prefix.dev/conda-forge"]
+
+# redirect all requests for bioconda to one of the three listed mirrors
+# Note: for repodata we try the first mirror first.
+"https://conda.anaconda.org/bioconda" = [
+  "https://conda.anaconda.org/bioconda",
+  # OCI registries are also supported
+  "oci://ghcr.io/channel-mirrors/bioconda",
+  "https://prefix.dev/bioconda",
+]
+
+

repodata-config#

+

Configuration for repodata fetching. +

config.toml
[repodata-config]
+# disable fetching of jlap, bz2 or zstd repodata files.
+# This should only be used for specific old versions of artifactory and other non-compliant
+# servers.
+disable-bzip2 = true   # don't try to download repodata.json.bz2
+disable-jlap = true    # don't try to download repodata.jlap
+disable-sharded = true # don't try to download sharded repodata
+disable-zstd = true    # don't try to download repodata.json.zst
+

+

The above settings can be overridden on a per-channel basis by specifying a channel prefix in the configuration. +

config.toml
[repodata-config."https://prefix.dev"]
+disable-sharded = false
+

+

pypi-config#

+

To setup a certain number of defaults for the usage of PyPI registries. You can use the following configuration options:

+
    +
  • index-url: The default index URL to use for PyPI packages. This will be added to a manifest file on a pixi init.
  • +
  • extra-index-urls: A list of additional URLs to use for PyPI packages. This will be added to a manifest file on a pixi init.
  • +
  • keyring-provider: Allows the use of the keyring python package to store and retrieve credentials.
  • +
  • allow-insecure-host: Allow insecure connections to host.
  • +
+
config.toml
[pypi-config]
+# Main index url
+index-url = "https://pypi.org/simple"
+# list of additional urls
+extra-index-urls = ["https://pypi.org/simple2"]
+# can be "subprocess" or "disabled"
+keyring-provider = "subprocess"
+# allow insecure connections to host
+allow-insecure-host = ["localhost:8080"]
+
+
+

index-url and extra-index-urls are not globals

+

Unlike pip, these settings, with the exception of keyring-provider will only modify the pixi.toml/pyproject.toml file and are not globally interpreted when not present in the manifest. +This is because we want to keep the manifest file as complete and reproducible as possible.

+
+

concurrency#

+

Configure multiple settings to limit or extend the concurrency of pixi. +

config.toml
[concurrency]
+# The maximum number of concurrent downloads
+# Defaults to 50 as that was found to be a good balance between speed and stability
+downloads = 5
+
+# The maximum number of concurrent dependency resolves
+# Defaults to a heuristic based on the number of cores on the system
+solves = 2
+
+Set them through the CLI with: +
pixi config set concurrency.solves 1
+pixi config set concurrency.downloads 12
+

+

Experimental#

+

This allows the user to set specific experimental features that are not yet stable.

+

Please write a GitHub issue and add the flag experimental to the issue if you find issues with the feature you activated.

+

Caching environment activations#

+

Turn this feature on from configuration with the following command: +

# For all your projects
+pixi config set experimental.use-environment-activation-cache true --global
+
+# For a specific project
+pixi config set experimental.use-environment-activation-cache true --local
+

+

This will cache the environment activation in the .pixi/activation-env-v0 folder in the project root. +It will create a json file for each environment that is activated, and it will be used to activate the environment in the future. +

> tree .pixi/activation-env-v0/
+.pixi/activation-env-v0/
+├── activation_default.json
+└── activation_lint.json
+
+> cat  .pixi/activation-env-v0/activation_lint.json
+{"hash":"8d8344e0751d377a","environment_variables":{<ENVIRONMENT_VARIABLES_USED_IN_ACTIVATION>}}
+

+
    +
  • The hash is a hash of the data on that environment in the pixi.lock, plus some important information on the environment activation. + Like [activation.scripts] and [activation.env] from the manifest file.
  • +
  • The environment_variables are the environment variables that are set when activating the environment.
  • +
+

You can ignore the cache by running: +

pixi run/shell/shell-hook --force-activate
+

+

Set the configuration with: +

config.toml
[experimental]
+# Enable the use of the environment activation cache
+use-environment-activation-cache = true
+

+
+

Why is this experimental?

+
+

This feature is experimental because the cache invalidation is very tricky, +and we don't want to disturb users that are not affected by activation times.

+

Mirror configuration#

+

You can configure mirrors for conda channels. We expect that mirrors are exact +copies of the original channel. The implementation will look for the mirror key +(a URL) in the mirrors section of the configuration file and replace the +original URL with the mirror URL.

+

To also include the original URL, you have to repeat it in the list of mirrors.

+

The mirrors are prioritized based on the order of the list. We attempt to fetch +the repodata (the most important file) from the first mirror in the list. The +repodata contains all the SHA256 hashes of the individual packages, so it is +important to get this file from a trusted source.

+

You can also specify mirrors for an entire "host", e.g.

+
config.toml
[mirrors]
+"https://conda.anaconda.org" = ["https://prefix.dev/"]
+
+

This will forward all request to channels on anaconda.org to prefix.dev. +Channels that are not currently mirrored on prefix.dev will fail in the above example.

+

OCI Mirrors#

+

You can also specify mirrors on the OCI registry. There is a public mirror on +the Github container registry (ghcr.io) that is maintained by the conda-forge +team. You can use it like this:

+
config.toml
[mirrors]
+"https://conda.anaconda.org/conda-forge" = [
+  "oci://ghcr.io/channel-mirrors/conda-forge",
+]
+
+

The GHCR mirror also contains bioconda packages. You can search the available +packages on Github.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/reference/pixi_manifest/index.html b/v0.39.2/reference/pixi_manifest/index.html new file mode 100644 index 000000000..187d0643e --- /dev/null +++ b/v0.39.2/reference/pixi_manifest/index.html @@ -0,0 +1,3017 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Manifest - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Pixi Manifest

+ +

The pixi.toml is the project manifest, also known as the pixi project configuration file.

+

A toml file is structured in different tables. +This document will explain the usage of the different tables. +For more technical documentation check pixi on docs.rs.

+
+

Tip

+

We also support the pyproject.toml file. It has the same structure as the pixi.toml file. except that you need to prepend the tables with tool.pixi instead of just the table name. +For example, the [project] table becomes [tool.pixi.project]. +There are also some small extras that are available in the pyproject.toml file, checkout the pyproject.toml documentation for more information.

+
+

Manifest discovery#

+

The manifest can be found at the following locations.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
PriorityLocationComments
6--manifest-pathCommand-line argument
5pixi.tomlIn your current working directory.
4pyproject.tomlIn your current working directory.
3pixi.toml or pyproject.tomlIterate through all parent directories. The first discovered manifest is used.
1$PIXI_PROJECT_MANIFESTIf $PIXI_IN_SHELL is set. This happens with pixi shell or pixi run.
+
+

Note

+

If multiple locations exist, the manifest with the highest priority will be used.

+
+

The project table#

+

The minimally required information in the project table is:

+
[project]
+channels = ["conda-forge"]
+name = "project-name"
+platforms = ["linux-64"]
+
+

name#

+

The name of the project.

+
name = "project-name"
+
+

channels#

+

This is a list that defines the channels used to fetch the packages from. +If you want to use channels hosted on anaconda.org you only need to use the name of the channel directly.

+
channels = ["conda-forge", "robostack", "bioconda", "nvidia", "pytorch"]
+
+

Channels situated on the file system are also supported with absolute file paths:

+
channels = ["conda-forge", "file:///home/user/staged-recipes/build_artifacts"]
+
+

To access private or public channels on prefix.dev or Quetz use the url including the hostname:

+
channels = ["conda-forge", "https://repo.prefix.dev/channel-name"]
+
+

platforms#

+

Defines the list of platforms that the project supports. +Pixi solves the dependencies for all these platforms and puts them in the lock file (pixi.lock).

+
platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
+
+

The available platforms are listed here: link

+
+

Special macOS behavior

+

macOS has two platforms: osx-64 for Intel Macs and osx-arm64 for Apple Silicon Macs. +To support both, include both in your platforms list. +Fallback: If osx-arm64 can't resolve, use osx-64. +Running osx-64 on Apple Silicon uses Rosetta for Intel binaries.

+
+

version (optional)#

+

The version of the project. +This should be a valid version based on the conda Version Spec. +See the version documentation, for an explanation of what is allowed in a Version Spec.

+
version = "1.2.3"
+
+

authors (optional)#

+

This is a list of authors of the project.

+
authors = ["John Doe <j.doe@prefix.dev>", "Marie Curie <mss1867@gmail.com>"]
+
+

description (optional)#

+

This should contain a short description of the project.

+
description = "A simple description"
+
+

license (optional)#

+

The license as a valid SPDX string (e.g. MIT AND Apache-2.0)

+
license = "MIT"
+
+

license-file (optional)#

+

Relative path to the license file.

+
license-file = "LICENSE.md"
+
+

readme (optional)#

+

Relative path to the README file.

+
readme = "README.md"
+
+

homepage (optional)#

+

URL of the project homepage.

+
homepage = "https://pixi.sh"
+
+

repository (optional)#

+

URL of the project source repository.

+
repository = "https://github.com/prefix-dev/pixi"
+
+

documentation (optional)#

+

URL of the project documentation.

+
documentation = "https://pixi.sh"
+
+

conda-pypi-map (optional)#

+

Mapping of channel name or URL to location of mapping that can be URL/Path. +Mapping should be structured in json format where conda_name: pypi_package_name. +Example:

+
local/robostack_mapping.json
{
+  "jupyter-ros": "my-name-from-mapping",
+  "boltons": "boltons-pypi"
+}
+
+

If conda-forge is not present in conda-pypi-map pixi will use prefix.dev mapping for it.

+
conda-pypi-map = { "conda-forge" = "https://example.com/mapping", "https://repo.prefix.dev/robostack" = "local/robostack_mapping.json"}
+
+

channel-priority (optional)#

+

This is the setting for the priority of the channels in the solver step.

+

Options:

+
    +
  • strict: Default, The channels are used in the order they are defined in the channels list. + Only packages from the first channel that has the package are used. + This ensures that different variants for a single package are not mixed from different channels. + Using packages from different incompatible channels like conda-forge and main can lead to hard to debug ABI incompatibilities.

    We strongly recommend not to switch the default.

    +
  • +
+
    +
  • +

    disabled: There is no priority, all package variants from all channels will be set per package name and solved as one. + Care should be taken when using this option. + Since package variants can come from any channel when you use this mode, packages might not be compatible. + This can cause hard to debug ABI incompatibilities.

    +

    We strongly discourage using this option.

    +
  • +
+
channel-priority = "disabled"
+
+
+

channel-priority = "disabled" is a security risk

+

Disabling channel priority may lead to unpredictable dependency resolutions. +This is a possible security risk as it may lead to packages being installed from unexpected channels. +It's advisable to maintain the default strict setting and order channels thoughtfully. +If necessary, specify a channel directly for a dependency. +

[project]
+# Putting conda-forge first solves most issues
+channels = ["conda-forge", "channel-name"]
+[dependencies]
+package = {version = "*", channel = "channel-name"}
+

+
+

The tasks table#

+

Tasks are a way to automate certain custom commands in your project. +For example, a lint or format step. +Tasks in a pixi project are essentially cross-platform shell commands, with a unified syntax across platforms. +For more in-depth information, check the Advanced tasks documentation. +Pixi's tasks are run in a pixi environment using pixi run and are executed using the deno_task_shell.

+
[tasks]
+simple = "echo This is a simple task"
+cmd = { cmd="echo Same as a simple task but now more verbose"}
+depending = { cmd="echo run after simple", depends-on="simple"}
+alias = { depends-on=["depending"]}
+download = { cmd="curl -o file.txt https://example.com/file.txt" , outputs=["file.txt"]}
+build = { cmd="npm build", cwd="frontend", inputs=["frontend/package.json", "frontend/*.js"]}
+run = { cmd="python run.py $ARGUMENT", env={ ARGUMENT="value" }}
+format = { cmd="black $INIT_CWD" } # runs black where you run pixi run format
+clean-env = { cmd = "python isolated.py", clean-env = true} # Only on Unix!
+
+

You can modify this table using pixi task.

+
+

Note

+

Specify different tasks for different platforms using the target table

+
+
+

Info

+

If you want to hide a task from showing up with pixi task list or pixi info, you can prefix the name with _. +For example, if you want to hide depending, you can rename it to _depending.

+
+

The system-requirements table#

+

The system requirements are used to define minimal system specifications used during dependency resolution.

+

For example, we can define a unix system with a specific minimal libc version. +

[system-requirements]
+libc = "2.28"
+
+or make the project depend on a specific version of cuda: +
[system-requirements]
+cuda = "12"
+

+

The options are:

+
    +
  • linux: The minimal version of the linux kernel.
  • +
  • libc: The minimal version of the libc library. Also allows specifying the family of the libc library. +e.g. libc = { family="glibc", version="2.28" }
  • +
  • macos: The minimal version of the macOS operating system.
  • +
  • cuda: The minimal version of the CUDA library.
  • +
+

More information in the system requirements documentation.

+

The pypi-options table#

+

The pypi-options table is used to define options that are specific to PyPI registries. +These options can be specified either at the root level, which will add it to the default options feature, or on feature level, which will create a union of these options when the features are included in the environment.

+

The options that can be defined are:

+
    +
  • index-url: replaces the main index url.
  • +
  • extra-index-urls: adds an extra index url.
  • +
  • find-links: similar to --find-links option in pip.
  • +
  • no-build-isolation: disables build isolation, can only be set per package.
  • +
  • index-strategy: allows for specifying the index strategy to use.
  • +
+

These options are explained in the sections below. Most of these options are taken directly or with slight modifications from the uv settings. If any are missing that you need feel free to create an issue requesting them.

+

Alternative registries#

+
+

Strict Index Priority

+

Unlike pip, because we make use of uv, we have a strict index priority. This means that the first index is used where a package can be found. +The order is determined by the order in the toml file. Where the extra-index-urls are preferred over the index-url. Read more about this on the uv docs

+
+

Often you might want to use an alternative or extra index for your project. This can be done by adding the pypi-options table to your pixi.toml file, the following options are available:

+
    +
  • index-url: replaces the main index url. If this is not set the default index used is https://pypi.org/simple. + Only one index-url can be defined per environment.
  • +
  • extra-index-urls: adds an extra index url. The urls are used in the order they are defined. And are preferred over the index-url. These are merged across features into an environment.
  • +
  • find-links: which can either be a path {path = './links'} or a url {url = 'https://example.com/links'}. + This is similar to the --find-links option in pip. These are merged across features into an environment.
  • +
+

An example:

+
[pypi-options]
+index-url = "https://pypi.org/simple"
+extra-index-urls = ["https://example.com/simple"]
+find-links = [{path = './links'}]
+
+

There are some examples in the pixi repository, that make use of this feature.

+
+

Authentication Methods

+

To read about existing authentication methods for private registries, please check the PyPI Authentication section.

+
+

No Build Isolation#

+

Even though build isolation is a good default. +One can choose to not isolate the build for a certain package name, this allows the build to access the pixi environment. +This is convenient if you want to use torch or something similar for your build-process.

+
[dependencies]
+pytorch = "2.4.0"
+
+[pypi-options]
+no-build-isolation = ["detectron2"]
+
+[pypi-dependencies]
+detectron2 = { git = "https://github.com/facebookresearch/detectron2.git", rev = "5b72c27ae39f99db75d43f18fd1312e1ea934e60"}
+
+
+

Conda dependencies define the build environment

+

To use no-build-isolation effectively, use conda dependencies to define the build environment. These are installed before the PyPI dependencies are resolved, this way these dependencies are available during the build process. In the example above adding torch as a PyPI dependency would be ineffective, as it would not yet be installed during the PyPI resolution phase.

+
+

Index Strategy#

+

The strategy to use when resolving against multiple index URLs. Description modified from the uv documentation:

+

By default, uv and thus pixi, will stop at the first index on which a given package is available, and limit resolutions to those present on that first index (first-match). This prevents dependency confusion attacks, whereby an attack can upload a malicious package under the same name to a secondary index.

+
+

One index strategy per environment

+

Only one index-strategy can be defined per environment or solve-group, otherwise, an error will be shown.

+
+

Possible values:#

+
    +
  • "first-index": Only use results from the first index that returns a match for a given package name
  • +
  • "unsafe-first-match": Search for every package name across all indexes, exhausting the versions from the first index before moving on to the next. Meaning if the package a is available on index x and y, it will prefer the version from x unless you've requested a package version that is only available on y.
  • +
  • "unsafe-best-match": Search for every package name across all indexes, preferring the best version found. If a package version is in multiple indexes, only look at the entry for the first index. So given index, x and y that both contain package a, it will take the best version from either x or y, but should that version be available on both indexes it will prefer x.
  • +
+
+

PyPI only

+

The index-strategy only changes PyPI package resolution and not conda package resolution.

+
+

The dependencies table(s)#

+

This section defines what dependencies you would like to use for your project.

+

There are multiple dependencies tables. +The default is [dependencies], which are dependencies that are shared across platforms.

+

Dependencies are defined using a VersionSpec. +A VersionSpec combines a Version with an optional operator.

+

Some examples are:

+
# Use this exact package version
+package0 = "1.2.3"
+# Use 1.2.3 up to 1.3.0
+package1 = "~=1.2.3"
+# Use larger than 1.2 lower and equal to 1.4
+package2 = ">1.2,<=1.4"
+# Bigger or equal than 1.2.3 or lower not including 1.0.0
+package3 = ">=1.2.3|<1.0.0"
+
+

Dependencies can also be defined as a mapping where it is using a matchspec:

+
package0 = { version = ">=1.2.3", channel="conda-forge" }
+package1 = { version = ">=1.2.3", build="py34_0" }
+
+
+

Tip

+

The dependencies can be easily added using the pixi add command line. +Running add for an existing dependency will replace it with the newest it can use.

+
+
+

Note

+

To specify different dependencies for different platforms use the target table

+
+

dependencies#

+

Add any conda package dependency that you want to install into the environment. +Don't forget to add the channel to the project table should you use anything different than conda-forge. +Even if the dependency defines a channel that channel should be added to the project.channels list.

+
[dependencies]
+python = ">3.9,<=3.11"
+rust = "1.72"
+pytorch-cpu = { version = "~=1.1", channel = "pytorch" }
+
+

pypi-dependencies#

+
+Details regarding the PyPI integration +

We use uv, which is a new fast pip replacement written in Rust.

+

We integrate uv as a library, so we use the uv resolver, to which we pass the conda packages as 'locked'. +This disallows uv from installing these dependencies itself, and ensures it uses the exact version of these packages in the resolution. +This is unique amongst conda based package managers, which usually just call pip from a subprocess.

+

The uv resolution is included in the lock file directly.

+
+

Pixi directly supports depending on PyPI packages, the PyPA calls a distributed package a 'distribution'. +There are Source and Binary distributions both +of which are supported by pixi. +These distributions are installed into the environment after the conda environment has been resolved and installed. +PyPI packages are not indexed on prefix.dev but can be viewed on pypi.org.

+
+

Important considerations

+
    +
  • Stability: PyPI packages might be less stable than their conda counterparts. Prefer using conda packages in the dependencies table where possible.
  • +
+
+

Version specification:#

+

These dependencies don't follow the conda matchspec specification. +The version is a string specification of the version according to PEP404/PyPA. +Additionally, a list of extra's can be included, which are essentially optional dependencies. +Note that this version is distinct from the conda MatchSpec type. +See the example below to see how this is used in practice:

+
[dependencies]
+# When using pypi-dependencies, python is needed to resolve pypi dependencies
+# make sure to include this
+python = ">=3.6"
+
+[pypi-dependencies]
+fastapi = "*"  # This means any version (the wildcard `*` is a pixi addition, not part of the specification)
+pre-commit = "~=3.5.0" # This is a single version specifier
+# Using the toml map allows the user to add `extras`
+pandas = { version = ">=1.0.0", extras = ["dataframe", "sql"]}
+
+# git dependencies
+# With ssh
+flask = { git = "ssh://git@github.com/pallets/flask" }
+# With https and a specific revision
+requests = { git = "https://github.com/psf/requests.git", rev = "0106aced5faa299e6ede89d1230bd6784f2c3660" }
+# TODO: will support later -> branch = '' or tag = '' to specify a branch or tag
+
+# You can also directly add a source dependency from a path, tip keep this relative to the root of the project.
+minimal-project = { path = "./minimal-project", editable = true}
+
+# You can also use a direct url, to either a `.tar.gz` or `.zip`, or a `.whl` file
+click = { url = "https://github.com/pallets/click/releases/download/8.1.7/click-8.1.7-py3-none-any.whl" }
+
+# You can also just the default git repo, it will checkout the default branch
+pytest = { git = "https://github.com/pytest-dev/pytest.git"}
+
+

Full specification#

+

The full specification of a PyPI dependencies that pixi supports can be split into the following fields:

+
extras#
+

A list of extras to install with the package. e.g. ["dataframe", "sql"] +The extras field works with all other version specifiers as it is an addition to the version specifier.

+
pandas = { version = ">=1.0.0", extras = ["dataframe", "sql"]}
+pytest = { git = "URL", extras = ["dev"]}
+black = { url = "URL", extras = ["cli"]}
+minimal-project = { path = "./minimal-project", editable = true, extras = ["dev"]}
+
+
version#
+

The version of the package to install. e.g. ">=1.0.0" or * which stands for any version, this is pixi specific. +Version is our default field so using no inline table ({}) will default to this field.

+
py-rattler = "*"
+ruff = "~=1.0.0"
+pytest = {version = "*", extras = ["dev"]}
+
+
index#
+

The index parameter allows you to specify the URL of a custom package index for the installation of a specific package. +This feature is useful when you want to ensure that a package is retrieved from a particular source, rather than from the default index.

+

For example, to use some other than the official Python Package Index (PyPI) at https://pypi.org/simple, you can use the index parameter:

+
torch = { version = "*", index = "https://download.pytorch.org/whl/cu118" }
+
+

This is useful for PyTorch specifically, as the registries are pinned to different CUDA versions.

+
git#
+

A git repository to install from. +This support both https:// and ssh:// urls.

+

Use git in combination with rev or subdirectory:

+
    +
  • rev: A specific revision to install. e.g. rev = "0106aced5faa299e6ede89d1230bd6784f2c3660
  • +
  • subdirectory: A subdirectory to install from. subdirectory = "src" or subdirectory = "src/packagex"
  • +
+
# Note don't forget the `ssh://` or `https://` prefix!
+pytest = { git = "https://github.com/pytest-dev/pytest.git"}
+requests = { git = "https://github.com/psf/requests.git", rev = "0106aced5faa299e6ede89d1230bd6784f2c3660" }
+py-rattler = { git = "ssh://git@github.com/mamba-org/rattler.git", subdirectory = "py-rattler" }
+
+
path#
+

A local path to install from. e.g. path = "./path/to/package" +We would advise to keep your path projects in the project, and to use a relative path.

+

Set editable to true to install in editable mode, this is highly recommended as it is hard to reinstall if you're not using editable mode. e.g. editable = true

+
minimal-project = { path = "./minimal-project", editable = true}
+
+
url#
+

A URL to install a wheel or sdist directly from an url.

+
pandas = {url = "https://files.pythonhosted.org/packages/3d/59/2afa81b9fb300c90531803c0fd43ff4548074fa3e8d0f747ef63b3b5e77a/pandas-2.2.1.tar.gz"}
+
+
+Did you know you can use: add --pypi? +

Use the --pypi flag with the add command to quickly add PyPI packages from the CLI. +E.g pixi add --pypi flask

+

This does not support all the features of the pypi-dependencies table yet.

+
+

Source dependencies (sdist)#

+

The Source Distribution Format is a source based format (sdist for short), that a package can include alongside the binary wheel format. +Because these distributions need to be built, the need a python executable to do this. +This is why python needs to be present in a conda environment. +Sdists usually depend on system packages to be built, especially when compiling C/C++ based python bindings. +Think for example of Python SDL2 bindings depending on the C library: SDL2. +To help built these dependencies we activate the conda environment that includes these pypi dependencies before resolving. +This way when a source distribution depends on gcc for example, it's used from the conda environment instead of the system.

+

host-dependencies#

+

This table contains dependencies that are needed to build your project but which should not be included when your project is installed as part of another project. +In other words, these dependencies are available during the build but are no longer available when your project is installed. +Dependencies listed in this table are installed for the architecture of the target machine.

+
[host-dependencies]
+python = "~=3.10.3"
+
+

Typical examples of host dependencies are:

+
    +
  • Base interpreters: a Python package would list python here and an R package would list mro-base or r-base.
  • +
  • Libraries your project links against during compilation like openssl, rapidjson, or xtensor.
  • +
+

build-dependencies#

+

This table contains dependencies that are needed to build the project. +Different from dependencies and host-dependencies these packages are installed for the architecture of the build machine. +This enables cross-compiling from one machine architecture to another.

+
[build-dependencies]
+cmake = "~=3.24"
+
+

Typical examples of build dependencies are:

+
    +
  • Compilers are invoked on the build machine, but they generate code for the target machine. + If the project is cross-compiled, the architecture of the build and target machine might differ.
  • +
  • cmake is invoked on the build machine to generate additional code- or project-files which are then include in the compilation process.
  • +
+
+

Info

+

The build target refers to the machine that will execute the build. +Programs and libraries installed by these dependencies will be executed on the build machine.

+

For example, if you compile on a MacBook with an Apple Silicon chip but target Linux x86_64 then your build platform is osx-arm64 and your host platform is linux-64.

+
+

The activation table#

+

The activation table is used for specialized activation operations that need to be run when the environment is activated.

+

There are two types of activation operations a user can modify in the manifest:

+
    +
  • scripts: A list of scripts that are run when the environment is activated.
  • +
  • env: A mapping of environment variables that are set when the environment is activated.
  • +
+

These activation operations will be run before the pixi run and pixi shell commands.

+
+

Note

+

The activation operations are run by the system shell interpreter as they run before an environment is available. +This means that it runs as cmd.exe on windows and bash on linux and osx (Unix). +Only .sh, .bash and .bat files are supported.

+

And the environment variables are set in the shell that is running the activation script, thus take note when using e.g. $ or %.

+

If you have scripts or env variable per platform use the target table.

+
+
[activation]
+scripts = ["env_setup.sh"]
+env = { ENV_VAR = "value" }
+
+# To support windows platforms as well add the following
+[target.win-64.activation]
+scripts = ["env_setup.bat"]
+
+[target.linux-64.activation.env]
+ENV_VAR = "linux-value"
+
+# You can also reference existing environment variables, but this has
+# to be done separately for unix-like operating systems and Windows
+[target.unix.activation.env]
+ENV_VAR = "$OTHER_ENV_VAR/unix-value"
+
+[target.win.activation.env]
+ENV_VAR = "%OTHER_ENV_VAR%\\windows-value"
+
+

The target table#

+

The target table is a table that allows for platform specific configuration. +Allowing you to make different sets of tasks or dependencies per platform.

+

The target table is currently implemented for the following sub-tables:

+ +

The target table is defined using [target.PLATFORM.SUB-TABLE]. +E.g [target.linux-64.dependencies]

+

The platform can be any of:

+
    +
  • win, osx, linux or unix (unix matches linux and osx)
  • +
  • or any of the (more) specific target platforms, e.g. linux-64, osx-arm64
  • +
+

The sub-table can be any of the specified above.

+

To make it a bit more clear, let's look at an example below. +Currently, pixi combines the top level tables like dependencies with the target-specific ones into a single set. +Which, in the case of dependencies, can both add or overwrite dependencies. +In the example below, we have cmake being used for all targets but on osx-64 or osx-arm64 a different version of python will be selected.

+
[dependencies]
+cmake = "3.26.4"
+python = "3.10"
+
+[target.osx.dependencies]
+python = "3.11"
+
+

Here are some more examples:

+
[target.win-64.activation]
+scripts = ["setup.bat"]
+
+[target.win-64.dependencies]
+msmpi = "~=10.1.1"
+
+[target.win-64.build-dependencies]
+vs2022_win-64 = "19.36.32532"
+
+[target.win-64.tasks]
+tmp = "echo $TEMP"
+
+[target.osx-64.dependencies]
+clang = ">=16.0.6"
+
+

The feature and environments tables#

+

The feature table allows you to define features that can be used to create different [environments]. +The [environments] table allows you to define different environments. The design is explained in the this design document.

+
Simplest example
[feature.test.dependencies]
+pytest = "*"
+
+[environments]
+test = ["test"]
+
+

This will create an environment called test that has pytest installed.

+

The feature table#

+

The feature table allows you to define the following fields per feature.

+
    +
  • dependencies: Same as the dependencies.
  • +
  • pypi-dependencies: Same as the pypi-dependencies.
  • +
  • pypi-options: Same as the pypi-options.
  • +
  • system-requirements: Same as the system-requirements.
  • +
  • activation: Same as the activation.
  • +
  • platforms: Same as the platforms. Unless overridden, the platforms of the feature will be those defined at project level.
  • +
  • channels: Same as the channels. Unless overridden, the channels of the feature will be those defined at project level.
  • +
  • channel-priority: Same as the channel-priority.
  • +
  • target: Same as the target.
  • +
  • tasks: Same as the tasks.
  • +
+

These tables are all also available without the feature prefix. +When those are used we call them the default feature. This is a protected name you can not use for your own feature.

+
Cuda feature table example
[feature.cuda]
+activation = {scripts = ["cuda_activation.sh"]}
+# Results in:  ["nvidia", "conda-forge"] when the default is `conda-forge`
+channels = ["nvidia"]
+dependencies = {cuda = "x.y.z", cudnn = "12.0"}
+pypi-dependencies = {torch = "==1.9.0"}
+platforms = ["linux-64", "osx-arm64"]
+system-requirements = {cuda = "12"}
+tasks = { warmup = "python warmup.py" }
+target.osx-arm64 = {dependencies = {mlx = "x.y.z"}}
+
+
Cuda feature table example but written as separate tables
[feature.cuda.activation]
+scripts = ["cuda_activation.sh"]
+
+[feature.cuda.dependencies]
+cuda = "x.y.z"
+cudnn = "12.0"
+
+[feature.cuda.pypi-dependencies]
+torch = "==1.9.0"
+
+[feature.cuda.system-requirements]
+cuda = "12"
+
+[feature.cuda.tasks]
+warmup = "python warmup.py"
+
+[feature.cuda.target.osx-arm64.dependencies]
+mlx = "x.y.z"
+
+# Channels and Platforms are not available as separate tables as they are implemented as lists
+[feature.cuda]
+channels = ["nvidia"]
+platforms = ["linux-64", "osx-arm64"]
+
+

The environments table#

+

The [environments] table allows you to define environments that are created using the features defined in the [feature] tables.

+

The environments table is defined using the following fields:

+
    +
  • features: The features that are included in the environment. Unless no-default-feature is set to true, the default feature is implicitly included in the environment.
  • +
  • solve-group: The solve group is used to group environments together at the solve stage. + This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. + For instance when testing a production environment with additional test dependencies. + These dependencies will then be the same version in all environments that have the same solve group. + But the different environments contain different subsets of the solve-groups dependencies set.
  • +
  • no-default-feature: Whether to include the default feature in that environment. The default is false, to include the default feature.
  • +
+

Full environments table specification
[environments]
+test = {features = ["test"], solve-group = "test"}
+prod = {features = ["prod"], solve-group = "test"}
+lint = {features = ["lint"], no-default-feature = true}
+
+As shown in the example above, in the simplest of cases, it is possible to define an environment only by listing its features:

+
Simplest example
[environments]
+test = ["test"]
+
+

is equivalent to

+
Simplest example expanded
[environments]
+test = {features = ["test"]}
+
+

When an environment comprises several features (including the default feature):

+
    +
  • The activation and tasks of the environment are the union of the activation and tasks of all its features.
  • +
  • The dependencies and pypi-dependencies of the environment are the union of the dependencies and pypi-dependencies of all its features. This means that if several features define a requirement for the same package, both requirements will be combined. Beware of conflicting requirements across features added to the same environment.
  • +
  • The system-requirements of the environment is the union of the system-requirements of all its features. If multiple features specify a requirement for the same system package, the highest version is chosen.
  • +
  • The channels of the environment is the union of the channels of all its features. Channel priorities can be specified in each feature, to ensure channels are considered in the right order in the environment.
  • +
  • The platforms of the environment is the intersection of the platforms of all its features. Be aware that the platforms supported by a feature (including the default feature) will be considered as the platforms defined at project level (unless overridden in the feature). This means that it is usually a good idea to set the project platforms to all platforms it can support across its environments.
  • +
+

Preview features#

+

Pixi sometimes introduces new features that are not yet stable, but that we would like for users to test out. These features are called preview features. Preview features are disabled by default and can be enabled by setting the preview field in the project manifest. The preview field is an array of strings that specify the preview features to enable, or the boolean value true to enable all preview features.

+

An example of a preview feature in the project manifest:

+
Example preview features in the project manifest
[project]
+name = "foo"
+channels = []
+platforms = []
+preview = ["new-resolve"]
+
+

Preview features in the documentation will be marked as such on the relevant pages.

+

Global configuration#

+

The global configuration options are documented in the global configuration section.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/reference/project_configuration/index.html b/v0.39.2/reference/project_configuration/index.html new file mode 100644 index 000000000..1aea2b718 --- /dev/null +++ b/v0.39.2/reference/project_configuration/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/v0.39.2/schema/manifest/schema.json b/v0.39.2/schema/manifest/schema.json new file mode 100644 index 000000000..f624df24c --- /dev/null +++ b/v0.39.2/schema/manifest/schema.json @@ -0,0 +1,1702 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "$id": "https://pixi.sh/v0.39.2/schema/manifest/schema.json", + "title": "`pixi.toml` manifest file", + "description": "The configuration for a [`pixi`](https://pixi.sh) project.", + "type": "object", + "additionalProperties": false, + "oneOf": [ + { + "required": [ + "project" + ] + }, + { + "required": [ + "workspace" + ] + } + ], + "properties": { + "$schema": { + "title": "Schema", + "description": "The schema identifier for the project's configuration", + "type": "string", + "default": "https://pixi.sh/v0.39.2/schema/manifest/schema.json", + "format": "uri-reference" + }, + "activation": { + "$ref": "#/$defs/Activation", + "description": "The scripts used on the activation of the project" + }, + "build-backend": { + "title": "Build-Backend", + "description": "Configuration for the build backend.", + "type": "object" + }, + "build-dependencies": { + "title": "Build-Dependencies", + "description": "The build `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "build-system": { + "$ref": "#/$defs/BuildSystem", + "description": "The build-system used to build the package." + }, + "dependencies": { + "title": "Dependencies", + "description": "The `conda` dependencies, consisting of a package name and a requirement in [MatchSpec](https://github.com/conda/conda/blob/078e7ee79381060217e1ec7f9b0e9cf80ecc8f3f/conda/models/match_spec.py) format", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "environments": { + "title": "Environments", + "description": "The environments of the project, defined as a full object or a list of feature names.", + "type": "object", + "patternProperties": { + "^[a-z\\d\\-]+$": { + "anyOf": [ + { + "$ref": "#/$defs/Environment" + }, + { + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + } + ] + } + } + }, + "feature": { + "title": "Feature", + "description": "The features of the project", + "type": "object", + "additionalProperties": { + "$ref": "#/$defs/Feature" + } + }, + "host-dependencies": { + "title": "Host-Dependencies", + "description": "The host `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + }, + "examples": [ + { + "python": ">=3.8" + } + ] + }, + "package": { + "$ref": "#/$defs/Package", + "description": "The package's metadata information" + }, + "project": { + "$ref": "#/$defs/Workspace", + "description": "The project's metadata information" + }, + "pypi-dependencies": { + "title": "Pypi-Dependencies", + "description": "The PyPI dependencies", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/PyPIVersion" + }, + { + "$ref": "#/$defs/PyPIGitBranchRequirement" + }, + { + "$ref": "#/$defs/PyPIGitTagRequirement" + }, + { + "$ref": "#/$defs/PyPIGitRevRequirement" + }, + { + "$ref": "#/$defs/PyPIPathRequirement" + }, + { + "$ref": "#/$defs/PyPIUrlRequirement" + } + ] + } + }, + "pypi-options": { + "$ref": "#/$defs/PyPIOptions", + "description": "Options related to PyPI indexes, on the default feature" + }, + "run-dependencies": { + "title": "Run-Dependencies", + "description": "The run-dependencies for the [package]", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "system-requirements": { + "$ref": "#/$defs/SystemRequirements", + "description": "The system requirements of the project" + }, + "target": { + "title": "Target", + "description": "The targets of the project", + "type": "object", + "additionalProperties": { + "$ref": "#/$defs/Target" + }, + "examples": [ + { + "linux": { + "dependencies": { + "python": "3.8" + } + } + } + ] + }, + "tasks": { + "title": "Tasks", + "description": "The tasks of the project", + "type": "object", + "patternProperties": { + "^[^\\s\\$]+$": { + "anyOf": [ + { + "$ref": "#/$defs/TaskInlineTable" + }, + { + "type": "string", + "minLength": 1 + } + ] + } + } + }, + "tool": { + "title": "Tool", + "description": "Third-party tool configurations, ignored by pixi", + "type": "object" + }, + "workspace": { + "$ref": "#/$defs/Workspace", + "description": "The workspace's metadata information" + } + }, + "$defs": { + "Activation": { + "title": "Activation", + "description": "A description of steps performed when an environment is activated", + "type": "object", + "additionalProperties": false, + "properties": { + "env": { + "title": "Env", + "description": "A map of environment variables to values, used in the activation of the environment. These will be set in the shell. Thus these variables are shell specific. Using '$' might not expand to a value in different shells.", + "type": "object", + "additionalProperties": { + "type": "string", + "minLength": 1 + }, + "examples": [ + { + "key": "value" + }, + { + "ARGUMENT": "value" + } + ] + }, + "scripts": { + "title": "Scripts", + "description": "The scripts to run when the environment is activated", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + }, + "examples": [ + "activate.sh", + "activate.bat" + ] + } + } + }, + "BuildBackend": { + "title": "BuildBackend", + "type": "object", + "additionalProperties": false, + "properties": { + "branch": { + "title": "Branch", + "description": "A git branch to use", + "type": "string", + "minLength": 1 + }, + "build": { + "title": "Build", + "description": "The build string of the package", + "type": "string", + "minLength": 1 + }, + "build-number": { + "title": "Build-Number", + "description": "The build number of the package, can be a spec like `>=1` or `<=10` or `1`", + "type": "string", + "minLength": 1 + }, + "channel": { + "title": "Channel", + "description": "The channel the packages needs to be fetched from", + "type": "string", + "minLength": 1, + "examples": [ + "conda-forge", + "pytorch", + "https://repo.prefix.dev/conda-forge" + ] + }, + "file-name": { + "title": "File-Name", + "description": "The file name of the package", + "type": "string", + "minLength": 1 + }, + "git": { + "title": "Git", + "description": "The git URL to the repo", + "type": "string", + "minLength": 1 + }, + "md5": { + "title": "Md5", + "description": "The md5 hash of the package", + "type": "string", + "pattern": "^[a-fA-F0-9]{32}$" + }, + "name": { + "title": "Name", + "description": "The name of the build backend package", + "type": "string", + "minLength": 1 + }, + "path": { + "title": "Path", + "description": "The path to the package", + "type": "string", + "minLength": 1 + }, + "rev": { + "title": "Rev", + "description": "A git SHA revision to use", + "type": "string", + "minLength": 1 + }, + "sha256": { + "title": "Sha256", + "description": "The sha256 hash of the package", + "type": "string", + "pattern": "^[a-fA-F0-9]{64}$" + }, + "subdir": { + "title": "Subdir", + "description": "The subdir of the package, also known as platform", + "type": "string", + "minLength": 1 + }, + "tag": { + "title": "Tag", + "description": "A git tag to use", + "type": "string", + "minLength": 1 + }, + "url": { + "title": "Url", + "description": "The URL to the package", + "type": "string", + "minLength": 1 + }, + "version": { + "title": "Version", + "description": "The version of the package in [MatchSpec](https://github.com/conda/conda/blob/078e7ee79381060217e1ec7f9b0e9cf80ecc8f3f/conda/models/match_spec.py) format", + "type": "string", + "minLength": 1 + } + } + }, + "BuildSystem": { + "title": "BuildSystem", + "type": "object", + "required": [ + "build-backend" + ], + "additionalProperties": false, + "properties": { + "additional-dependencies": { + "title": "Additional-Dependencies", + "description": "Additional dependencies to install alongside the build backend", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "build-backend": { + "$ref": "#/$defs/BuildBackend", + "description": "The build backend to instantiate" + }, + "channels": { + "title": "Channels", + "description": "The `conda` channels that are used to fetch the build backend from", + "type": "array", + "items": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "type": "string", + "format": "uri", + "minLength": 1 + }, + { + "$ref": "#/$defs/ChannelInlineTable" + } + ] + } + } + } + }, + "ChannelInlineTable": { + "title": "ChannelInlineTable", + "description": "A precise description of a `conda` channel, with an optional priority.", + "type": "object", + "required": [ + "channel" + ], + "additionalProperties": false, + "properties": { + "channel": { + "title": "Channel", + "description": "The channel the packages needs to be fetched from", + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "type": "string", + "format": "uri", + "minLength": 1 + } + ] + }, + "priority": { + "title": "Priority", + "description": "The priority of the channel", + "type": "integer" + } + } + }, + "ChannelPriority": { + "title": "ChannelPriority", + "description": "The priority of the channel.", + "type": "string", + "enum": [ + "disabled", + "strict" + ] + }, + "Environment": { + "title": "Environment", + "description": "A composition of the dependencies of features which can be activated to run tasks or provide a shell", + "type": "object", + "additionalProperties": false, + "properties": { + "features": { + "title": "Features", + "description": "The features that define the environment", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "no-default-feature": { + "title": "No-Default-Feature", + "description": "Whether to add the default feature to this environment", + "type": "boolean", + "default": false + }, + "solve-group": { + "title": "Solve-Group", + "description": "The group name for environments that should be solved together", + "type": "string", + "minLength": 1 + } + } + }, + "Feature": { + "title": "Feature", + "description": "A composable aspect of the project which can contribute dependencies and tasks to an environment", + "type": "object", + "additionalProperties": false, + "properties": { + "activation": { + "$ref": "#/$defs/Activation", + "description": "The scripts used on the activation of environments using this feature" + }, + "build-dependencies": { + "title": "Build-Dependencies", + "description": "The build `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "channel-priority": { + "$ref": "#/$defs/ChannelPriority", + "description": "The type of channel priority that is used in the solve.- 'strict': only take the package from the channel it exist in first.- 'disabled': group all dependencies together as if there is no channel difference.", + "examples": [ + "strict", + "disabled" + ] + }, + "channels": { + "title": "Channels", + "description": "The `conda` channels that can be considered when solving environments containing this feature", + "type": "array", + "items": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "type": "string", + "format": "uri", + "minLength": 1 + }, + { + "$ref": "#/$defs/ChannelInlineTable" + } + ] + } + }, + "dependencies": { + "title": "Dependencies", + "description": "The `conda` dependencies, consisting of a package name and a requirement in [MatchSpec](https://github.com/conda/conda/blob/078e7ee79381060217e1ec7f9b0e9cf80ecc8f3f/conda/models/match_spec.py) format", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "host-dependencies": { + "title": "Host-Dependencies", + "description": "The host `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + }, + "examples": [ + { + "python": ">=3.8" + } + ] + }, + "platforms": { + "title": "Platforms", + "description": "The platforms that the feature supports: a union of all features combined in one environment is used for the environment.", + "type": "array", + "items": { + "$ref": "#/$defs/Platform" + } + }, + "pypi-dependencies": { + "title": "Pypi-Dependencies", + "description": "The PyPI dependencies of this feature", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/PyPIVersion" + }, + { + "$ref": "#/$defs/PyPIGitBranchRequirement" + }, + { + "$ref": "#/$defs/PyPIGitTagRequirement" + }, + { + "$ref": "#/$defs/PyPIGitRevRequirement" + }, + { + "$ref": "#/$defs/PyPIPathRequirement" + }, + { + "$ref": "#/$defs/PyPIUrlRequirement" + } + ] + } + }, + "pypi-options": { + "$ref": "#/$defs/PyPIOptions", + "description": "Options related to PyPI indexes for this feature" + }, + "system-requirements": { + "$ref": "#/$defs/SystemRequirements", + "description": "The system requirements of this feature" + }, + "target": { + "title": "Target", + "description": "Machine-specific aspects of this feature", + "type": "object", + "additionalProperties": { + "$ref": "#/$defs/Target" + }, + "examples": [ + { + "linux": { + "dependencies": { + "python": "3.8" + } + } + } + ] + }, + "tasks": { + "title": "Tasks", + "description": "The tasks provided by this feature", + "type": "object", + "patternProperties": { + "^[^\\s\\$]+$": { + "anyOf": [ + { + "$ref": "#/$defs/TaskInlineTable" + }, + { + "type": "string", + "minLength": 1 + } + ] + } + } + } + } + }, + "FindLinksPath": { + "title": "FindLinksPath", + "description": "The path to the directory containing packages", + "type": "object", + "additionalProperties": false, + "properties": { + "path": { + "title": "Path", + "description": "Path to the directory of packages", + "type": "string", + "minLength": 1, + "examples": [ + "./links" + ] + } + } + }, + "FindLinksURL": { + "title": "FindLinksURL", + "description": "The URL to the html file containing href-links to packages", + "type": "object", + "additionalProperties": false, + "properties": { + "url": { + "title": "Url", + "description": "URL to html file with href-links to packages", + "type": "string", + "minLength": 1, + "examples": [ + "https://simple-index-is-here.com" + ] + } + } + }, + "LibcFamily": { + "title": "LibcFamily", + "type": "object", + "additionalProperties": false, + "properties": { + "family": { + "title": "Family", + "description": "The family of the `libc`", + "type": "string", + "minLength": 1, + "examples": [ + "glibc", + "musl" + ] + }, + "version": { + "title": "Version", + "description": "The version of `libc`", + "anyOf": [ + { + "type": "number" + }, + { + "type": "string", + "minLength": 1 + } + ] + } + } + }, + "MatchspecTable": { + "title": "MatchspecTable", + "description": "A precise description of a `conda` package version.", + "type": "object", + "additionalProperties": false, + "properties": { + "branch": { + "title": "Branch", + "description": "A git branch to use", + "type": "string", + "minLength": 1 + }, + "build": { + "title": "Build", + "description": "The build string of the package", + "type": "string", + "minLength": 1 + }, + "build-number": { + "title": "Build-Number", + "description": "The build number of the package, can be a spec like `>=1` or `<=10` or `1`", + "type": "string", + "minLength": 1 + }, + "channel": { + "title": "Channel", + "description": "The channel the packages needs to be fetched from", + "type": "string", + "minLength": 1, + "examples": [ + "conda-forge", + "pytorch", + "https://repo.prefix.dev/conda-forge" + ] + }, + "file-name": { + "title": "File-Name", + "description": "The file name of the package", + "type": "string", + "minLength": 1 + }, + "git": { + "title": "Git", + "description": "The git URL to the repo", + "type": "string", + "minLength": 1 + }, + "md5": { + "title": "Md5", + "description": "The md5 hash of the package", + "type": "string", + "pattern": "^[a-fA-F0-9]{32}$" + }, + "path": { + "title": "Path", + "description": "The path to the package", + "type": "string", + "minLength": 1 + }, + "rev": { + "title": "Rev", + "description": "A git SHA revision to use", + "type": "string", + "minLength": 1 + }, + "sha256": { + "title": "Sha256", + "description": "The sha256 hash of the package", + "type": "string", + "pattern": "^[a-fA-F0-9]{64}$" + }, + "subdir": { + "title": "Subdir", + "description": "The subdir of the package, also known as platform", + "type": "string", + "minLength": 1 + }, + "tag": { + "title": "Tag", + "description": "A git tag to use", + "type": "string", + "minLength": 1 + }, + "url": { + "title": "Url", + "description": "The URL to the package", + "type": "string", + "minLength": 1 + }, + "version": { + "title": "Version", + "description": "The version of the package in [MatchSpec](https://github.com/conda/conda/blob/078e7ee79381060217e1ec7f9b0e9cf80ecc8f3f/conda/models/match_spec.py) format", + "type": "string", + "minLength": 1 + } + } + }, + "Package": { + "title": "Package", + "description": "The package's metadata information.", + "type": "object", + "additionalProperties": false, + "properties": { + "authors": { + "title": "Authors", + "description": "The authors of the project", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + }, + "examples": [ + "John Doe " + ] + }, + "description": { + "title": "Description", + "description": "A short description of the project", + "type": "string", + "minLength": 1 + }, + "documentation": { + "title": "Documentation", + "description": "The URL of the documentation of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "homepage": { + "title": "Homepage", + "description": "The URL of the homepage of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "license": { + "title": "License", + "description": "The license of the project; we advise using an [SPDX](https://spdx.org/licenses/) identifier.", + "type": "string", + "minLength": 1 + }, + "license-file": { + "title": "License-File", + "description": "The path to the license file of the project", + "type": "string", + "pattern": "^[^\\\\]+$" + }, + "name": { + "title": "Name", + "description": "The name of the package", + "type": "string", + "minLength": 1 + }, + "readme": { + "title": "Readme", + "description": "The path to the readme file of the project", + "type": "string", + "pattern": "^[^\\\\]+$" + }, + "repository": { + "title": "Repository", + "description": "The URL of the repository of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "version": { + "title": "Version", + "description": "The version of the project; we advise use of [SemVer](https://semver.org)", + "type": "string", + "minLength": 1, + "examples": [ + "1.2.3" + ] + } + } + }, + "Platform": { + "title": "Platform", + "description": "A supported operating system and processor architecture pair.", + "type": "string", + "enum": [ + "emscripten-wasm32", + "linux-32", + "linux-64", + "linux-aarch64", + "linux-armv6l", + "linux-armv7l", + "linux-ppc64", + "linux-ppc64le", + "linux-riscv32", + "linux-riscv64", + "linux-s390x", + "noarch", + "osx-64", + "osx-arm64", + "unknown", + "wasi-wasm32", + "win-32", + "win-64", + "win-arm64", + "zos-z" + ] + }, + "PyPIGitBranchRequirement": { + "title": "PyPIGitBranchRequirement", + "type": "object", + "additionalProperties": false, + "properties": { + "branch": { + "title": "Branch", + "description": "A `git` branch to use", + "type": "string", + "minLength": 1 + }, + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "git": { + "title": "Git", + "description": "The `git` URL to the repo e.g https://github.com/prefix-dev/pixi", + "type": "string", + "minLength": 1 + }, + "subdirectory": { + "title": "Subdirectory", + "description": "The subdirectory in the repo, a path from the root of the repo.", + "type": "string", + "minLength": 1 + } + } + }, + "PyPIGitRevRequirement": { + "title": "PyPIGitRevRequirement", + "type": "object", + "additionalProperties": false, + "properties": { + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "git": { + "title": "Git", + "description": "The `git` URL to the repo e.g https://github.com/prefix-dev/pixi", + "type": "string", + "minLength": 1 + }, + "rev": { + "title": "Rev", + "description": "A `git` SHA revision to use", + "type": "string", + "minLength": 1 + }, + "subdirectory": { + "title": "Subdirectory", + "description": "The subdirectory in the repo, a path from the root of the repo.", + "type": "string", + "minLength": 1 + } + } + }, + "PyPIGitTagRequirement": { + "title": "PyPIGitTagRequirement", + "type": "object", + "additionalProperties": false, + "properties": { + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "git": { + "title": "Git", + "description": "The `git` URL to the repo e.g https://github.com/prefix-dev/pixi", + "type": "string", + "minLength": 1 + }, + "subdirectory": { + "title": "Subdirectory", + "description": "The subdirectory in the repo, a path from the root of the repo.", + "type": "string", + "minLength": 1 + }, + "tag": { + "title": "Tag", + "description": "A `git` tag to use", + "type": "string", + "minLength": 1 + } + } + }, + "PyPIOptions": { + "title": "PyPIOptions", + "description": "Options that determine the behavior of PyPI package resolution and installation", + "type": "object", + "additionalProperties": false, + "properties": { + "extra-index-urls": { + "title": "Extra-Index-Urls", + "description": "Additional PyPI registries that should be used as extra indexes", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + }, + "examples": [ + [ + "https://pypi.org/simple" + ] + ] + }, + "find-links": { + "title": "Find-Links", + "description": "Paths to directory containing", + "type": "array", + "items": { + "anyOf": [ + { + "$ref": "#/$defs/FindLinksPath" + }, + { + "$ref": "#/$defs/FindLinksURL" + } + ] + }, + "examples": [ + [ + "https://pypi.org/simple" + ] + ] + }, + "index-strategy": { + "title": "Index-Strategy", + "description": "The strategy to use when resolving packages from multiple indexes", + "anyOf": [ + { + "const": "first-index" + }, + { + "const": "unsafe-first-match" + }, + { + "const": "unsafe-best-match" + } + ], + "examples": [ + "first-index", + "unsafe-first-match", + "unsafe-best-match" + ] + }, + "index-url": { + "title": "Index-Url", + "description": "PyPI registry that should be used as the primary index", + "type": "string", + "minLength": 1, + "examples": [ + "https://pypi.org/simple" + ] + }, + "no-build-isolation": { + "title": "No-Build-Isolation", + "description": "Packages that should NOT be isolated during the build process", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + }, + "examples": [ + [ + "numpy" + ] + ] + } + } + }, + "PyPIPathRequirement": { + "title": "PyPIPathRequirement", + "type": "object", + "additionalProperties": false, + "properties": { + "editable": { + "title": "Editable", + "description": "If `true` the package will be installed as editable", + "type": "boolean" + }, + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "path": { + "title": "Path", + "description": "A path to a local source or wheel", + "type": "string", + "minLength": 1 + }, + "subdirectory": { + "title": "Subdirectory", + "description": "The subdirectory in the repo, a path from the root of the repo.", + "type": "string", + "minLength": 1 + } + } + }, + "PyPIUrlRequirement": { + "title": "PyPIUrlRequirement", + "type": "object", + "additionalProperties": false, + "properties": { + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "url": { + "title": "Url", + "description": "A URL to a remote source or wheel", + "type": "string", + "minLength": 1 + } + } + }, + "PyPIVersion": { + "title": "PyPIVersion", + "type": "object", + "additionalProperties": false, + "properties": { + "extras": { + "title": "Extras", + "description": "The [PEP 508 extras](https://peps.python.org/pep-0508/#extras) of the package", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "index": { + "title": "Index", + "description": "The index to fetch the package from", + "type": "string", + "minLength": 1 + }, + "version": { + "title": "Version", + "description": "The version of the package in [PEP 440](https://www.python.org/dev/peps/pep-0440/) format", + "type": "string", + "minLength": 1 + } + } + }, + "SystemRequirements": { + "title": "SystemRequirements", + "description": "Platform-specific requirements", + "type": "object", + "additionalProperties": false, + "properties": { + "archspec": { + "title": "Archspec", + "description": "The architecture the project supports", + "type": "string", + "minLength": 1 + }, + "cuda": { + "title": "Cuda", + "description": "The minimum version of CUDA", + "anyOf": [ + { + "type": "number" + }, + { + "type": "string", + "minLength": 1 + } + ] + }, + "libc": { + "title": "Libc", + "description": "The minimum version of `libc`", + "anyOf": [ + { + "$ref": "#/$defs/LibcFamily" + }, + { + "type": "number" + }, + { + "type": "string", + "minLength": 1 + } + ] + }, + "linux": { + "title": "Linux", + "description": "The minimum version of the Linux kernel", + "anyOf": [ + { + "type": "number", + "exclusiveMinimum": 0.0 + }, + { + "type": "string", + "minLength": 1 + } + ] + }, + "macos": { + "title": "Macos", + "description": "The minimum version of MacOS", + "anyOf": [ + { + "type": "number", + "exclusiveMinimum": 0.0 + }, + { + "type": "string", + "minLength": 1 + } + ] + }, + "unix": { + "title": "Unix", + "description": "Whether the project supports UNIX", + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "string", + "minLength": 1 + } + ], + "examples": [ + "true" + ] + } + } + }, + "Target": { + "title": "Target", + "description": "A machine-specific configuration of dependencies and tasks", + "type": "object", + "additionalProperties": false, + "properties": { + "activation": { + "$ref": "#/$defs/Activation", + "description": "The scripts used on the activation of the project for this target" + }, + "build-dependencies": { + "title": "Build-Dependencies", + "description": "The build `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "dependencies": { + "title": "Dependencies", + "description": "The `conda` dependencies, consisting of a package name and a requirement in [MatchSpec](https://github.com/conda/conda/blob/078e7ee79381060217e1ec7f9b0e9cf80ecc8f3f/conda/models/match_spec.py) format", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + } + }, + "host-dependencies": { + "title": "Host-Dependencies", + "description": "The host `conda` dependencies, used in the build process", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/MatchspecTable" + } + ] + }, + "examples": [ + { + "python": ">=3.8" + } + ] + }, + "pypi-dependencies": { + "title": "Pypi-Dependencies", + "description": "The PyPI dependencies for this target", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "$ref": "#/$defs/PyPIVersion" + }, + { + "$ref": "#/$defs/PyPIGitBranchRequirement" + }, + { + "$ref": "#/$defs/PyPIGitTagRequirement" + }, + { + "$ref": "#/$defs/PyPIGitRevRequirement" + }, + { + "$ref": "#/$defs/PyPIPathRequirement" + }, + { + "$ref": "#/$defs/PyPIUrlRequirement" + } + ] + } + }, + "tasks": { + "title": "Tasks", + "description": "The tasks of the target", + "type": "object", + "patternProperties": { + "^[^\\s\\$]+$": { + "anyOf": [ + { + "$ref": "#/$defs/TaskInlineTable" + }, + { + "type": "string", + "minLength": 1 + } + ] + } + } + } + } + }, + "TaskInlineTable": { + "title": "TaskInlineTable", + "description": "A precise definition of a task.", + "type": "object", + "additionalProperties": false, + "properties": { + "clean-env": { + "title": "Clean-Env", + "description": "Whether to run in a clean environment, removing all environment variables except those defined in `env` and by pixi itself.", + "type": "boolean" + }, + "cmd": { + "title": "Cmd", + "description": "A shell command to run the task in the limited, but cross-platform `bash`-like `deno_task_shell`. See the documentation for [supported syntax](https://pixi.sh/latest/features/advanced_tasks/#syntax)", + "anyOf": [ + { + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + { + "type": "string", + "minLength": 1 + } + ] + }, + "cwd": { + "title": "Cwd", + "description": "The working directory to run the task", + "type": "string", + "pattern": "^[^\\\\]+$" + }, + "depends-on": { + "title": "Depends-On", + "description": "The tasks that this task depends on. Environment variables will **not** be expanded.", + "anyOf": [ + { + "type": "array", + "items": { + "description": "A valid task name.", + "type": "string", + "pattern": "^[^\\s\\$]+$" + } + }, + { + "description": "A valid task name.", + "type": "string", + "pattern": "^[^\\s\\$]+$" + } + ] + }, + "depends_on": { + "title": "Depends On", + "description": "The tasks that this task depends on. Environment variables will **not** be expanded. Deprecated in favor of `depends-on` from v0.21.0 onward.", + "anyOf": [ + { + "type": "array", + "items": { + "description": "A valid task name.", + "type": "string", + "pattern": "^[^\\s\\$]+$" + } + }, + { + "description": "A valid task name.", + "type": "string", + "pattern": "^[^\\s\\$]+$" + } + ] + }, + "description": { + "title": "Description", + "description": "A short description of the task", + "type": "string", + "minLength": 1, + "examples": [ + "Build the project" + ] + }, + "env": { + "title": "Env", + "description": "A map of environment variables to values, used in the task, these will be overwritten by the shell.", + "type": "object", + "additionalProperties": { + "type": "string", + "minLength": 1 + }, + "examples": [ + { + "key": "value" + }, + { + "ARGUMENT": "value" + } + ] + }, + "inputs": { + "title": "Inputs", + "description": "A list of `.gitignore`-style glob patterns that should be watched for changes before this command is run. Environment variables _will_ be expanded.", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + }, + "outputs": { + "title": "Outputs", + "description": "A list of `.gitignore`-style glob patterns that are generated by this command. Environment variables _will_ be expanded.", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + } + } + } + }, + "Workspace": { + "title": "Workspace", + "description": "The project's metadata information.", + "type": "object", + "required": [ + "platforms" + ], + "additionalProperties": false, + "properties": { + "authors": { + "title": "Authors", + "description": "The authors of the project", + "type": "array", + "items": { + "type": "string", + "minLength": 1 + }, + "examples": [ + "John Doe " + ] + }, + "build-variants": { + "title": "Build-Variants", + "description": "The build variants of the project", + "type": "object", + "additionalProperties": { + "type": "array", + "items": { + "type": "string" + } + } + }, + "channel-priority": { + "$ref": "#/$defs/ChannelPriority", + "description": "The type of channel priority that is used in the solve.- 'strict': only take the package from the channel it exist in first.- 'disabled': group all dependencies together as if there is no channel difference.", + "examples": [ + "strict", + "disabled" + ] + }, + "channels": { + "title": "Channels", + "description": "The `conda` channels that can be used in the project. Unless overridden by `priority`, the first channel listed will be preferred.", + "type": "array", + "items": { + "anyOf": [ + { + "type": "string", + "minLength": 1 + }, + { + "type": "string", + "format": "uri", + "minLength": 1 + }, + { + "$ref": "#/$defs/ChannelInlineTable" + } + ] + } + }, + "conda-pypi-map": { + "title": "Conda-Pypi-Map", + "description": "The `conda` to PyPI mapping configuration", + "type": "object", + "additionalProperties": { + "anyOf": [ + { + "type": "string", + "format": "uri", + "minLength": 1 + }, + { + "type": "string", + "minLength": 1 + } + ] + } + }, + "description": { + "title": "Description", + "description": "A short description of the project", + "type": "string", + "minLength": 1 + }, + "documentation": { + "title": "Documentation", + "description": "The URL of the documentation of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "homepage": { + "title": "Homepage", + "description": "The URL of the homepage of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "license": { + "title": "License", + "description": "The license of the project; we advise using an [SPDX](https://spdx.org/licenses/) identifier.", + "type": "string", + "minLength": 1 + }, + "license-file": { + "title": "License-File", + "description": "The path to the license file of the project", + "type": "string", + "pattern": "^[^\\\\]+$" + }, + "name": { + "title": "Name", + "description": "The name of the project; we advise use of the name of the repository", + "type": "string", + "minLength": 1 + }, + "platforms": { + "title": "Platforms", + "description": "The platforms that the project supports", + "type": "array", + "items": { + "$ref": "#/$defs/Platform" + } + }, + "preview": { + "title": "Preview", + "description": "Defines the enabling of preview features of the project", + "anyOf": [ + { + "type": "array", + "items": { + "anyOf": [ + { + "description": "Enables building of source records", + "const": "pixi-build" + }, + { + "type": "string" + } + ] + } + }, + { + "type": "boolean" + } + ] + }, + "pypi-options": { + "$ref": "#/$defs/PyPIOptions", + "description": "Options related to PyPI indexes for this project" + }, + "readme": { + "title": "Readme", + "description": "The path to the readme file of the project", + "type": "string", + "pattern": "^[^\\\\]+$" + }, + "repository": { + "title": "Repository", + "description": "The URL of the repository of the project", + "type": "string", + "format": "uri", + "minLength": 1 + }, + "version": { + "title": "Version", + "description": "The version of the project; we advise use of [SemVer](https://semver.org)", + "type": "string", + "minLength": 1, + "examples": [ + "1.2.3" + ] + } + } + } + } +} diff --git a/v0.39.2/search/search_index.json b/v0.39.2/search/search_index.json new file mode 100644 index 000000000..33e4e3812 --- /dev/null +++ b/v0.39.2/search/search_index.json @@ -0,0 +1 @@ +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Getting Started","text":"

Pixi is a package management tool for developers. It allows the developer to install libraries and applications in a reproducible way. Use pixi cross-platform, on Windows, Mac and Linux.

"},{"location":"#installation","title":"Installation","text":"

To install pixi you can run the following command in your terminal:

Linux & macOSWindows
curl -fsSL https://pixi.sh/install.sh | bash\n

The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to ~/.pixi/bin. If this directory does not already exist, the script will create it.

The script will also update your ~/.bash_profile to include ~/.pixi/bin in your PATH, allowing you to invoke the pixi command from anywhere.

PowerShell:

iwr -useb https://pixi.sh/install.ps1 | iex\n
winget:
winget install prefix-dev.pixi\n
The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to LocalAppData/pixi/bin. If this directory does not already exist, the script will create it.

The command will also automatically add LocalAppData/pixi/bin to your path allowing you to invoke pixi from anywhere.

Tip

You might need to restart your terminal or source your shell for the changes to take effect.

You can find more options for the installation script here.

"},{"location":"#autocompletion","title":"Autocompletion","text":"

To get autocompletion follow the instructions for your shell. Afterwards, restart the shell or source the shell config file.

"},{"location":"#bash-default-on-most-linux-systems","title":"Bash (default on most Linux systems)","text":"

Add the following to the end of ~/.bashrc:

~/.bashrc
eval \"$(pixi completion --shell bash)\"\n
"},{"location":"#zsh-default-on-macos","title":"Zsh (default on macOS)","text":"

Add the following to the end of ~/.zshrc:

~/.zshrc
eval \"$(pixi completion --shell zsh)\"\n
"},{"location":"#powershell-pre-installed-on-all-windows-systems","title":"PowerShell (pre-installed on all Windows systems)","text":"

Add the following to the end of Microsoft.PowerShell_profile.ps1. You can check the location of this file by querying the $PROFILE variable in PowerShell. Typically the path is ~\\Documents\\PowerShell\\Microsoft.PowerShell_profile.ps1 or ~/.config/powershell/Microsoft.PowerShell_profile.ps1 on -Nix.

(& pixi completion --shell powershell) | Out-String | Invoke-Expression\n
"},{"location":"#fish","title":"Fish","text":"

Add the following to the end of ~/.config/fish/config.fish:

~/.config/fish/config.fish
pixi completion --shell fish | source\n
"},{"location":"#nushell","title":"Nushell","text":"

Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

mkdir ~/.cache/pixi\npixi completion --shell nushell | save -f ~/.cache/pixi/completions.nu\n

And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

use ~/.cache/pixi/completions.nu *\n
"},{"location":"#elvish","title":"Elvish","text":"

Add the following to the end of ~/.elvish/rc.elv:

~/.elvish/rc.elv
eval (pixi completion --shell elvish | slurp)\n
"},{"location":"#alternative-installation-methods","title":"Alternative installation methods","text":"

Although we recommend installing pixi through the above method we also provide additional installation methods.

"},{"location":"#homebrew","title":"Homebrew","text":"

Pixi is available via homebrew. To install pixi via homebrew simply run:

brew install pixi\n
"},{"location":"#windows-installer","title":"Windows installer","text":"

We provide an msi installer on our GitHub releases page. The installer will download pixi and add it to the path.

"},{"location":"#install-from-source","title":"Install from source","text":"

pixi is 100% written in Rust, and therefore it can be installed, built and tested with cargo. To start using pixi from a source build run:

cargo install --locked --git https://github.com/prefix-dev/pixi.git pixi\n

We don't publish to crates.io anymore, so you need to install it from the repository. The reason for this is that we depend on some unpublished crates which disallows us to publish to crates.io.

or when you want to make changes use:

cargo build\ncargo test\n

If you have any issues building because of the dependency on rattler checkout its compile steps.

"},{"location":"#installer-script-options","title":"Installer script options","text":"Linux & macOSWindows

The installation script has several options that can be manipulated through environment variables.

Variable Description Default Value PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PIXI_HOME The location of the binary folder. $HOME/.pixi PIXI_ARCH The architecture the pixi version was built for. uname -m PIXI_NO_PATH_UPDATE If set the $PATH will not be updated to add pixi to it. TMP_DIR The temporary directory the script uses to download to and unpack the binary from. /tmp

For example, on Apple Silicon, you can force the installation of the x86 version:

curl -fsSL https://pixi.sh/install.sh | PIXI_ARCH=x86_64 bash\n
Or set the version
curl -fsSL https://pixi.sh/install.sh | PIXI_VERSION=v0.18.0 bash\n

The installation script has several options that can be manipulated through environment variables.

Variable Environment variable Description Default Value PixiVersion PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PixiHome PIXI_HOME The location of the installation. $Env:USERPROFILE\\.pixi NoPathUpdate If set, the $PATH will not be updated to add pixi to it.

For example, set the version using:

iwr -useb https://pixi.sh/install.ps1 | iex -Args \"-PixiVersion v0.18.0\"\n
"},{"location":"#update","title":"Update","text":"

Updating is as simple as installing, rerunning the installation script gets you the latest version.

pixi self-update\n
Or get a specific pixi version using:
pixi self-update --version x.y.z\n

Note

If you've used a package manager like brew, mamba, conda, paru etc. to install pixi you must use the built-in update mechanism. e.g. brew upgrade pixi.

"},{"location":"#uninstall","title":"Uninstall","text":"

To uninstall pixi from your system, simply remove the binary.

Linux & macOSWindows
rm ~/.pixi/bin/pixi\n
$PIXI_BIN = \"$Env:LocalAppData\\pixi\\bin\\pixi\"; Remove-Item -Path $PIXI_BIN\n

After this command, you can still use the tools you installed with pixi. To remove these as well, just remove the whole ~/.pixi directory and remove the directory from your path.

"},{"location":"Community/","title":"Community","text":"

When you want to show your users and contributors that they can use pixi in your repo, you can use the following badge:

[![Pixi Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/prefix-dev/pixi/main/assets/badge/v0.json)](https://pixi.sh)\n

Customize your badge

To further customize the look and feel of your badge, you can add &style=<custom-style> at the end of the URL. See the documentation on shields.io for more info.

"},{"location":"Community/#built-using-pixi","title":"Built using Pixi","text":"
  • Deltares:
    • Ribasim: Water resources model
    • Ribasim-NL: Ribasim water resources modeling in the Netherlands
    • iMOD Python: Make massive MODFLOW models
    • iMOD Coupler: Application for coupling hydrological kernels
    • iMOD Documentation: Documentation of the iMOD suite.
    • Xugrid: Xarray and unstructured grids
    • Numba celltree: Celltree data structure for searching for points, lines, boxes, and cells (convex polygons) in a two dimensional unstructured mesh.
    • QGIS-Tim: QGIS plugin and utilities for TimML multi-layer analytic element model
    • Pandamesh: From geodataframe to mesh
    • Wflow: Hydrological modeling framework
    • HydroMT: Automated and reproducible model building and analysis
    • HydroMT SFINCS: SFINCS plugin for HydroMT
    • PyFlwDir: Fast methods to work with hydro- and topography data in pure Python.
  • USGS:
    • MODFLOW 6: USGS modular hydrological model
  • QuantCo:
    • glum: High performance Python GLMs with all the features!
    • tabmat: Efficient matrix representations for working with tabular data
    • pixi-pack: A tool to pack and unpack conda environments created with pixi
    • polarify: Simplifying conditional Polars Expressions with Python \ud83d\udc0d \ud83d\udc3b\u200d\u2744\ufe0f
    • copier-template-python-open-source: Copier template for python projects using pixi
    • datajudge: Assessing whether data from database complies with reference information
    • ndonnx: ONNX-backed array library that is compliant with the Array API standard
    • multiregex: Quickly match many regexes against a string
    • slim-trees: Pickle your ML models more efficiently for deployment \ud83d\ude80
    • sqlcompyre: Compare SQL tables and databases
    • metalearners: MetaLearners for CATE estimation
    • ndonnx: ONNX-backed array library that is compliant with the Array API standard
    • tabulardelta: Simplify table comparisons
    • pydiverse.pipedag: A library for data pipeline orchestration optimizing high development iteration speed
    • pydiverse.transform: Pipe based dataframe manipulation library that can also transform data on SQL databases
  • pixi-pycharm: Conda shim for PyCharm that proxies pixi
  • pixi-diff-to-markdown: Generate markdown summaries from pixi update
  • jiaxiyang/cpp_project_guideline: Guide the way beginners make their c++ projects.
  • hex-inc/vegafusion: Serverside scaling of Vega and Altair visualizations in Rust, Python, WASM, and Java
  • pablovela5620/arxiv-researcher: Summarize PDF's and Arixv papers with Langchain and Nougat \ud83e\udd89
  • HaoZeke/xtsci-dist: Incremental scipy port using xtensor
  • jslorrma/keyrings.artifacts: Keyring backend that provides authentication for publishing or consuming Python packages to or from Azure Artifacts feeds within Azure DevOps
  • LFortran: A modern cross-platform Fortran compiler
  • Rerun: Rerun is an SDK for building time aware visualizations of multimodal data.
  • conda-auth: a conda plugin providing more secure authentication support to conda.
  • py-rattler: Build your own conda environment manager using the python wrapper of our Rattler backend.
  • array-api-extra: Extra array functions built on top of the Python array API standard.
"},{"location":"FAQ/","title":"Frequently asked questions","text":""},{"location":"FAQ/#what-is-the-difference-with-conda-mamba-poetry-pip","title":"What is the difference with conda, mamba, poetry, pip","text":"Tool Installs python Builds packages Runs predefined tasks Has lock files builtin Fast Use without python Conda \u2705 \u274c \u274c \u274c \u274c \u274c Mamba \u2705 \u274c \u274c \u274c \u2705 \u2705 Pip \u274c \u2705 \u274c \u274c \u274c \u274c Pixi \u2705 \ud83d\udea7 \u2705 \u2705 \u2705 \u2705 Poetry \u274c \u2705 \u274c \u2705 \u274c \u274c"},{"location":"FAQ/#why-the-name-pixi","title":"Why the name pixi","text":"

Starting with the name prefix we iterated until we had a name that was easy to pronounce, spell and remember. There also wasn't a cli tool yet using that name. Unlike px, pex, pax, etc. We think it sparks curiosity and fun, if you don't agree, I'm sorry, but you can always alias it to whatever you like.

Linux & macOSWindows
alias not_pixi=\"pixi\"\n

PowerShell:

New-Alias -Name not_pixi -Value pixi\n

"},{"location":"FAQ/#where-is-pixi-build","title":"Where is pixi build","text":"

TL;DR: It's coming we promise!

pixi build is going to be the subcommand that can generate a conda package out of a pixi project. This requires a solid build tool which we're creating with rattler-build which will be used as a library in pixi.

"},{"location":"basic_usage/","title":"Basic usage","text":"

Ensure you've got pixi set up. If running pixi doesn't show the help, see the getting started if it doesn't.

pixi\n

Initialize a new project and navigate to the project directory.

pixi init pixi-hello-world\ncd pixi-hello-world\n

Add the dependencies you would like to use.

pixi add python\n

Create a file named hello_world.py in the directory and paste the following code into the file.

hello_world.py
def hello():\n    print(\"Hello World, to the new revolution in package management.\")\n\nif __name__ == \"__main__\":\n    hello()\n

Run the code inside the environment.

pixi run python hello_world.py\n

You can also put this run command in a task.

pixi task add hello python hello_world.py\n

After adding the task, you can run the task using its name.

pixi run hello\n

Use the shell command to activate the environment and start a new shell in there.

pixi shell\npython\nexit()\n

You've just learned the basic features of pixi:

  1. initializing a project
  2. adding a dependency.
  3. adding a task, and executing it.
  4. running a program.

Feel free to play around with what you just learned like adding more tasks, dependencies or code.

Happy coding!

"},{"location":"basic_usage/#use-pixi-as-a-global-installation-tool","title":"Use pixi as a global installation tool","text":"

Use pixi to install tools on your machine.

Some notable examples:

# Awesome cross shell prompt, huge tip when using pixi!\npixi global install starship\n\n# Want to try a different shell?\npixi global install fish\n\n# Install other prefix.dev tools\npixi global install rattler-build\n\n# Install a multi package environment\npixi global install --environment data-science-env --expose python --expose jupyter python jupyter numpy pandas\n
"},{"location":"basic_usage/#use-pixi-in-github-actions","title":"Use pixi in GitHub Actions","text":"

You can use pixi in GitHub Actions to install dependencies and run commands. It supports automatic caching of your environments.

- uses: prefix-dev/setup-pixi@v0.5.1\n- run: pixi run cowpy \"Thanks for using pixi\"\n

See the GitHub Actions for more details.

"},{"location":"packaging/","title":"Packaging pixi","text":"

This is a guide for distribution maintainers wanting to package pixi for a different package manager. Users of pixi can ignore this page.

"},{"location":"packaging/#building","title":"Building","text":"

Pixi is written in Rust and compiled using Cargo, which are needed as compile-time dependencies. At runtime pixi needs no dependencies in other than the runtime it was compiled against (libc, ...).

To build pixi run

cargo build --locked --profile dist\n
Instead of using the predefined dist profile, which is optimized for binary size, you can also pass other options to let cargo optimize the binary for other metrics.

"},{"location":"packaging/#build-time-options","title":"Build-time Options","text":"

Pixi provides some compile-time options, which can influence the build

"},{"location":"packaging/#tls","title":"TLS","text":"

By default, pixi is built with Rustls TLS implementation. You can compile pixi using the platform native TLS implementation using by adding --no-default-features --feature native-tls to the build command. Note that this might add additional runtime dependencies, such as OpenSSL on Linux.

"},{"location":"packaging/#self-update","title":"Self-Update","text":"

Pixi has a self-update functionality. When pixi is installed using another package manager one usually doesn't want pixi to try to update itself and instead let it be updated by the package manager. For this reason the self-update feature is disabled by default. It can be enabled by adding --feature self_update to the build command.

When the self-update feature is disabled and a user tries to run pixi self-update an error message is displayed. This message can be customized by setting the PIXI_SELF_UPDATE_DISABLED_MESSAGE environment variable at build time to point the user to the package manager they should be using to update pixi.

PIXI_SELF_UPDATE_DISABLED_MESSAGE=\"`self-update` has been disabled for this build. Run `brew upgrade pixi` instead\" cargo build --locked --profile dist\n

"},{"location":"packaging/#custom-version","title":"Custom version","text":"

You can specify a custom version string to be used in the --version output by setting the PIXI_VERSION environment variable during the build.

PIXI_VERSION=\"HEAD-123456\" cargo build --locked --profile dist\n
"},{"location":"packaging/#shell-completion","title":"Shell completion","text":"

After building pixi you can generate shell autocompletion scripts by running

pixi completion --shell <SHELL>\n
and saving the output to a file. Currently supported shells are bash, elvish, fish, nushell, powershell and zsh.

"},{"location":"vision/","title":"Vision","text":"

We created pixi because we want to have a cargo/npm/yarn like package management experience for conda. We really love what the conda packaging ecosystem achieves, but we think that the user experience can be improved a lot. Modern package managers like cargo have shown us, how great a package manager can be. We want to bring that experience to the conda ecosystem.

"},{"location":"vision/#pixi-values","title":"Pixi values","text":"

We want to make pixi a great experience for everyone, so we have a few values that we want to uphold:

  1. Fast. We want to have a fast package manager, that is able to solve the environment in a few seconds.
  2. User Friendly. We want to have a package manager that puts user friendliness on the front-line. Providing easy, accessible and intuitive commands. That have the element of least surprise.
  3. Isolated Environment. We want to have isolated environments, that are reproducible and easy to share. Ideally, it should run on all common platforms. The Conda packaging system provides an excellent base for this.
  4. Single Tool. We want to integrate most common uses when working on a development project with Pixi, so it should support at least dependency management, command management, building and uploading packages. You should not need to reach to another external tool for this.
  5. Fun. It should be fun to use pixi and not cause frustrations, you should not need to think about it a lot and it should generally just get out of your way.
"},{"location":"vision/#conda","title":"Conda","text":"

We are building on top of the conda packaging ecosystem, this means that we have a huge number of packages available for different platforms on conda-forge. We believe the conda packaging ecosystem provides a solid base to manage your dependencies. Conda-forge is community maintained and very open to contributions. It is widely used in data science and scientific computing, robotics and other fields. And has a proven track record.

"},{"location":"vision/#target-languages","title":"Target languages","text":"

Essentially, we are language agnostics, we are targeting any language that can be installed with conda. Including: C++, Python, Rust, Zig etc. But we do believe the python ecosystem can benefit from a good package manager that is based on conda. So we are trying to provide an alternative to existing solutions there. We also think we can provide a good solution for C++ projects, as there are a lot of libraries available on conda-forge today. Pixi also truly shines when using it for multi-language projects e.g. a mix of C++ and Python, because we provide a nice way to build everything up to and including system level packages.

"},{"location":"advanced/authentication/","title":"Authenticate pixi with a server","text":"

You can authenticate pixi with a server like prefix.dev, a private quetz instance or anaconda.org. Different servers use different authentication methods. In this documentation page, we detail how you can authenticate against the different servers and where the authentication information is stored.

Usage: pixi auth login [OPTIONS] <HOST>\n\nArguments:\n  <HOST>  The host to authenticate with (e.g. repo.prefix.dev)\n\nOptions:\n      --token <TOKEN>              The token to use (for authentication with prefix.dev)\n      --username <USERNAME>        The username to use (for basic HTTP authentication)\n      --password <PASSWORD>        The password to use (for basic HTTP authentication)\n      --conda-token <CONDA_TOKEN>  The token to use on anaconda.org / quetz authentication\n  -v, --verbose...                 More output per occurrence\n  -q, --quiet...                   Less output per occurrence\n  -h, --help                       Print help\n

The different options are \"token\", \"conda-token\" and \"username + password\".

The token variant implements a standard \"Bearer Token\" authentication as is used on the prefix.dev platform. A Bearer Token is sent with every request as an additional header of the form Authentication: Bearer <TOKEN>.

The conda-token option is used on anaconda.org and can be used with a quetz server. With this option, the token is sent as part of the URL following this scheme: conda.anaconda.org/t/<TOKEN>/conda-forge/linux-64/....

The last option, username & password, are used for \"Basic HTTP Authentication\". This is the equivalent of adding http://user:password@myserver.com/.... This authentication method can be configured quite easily with a reverse NGinx or Apache server and is thus commonly used in self-hosted systems.

"},{"location":"advanced/authentication/#examples","title":"Examples","text":"

Login to prefix.dev:

pixi auth login prefix.dev --token pfx_jj8WDzvnuTHEGdAhwRZMC1Ag8gSto8\n

Login to anaconda.org:

pixi auth login anaconda.org --conda-token xy-72b914cc-c105-4ec7-a969-ab21d23480ed\n

Login to a basic HTTP secured server:

pixi auth login myserver.com --username user --password password\n
"},{"location":"advanced/authentication/#where-does-pixi-store-the-authentication-information","title":"Where does pixi store the authentication information?","text":"

The storage location for the authentication information is system-dependent. By default, pixi tries to use the keychain to store this sensitive information securely on your machine.

On Windows, the credentials are stored in the \"credentials manager\". Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

On macOS, the passwords are stored in the keychain. To access the password, you can use the Keychain Access program that comes pre-installed on macOS. Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

On Linux, one can use GNOME Keyring (or just Keyring) to access credentials that are securely stored by libsecret. Searching for rattler should list all the credentials stored by pixi and other rattler-based programs.

"},{"location":"advanced/authentication/#fallback-storage","title":"Fallback storage","text":"

If you run on a server with none of the aforementioned keychains available, then pixi falls back to store the credentials in an insecure JSON file. This JSON file is located at ~/.rattler/credentials.json and contains the credentials.

"},{"location":"advanced/authentication/#override-the-authentication-storage","title":"Override the authentication storage","text":"

You can use the RATTLER_AUTH_FILE environment variable to override the default location of the credentials file. When this environment variable is set, it provides the only source of authentication data that is used by pixi.

E.g.

export RATTLER_AUTH_FILE=$HOME/credentials.json\n# You can also specify the file in the command line\npixi global install --auth-file $HOME/credentials.json ...\n

The JSON should follow the following format:

{\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHTTP\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"conda.anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

Note: if you use a wildcard in the host, any subdomain will match (e.g. *.prefix.dev also matches repo.prefix.dev).

Lastly you can set the authentication override file in the global configuration file.

"},{"location":"advanced/authentication/#pypi-authentication","title":"PyPI authentication","text":"

Currently, we support the following methods for authenticating against PyPI:

  1. keyring authentication.
  2. .netrc file authentication.

We want to add more methods in the future, so if you have a specific method you would like to see, please let us know.

"},{"location":"advanced/authentication/#keyring-authentication","title":"Keyring authentication","text":"

Currently, pixi supports the uv method of authentication through the python keyring library.

"},{"location":"advanced/authentication/#installing-keyring","title":"Installing keyring","text":"

To install keyring you can use pixi global install:

Basic AuthGoogle Artifact RegistryAzure DevOps Artifacts
pixi global install keyring\n
pixi global install keyring --with keyrings.google-artifactregistry-auth\n
pixi global install keyring --with keyring.artifacts\n

For other registries, you will need to adapt these instructions to add the right keyring backend.

"},{"location":"advanced/authentication/#configuring-your-project-to-use-keyring","title":"Configuring your project to use keyring","text":"Basic AuthGoogle Artifact RegistryAzure DevOps Artifacts

Use keyring to store your credentials e.g:

keyring set https://my-index/simple your_username\n# prompt will appear for your password\n

Add the following configuration to your pixi manifest, making sure to include your_username@ in the URL of the registry:

[pypi-options]\nindex-url = \"https://your_username@custom-registry.com/simple\"\n

After making sure you are logged in, for instance by running gcloud auth login, add the following configuration to your pixi manifest:

[pypi-options]\nextra-index-urls = [\"https://oauth2accesstoken@<location>-python.pkg.dev/<project>/<repository>/simple\"]\n

Note

To find this URL more easily, you can use the gcloud command:

gcloud artifacts print-settings python --project=<project> --repository=<repository> --location=<location>\n

After following the keyring.artifacts instructions and making sure that keyring works correctly, add the following configuration to your pixi manifest:

[pypi-options]\nextra-index-urls = [\"https://VssSessionToken@pkgs.dev.azure.com/{organization}/{project}/_packaging/{feed}/pypi/simple/\"]\n
"},{"location":"advanced/authentication/#installing-your-environment","title":"Installing your environment","text":"

Either configure your Global Config, or use the flag --pypi-keyring-provider which can either be set to subprocess (activated) or disabled:

# From an existing pixi project\npixi install --pypi-keyring-provider subprocess\n
"},{"location":"advanced/authentication/#netrc-file","title":".netrc file","text":"

pixi allows you to access private registries securely by authenticating with credentials stored in a .netrc file.

  • The .netrc file can be stored in your home directory ($HOME/.netrc for Unix-like systems)
  • or in the user profile directory on Windows (%HOME%\\_netrc).
  • You can also set up a different location for it using the NETRC variable (export NETRC=/my/custom/location/.netrc). e.g export NETRC=/my/custom/location/.netrc pixi install

In the .netrc file, you store authentication details like this:

machine registry-name\nlogin admin\npassword admin\n
For more details, you can access the .netrc docs.

"},{"location":"advanced/channel_priority/","title":"Channel Logic","text":"

All logic regarding the decision which dependencies can be installed from which channel is done by the instruction we give the solver.

The actual code regarding this is in the rattler_solve crate. This might however be hard to read. Therefore, this document will continue with simplified flow charts.

"},{"location":"advanced/channel_priority/#channel-specific-dependencies","title":"Channel specific dependencies","text":"

When a user defines a channel per dependency, the solver needs to know the other channels are unusable for this dependency.

[project]\nchannels = [\"conda-forge\", \"my-channel\"]\n\n[dependencies]\npackgex = { version = \"*\", channel = \"my-channel\" }\n
In the packagex example, the solver will understand that the package is only available in my-channel and will not look for it in conda-forge.

The flowchart of the logic that excludes all other channels:

flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Channel Specific Dependency?}\n    C -->|Yes| D[Exclude All Other Channels for This Package]\n    C -->|No| E{Any Other Dependencies?}\n    E -->|Yes| B\n    E -->|No| F[End]\n    D --> E
"},{"location":"advanced/channel_priority/#channel-priority","title":"Channel priority","text":"

Channel priority is dictated by the order in the project.channels array, where the first channel is the highest priority. For instance:

[project]\nchannels = [\"conda-forge\", \"my-channel\", \"your-channel\"]\n
If the package is found in conda-forge the solver will not look for it in my-channel and your-channel, because it tells the solver they are excluded. If the package is not found in conda-forge the solver will look for it in my-channel and if it is found there it will tell the solver to exclude your-channel for this package. This diagram explains the logic:
flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Loop Over Channels}\n    C --> D{Package in This Channel?}\n    D -->|No| C\n    D -->|Yes| E{\"This the first channel\n     for this package?\"}\n    E -->|Yes| F[Include Package in Candidates]\n    E -->|No| G[Exclude Package from Candidates]\n    F --> H{Any Other Channels?}\n    G --> H\n    H -->|Yes| C\n    H -->|No| I{Any Other Dependencies?}\n    I -->|No| J[End]\n    I -->|Yes| B

This method ensures the solver only adds a package to the candidates if it's found in the highest priority channel available. If you have 10 channels and the package is found in the 5th channel it will exclude the next 5 channels from the candidates if they also contain the package.

"},{"location":"advanced/channel_priority/#use-case-pytorch-and-nvidia-with-conda-forge","title":"Use case: pytorch and nvidia with conda-forge","text":"

A common use case is to use pytorch with nvidia drivers, while also needing the conda-forge channel for the main dependencies.

[project]\nchannels = [\"nvidia/label/cuda-11.8.0\", \"nvidia\", \"conda-forge\", \"pytorch\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\ncuda = {version = \"*\", channel=\"nvidia/label/cuda-11.8.0\"}\npytorch = {version = \"2.0.1.*\", channel=\"pytorch\"}\ntorchvision = {version = \"0.15.2.*\", channel=\"pytorch\"}\npytorch-cuda = {version = \"11.8.*\", channel=\"pytorch\"}\npython = \"3.10.*\"\n
What this will do is get as much as possible from the nvidia/label/cuda-11.8.0 channel, which is actually only the cuda package.

Then it will get all packages from the nvidia channel, which is a little more and some packages overlap the nvidia and conda-forge channel. Like the cuda-cudart package, which will now only be retrieved from the nvidia channel because of the priority logic.

Then it will get the packages from the conda-forge channel, which is the main channel for the dependencies.

But the user only wants the pytorch packages from the pytorch channel, which is why pytorch is added last and the dependencies are added as channel specific dependencies.

We don't define the pytorch channel before conda-forge because we want to get as much as possible from the conda-forge as the pytorch channel is not always shipping the best versions of all packages.

For example, it also ships the ffmpeg package, but only an old version which doesn't work with the newer pytorch versions. Thus breaking the installation if we would skip the conda-forge channel for ffmpeg with the priority logic.

"},{"location":"advanced/channel_priority/#force-a-specific-channel-priority","title":"Force a specific channel priority","text":"

If you want to force a specific priority for a channel, you can use the priority (int) key in the channel definition. The higher the number, the higher the priority. Non specified priorities are set to 0 but the index in the array still counts as a priority, where the first in the list has the highest priority.

This priority definition is mostly important for multiple environments with different channel priorities, as by default feature channels are prepended to the project channels.

[project]\nname = \"test_channel_priority\"\nplatforms = [\"linux-64\", \"osx-64\", \"win-64\", \"osx-arm64\"]\nchannels = [\"conda-forge\"]\n\n[feature.a]\nchannels = [\"nvidia\"]\n\n[feature.b]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = 1}]\n\n[feature.c]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = -1}]\n\n[environments]\na = [\"a\"]\nb = [\"b\"]\nc = [\"c\"]\n
This example creates 4 environments, a, b, c, and the default environment. Which will have the following channel order:

Environment Resulting Channels order default conda-forge a nvidia, conda-forge b nvidia, pytorch, conda-forge c pytorch, conda-forge, nvidia Check priority result with pixi info

Using pixi info you can check the priority of the channels in the environment.

pixi info\nEnvironments\n------------\n       Environment: default\n          Features: default\n          Channels: conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: a\n          Features: a, default\n          Channels: nvidia, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: b\n          Features: b, default\n          Channels: nvidia, pytorch, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: c\n          Features: c, default\n          Channels: pytorch, conda-forge, nvidia\nDependency count: 0\nTarget platforms: linux-64\n

"},{"location":"advanced/explain_info_command/","title":"Info command","text":"

pixi info prints out useful information to debug a situation or to get an overview of your machine/project. This information can also be retrieved in json format using the --json flag, which can be useful for programmatically reading it.

Running pixi info in the pixi repo
\u279c pixi info\n      Pixi version: 0.13.0\n          Platform: linux-64\n  Virtual packages: __unix=0=0\n                  : __linux=6.5.12=0\n                  : __glibc=2.36=0\n                  : __cuda=12.3=0\n                  : __archspec=1=x86_64\n         Cache dir: /home/user/.cache/rattler/cache\n      Auth storage: /home/user/.rattler/credentials.json\n\nProject\n------------\n           Version: 0.13.0\n     Manifest file: /home/user/development/pixi/pixi.toml\n      Last updated: 25-01-2024 10:29:08\n\nEnvironments\n------------\ndefault\n          Features: default\n          Channels: conda-forge\n  Dependency count: 10\n      Dependencies: pre-commit, rust, openssl, pkg-config, git, mkdocs, mkdocs-material, pillow, cairosvg, compilers\n  Target platforms: linux-64, osx-arm64, win-64, osx-64\n             Tasks: docs, test-all, test, build, lint, install, build-docs\n
"},{"location":"advanced/explain_info_command/#global-info","title":"Global info","text":"

The first part of the info output is information that is always available and tells you what pixi can read on your machine.

"},{"location":"advanced/explain_info_command/#platform","title":"Platform","text":"

This defines the platform you're currently on according to pixi. If this is incorrect, please file an issue on the pixi repo.

"},{"location":"advanced/explain_info_command/#virtual-packages","title":"Virtual packages","text":"

The virtual packages that pixi can find on your machine.

In the Conda ecosystem, you can depend on virtual packages. These packages aren't real dependencies that are going to be installed, but rather are being used in the solve step to find if a package can be installed on the machine. A simple example: When a package depends on Cuda drivers being present on the host machine it can do that by depending on the __cuda virtual package. In that case, if pixi cannot find the __cuda virtual package on your machine the installation will fail.

"},{"location":"advanced/explain_info_command/#cache-dir","title":"Cache dir","text":"

The directory where pixi stores its cache. Checkout the cache documentation for more information.

"},{"location":"advanced/explain_info_command/#auth-storage","title":"Auth storage","text":"

Check the authentication documentation

"},{"location":"advanced/explain_info_command/#cache-size","title":"Cache size","text":"

[requires --extended]

The size of the previously mentioned \"Cache dir\" in Mebibytes.

"},{"location":"advanced/explain_info_command/#project-info","title":"Project info","text":"

Everything below Project is info about the project you're currently in. This info is only available if your path has a manifest file.

"},{"location":"advanced/explain_info_command/#manifest-file","title":"Manifest file","text":"

The path to the manifest file that describes the project.

"},{"location":"advanced/explain_info_command/#last-updated","title":"Last updated","text":"

The last time the lock file was updated, either manually or by pixi itself.

"},{"location":"advanced/explain_info_command/#environment-info","title":"Environment info","text":"

The environment info defined per environment. If you don't have any environments defined, this will only show the default environment.

"},{"location":"advanced/explain_info_command/#features","title":"Features","text":"

This lists which features are enabled in the environment. For the default this is only default

"},{"location":"advanced/explain_info_command/#channels","title":"Channels","text":"

The list of channels used in this environment.

"},{"location":"advanced/explain_info_command/#dependency-count","title":"Dependency count","text":"

The amount of dependencies defined that are defined for this environment (not the amount of installed dependencies).

"},{"location":"advanced/explain_info_command/#dependencies","title":"Dependencies","text":"

The list of dependencies defined for this environment.

"},{"location":"advanced/explain_info_command/#target-platforms","title":"Target platforms","text":"

The platforms the project has defined.

"},{"location":"advanced/github_actions/","title":"GitHub Action","text":"

We created prefix-dev/setup-pixi to facilitate using pixi in CI.

"},{"location":"advanced/github_actions/#usage","title":"Usage","text":"
- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    pixi-version: v0.39.2\n    cache: true\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n- run: pixi run test\n

Pin your action versions

Since pixi is not yet stable, the API of this action may change between minor versions. Please pin the versions of this action to a specific version (i.e., prefix-dev/setup-pixi@v0.8.0) to avoid breaking changes. You can automatically update the version of this action by using Dependabot.

Put the following in your .github/dependabot.yml file to enable Dependabot for your GitHub Actions:

.github/dependabot.yml
version: 2\nupdates:\n  - package-ecosystem: github-actions\n    directory: /\n    schedule:\n      interval: monthly # (1)!\n    groups:\n      dependencies:\n        patterns:\n          - \"*\"\n
  1. or daily, weekly
"},{"location":"advanced/github_actions/#features","title":"Features","text":"

To see all available input arguments, see the action.yml file in setup-pixi. The most important features are described below.

"},{"location":"advanced/github_actions/#caching","title":"Caching","text":"

The action supports caching of the pixi environment. By default, caching is enabled if a pixi.lock file is present. It will then use the pixi.lock file to generate a hash of the environment and cache it. If the cache is hit, the action will skip the installation and use the cached environment. You can specify the behavior by setting the cache input argument.

Customize your cache key

If you need to customize your cache-key, you can use the cache-key input argument. This will be the prefix of the cache key. The full cache key will be <cache-key><conda-arch>-<hash>.

Only save caches on main

In order to not exceed the 10 GB cache size limit as fast, you might want to restrict when the cache is saved. This can be done by setting the cache-write argument.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    cache: true\n    cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}\n
"},{"location":"advanced/github_actions/#multiple-environments","title":"Multiple environments","text":"

With pixi, you can create multiple environments for different requirements. You can also specify which environment(s) you want to install by setting the environments input argument. This will install all environments that are specified and cache them.

[project]\nname = \"my-package\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\npython = \">=3.11\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n\n[environments]\npy311 = [\"py311\"]\npy312 = [\"py312\"]\n
"},{"location":"advanced/github_actions/#multiple-environments-using-a-matrix","title":"Multiple environments using a matrix","text":"

The following example will install the py311 and py312 environments in different jobs.

test:\n  runs-on: ubuntu-latest\n  strategy:\n    matrix:\n      environment: [py311, py312]\n  steps:\n  - uses: actions/checkout@v4\n  - uses: prefix-dev/setup-pixi@v0.8.0\n    with:\n      environments: ${{ matrix.environment }}\n
"},{"location":"advanced/github_actions/#install-multiple-environments-in-one-job","title":"Install multiple environments in one job","text":"

The following example will install both the py311 and the py312 environment on the runner.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >- # (1)!\n      py311\n      py312\n- run: |\n  pixi run -e py311 test\n  pixi run -e py312 test\n
  1. separated by spaces, equivalent to

    environments: py311 py312\n

Caching behavior if you don't specify environments

If you don't specify any environment, the default environment will be installed and cached, even if you use other environments.

"},{"location":"advanced/github_actions/#authentication","title":"Authentication","text":"

There are currently three ways to authenticate with pixi:

  • using a token
  • using a username and password
  • using a conda-token

For more information, see Authentication.

Handle secrets with care

Please only store sensitive information using GitHub secrets. Do not store them in your repository. When your sensitive information is stored in a GitHub secret, you can access it using the ${{ secrets.SECRET_NAME }} syntax. These secrets will always be masked in the logs.

"},{"location":"advanced/github_actions/#token","title":"Token","text":"

Specify the token using the auth-token input argument. This form of authentication (bearer token in the request headers) is mainly used at prefix.dev.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n
"},{"location":"advanced/github_actions/#username-and-password","title":"Username and password","text":"

Specify the username and password using the auth-username and auth-password input arguments. This form of authentication (HTTP Basic Auth) is used in some enterprise environments with artifactory for example.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: custom-artifactory.com\n    auth-username: ${{ secrets.PIXI_USERNAME }}\n    auth-password: ${{ secrets.PIXI_PASSWORD }}\n
"},{"location":"advanced/github_actions/#conda-token","title":"Conda-token","text":"

Specify the conda-token using the conda-token input argument. This form of authentication (token is encoded in URL: https://my-quetz-instance.com/t/<token>/get/custom-channel) is used at anaconda.org or with quetz instances.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: anaconda.org # (1)!\n    conda-token: ${{ secrets.CONDA_TOKEN }}\n
  1. or my-quetz-instance.com
"},{"location":"advanced/github_actions/#custom-shell-wrapper","title":"Custom shell wrapper","text":"

setup-pixi allows you to run command inside of the pixi environment by specifying a custom shell wrapper with shell: pixi run bash -e {0}. This can be useful if you want to run commands inside of the pixi environment, but don't want to use the pixi run command for each command.

- run: | # (1)!\n    python --version\n    pip install --no-deps -e .\n  shell: pixi run bash -e {0}\n
  1. everything here will be run inside of the pixi environment

You can even run Python scripts like this:

- run: | # (1)!\n    import my_package\n    print(\"Hello world!\")\n  shell: pixi run python {0}\n
  1. everything here will be run inside of the pixi environment

If you want to use PowerShell, you need to specify -Command as well.

- run: | # (1)!\n    python --version | Select-String \"3.11\"\n  shell: pixi run pwsh -Command {0} # pwsh works on all platforms\n
  1. everything here will be run inside of the pixi environment

How does it work under the hood?

Under the hood, the shell: xyz {0} option is implemented by creating a temporary script file and calling xyz with that script file as an argument. This file does not have the executable bit set, so you cannot use shell: pixi run {0} directly but instead have to use shell: pixi run bash {0}. There are some custom shells provided by GitHub that have slightly different behavior, see jobs.<job_id>.steps[*].shell in the documentation. See the official documentation and ADR 0277 for more information about how the shell: input works in GitHub Actions.

"},{"location":"advanced/github_actions/#one-off-shell-wrapper-using-pixi-exec","title":"One-off shell wrapper using pixi exec","text":"

With pixi exec, you can also run a one-off command inside a temporary pixi environment.

- run: | # (1)!\n    zstd --version\n  shell: pixi exec --spec zstd -- bash -e {0}\n
  1. everything here will be run inside of the temporary pixi environment
- run: | # (1)!\n    import ruamel.yaml\n    # ...\n  shell: pixi exec --spec python=3.11.* --spec ruamel.yaml -- python {0}\n
  1. everything here will be run inside of the temporary pixi environment

See here for more information about pixi exec.

"},{"location":"advanced/github_actions/#environment-activation","title":"Environment activation","text":"

Instead of using a custom shell wrapper, you can also make all pixi-installed binaries available to subsequent steps by \"activating\" the installed environment in the currently running job. To this end, setup-pixi adds all environment variables set when executing pixi run to $GITHUB_ENV and, similarly, adds all path modifications to $GITHUB_PATH. As a result, all installed binaries can be accessed without having to call pixi run.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    activate-environment: true\n

If you are installing multiple environments, you will need to specify the name of the environment that you want to be activated.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >-\n      py311\n      py312\n    activate-environment: py311\n

Activating an environment may be more useful than using a custom shell wrapper as it allows non-shell based steps to access binaries on the path. However, be aware that this option augments the environment of your job.

"},{"location":"advanced/github_actions/#-frozen-and-locked","title":"--frozen and --locked","text":"

You can specify whether setup-pixi should run pixi install --frozen or pixi install --locked depending on the frozen or the locked input argument. See the official documentation for more information about the --frozen and --locked flags.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    locked: true\n    # or\n    frozen: true\n

If you don't specify anything, the default behavior is to run pixi install --locked if a pixi.lock file is present and pixi install otherwise.

"},{"location":"advanced/github_actions/#debugging","title":"Debugging","text":"

There are two types of debug logging that you can enable.

"},{"location":"advanced/github_actions/#debug-logging-of-the-action","title":"Debug logging of the action","text":"

The first one is the debug logging of the action itself. This can be enabled by for the action by re-running the action in debug mode:

Debug logging documentation

For more information about debug logging in GitHub Actions, see the official documentation.

"},{"location":"advanced/github_actions/#debug-logging-of-pixi","title":"Debug logging of pixi","text":"

The second type is the debug logging of the pixi executable. This can be specified by setting the log-level input.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    log-level: vvv # (1)!\n
  1. One of q, default, v, vv, or vvv.

If nothing is specified, log-level will default to default or vv depending on if debug logging is enabled for the action.

"},{"location":"advanced/github_actions/#self-hosted-runners","title":"Self-hosted runners","text":"

On self-hosted runners, it may happen that some files are persisted between jobs. This can lead to problems or secrets getting leaked between job runs. To avoid this, you can use the post-cleanup input to specify the post cleanup behavior of the action (i.e., what happens after all your commands have been executed).

If you set post-cleanup to true, the action will delete the following files:

  • .pixi environment
  • the pixi binary
  • the rattler cache
  • other rattler files in ~/.rattler

If nothing is specified, post-cleanup will default to true.

On self-hosted runners, you also might want to alter the default pixi install location to a temporary location. You can use pixi-bin-path: ${{ runner.temp }}/bin/pixi to do this.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    post-cleanup: true\n    pixi-bin-path: ${{ runner.temp }}/bin/pixi # (1)!\n
  1. ${{ runner.temp }}\\Scripts\\pixi.exe on Windows

You can also use a preinstalled local version of pixi on the runner by not setting any of the pixi-version, pixi-url or pixi-bin-path inputs. This action will then try to find a local version of pixi in the runner's PATH.

"},{"location":"advanced/github_actions/#using-the-pyprojecttoml-as-a-manifest-file-for-pixi","title":"Using the pyproject.toml as a manifest file for pixi.","text":"

setup-pixi will automatically pick up the pyproject.toml if it contains a [tool.pixi.project] section and no pixi.toml. This can be overwritten by setting the manifest-path input argument.

- uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    manifest-path: pyproject.toml\n
"},{"location":"advanced/github_actions/#more-examples","title":"More examples","text":"

If you want to see more examples, you can take a look at the GitHub Workflows of the setup-pixi repository.

"},{"location":"advanced/production_deployment/","title":"Bringing pixi to production","text":"

You can bring pixi projects into production by either containerizing it using tools like Docker or by using quantco/pixi-pack.

@pavelzw from QuantCo wrote a blog post about bringing pixi to production. You can read it here.

"},{"location":"advanced/production_deployment/#docker","title":"Docker","text":"

We provide a simple docker image at pixi-docker that contains the pixi executable on top of different base images.

The images are available on ghcr.io/prefix-dev/pixi.

There are different tags for different base images available:

  • latest - based on ubuntu:jammy
  • focal - based on ubuntu:focal
  • bullseye - based on debian:bullseye
  • jammy-cuda-12.2.2 - based on nvidia/cuda:12.2.2-jammy
  • ... and more

All tags

For all tags, take a look at the build script.

"},{"location":"advanced/production_deployment/#example-usage","title":"Example usage","text":"

The following example uses the pixi docker image as a base image for a multi-stage build. It also makes use of pixi shell-hook to not rely on pixi being installed in the production container.

More examples

For more examples, take a look at pavelzw/pixi-docker-example.

FROM ghcr.io/prefix-dev/pixi:0.39.2 AS build\n\n# copy source code, pixi.toml and pixi.lock to the container\nWORKDIR /app\nCOPY . .\n# install dependencies to `/app/.pixi/envs/prod`\n# use `--locked` to ensure the lockfile is up to date with pixi.toml\nRUN pixi install --locked -e prod\n# create the shell-hook bash script to activate the environment\nRUN pixi shell-hook -e prod -s bash > /shell-hook\nRUN echo \"#!/bin/bash\" > /app/entrypoint.sh\nRUN cat /shell-hook >> /app/entrypoint.sh\n# extend the shell-hook script to run the command passed to the container\nRUN echo 'exec \"$@\"' >> /app/entrypoint.sh\n\nFROM ubuntu:24.04 AS production\nWORKDIR /app\n# only copy the production environment into prod container\n# please note that the \"prefix\" (path) needs to stay the same as in the build container\nCOPY --from=build /app/.pixi/envs/prod /app/.pixi/envs/prod\nCOPY --from=build --chmod=0755 /app/entrypoint.sh /app/entrypoint.sh\n# copy your project code into the container as well\nCOPY ./my_project /app/my_project\n\nEXPOSE 8000\nENTRYPOINT [ \"/app/entrypoint.sh\" ]\n# run your app inside the pixi environment\nCMD [ \"uvicorn\", \"my_project:app\", \"--host\", \"0.0.0.0\" ]\n
"},{"location":"advanced/production_deployment/#pixi-pack","title":"pixi-pack","text":"

pixi-pack is a simple tool that takes a pixi environment and packs it into a compressed archive that can be shipped to the target machine.

It can be installed via

pixi global install pixi-pack\n

Or by downloading our pre-built binaries from the releases page.

Instead of installing pixi-pack globally, you can also use pixi exec to run pixi-pack in a temporary environment:

pixi exec pixi-pack pack\npixi exec pixi-pack unpack environment.tar\n

You can pack an environment with

pixi-pack pack --manifest-file pixi.toml --environment prod --platform linux-64\n

This will create a environment.tar file that contains all conda packages required to create the environment.

# environment.tar\n| pixi-pack.json\n| environment.yml\n| channel\n|    \u251c\u2500\u2500 noarch\n|    |    \u251c\u2500\u2500 tzdata-2024a-h0c530f3_0.conda\n|    |    \u251c\u2500\u2500 ...\n|    |    \u2514\u2500\u2500 repodata.json\n|    \u2514\u2500\u2500 linux-64\n|         \u251c\u2500\u2500 ca-certificates-2024.2.2-hbcca054_0.conda\n|         \u251c\u2500\u2500 ...\n|         \u2514\u2500\u2500 repodata.json\n
"},{"location":"advanced/production_deployment/#unpacking-an-environment","title":"Unpacking an environment","text":"

With pixi-pack unpack environment.tar, you can unpack the environment on your target system. This will create a new conda environment in ./env that contains all packages specified in your pixi.toml. It also creates an activate.sh (or activate.bat on Windows) file that lets you activate the environment without needing to have conda or micromamba installed.

"},{"location":"advanced/production_deployment/#cross-platform-packs","title":"Cross-platform packs","text":"

Since pixi-pack just downloads the .conda and .tar.bz2 files from the conda repositories, you can trivially create packs for different platforms.

pixi-pack pack --platform win-64\n

You can only unpack a pack on a system that has the same platform as the pack was created for.

"},{"location":"advanced/production_deployment/#inject-additional-packages","title":"Inject additional packages","text":"

You can inject additional packages into the environment that are not specified in pixi.lock by using the --inject flag:

pixi-pack pack --inject local-package-1.0.0-hbefa133_0.conda --manifest-pack pixi.toml\n

This can be particularly useful if you build the project itself and want to include the built package in the environment but still want to use pixi.lock from the project.

"},{"location":"advanced/production_deployment/#unpacking-without-pixi-pack","title":"Unpacking without pixi-pack","text":"

If you don't have pixi-pack available on your target system, you can still install the environment if you have conda or micromamba available. Just unarchive the environment.tar, then you have a local channel on your system where all necessary packages are available. Next to this local channel, you will find an environment.yml file that contains the environment specification. You can then install the environment using conda or micromamba:

tar -xvf environment.tar\nmicromamba create -p ./env --file environment.yml\n# or\nconda env create -p ./env --file environment.yml\n

The environment.yml and repodata.json files are only for this use case, pixi-pack unpack does not use them.

"},{"location":"advanced/pyproject_toml/","title":"pyproject.toml in pixi","text":"

We support the use of the pyproject.toml as our manifest file in pixi. This allows the user to keep one file with all configuration. The pyproject.toml file is a standard for Python projects. We don't advise to use the pyproject.toml file for anything else than python projects, the pixi.toml is better suited for other types of projects.

"},{"location":"advanced/pyproject_toml/#initial-setup-of-the-pyprojecttoml-file","title":"Initial setup of the pyproject.toml file","text":"

When you already have a pyproject.toml file in your project, you can run pixi init in a that folder. Pixi will automatically

  • Add a [tool.pixi.project] section to the file, with the platform and channel information required by pixi;
  • Add the current project as an editable pypi dependency;
  • Add some defaults to the .gitignore and .gitattributes files.

If you do not have an existing pyproject.toml file , you can run pixi init --format pyproject in your project folder. In that case, pixi will create a pyproject.toml manifest from scratch with some sane defaults.

"},{"location":"advanced/pyproject_toml/#python-dependency","title":"Python dependency","text":"

The pyproject.toml file supports the requires_python field. Pixi understands that field and automatically adds the version to the dependencies.

This is an example of a pyproject.toml file with the requires_python field, which will be used as the python dependency:

pyproject.toml
[project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

Which is equivalent to:

equivalent pixi.toml
[project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[dependencies]\npython = \">=3.9\"\n
"},{"location":"advanced/pyproject_toml/#dependency-section","title":"Dependency section","text":"

The pyproject.toml file supports the dependencies field. Pixi understands that field and automatically adds the dependencies to the project as [pypi-dependencies].

This is an example of a pyproject.toml file with the dependencies field:

pyproject.toml
[project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

Which is equivalent to:

equivalent pixi.toml
[project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[pypi-dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n\n[dependencies]\npython = \">=3.9\"\n

You can overwrite these with conda dependencies by adding them to the dependencies field:

pyproject.toml
[project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n

This would result in the conda dependencies being installed and the pypi dependencies being ignored. As pixi takes the conda dependencies over the pypi dependencies.

"},{"location":"advanced/pyproject_toml/#optional-dependencies","title":"Optional dependencies","text":"

If your python project includes groups of optional dependencies, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per feature. Self-references to other groups of optional dependencies are also handled.

For instance, imagine you have a project folder with a pyproject.toml file similar to:

[project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n

Running pixi init in that project folder will transform the pyproject.toml file into:

[project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # if executed on linux\n\n[tool.pixi.environments]\ndefault = {features = [], solve-group = \"default\"}\ntest = {features = [\"test\"], solve-group = \"default\"}\nall = {features = [\"all\", \"test\"], solve-group = \"default\"}\n

In this example, three environments will be created by pixi:

  • default with 'package1' as pypi dependency
  • test with 'package1' and 'pytest' as pypi dependencies
  • all with 'package1', 'package2' and 'pytest' as pypi dependencies

All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

"},{"location":"advanced/pyproject_toml/#dependency-groups","title":"Dependency groups","text":"

If your python project includes dependency groups, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per dependency group.

For instance, imagine you have a project folder with a pyproject.toml file similar to:

[project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[dependency-groups]\ntest = [\"pytest\"]\ndocs = [\"sphinx\"]\ndev = [{include-group = \"test\"}, {include-group = \"docs\"}]\n

Running pixi init in that project folder will transform the pyproject.toml file into:

[project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[dependency-groups]\ntest = [\"pytest\"]\ndocs = [\"sphinx\"]\ndev = [{include-group = \"test\"}, {include-group = \"docs\"}]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # if executed on linux\n\n[tool.pixi.environments]\ndefault = {features = [], solve-group = \"default\"}\ntest = {features = [\"test\"], solve-group = \"default\"}\ndocs = {features = [\"docs\"], solve-group = \"default\"}\ndev = {features = [\"dev\"], solve-group = \"default\"}\n

In this example, four environments will be created by pixi:

  • default with 'package1' as pypi dependency
  • test with 'package1' and 'pytest' as pypi dependencies
  • docs with 'package1', 'sphinx' as pypi dependencies
  • dev with 'package1', 'sphinx' and 'pytest' as pypi dependencies

All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

"},{"location":"advanced/pyproject_toml/#example","title":"Example","text":"

As the pyproject.toml file supports the full pixi spec with [tool.pixi] prepended an example would look like this:

pyproject.toml
[project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n    \"ruff\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\ncompilers = \"*\"\ncmake = \"*\"\n\n[tool.pixi.tasks]\nstart = \"python my_project/main.py\"\nlint = \"ruff lint\"\n\n[tool.pixi.system-requirements]\ncuda = \"11.0\"\n\n[tool.pixi.feature.test.dependencies]\npytest = \"*\"\n\n[tool.pixi.feature.test.tasks]\ntest = \"pytest\"\n\n[tool.pixi.environments]\ntest = [\"test\"]\n
"},{"location":"advanced/pyproject_toml/#build-system-section","title":"Build-system section","text":"

The pyproject.toml file normally contains a [build-system] section. Pixi will use this section to build and install the project if it is added as a pypi path dependency.

If the pyproject.toml file does not contain any [build-system] section, pixi will fall back to uv's default, which is equivalent to the below:

pyproject.toml
[build-system]\nrequires = [\"setuptools >= 40.8.0\"]\nbuild-backend = \"setuptools.build_meta:__legacy__\"\n

Including a [build-system] section is highly recommended. If you are not sure of the build-backend you want to use, including the [build-system] section below in your pyproject.toml is a good starting point. pixi init --format pyproject defaults to hatchling. The advantages of hatchling over setuptools are outlined on its website.

pyproject.toml
[build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n
"},{"location":"advanced/updates_github_actions/","title":"Update lockfiles with GitHub Actions","text":"

You can leverage GitHub Actions in combination with pavelzw/pixi-diff-to-markdown to automatically update your lockfiles similar to dependabot or renovate in other ecosystems.

Dependabot/Renovate support for pixi

You can track native Dependabot support for pixi in dependabot/dependabot-core #2227 and for Renovate in renovatebot/renovate #2213.

"},{"location":"advanced/updates_github_actions/#how-to-use","title":"How to use","text":"

To get started, create a new GitHub Actions workflow file in your repository.

.github/workflows/update-lockfiles.yml
name: Update lockfiles\n\npermissions: # (1)!\n  contents: write\n  pull-requests: write\n\non:\n  workflow_dispatch:\n  schedule:\n    - cron: 0 5 1 * * # (2)!\n\njobs:\n  pixi-update:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up pixi\n        uses: prefix-dev/setup-pixi@v0.8.1\n        with:\n          run-install: false\n      - name: Update lockfiles\n        run: |\n          set -o pipefail\n          pixi update --json | pixi exec pixi-diff-to-markdown >> diff.md\n      - name: Create pull request\n        uses: peter-evans/create-pull-request@v7\n        with:\n          token: ${{ secrets.GITHUB_TOKEN }}\n          commit-message: Update pixi lockfile\n          title: Update pixi lockfile\n          body-path: diff.md\n          branch: update-pixi\n          base: main\n          labels: pixi\n          delete-branch: true\n          add-paths: pixi.lock\n
  1. Needed for peter-evans/create-pull-request
  2. Runs at 05:00, on day 1 of the month

In order for this workflow to work, you need to set \"Allow GitHub Actions to create and approve pull requests\" to true in your repository settings (in \"Actions\" -> \"General\").

Tip

If you don't have any pypi-dependencies, you can use pixi update --json --no-install to speed up diff generation.

"},{"location":"advanced/updates_github_actions/#triggering-ci-in-automated-prs","title":"Triggering CI in automated PRs","text":"

In order to prevent accidental recursive GitHub Workflow runs, GitHub decided to not trigger any workflows on automated PRs when using the default GITHUB_TOKEN. There are a couple of ways how to work around this limitation. You can find excellent documentation for this in peter-evans/create-pull-request, see here.

"},{"location":"advanced/updates_github_actions/#customizing-the-summary","title":"Customizing the summary","text":"

You can customize the summary by either using command-line-arguments of pixi-diff-to-markdown or by specifying the configuration in pixi.toml under [tool.pixi-diff-to-markdown]. See the pixi-diff-to-markdown documentation or run pixi-diff-to-markdown --help for more information.

"},{"location":"advanced/updates_github_actions/#using-reusable-workflows","title":"Using reusable workflows","text":"

If you want to use the same workflow in multiple repositories in your GitHub organization, you can create a reusable workflow. You can find more information in the GitHub documentation.

"},{"location":"examples/cpp-sdl/","title":"SDL example","text":"

The cpp-sdl example is located in the pixi repository.

git clone https://github.com/prefix-dev/pixi.git\n

Move to the example folder

cd pixi/examples/cpp-sdl\n

Run the start command

pixi run start\n

Using the depends-on feature you only needed to run the start task but under water it is running the following tasks.

# Configure the CMake project\npixi run configure\n\n# Build the executable\npixi run build\n\n# Start the build executable\npixi run start\n
"},{"location":"examples/opencv/","title":"Opencv example","text":"

The opencv example is located in the pixi repository.

git clone https://github.com/prefix-dev/pixi.git\n

Move to the example folder

cd pixi/examples/opencv\n
"},{"location":"examples/opencv/#face-detection","title":"Face detection","text":"

Run the start command to start the face detection algorithm.

pixi run start\n

The screen that starts should look like this:

Check out the webcame_capture.py to see how we detect a face.

"},{"location":"examples/opencv/#camera-calibration","title":"Camera Calibration","text":"

Next to face recognition, a camera calibration example is also included.

You'll need a checkerboard for this to work. Print this:

Then run

pixi run calibrate\n

To make a picture for calibration press SPACE Do this approximately 10 times with the chessboard in view of the camera

After that press ESC which will start the calibration.

When the calibration is done, the camera will be used again to find the distance to the checkerboard.

"},{"location":"examples/ros2-nav2/","title":"Navigation 2 example","text":"

The nav2 example is located in the pixi repository.

git clone https://github.com/prefix-dev/pixi.git\n

Move to the example folder

cd pixi/examples/ros2-nav2\n

Run the start command

pixi run start\n
"},{"location":"features/advanced_tasks/","title":"Advanced tasks","text":"

When building a package, you often have to do more than just run the code. Steps like formatting, linting, compiling, testing, benchmarking, etc. are often part of a project. With pixi tasks, this should become much easier to do.

Here are some quick examples

pixi.toml
[tasks]\n# Commands as lists so you can also add documentation in between.\nconfigure = { cmd = [\n    \"cmake\",\n    # Use the cross-platform Ninja generator\n    \"-G\",\n    \"Ninja\",\n    # The source is in the root directory\n    \"-S\",\n    \".\",\n    # We wanna build in the .build directory\n    \"-B\",\n    \".build\",\n] }\n\n# Depend on other tasks\nbuild = { cmd = [\"ninja\", \"-C\", \".build\"], depends-on = [\"configure\"] }\n\n# Using environment variables\nrun = \"python main.py $PIXI_PROJECT_ROOT\"\nset = \"export VAR=hello && echo $VAR\"\n\n# Cross platform file operations\ncopy = \"cp pixi.toml pixi_backup.toml\"\nclean = \"rm pixi_backup.toml\"\nmove = \"mv pixi.toml backup.toml\"\n
"},{"location":"features/advanced_tasks/#depends-on","title":"Depends on","text":"

Just like packages can depend on other packages, our tasks can depend on other tasks. This allows for complete pipelines to be run with a single command.

An obvious example is compiling before running an application.

Checkout our cpp_sdl example for a running example. In that package we have some tasks that depend on each other, so we can assure that when you run pixi run start everything is set up as expected.

pixi task add configure \"cmake -G Ninja -S . -B .build\"\npixi task add build \"ninja -C .build\" --depends-on configure\npixi task add start \".build/bin/sdl_example\" --depends-on build\n

Results in the following lines added to the pixi.toml

pixi.toml
[tasks]\n# Configures CMake\nconfigure = \"cmake -G Ninja -S . -B .build\"\n# Build the executable but make sure CMake is configured first.\nbuild = { cmd = \"ninja -C .build\", depends-on = [\"configure\"] }\n# Start the built executable\nstart = { cmd = \".build/bin/sdl_example\", depends-on = [\"build\"] }\n
pixi run start\n

The tasks will be executed after each other:

  • First configure because it has no dependencies.
  • Then build as it only depends on configure.
  • Then start as all it dependencies are run.

If one of the commands fails (exit with non-zero code.) it will stop and the next one will not be started.

With this logic, you can also create aliases as you don't have to specify any command in a task.

pixi task add fmt ruff\npixi task add lint pylint\n
pixi task alias style fmt lint\n

Results in the following pixi.toml.

pixi.toml
fmt = \"ruff\"\nlint = \"pylint\"\nstyle = { depends-on = [\"fmt\", \"lint\"] }\n

Now run both tools with one command.

pixi run style\n
"},{"location":"features/advanced_tasks/#working-directory","title":"Working directory","text":"

Pixi tasks support the definition of a working directory.

cwd\" stands for Current Working Directory. The directory is relative to the pixi package root, where the pixi.toml file is located.

Consider a pixi project structured as follows:

\u251c\u2500\u2500 pixi.toml\n\u2514\u2500\u2500 scripts\n    \u2514\u2500\u2500 bar.py\n

To add a task to run the bar.py file, use:

pixi task add bar \"python bar.py\" --cwd scripts\n

This will add the following line to manifest file:

pixi.toml
[tasks]\nbar = { cmd = \"python bar.py\", cwd = \"scripts\" }\n
"},{"location":"features/advanced_tasks/#caching","title":"Caching","text":"

When you specify inputs and/or outputs to a task, pixi will reuse the result of the task.

For the cache, pixi checks that the following are true:

  • No package in the environment has changed.
  • The selected inputs and outputs are the same as the last time the task was run. We compute fingerprints of all the files selected by the globs and compare them to the last time the task was run.
  • The command is the same as the last time the task was run.

If all of these conditions are met, pixi will not run the task again and instead use the existing result.

Inputs and outputs can be specified as globs, which will be expanded to all matching files.

pixi.toml
[tasks]\n# This task will only run if the `main.py` file has changed.\nrun = { cmd = \"python main.py\", inputs = [\"main.py\"] }\n\n# This task will remember the result of the `curl` command and not run it again if the file `data.csv` already exists.\ndownload_data = { cmd = \"curl -o data.csv https://example.com/data.csv\", outputs = [\"data.csv\"] }\n\n# This task will only run if the `src` directory has changed and will remember the result of the `make` command.\nbuild = { cmd = \"make\", inputs = [\"src/*.cpp\", \"include/*.hpp\"], outputs = [\"build/app.exe\"] }\n

Note: if you want to debug the globs you can use the --verbose flag to see which files are selected.

# shows info logs of all files that were selected by the globs\npixi run -v start\n
"},{"location":"features/advanced_tasks/#environment-variables","title":"Environment variables","text":"

You can set environment variables for a task. These are seen as \"default\" values for the variables as you can overwrite them from the shell.

pixi.toml

[tasks]\necho = { cmd = \"echo $ARGUMENT\", env = { ARGUMENT = \"hello\" } }\n
If you run pixi run echo it will output hello. When you set the environment variable ARGUMENT before running the task, it will use that value instead.

ARGUMENT=world pixi run echo\n\u2728 Pixi task (echo in default): echo $ARGUMENT\nworld\n

These variables are not shared over tasks, so you need to define these for every task you want to use them in.

Extend instead of overwrite

If you use the same environment variable in the value as in the key of the map you will also overwrite the variable. For example overwriting a PATH pixi.toml

[tasks]\necho = { cmd = \"echo $PATH\", env = { PATH = \"/tmp/path:$PATH\" } }\n
This will output /tmp/path:/usr/bin:/bin instead of the original /usr/bin:/bin.

"},{"location":"features/advanced_tasks/#clean-environment","title":"Clean environment","text":"

You can make sure the environment of a task is \"pixi only\". Here pixi will only include the minimal required environment variables for your platform to run the command in. The environment will contain all variables set by the conda environment like \"CONDA_PREFIX\". It will however include some default values from the shell, like: \"DISPLAY\", \"LC_ALL\", \"LC_TIME\", \"LC_NUMERIC\", \"LC_MEASUREMENT\", \"SHELL\", \"USER\", \"USERNAME\", \"LOGNAME\", \"HOME\", \"HOSTNAME\",\"TMPDIR\", \"XPC_SERVICE_NAME\", \"XPC_FLAGS\"

[tasks]\nclean_command = { cmd = \"python run_in_isolated_env.py\", clean-env = true}\n
This setting can also be set from the command line with pixi run --clean-env TASK_NAME.

clean-env not supported on Windows

On Windows it's hard to create a \"clean environment\" as conda-forge doesn't ship Windows compilers and Windows needs a lot of base variables. Making this feature not worthy of implementing as the amount of edge cases will make it unusable.

"},{"location":"features/advanced_tasks/#our-task-runner-deno_task_shell","title":"Our task runner: deno_task_shell","text":"

To support the different OS's (Windows, OSX and Linux), pixi integrates a shell that can run on all of them. This is deno_task_shell. The task shell is a limited implementation of a bourne-shell interface.

"},{"location":"features/advanced_tasks/#built-in-commands","title":"Built-in commands","text":"

Next to running actual executable like ./myprogram, cmake or python the shell has some built-in commandos.

  • cp: Copies files.
  • mv: Moves files.
  • rm: Remove files or directories. Ex: rm -rf [FILE]... - Commonly used to recursively delete files or directories.
  • mkdir: Makes directories. Ex. mkdir -p DIRECTORY... - Commonly used to make a directory and all its parents with no error if it exists.
  • pwd: Prints the name of the current/working directory.
  • sleep: Delays for a specified amount of time. Ex. sleep 1 to sleep for 1 second, sleep 0.5 to sleep for half a second, or sleep 1m to sleep a minute
  • echo: Displays a line of text.
  • cat: Concatenates files and outputs them on stdout. When no arguments are provided, it reads and outputs stdin.
  • exit: Causes the shell to exit.
  • unset: Unsets environment variables.
  • xargs: Builds arguments from stdin and executes a command.
"},{"location":"features/advanced_tasks/#syntax","title":"Syntax","text":"
  • Boolean list: use && or || to separate two commands.
    • &&: if the command before && succeeds continue with the next command.
    • ||: if the command before || fails continue with the next command.
  • Sequential lists: use ; to run two commands without checking if the first command failed or succeeded.
  • Environment variables:
    • Set env variable using: export ENV_VAR=value
    • Use env variable using: $ENV_VAR
    • unset env variable using unset ENV_VAR
  • Shell variables: Shell variables are similar to environment variables, but won\u2019t be exported to spawned commands.
    • Set them: VAR=value
    • use them: VAR=value && echo $VAR
  • Pipelines: Use the stdout output of a command into the stdin a following command
    • |: echo Hello | python receiving_app.py
    • |&: use this to also get the stderr as input.
  • Command substitution: $() to use the output of a command as input for another command.
    • python main.py $(git rev-parse HEAD)
  • Negate exit code: ! before any command will negate the exit code from 1 to 0 or visa-versa.
  • Redirects: > to redirect the stdout to a file.
    • echo hello > file.txt will put hello in file.txt and overwrite existing text.
    • python main.py 2> file.txt will put the stderr output in file.txt.
    • python main.py &> file.txt will put the stderr and stdout in file.txt.
    • echo hello >> file.txt will append hello to the existing file.txt.
  • Glob expansion: * to expand all options.
    • echo *.py will echo all filenames that end with .py
    • echo **/*.py will echo all filenames that end with .py in this directory and all descendant directories.
    • echo data[0-9].csv will echo all filenames that have a single number after data and before .csv

More info in deno_task_shell documentation.

"},{"location":"features/environment/","title":"Environments","text":"

Pixi is a tool to manage virtual environments. This document explains what an environment looks like and how to use it.

"},{"location":"features/environment/#structure","title":"Structure","text":"

A pixi environment is located in the .pixi/envs directory of the project by default. This keeps your machine and your project clean and isolated from each other, and makes it easy to clean up after a project is done. While this structure is generally recommended, environments can also be stored outside of project directories by enabling detached environments.

If you look at the .pixi/envs directory, you will see a directory for each environment, the default being the one that is normally used, if you specify a custom environment the name you specified will be used.

.pixi\n\u2514\u2500\u2500 envs\n    \u251c\u2500\u2500 cuda\n    \u2502   \u251c\u2500\u2500 bin\n    \u2502   \u251c\u2500\u2500 conda-meta\n    \u2502   \u251c\u2500\u2500 etc\n    \u2502   \u251c\u2500\u2500 include\n    \u2502   \u251c\u2500\u2500 lib\n    \u2502   ...\n    \u2514\u2500\u2500 default\n        \u251c\u2500\u2500 bin\n        \u251c\u2500\u2500 conda-meta\n        \u251c\u2500\u2500 etc\n        \u251c\u2500\u2500 include\n        \u251c\u2500\u2500 lib\n        ...\n

These directories are conda environments, and you can use them as such, but you cannot manually edit them, this should always go through the pixi.toml. Pixi will always make sure the environment is in sync with the pixi.lock file. If this is not the case then all the commands that use the environment will automatically update the environment, e.g. pixi run, pixi shell.

"},{"location":"features/environment/#environment-installation-metadata","title":"Environment Installation Metadata","text":"

On environment installation, pixi will write a small file to the environment that contains some metadata about installation. This file is called pixi and is located in the conda-meta folder of the environment. This file contains the following information:

  • manifest_path: The path to the manifest file that describes the project used to create this environment
  • environment_name: The name of the environment
  • pixi_version: The version of pixi that was used to create this environment
  • environment_lock_file_hash: The hash of the pixi.lock file that was used to create this environment
{\n  \"manifest_path\": \"/home/user/dev/pixi/pixi.toml\",\n  \"environment_name\": \"default\",\n  \"pixi_version\": \"0.34.0\",\n  \"environment_lock_file_hash\": \"4f36ee620f10329d\"\n}\n

The environment_lock_file_hash is used to check if the environment is in sync with the pixi.lock file. If the hash of the pixi.lock file is different from the hash in the pixi file, pixi will update the environment.

This is used to speedup activation, in order to trigger a full revalidation pass --revalidate to the pixi run or pixi shell command. A broken environment would typically not be found with a hash comparison, but a revalidation would reinstall the environment. By default, all lock file modifying commands will always use the revalidation and on pixi install it always revalidates.

"},{"location":"features/environment/#cleaning-up","title":"Cleaning up","text":"

If you want to clean up the environments, you can simply delete the .pixi/envs directory, and pixi will recreate the environments when needed.

# either:\nrm -rf .pixi/envs\n\n# or per environment:\nrm -rf .pixi/envs/default\nrm -rf .pixi/envs/cuda\n
"},{"location":"features/environment/#activation","title":"Activation","text":"

An environment is nothing more than a set of files that are installed into a certain location, that somewhat mimics a global system install. You need to activate the environment to use it. In the most simple sense that mean adding the bin directory of the environment to the PATH variable. But there is more to it in a conda environment, as it also sets some environment variables.

To do the activation we have multiple options:

  • Use the pixi shell command to open a shell with the environment activated.
  • Use the pixi shell-hook command to print the command to activate the environment in your current shell.
  • Use the pixi run command to run a command in the environment.

Where the run command is special as it runs its own cross-platform shell and has the ability to run tasks. More information about tasks can be found in the tasks documentation.

Using the pixi shell-hook in pixi you would get the following output:

export PATH=\"/home/user/development/pixi/.pixi/envs/default/bin:/home/user/.local/bin:/home/user/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/home/user/.pixi/bin\"\nexport CONDA_PREFIX=\"/home/user/development/pixi/.pixi/envs/default\"\nexport PIXI_PROJECT_NAME=\"pixi\"\nexport PIXI_PROJECT_ROOT=\"/home/user/development/pixi\"\nexport PIXI_PROJECT_VERSION=\"0.12.0\"\nexport PIXI_PROJECT_MANIFEST=\"/home/user/development/pixi/pixi.toml\"\nexport CONDA_DEFAULT_ENV=\"pixi\"\nexport PIXI_ENVIRONMENT_PLATFORMS=\"osx-64,linux-64,win-64,osx-arm64\"\nexport PIXI_ENVIRONMENT_NAME=\"default\"\nexport PIXI_PROMPT=\"(pixi) \"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-binutils_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gcc_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gfortran_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gxx_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/libglib_activate.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/rust.sh\"\n

It sets the PATH and some more environment variables. But more importantly it also runs activation scripts that are presented by the installed packages. An example of this would be the libglib_activate.sh script. Thus, just adding the bin directory to the PATH is not enough.

"},{"location":"features/environment/#traditional-conda-activate-like-activation","title":"Traditional conda activate-like activation","text":"

If you prefer to use the traditional conda activate-like activation, you could use the pixi shell-hook command.

$ which python\npython not found\n$ eval \"$(pixi shell-hook)\"\n$ (default) which python\n/path/to/project/.pixi/envs/default/bin/python\n

Warning

It is not encouraged to use the traditional conda activate-like activation, as deactivating the environment is not really possible. Use pixi shell instead.

"},{"location":"features/environment/#using-pixi-with-direnv","title":"Using pixi with direnv","text":"Installing direnv

Of course you can use pixi to install direnv globally. We recommend to run

pixi global install direnv\n

to install the latest version of direnv on your computer.

This allows you to use pixi in combination with direnv. Enter the following into your .envrc file:

.envrc
watch_file pixi.lock # (1)!\neval \"$(pixi shell-hook)\" # (2)!\n
  1. This ensures that every time your pixi.lock changes, direnv invokes the shell-hook again.
  2. This installs if needed, and activates the environment. direnv ensures that the environment is deactivated when you leave the directory.
$ cd my-project\ndirenv: error /my-project/.envrc is blocked. Run `direnv allow` to approve its content\n$ direnv allow\ndirenv: loading /my-project/.envrc\n\u2714 Project in /my-project is ready to use!\ndirenv: export +CONDA_DEFAULT_ENV +CONDA_PREFIX +PIXI_ENVIRONMENT_NAME +PIXI_ENVIRONMENT_PLATFORMS +PIXI_PROJECT_MANIFEST +PIXI_PROJECT_NAME +PIXI_PROJECT_ROOT +PIXI_PROJECT_VERSION +PIXI_PROMPT ~PATH\n$ which python\n/my-project/.pixi/envs/default/bin/python\n$ cd ..\ndirenv: unloading\n$ which python\npython not found\n
"},{"location":"features/environment/#environment-variables","title":"Environment variables","text":"

The following environment variables are set by pixi, when using the pixi run, pixi shell, or pixi shell-hook command:

  • PIXI_PROJECT_ROOT: The root directory of the project.
  • PIXI_PROJECT_NAME: The name of the project.
  • PIXI_PROJECT_MANIFEST: The path to the manifest file (pixi.toml).
  • PIXI_PROJECT_VERSION: The version of the project.
  • PIXI_PROMPT: The prompt to use in the shell, also used by pixi shell itself.
  • PIXI_ENVIRONMENT_NAME: The name of the environment, defaults to default.
  • PIXI_ENVIRONMENT_PLATFORMS: Comma separated list of platforms supported by the project.
  • CONDA_PREFIX: The path to the environment. (Used by multiple tools that already understand conda environments)
  • CONDA_DEFAULT_ENV: The name of the environment. (Used by multiple tools that already understand conda environments)
  • PATH: We prepend the bin directory of the environment to the PATH variable, so you can use the tools installed in the environment directly.
  • INIT_CWD: ONLY IN pixi run: The directory where the command was run from.

Note

Even though the variables are environment variables these cannot be overridden. E.g. you can not change the root of the project by setting PIXI_PROJECT_ROOT in the environment.

"},{"location":"features/environment/#solving-environments","title":"Solving environments","text":"

When you run a command that uses the environment, pixi will check if the environment is in sync with the pixi.lock file. If it is not, pixi will solve the environment and update it. This means that pixi will retrieve the best set of packages for the dependency requirements that you specified in the pixi.toml and will put the output of the solve step into the pixi.lock file. Solving is a mathematical problem and can take some time, but we take pride in the way we solve environments, and we are confident that we can solve your environment in a reasonable time. If you want to learn more about the solving process, you can read these:

  • Rattler(conda) resolver blog
  • UV(PyPI) resolver blog

Pixi solves both the conda and PyPI dependencies, where the PyPI dependencies use the conda packages as a base, so you can be sure that the packages are compatible with each other. These solvers are split between the rattler and uv library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. resolve is able to solve multiple ecosystem like conda and PyPI. It implements the lazy solving process for PyPI packages, which means that it only downloads the metadata of the packages that are needed to solve the environment. It also supports the conda way of solving, which means that it downloads the metadata of all the packages at once and then solves in one go.

For the [pypi-dependencies], uv implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. For this building step, pixi requires to first install python in the (conda)[dependencies] section of the pixi.toml file. This will always be slower than the pure conda solves. So for the best pixi experience you should stay within the [dependencies] section of the pixi.toml file.

"},{"location":"features/environment/#caching-packages","title":"Caching packages","text":"

Pixi caches all previously downloaded packages in a cache folder. This cache folder is shared between all pixi projects and globally installed tools.

Normally the location would be the following platform-specific default cache folder:

  • Linux: $XDG_CACHE_HOME/rattler or $HOME/.cache/rattler
  • macOS: $HOME/Library/Caches/rattler
  • Windows: %LOCALAPPDATA%\\rattler

This location is configurable by setting the PIXI_CACHE_DIR or RATTLER_CACHE_DIR environment variable.

When you want to clean the cache, you can simply delete the cache directory, and pixi will re-create the cache when needed.

The cache contains multiple folders concerning different caches from within pixi.

  • pkgs: Contains the downloaded/unpacked conda packages.
  • repodata: Contains the conda repodata cache.
  • uv-cache: Contains the uv cache. This includes multiple caches, e.g. built-wheels wheels archives
  • http-cache: Contains the conda-pypi mapping cache.
"},{"location":"features/global_tools/","title":"Pixi Global","text":"

Pixi global demo

With pixi global, users can manage globally installed tools in a way that makes them available from any directory. This means that the pixi environment will be placed in a global location, and the tools will be exposed to the system PATH, allowing you to run them from the command line.

"},{"location":"features/global_tools/#basic-usage","title":"Basic Usage","text":"

Running the following command installs rattler-build on your system.

pixi global install rattler-build\n

What's great about pixi global is that, by default, it isolates each package in its own environment, exposing only the necessary entry points. This means you don't have to worry about removing a package and accidentally breaking seemingly unrelated packages. This behavior is quite similar to that of pipx.

However, there are times when you may want multiple dependencies in the same environment. For instance, while ipython is really useful on its own, it becomes much more useful when numpy and matplotlib are available when using it.

Let's execute the following command:

pixi global install ipython --with numpy --with matplotlib\n

numpy exposes executables, but since it's added via --with it's executables are not being exposed.

Importing numpy and matplotlib now works as expected.

ipython -c 'import numpy; import matplotlib'\n

At some point, you might want to install multiple versions of the same package on your system. Since they will be all available on the system PATH, they need to be exposed under different names.

Let's check out the following command:

pixi global install --expose py3=python \"python=3.12\"\n

By specifying --expose we specified that we want to expose the executable python under the name py3. The package python has more executables, but since we specified --exposed they are not auto-exposed.

You can run py3 to start the python interpreter.

py3 -c \"print('Hello World')\"\n

"},{"location":"features/global_tools/#the-global-manifest","title":"The Global Manifest","text":"

Since v0.33.0 pixi has a new manifest file that will be created in the global directory. This file will contain the list of environments that are installed globally, their dependencies and exposed binaries. The manifest can be edited, synced, checked in to a version control system, and shared with others.

Running the commands from the section before results in the following manifest:

version = 1\n\n[envs.rattler-build]\nchannels = [\"conda-forge\"]\ndependencies = { rattler-build = \"*\" }\nexposed = { rattler-build = \"rattler-build\" }\n\n[envs.ipython]\nchannels = [\"conda-forge\"]\ndependencies = { ipython = \"*\", numpy = \"*\", matplotlib = \"*\" }\nexposed = { ipython = \"ipython\", ipython3 = \"ipython3\" }\n\n[envs.python]\nchannels = [\"conda-forge\"]\ndependencies = { python = \"3.12.*\" } # (1)!\nexposed = { py3 = \"python\" } # (2)!\n

  1. Dependencies are the packages that will be installed in the environment. You can specify the version or use a wildcard.
  2. The exposed binaries are the ones that will be available in the system path. In this case, python is exposed under the name py3.
"},{"location":"features/global_tools/#manifest-locations","title":"Manifest locations","text":"

The manifest can be found at the following locations depending on your operating system. Run pixi info, to find the currently used manifest on your system.

LinuxmacOSWindows Priority Location Comments 4 $PIXI_HOME/manifests/pixi-global.toml Global manifest in PIXI_HOME. 3 $HOME/.pixi/manifests/pixi-global.toml Global manifest in user home directory. 2 $XDG_CONFIG_HOME/pixi/manifests/pixi-global.toml XDG compliant config directory. 1 $HOME/.config/pixi/manifests/pixi-global.toml Config directory. Priority Location Comments 3 $PIXI_HOME/manifests/pixi-global.toml Global manifest in PIXI_HOME. 2 $HOME/.pixi/manifests/pixi-global.toml Global manifest in user home directory. 1 $HOME/Library/Application Support/pixi/manifests/pixi-global.toml Config directory. Priority Location Comments 3 $PIXI_HOME\\manifests/pixi-global.toml Global manifest in PIXI_HOME. 2 %USERPROFILE%\\.pixi\\manifests\\pixi-global.toml Global manifest in user home directory. 1 %APPDATA%\\pixi\\manifests\\pixi-global.toml Config directory.

Note

If multiple locations exist, the manifest with the highest priority will be used.

"},{"location":"features/global_tools/#channels","title":"Channels","text":"

The channels are the conda channels that will be used to search for the packages. There is a priority to these, so the first one will have the highest priority, if a package is not found in that channel the next one will be used. For example, running:

pixi global install --channel conda-forge --channel bioconda snakemake\n
Results in the following entry in the manifest:
[envs.snakemake]\nchannels = [\"conda-forge\", \"bioconda\"]\ndependencies = { snakemake = \"*\" }\nexposed = { snakemake = \"snakemake\" }\n

More information on channels can be found here.

"},{"location":"features/global_tools/#automatic-exposed","title":"Automatic Exposed","text":"

There is some added automatic behavior, if you install a package with the same name as the environment, it will be exposed with the same name. Even if the binary name is only exposed through dependencies of the package For example, running:

pixi global install ansible\n
will create the following entry in the manifest:
[envs.ansible]\nchannels = [\"conda-forge\"]\ndependencies = { ansible = \"*\" }\nexposed = { ansible = \"ansible\" } # (1)!\n

  1. The ansible binary is exposed even though it is installed by a dependency of ansible, the ansible-core package.

It's also possible to expose an executable which is located in a nested directory. For example dotnet.exe executable is located in a dotnet folder, to expose dotnet you must specify its relative path :

pixi global install dotnet --expose dotnet=dotnet\\dotnet\n

Which will create the following entry in the manifest:

[envs.dotnet]\nchannels = [\"conda-forge\"]\ndependencies = { dotnet = \"*\" }\nexposed = { dotnet = 'dotnet\\dotnet' }\n

"},{"location":"features/global_tools/#dependencies","title":"Dependencies","text":"

Dependencies are the Conda packages that will be installed into your environment. For example, running:

pixi global install \"python<3.12\"\n
creates the following entry in the manifest:
[envs.vim]\nchannels = [\"conda-forge\"]\ndependencies = { python = \"<3.12\" }\n# ...\n
Typically, you'd specify just the tool you're installing, but you can add more packages if needed. Defining the environment to install into will allow you to add multiple dependencies at once. For example, running:
pixi global install --environment my-env git vim python\n
will create the following entry in the manifest:
[envs.my-env]\nchannels = [\"conda-forge\"]\ndependencies = { git = \"*\", vim = \"*\", python = \"*\" }\n# ...\n

You can add a dependency to an existing environment by running:

pixi global install --environment my-env package-a package-b\n
This will be added as dependencies to the my-env environment but won't auto expose the binaries from the new packages.

You can remove dependencies by running:

pixi global remove --environment my-env package-a package-b\n

"},{"location":"features/global_tools/#trampolines","title":"Trampolines","text":"

To increase efficiency, pixi uses trampolines\u2014small, specialized binary files that manage configuration and environment setup before executing the main binary. The trampoline approach allows for skipping the execution of activation scripts that have a significant performance impact.

When you execute a global install binary, a trampoline performs the following sequence of steps:

  • Each trampoline first reads a configuration file named after the binary being executed. This configuration file, in JSON format (e.g., python.json), contains key information about how the environment should be set up. The configuration file is stored in .pixi/bin/trampoline_configuration.
  • Once the configuration is loaded and the environment is set, the trampoline executes the original binary with the correct environment settings.
  • When installing a new binary, a new trampoline is placed in the .pixi/bin directory and is hard-linked to the .pixi/bin/trampoline_configuration/trampoline_bin. This optimizes storage space and avoids duplication of the same trampoline.
"},{"location":"features/global_tools/#example-adding-a-series-of-tools-at-once","title":"Example: Adding a series of tools at once","text":"

Without specifying an environment, you can add multiple tools at once:

pixi global install pixi-pack rattler-build\n
This command generates the following entry in the manifest:
[envs.pixi-pack]\nchannels = [\"conda-forge\"]\ndependencies= { pixi-pack = \"*\" }\nexposed = { pixi-pack = \"pixi-pack\" }\n\n[envs.rattler-build]\nchannels = [\"conda-forge\"]\ndependencies = { rattler-build = \"*\" }\nexposed = { rattler-build = \"rattler-build\" }\n
Creating two separate non-interfering environments, while exposing only the minimum required binaries.

"},{"location":"features/global_tools/#example-creating-a-data-science-sandbox-environment","title":"Example: Creating a Data Science Sandbox Environment","text":"

You can create an environment with multiple tools using the following command:

pixi global install --environment data-science --expose jupyter --expose ipython jupyter numpy pandas matplotlib ipython\n
This command generates the following entry in the manifest:
[envs.data-science]\nchannels = [\"conda-forge\"]\ndependencies = { jupyter = \"*\", ipython = \"*\" }\nexposed = { jupyter = \"jupyter\", ipython = \"ipython\" }\n
In this setup, both jupyter and ipython are exposed from the data-science environment, allowing you to run:
> ipython\n# Or\n> jupyter lab\n
These commands will be available globally, making it easy to access your preferred tools without switching environments.

"},{"location":"features/global_tools/#example-install-packages-for-a-different-platform","title":"Example: Install packages for a different platform","text":"

You can install packages for a different platform using the --platform flag. This is useful when you want to install packages for a different platform, such as osx-64 packages on osx-arm64. For example, running this on osx-arm64:

pixi global install --platform osx-64 python\n
will create the following entry in the manifest:
[envs.python]\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-64\"]\ndependencies = { python = \"*\" }\n# ...\n

"},{"location":"features/global_tools/#potential-future-features","title":"Potential Future Features","text":""},{"location":"features/global_tools/#pypi-support","title":"PyPI support","text":"

We could support packages from PyPI via a command like this:

pixi global install --pypi flask\n
"},{"location":"features/global_tools/#lock-file","title":"Lock file","text":"

A lock file is less important for global tools. However, there is demand for it, and users that don't care about it should not be negatively impacted

"},{"location":"features/global_tools/#multiple-manifests","title":"Multiple manifests","text":"

We could go for one default manifest, but also parse other manifests in the same directory. The only requirement to be parsed as manifest is a .toml extension In order to modify those with the CLI one would have to add an option --manifest to select the correct one.

  • pixi-global.toml: Default
  • pixi-global-company-tools.toml
  • pixi-global-from-my-dotfiles.toml

It is unclear whether the first implementation already needs to support this. At the very least we should put the manifest into its own folder like ~/.pixi/global/manifests/pixi-global.toml

"},{"location":"features/global_tools/#no-activation","title":"No activation","text":"

The current pixi global install features --no-activation. When this flag is set, CONDA_PREFIX and PATH will not be set when running the exposed executable. This is useful when installing Python package managers or shells.

Assuming that this needs to be set per mapping, one way to expose this functionality would be to allow the following:

[envs.pip.exposed]\npip = { executable = \"pip\", activation = false }\n
"},{"location":"features/lockfile/","title":"The pixi.lock lock file","text":"

A lock file is the protector of the environments, and pixi is the key to unlock it.

"},{"location":"features/lockfile/#what-is-a-lock-file","title":"What is a lock file?","text":"

A lock file locks the environment in a specific state. Within pixi a lock file is a description of the packages in an environment. The lock file contains two definitions:

  • The environments that are used in the project with their complete set of packages. e.g.:

    environments:\n    default:\n        channels:\n          - url: https://conda.anaconda.org/conda-forge/\n        packages:\n            linux-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.12.2-hab00c5b_0_cpython.conda\n            ...\n            osx-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n            ...\n
    • The definition of the packages themselves. e.g.:

      - kind: conda\n  name: python\n  version: 3.12.2\n  build: h9f0c242_0_cpython\n  subdir: osx-64\n  url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n  sha256: 7647ac06c3798a182a4bcb1ff58864f1ef81eb3acea6971295304c23e43252fb\n  md5: 0179b8007ba008cf5bec11f3b3853902\n  depends:\n    - bzip2 >=1.0.8,<2.0a0\n    - libexpat >=2.5.0,<3.0a0\n    - libffi >=3.4,<4.0a0\n    - libsqlite >=3.45.1,<4.0a0\n    - libzlib >=1.2.13,<1.3.0a0\n    - ncurses >=6.4,<7.0a0\n    - openssl >=3.2.1,<4.0a0\n    - readline >=8.2,<9.0a0\n    - tk >=8.6.13,<8.7.0a0\n    - tzdata\n    - xz >=5.2.6,<6.0a0\n  constrains:\n    - python_abi 3.12.* *_cp312\n  license: Python-2.0\n  size: 14596811\n  timestamp: 1708118065292\n
"},{"location":"features/lockfile/#why-a-lock-file","title":"Why a lock file","text":"

Pixi uses the lock file for the following reasons:

  • To save a working installation state, without copying the entire environment's data.
  • To ensure the project configuration is aligned with the installed environment.
  • To give the user a file that contains all the information about the environment.

This gives you (and your collaborators) a way to really reproduce the environment they are working in. Using tools such as docker suddenly becomes much less necessary.

"},{"location":"features/lockfile/#when-is-a-lock-file-generated","title":"When is a lock file generated?","text":"

A lock file is generated when you install a package. More specifically, a lock file is generated from the solve step of the installation process. The solve will return a list of packages that are to be installed, and the lock file will be generated from this list. This diagram tries to explain the process:

graph TD\n    A[Install] --> B[Solve]\n    B --> C[Generate and write lock file]\n    C --> D[Install Packages]
"},{"location":"features/lockfile/#how-to-use-a-lock-file","title":"How to use a lock file","text":"

Do not edit the lock file

A lock file is a machine only file, and should not be edited by hand.

That said, the pixi.lock is human-readable, so it's easy to track the changes in the environment. We recommend you track the lock file in git or other version control systems. This will ensure that the environment is always reproducible and that you can always revert back to a working state, in case something goes wrong. The pixi.lock and the manifest file pixi.toml/pyproject.toml should always be in sync.

Running the following commands will check and automatically update the lock file if you changed any dependencies:

  • pixi install
  • pixi run
  • pixi shell
  • pixi shell-hook
  • pixi tree
  • pixi list
  • pixi add
  • pixi remove

All the commands that support the interaction with the lock file also include some lock file usage options:

  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: only install if the pixi.lock is up-to-date with the manifest file[^1]. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.

Syncing the lock file with the manifest file

The lock file is always matched with the whole configuration in the manifest file. This means that if you change the manifest file, the lock file will be updated.

flowchart TD\n    C[manifest] --> A[lockfile] --> B[environment]

"},{"location":"features/lockfile/#lockfile-satisfiability","title":"Lockfile satisfiability","text":"

The lock file is a description of the environment, and it should always be satisfiable. Satisfiable means that the given manifest file and the created environment are in sync with the lockfile. If the lock file is not satisfiable, pixi will generate a new lock file automatically.

Steps to check if the lock file is satisfiable:

  • All environments in the manifest file are in the lock file
  • All channels in the manifest file are in the lock file
  • All packages in the manifest file are in the lock file, and the versions in the lock file are compatible with the requirements in the manifest file, for both conda and pypi packages.
    • Conda packages use a matchspec which can match on all the information we store in the lockfile, even timestamp, subdir and license.
  • If pypi-dependencies are added, all conda package that are python packages in the lock file have a purls field.
  • All hashes for the pypi editable packages are correct.
  • There is only a single entry for every package in the lock file.

If you want to get more details checkout the actual code as this is a simplification of the actual code.

"},{"location":"features/lockfile/#the-version-of-the-lock-file","title":"The version of the lock file","text":"

The lock file has a version number, this is to ensure that the lock file is compatible with the local version of pixi.

version: 4\n

Pixi is backward compatible with the lock file, but not forward compatible. This means that you can use an older lock file with a newer version of pixi, but not the other way around.

"},{"location":"features/lockfile/#your-lock-file-is-big","title":"Your lock file is big","text":"

The lock file can grow quite large, especially if you have a lot of packages installed. This is because the lock file contains all the information about the packages.

  1. We try to keep the lock file as small as possible.
  2. It's always smaller than a docker image.
  3. Downloading the lock file is always faster than downloading the incorrect packages.
"},{"location":"features/lockfile/#you-dont-need-a-lock-file-because","title":"You don't need a lock file because...","text":"

If you can not think of a case where you would benefit from a fast reproducible environment, then you don't need a lock file.

But take note of the following:

  • A lock file allows you to run the same environment on different machines, think CI systems.
  • It also allows you to go back to a working state if you have made a mistake.
  • It helps other users onboard to your project as they don't have to figure out the environment setup or solve dependency issues.
"},{"location":"features/lockfile/#removing-the-lock-file","title":"Removing the lock file","text":"

If you want to remove the lock file, you can simply delete it.

rm pixi.lock\n

This will remove the lock file, and the next time you run a command that requires the lock file, it will be generated again.

Note

This does remove the locked state of the environment, and the environment will be updated to the latest version of the packages.

"},{"location":"features/multi_environment/","title":"Multi Environment Support","text":""},{"location":"features/multi_environment/#motivating-example","title":"Motivating Example","text":"

There are multiple scenarios where multiple environments are useful.

  • Testing of multiple package versions, e.g. py39 and py310 or polars 0.12 and 0.13.
  • Smaller single tool environments, e.g. lint or docs.
  • Large developer environments, that combine all the smaller environments, e.g. dev.
  • Strict supersets of environments, e.g. prod and test-prod where test-prod is a strict superset of prod.
  • Multiple machines from one project, e.g. a cuda environment and a cpu environment.
  • And many more. (Feel free to edit this document in our GitHub and add your use case.)

This prepares pixi for use in large projects with multiple use-cases, multiple developers and different CI needs.

"},{"location":"features/multi_environment/#design-considerations","title":"Design Considerations","text":"

There are a few things we wanted to keep in mind in the design:

  1. User-friendliness: Pixi is a user focussed tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
  2. Keep it simple: Not understanding the multiple environments feature shouldn't limit a user to use pixi. The feature should be \"invisible\" to the non-multi env use-cases.
  3. No Automatic Combinatorial: To ensure the dependency resolution process remains manageable, the solution should avoid a combinatorial explosion of dependency sets. By making the environments user defined and not automatically inferred by testing a matrix of the features.
  4. Single environment Activation: The design should allow only one environment to be active at any given time, simplifying the resolution process and preventing conflicts.
  5. Fixed lock files: It's crucial to preserve fixed lock files for consistency and predictability. Solutions must ensure reliability not just for authors but also for end-users, particularly at the time of lock file creation.
"},{"location":"features/multi_environment/#feature-environment-set-definitions","title":"Feature & Environment Set Definitions","text":"

Introduce environment sets into the pixi.toml this describes environments based on feature's. Introduce features into the pixi.toml that can describe parts of environments. As an environment goes beyond just dependencies the features should be described including the following fields:

  • dependencies: The conda package dependencies
  • pypi-dependencies: The pypi package dependencies
  • system-requirements: The system requirements of the environment
  • activation: The activation information for the environment
  • platforms: The platforms the environment can be run on.
  • channels: The channels used to create the environment. Adding the priority field to the channels to allow concatenation of channels instead of overwriting.
  • target: All the above features but also separated by targets.
  • tasks: Feature specific tasks, tasks in one environment are selected as default tasks for the environment.
Default features
[dependencies] # short for [feature.default.dependencies]\npython = \"*\"\nnumpy = \"==2.3\"\n\n[pypi-dependencies] # short for [feature.default.pypi-dependencies]\npandas = \"*\"\n\n[system-requirements] # short for [feature.default.system-requirements]\nlibc = \"2.33\"\n\n[activation] # short for [feature.default.activation]\nscripts = [\"activate.sh\"]\n
Different dependencies per feature
[feature.py39.dependencies]\npython = \"~=3.9.0\"\n[feature.py310.dependencies]\npython = \"~=3.10.0\"\n[feature.test.dependencies]\npytest = \"*\"\n
Full set of environment modification in one feature
[feature.cuda]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nactivation = {scripts = [\"cuda_activation.sh\"]}\nsystem-requirements = {cuda = \"12\"}\n# Channels concatenate using a priority instead of overwrite, so the default channels are still used.\n# Using the priority the concatenation is controlled, default is 0, the default channels are used last.\n# Highest priority comes first.\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}] # Results in:  [\"nvidia\", \"conda-forge\", \"pytorch\"] when the default is `conda-forge`\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
Define tasks as defaults of an environment
[feature.test.tasks]\ntest = \"pytest\"\n\n[environments]\ntest = [\"test\"]\n\n# `pixi run test` == `pixi run --environment test test`\n

The environment definition should contain the following fields:

  • features: Vec<Feature>: The features that are included in the environment set, which is also the default field in the environments.
  • solve-group: String: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies.
Creating environments from features
[environments]\n# implicit: default = [\"default\"]\ndefault = [\"py39\"] # implicit: default = [\"py39\", \"default\"]\npy310 = [\"py310\"] # implicit: py310 = [\"py310\", \"default\"]\ntest = [\"test\"] # implicit: test = [\"test\", \"default\"]\ntest39 = [\"test\", \"py39\"] # implicit: test39 = [\"test\", \"py39\", \"default\"]\n
Testing a production environment with additional dependencies
[environments]\n# Creating a `prod` environment which is the minimal set of dependencies used for production.\nprod = {features = [\"py39\"], solve-group = \"prod\"}\n# Creating a `test_prod` environment which is the `prod` environment plus the `test` feature.\ntest_prod = {features = [\"py39\", \"test\"], solve-group = \"prod\"}\n# Using the `solve-group` to solve the `prod` and `test_prod` environments together\n# Which makes sure the tested environment has the same version of the dependencies as the production environment.\n
Creating environments without including the default feature
[dependencies]\npython = \"*\"\nnumpy = \"*\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n\n[environments]\n# Create a custom environment which only has the `lint` feature (numpy isn't part of that env).\nlint = {features = [\"lint\"], no-default-feature = true}\n
"},{"location":"features/multi_environment/#lock-file-structure","title":"lock file Structure","text":"

Within the pixi.lock file, a package may now include an additional environments field, specifying the environment to which it belongs. To avoid duplication the packages environments field may contain multiple environments so the lock file is of minimal size.

- platform: linux-64\n  name: pre-commit\n  version: 3.3.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n  ...:\n- platform: linux-64\n  name: python\n  version: 3.9.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n    - py39\n    - default\n  ...:\n
"},{"location":"features/multi_environment/#user-interface-environment-activation","title":"User Interface Environment Activation","text":"

Users can manually activate the desired environment via command line or configuration. This approach guarantees a conflict-free environment by allowing only one feature set to be active at a time. For the user the cli would look like this:

Default behavior
\u279c pixi run python\n# Runs python in the `default` environment\n
Activating an specific environment
\u279c pixi run -e test pytest\n\u279c pixi run --environment test pytest\n# Runs `pytest` in the `test` environment\n
Activating a shell in an environment
\u279c pixi shell -e cuda\npixi shell --environment cuda\n# Starts a shell in the `cuda` environment\n
Running any command in an environment
\u279c pixi run -e test any_command\n# Runs any_command in the `test` environment which doesn't require to be predefined as a task.\n
"},{"location":"features/multi_environment/#ambiguous-environment-selection","title":"Ambiguous Environment Selection","text":"

It's possible to define tasks in multiple environments, in this case the user should be prompted to select the environment.

Here is a simple example of a task only manifest:

pixi.toml

[project]\nname = \"test_ambiguous_env\"\nchannels = []\nplatforms = [\"linux-64\", \"win-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ndefault = \"echo Default\"\nambi = \"echo Ambi::Default\"\n[feature.test.tasks]\ntest = \"echo Test\"\nambi = \"echo Ambi::Test\"\n\n[feature.dev.tasks]\ndev = \"echo Dev\"\nambi = \"echo Ambi::Dev\"\n\n[environments]\ndefault = [\"test\", \"dev\"]\ntest = [\"test\"]\ndev = [\"dev\"]\n
Trying to run the ambi task will prompt the user to select the environment. As it is available in all environments.

Interactive selection of environments if task is in multiple environments
\u279c pixi run ambi\n? The task 'ambi' can be run in multiple environments.\n\nPlease select an environment to run the task in: \u203a\n\u276f default # selecting default\n  test\n  dev\n\n\u2728 Pixi task (ambi in default): echo Ambi::Test\nAmbi::Test\n

As you can see it runs the task defined in the feature.task but it is run in the default environment. This happens because the ambi task is defined in the test feature, and it is overwritten in the default environment. So the tasks.default is now non-reachable from any environment.

Some other results running in this example:

\u279c pixi run --environment test ambi\n\u2728 Pixi task (ambi in test): echo Ambi::Test\nAmbi::Test\n\n\u279c pixi run --environment dev ambi\n\u2728 Pixi task (ambi in dev): echo Ambi::Dev\nAmbi::Dev\n\n# dev is run in the default environment\n\u279c pixi run dev\n\u2728 Pixi task (dev in default): echo Dev\nDev\n\n# dev is run in the dev environment\n\u279c pixi run -e dev dev\n\u2728 Pixi task (dev in dev): echo Dev\nDev\n

"},{"location":"features/multi_environment/#important-links","title":"Important links","text":"
  • Initial writeup of the proposal: GitHub Gist by 0xbe7a
  • GitHub project: #10
"},{"location":"features/multi_environment/#real-world-example-use-cases","title":"Real world example use cases","text":"Polarify test setup

In polarify they want to test multiple versions combined with multiple versions of polars. This is currently done by using a matrix in GitHub actions. This can be replaced by using multiple environments.

pixi.toml
[project]\nname = \"polarify\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tasks]\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\n\n[dependencies]\npython = \">=3.9\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py39.dependencies]\npython = \"3.9.*\"\n[feature.py310.dependencies]\npython = \"3.10.*\"\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n[feature.pl017.dependencies]\npolars = \"0.17.*\"\n[feature.pl018.dependencies]\npolars = \"0.18.*\"\n[feature.pl019.dependencies]\npolars = \"0.19.*\"\n[feature.pl020.dependencies]\npolars = \"0.20.*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-emoji = \"*\"\nhypothesis = \"*\"\n[feature.test.tasks]\ntest = \"pytest\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n[feature.lint.tasks]\nlint = \"pre-commit run --all\"\n\n[environments]\npl017 = [\"pl017\", \"py39\", \"test\"]\npl018 = [\"pl018\", \"py39\", \"test\"]\npl019 = [\"pl019\", \"py39\", \"test\"]\npl020 = [\"pl020\", \"py39\", \"test\"]\npy39 = [\"py39\", \"test\"]\npy310 = [\"py310\", \"test\"]\npy311 = [\"py311\", \"test\"]\npy312 = [\"py312\", \"test\"]\n
.github/workflows/test.yml
jobs:\n  tests-per-env:\n    runs-on: ubuntu-latest\n    strategy:\n      matrix:\n        environment: [py311, py312]\n    steps:\n    - uses: actions/checkout@v4\n      - uses: prefix-dev/setup-pixi@v0.5.1\n        with:\n          environments: ${{ matrix.environment }}\n      - name: Run tasks\n        run: |\n          pixi run --environment ${{ matrix.environment }} test\n  tests-with-multiple-envs:\n    runs-on: ubuntu-latest\n    steps:\n    - uses: actions/checkout@v4\n    - uses: prefix-dev/setup-pixi@v0.5.1\n      with:\n       environments: pl017 pl018\n    - run: |\n        pixi run -e pl017 test\n        pixi run -e pl018 test\n
Test vs Production example

This is an example of a project that has a test feature and prod environment. The prod environment is a production environment that contains the run dependencies. The test feature is a set of dependencies and tasks that we want to put on top of the previously solved prod environment. This is a common use case where we want to test the production environment with additional dependencies.

pixi.toml

[project]\nname = \"my-app\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\", \"linux-64\"]\n\n[tasks]\npostinstall-e = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check .\"\ndev = \"uvicorn my_app.app:main --reload\"\nserve = \"uvicorn my_app.app:main\"\n\n[dependencies]\npython = \">=3.12\"\npip = \"*\"\npydantic = \">=2\"\nfastapi = \">=0.105.0\"\nsqlalchemy = \">=2,<3\"\nuvicorn = \"*\"\naiofiles = \"*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-asyncio = \"*\"\n[feature.test.tasks]\ntest = \"pytest --md=report.md\"\n\n[environments]\n# both default and prod will have exactly the same dependency versions when they share a dependency\ndefault = {features = [\"test\"], solve-group = \"prod-group\"}\nprod = {features = [], solve-group = \"prod-group\"}\n
In ci you would run the following commands:
pixi run postinstall-e && pixi run test\n
Locally you would run the following command:
pixi run postinstall-e && pixi run dev\n

Then in a Dockerfile you would run the following command: Dockerfile

FROM ghcr.io/prefix-dev/pixi:latest # this doesn't exist yet\nWORKDIR /app\nCOPY . .\nRUN pixi run --environment prod postinstall\nEXPOSE 8080\nCMD [\"/usr/local/bin/pixi\", \"run\", \"--environment\", \"prod\", \"serve\"]\n

Multiple machines from one project

This is an example for an ML project that should be executable on a machine that supports cuda and mlx. It should also be executable on machines that don't support cuda or mlx, we use the cpu feature for this.

pixi.toml
[project]\nname = \"my-ml-project\"\ndescription = \"A project that does ML stuff\"\nauthors = [\"Your Name <your.name@gmail.com>\"]\nchannels = [\"conda-forge\", \"pytorch\"]\n# All platforms that are supported by the project as the features will take the intersection of the platforms defined there.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ntrain-model = \"python train.py\"\nevaluate-model = \"python test.py\"\n\n[dependencies]\npython = \"3.11.*\"\npytorch = {version = \">=2.0.1\", channel = \"pytorch\"}\ntorchvision = {version = \">=0.15\", channel = \"pytorch\"}\npolars = \">=0.20,<0.21\"\nmatplotlib-base = \">=3.8.2,<3.9\"\nipykernel = \">=6.28.0,<6.29\"\n\n[feature.cuda]\nplatforms = [\"win-64\", \"linux-64\"]\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}]\nsystem-requirements = {cuda = \"12.1\"}\n\n[feature.cuda.tasks]\ntrain-model = \"python train.py --cuda\"\nevaluate-model = \"python test.py --cuda\"\n\n[feature.cuda.dependencies]\npytorch-cuda = {version = \"12.1.*\", channel = \"pytorch\"}\n\n[feature.mlx]\nplatforms = [\"osx-arm64\"]\n# MLX is only available on macOS >=13.5 (>14.0 is recommended)\nsystem-requirements = {macos = \"13.5\"}\n\n[feature.mlx.tasks]\ntrain-model = \"python train.py --mlx\"\nevaluate-model = \"python test.py --mlx\"\n\n[feature.mlx.dependencies]\nmlx = \">=0.16.0,<0.17.0\"\n\n[feature.cpu]\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[environments]\ncuda = [\"cuda\"]\nmlx = [\"mlx\"]\ndefault = [\"cpu\"]\n
Running the project on a cuda machine
pixi run train-model --environment cuda\n# will execute `python train.py --cuda`\n# fails if not on linux-64 or win-64 with cuda 12.1\n
Running the project with mlx
pixi run train-model --environment mlx\n# will execute `python train.py --mlx`\n# fails if not on osx-arm64\n
Running the project on a machine without cuda or mlx
pixi run train-model\n
"},{"location":"features/multi_platform_configuration/","title":"Multi platform config","text":"

Pixi's vision includes being supported on all major platforms. Sometimes that needs some extra configuration to work well. On this page, you will learn what you can configure to align better with the platform you are making your application for.

Here is an example manifest file that highlights some of the features:

pixi.tomlpyproject.toml pixi.toml
[project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"3.7\"\n\n\n[activation]\nscripts = [\"setup.sh\"]\n\n[target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
pyproject.toml
[tool.pixi.project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tool.pixi.dependencies]\npython = \">=3.8\"\n\n[tool.pixi.target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"~=3.7.0\"\n\n\n[tool.pixi.activation]\nscripts = [\"setup.sh\"]\n\n[tool.pixi.target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
"},{"location":"features/multi_platform_configuration/#platform-definition","title":"Platform definition","text":"

The project.platforms defines which platforms your project supports. When multiple platforms are defined, pixi determines which dependencies to install for each platform individually. All of this is stored in a lock file.

Running pixi install on a platform that is not configured will warn the user that it is not setup for that platform:

\u276f pixi install\n  \u00d7 the project is not configured for your current platform\n   \u256d\u2500[pixi.toml:6:1]\n 6 \u2502 channels = [\"conda-forge\"]\n 7 \u2502 platforms = [\"osx-64\", \"osx-arm64\", \"win-64\"]\n   \u00b7             \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n   \u00b7                             \u2570\u2500\u2500 add 'linux-64' here\n 8 \u2502\n   \u2570\u2500\u2500\u2500\u2500\n  help: The project needs to be configured to support your platform (linux-64).\n
"},{"location":"features/multi_platform_configuration/#target-specifier","title":"Target specifier","text":"

With the target specifier, you can overwrite the original configuration specifically for a single platform. If you are targeting a specific platform in your target specifier that was not specified in your project.platforms then pixi will throw an error.

"},{"location":"features/multi_platform_configuration/#dependencies","title":"Dependencies","text":"

It might happen that you want to install a certain dependency only on a specific platform, or you might want to use a different version on different platforms.

pixi.toml
[dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\nmsmpi = \"*\"\npython = \"3.8\"\n

In the above example, we specify that we depend on msmpi only on Windows. We also specifically want python on 3.8 when installing on Windows. This will overwrite the dependencies from the generic set of dependencies. This will not touch any of the other platforms.

You can use pixi's cli to add these dependencies to the manifest file.

pixi add --platform win-64 posix\n

This also works for the host and build dependencies.

pixi add --host --platform win-64 posix\npixi add --build --platform osx-64 clang\n

Which results in this.

pixi.toml
[target.win-64.host-dependencies]\nposix = \"1.0.0.*\"\n\n[target.osx-64.build-dependencies]\nclang = \"16.0.6.*\"\n
"},{"location":"features/multi_platform_configuration/#activation","title":"Activation","text":"

Pixi's vision is to enable completely cross-platform projects, but you often need to run tools that are not built by your projects. Generated activation scripts are often in this category, default scripts in unix are bash and for windows they are bat

To deal with this, you can define your activation scripts using the target definition.

pixi.toml

[activation]\nscripts = [\"setup.sh\", \"local_setup.bash\"]\n\n[target.win-64.activation]\nscripts = [\"setup.bat\", \"local_setup.bat\"]\n
When this project is run on win-64 it will only execute the target scripts not the scripts specified in the default activation.scripts

"},{"location":"features/system_requirements/","title":"System Requirements in pixi","text":"

System requirements define the minimal system specifications necessary during dependency resolution for a project. For instance, specifying a Unix system with a particular minimal libc version ensures that dependencies are compatible with the project's environment.

System specifications are closely related to virtual packages, allowing for flexible and accurate dependency management.

"},{"location":"features/system_requirements/#default-system-requirements","title":"Default System Requirements","text":"

The following configurations outline the default minimal system requirements for different operating systems:

LinuxWindowsosx-64osx-arm64
# Default system requirements for Linux\n[system-requirements]\nlinux = \"4.18\"\nlibc = { family = \"glibc\", version = \"2.28\" }\n

Windows currently has no minimal system requirements defined. If your project requires specific Windows configurations, you should define them accordingly.

# Default system requirements for macOS\n[system-requirements]\nmacos = \"13.0\"\n
# Default system requirements for macOS ARM64\n[system-requirements]\nmacos = \"13.0\"\n
"},{"location":"features/system_requirements/#customizing-system-requirements","title":"Customizing System Requirements","text":"

You only need to define system requirements if your project necessitates a different set from the defaults. This is common when installing environments on older or newer versions of operating systems.

"},{"location":"features/system_requirements/#adjusting-for-older-systems","title":"Adjusting for Older Systems","text":"

If you're encountering an error like:

\u00d7 The current system has a mismatching virtual package. The project requires '__linux' to be at least version '4.18' but the system has version '4.12.14'\n

This indicates that the project's system requirements are higher than your current system's specifications. To resolve this, you can lower the system requirements in your project's configuration:

[system-requirements]\nlinux = \"4.12.14\"\n

This adjustment informs the dependency resolver to accommodate the older system version.

"},{"location":"features/system_requirements/#using-cuda-in-pixi","title":"Using CUDA in pixi","text":"

To utilize CUDA in your project, you must specify the desired CUDA version in the system-requirements table. This ensures that CUDA is recognized and appropriately locked into the lock file if necessary.

Example Configuration

[system-requirements]\ncuda = \"12\"  # Replace \"12\" with the specific CUDA version you intend to use\n
"},{"location":"features/system_requirements/#setting-system-requirements-environment-specific","title":"Setting System Requirements environment specific","text":"

This can be set per feature in the the manifest file.

[feature.cuda.system-requirements]\ncuda = \"12\"\n\n[environments]\ncuda = [\"cuda\"]\n
"},{"location":"features/system_requirements/#available-override-options","title":"Available Override Options","text":"

In certain scenarios, you might need to override the system requirements detected on your machine. This can be particularly useful when working on systems that do not meet the project's default requirements.

You can override virtual packages by setting the following environment variables:

  • CONDA_OVERRIDE_CUDA
    • Description: Sets the CUDA version.
    • Usage Example: CONDA_OVERRIDE_CUDA=11
  • CONDA_OVERRIDE_GLIBC
    • Description: Sets the glibc version.
    • Usage Example: CONDA_OVERRIDE_GLIBC=2.28
  • CONDA_OVERRIDE_OSX
    • Description: Sets the macOS version.
    • Usage Example: CONDA_OVERRIDE_OSX=13.0
"},{"location":"features/system_requirements/#additional-resources","title":"Additional Resources","text":"

For more detailed information on managing virtual packages and overriding system requirements, refer to the Conda Documentation.

"},{"location":"ide_integration/devcontainer/","title":"Use pixi inside of a devcontainer","text":"

VSCode Devcontainers are a popular tool to develop on a project with a consistent environment. They are also used in GitHub Codespaces which makes it a great way to develop on a project without having to install anything on your local machine.

To use pixi inside of a devcontainer, follow these steps:

Create a new directory .devcontainer in the root of your project. Then, create the following two files in the .devcontainer directory:

.devcontainer/Dockerfile
FROM mcr.microsoft.com/devcontainers/base:jammy\n\nARG PIXI_VERSION=v0.39.2\n\nRUN curl -L -o /usr/local/bin/pixi -fsSL --compressed \"https://github.com/prefix-dev/pixi/releases/download/${PIXI_VERSION}/pixi-$(uname -m)-unknown-linux-musl\" \\\n    && chmod +x /usr/local/bin/pixi \\\n    && pixi info\n\n# set some user and workdir settings to work nicely with vscode\nUSER vscode\nWORKDIR /home/vscode\n\nRUN echo 'eval \"$(pixi completion -s bash)\"' >> /home/vscode/.bashrc\n
.devcontainer/devcontainer.json
{\n    \"name\": \"my-project\",\n    \"build\": {\n      \"dockerfile\": \"Dockerfile\",\n      \"context\": \"..\",\n    },\n    \"customizations\": {\n      \"vscode\": {\n        \"settings\": {},\n        \"extensions\": [\"ms-python.python\", \"charliermarsh.ruff\", \"GitHub.copilot\"]\n      }\n    },\n    \"features\": {\n      \"ghcr.io/devcontainers/features/docker-in-docker:2\": {}\n    },\n    \"mounts\": [\"source=${localWorkspaceFolderBasename}-pixi,target=${containerWorkspaceFolder}/.pixi,type=volume\"],\n    \"postCreateCommand\": \"sudo chown vscode .pixi && pixi install\"\n}\n

Put .pixi in a mount

In the above example, we mount the .pixi directory into a volume. This is needed since the .pixi directory shouldn't be on a case insensitive filesystem (default on macOS, Windows) but instead in its own volume. There are some conda packages (for example ncurses-feedstock#73) that contain files that only differ in case which leads to errors on case insensitive filesystems.

"},{"location":"ide_integration/devcontainer/#secrets","title":"Secrets","text":"

If you want to authenticate to a private conda channel, you can add secrets to your devcontainer.

.devcontainer/devcontainer.json
{\n    \"build\": \"Dockerfile\",\n    \"context\": \"..\",\n    \"options\": [\n        \"--secret\",\n        \"id=prefix_dev_token,env=PREFIX_DEV_TOKEN\",\n    ],\n    // ...\n}\n
.devcontainer/Dockerfile
# ...\nRUN --mount=type=secret,id=prefix_dev_token,uid=1000 \\\n    test -s /run/secrets/prefix_dev_token \\\n    && pixi auth login --token \"$(cat /run/secrets/prefix_dev_token)\" https://repo.prefix.dev\n

These secrets need to be present either as an environment variable when starting the devcontainer locally or in your GitHub Codespaces settings under Secrets.

"},{"location":"ide_integration/jupyterlab/","title":"JupyterLab Integration","text":""},{"location":"ide_integration/jupyterlab/#basic-usage","title":"Basic usage","text":"

Using JupyterLab with pixi is very simple. You can just create a new pixi project and add the jupyterlab package to it. The full example is provided under the following Github link.

pixi init\npixi add jupyterlab\n

This will create a new pixi project and add the jupyterlab package to it. You can then start JupyterLab using the following command:

pixi run jupyter lab\n

If you want to add more \"kernels\" to JupyterLab, you can simply add them to your current project \u2013 as well as any dependencies from the scientific stack you might need.

pixi add bash_kernel ipywidgets matplotlib numpy pandas  # ...\n
"},{"location":"ide_integration/jupyterlab/#what-kernels-are-available","title":"What kernels are available?","text":"

You can easily install more \"kernels\" for JupyterLab. The conda-forge repository has a number of interesting additional kernels - not just Python!

  • bash_kernel A kernel for bash
  • xeus-cpp A C++ kernel based on the new clang-repl
  • xeus-cling A C++ kernel based on the slightly older Cling
  • xeus-lua A Lua kernel
  • xeus-sql A kernel for SQL
  • r-irkernel An R kernel
"},{"location":"ide_integration/jupyterlab/#advanced-usage","title":"Advanced usage","text":"

If you want to have only one instance of JupyterLab running but still want per-directory Pixi environments, you can use one of the kernels provided by the pixi-kernel package.

"},{"location":"ide_integration/jupyterlab/#configuring-jupyterlab","title":"Configuring JupyterLab","text":"

To get started, create a Pixi project, add jupyterlab and pixi-kernel and then start JupyterLab:

pixi init\npixi add jupyterlab pixi-kernel\npixi run jupyter lab\n

This will start JupyterLab and open it in your browser.

pixi-kernel searches for a manifest file, either pixi.toml or pyproject.toml, in the same directory of your notebook or in any parent directory. When it finds one, it will use the environment specified in the manifest file to start the kernel and run your notebooks.

"},{"location":"ide_integration/jupyterlab/#binder","title":"Binder","text":"

If you just want to check a JupyterLab environment running in the cloud using pixi-kernel, you can visit Binder.

"},{"location":"ide_integration/pycharm/","title":"PyCharm Integration","text":"

You can use PyCharm with pixi environments by using the conda shim provided by the pixi-pycharm package.

"},{"location":"ide_integration/pycharm/#how-to-use","title":"How to use","text":"

To get started, add pixi-pycharm to your pixi project.

pixi add pixi-pycharm\n

This will ensure that the conda shim is installed in your project's environment.

Having pixi-pycharm installed, you can now configure PyCharm to use your pixi environments. Go to the Add Python Interpreter dialog (bottom right corner of the PyCharm window) and select Conda Environment. Set Conda Executable to the full path of the conda file (on Windows: conda.bat) which is located in .pixi/envs/default/libexec. You can get the path using the following command:

Linux & macOSWindows
pixi run 'echo $CONDA_PREFIX/libexec/conda'\n
pixi run 'echo $CONDA_PREFIX\\\\libexec\\\\conda.bat'\n

This is an executable that tricks PyCharm into thinking it's the proper conda executable. Under the hood it redirects all calls to the corresponding pixi equivalent.

Use the conda shim from this pixi project

Please make sure that this is the conda shim from this pixi project and not another one. If you use multiple pixi projects, you might have to adjust the path accordingly as PyCharm remembers the path to the conda executable.

Having selected the environment, PyCharm will now use the Python interpreter from your pixi environment.

PyCharm should now be able to show you the installed packages as well.

You can now run your programs and tests as usual.

Mark .pixi as excluded

In order for PyCharm to not get confused about the .pixi directory, please mark it as excluded.

Also, when using a remote interpreter, you should exclude the .pixi directory on the remote machine. Instead, you should run pixi install on the remote machine and select the conda shim from there.

"},{"location":"ide_integration/pycharm/#multiple-environments","title":"Multiple environments","text":"

If your project uses multiple environments to tests different Python versions or dependencies, you can add multiple environments to PyCharm by specifying Use existing environment in the Add Python Interpreter dialog.

You can then specify the corresponding environment in the bottom right corner of the PyCharm window.

"},{"location":"ide_integration/pycharm/#multiple-pixi-projects","title":"Multiple pixi projects","text":"

When using multiple pixi projects, remember to select the correct Conda Executable for each project as mentioned above. It also might come up that you have multiple environments it might come up that you have multiple environments with the same name.

It is recommended to rename the environments to something unique.

"},{"location":"ide_integration/pycharm/#debugging","title":"Debugging","text":"

Logs are written to ~/.cache/pixi-pycharm.log. You can use them to debug problems. Please attach the logs when filing a bug report.

"},{"location":"ide_integration/pycharm/#install-as-an-optional-dependency","title":"Install as an optional dependency","text":"

In some cases, you might only want to install pixi-pycharm on your local dev-machines but not in production. To achieve this, we can use multiple environments.

[project]\nname = \"multi-env\"\nversion = \"0.1.0\"\nrequires-python = \">=3.12\"\ndependencies = [\"numpy\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"]\n\n[tool.pixi.feature.lint.dependencies]\nruff =  \"*\"\n\n[tool.pixi.feature.dev.dependencies]\npixi-pycharm = \"*\"\n\n[tool.pixi.environments]\n# The production environment is the default feature set.\n# Adding a solve group to make sure the same versions are used in the `default` and `prod` environments.\nprod = { solve-group = \"main\" }\n\n# Setup the default environment to include the dev features.\n# By using `default` instead of `dev` you'll not have to specify the `--environment` flag when running `pixi run`.\ndefault = { features = [\"dev\"], solve-group = \"main\" }\n\n# The lint environment doesn't need the default feature set but only the `lint` feature\n# and thus can also be excluded from the solve group.\nlint = { features = [\"lint\"], no-default-feature = true }\n

Now you as a user can run pixi shell, which will start the default environment. In production, you then just run pixi run -e prod COMMAND, and the minimal prod environment is installed.

"},{"location":"ide_integration/r_studio/","title":"Developing R scripts in RStudio","text":"

You can use pixi to manage your R dependencies. The conda-forge channel contains a wide range of R packages that can be installed using pixi.

"},{"location":"ide_integration/r_studio/#installing-r-packages","title":"Installing R packages","text":"

R packages are usually prefixed with r- in the conda-forge channel. To install an R package, you can use the following command:

pixi add r-<package-name>\n# for example\npixi add r-ggplot2\n
"},{"location":"ide_integration/r_studio/#using-r-packages-in-rstudio","title":"Using R packages in RStudio","text":"

To use the R packages installed by pixi in RStudio, you need to run rstudio from an activated environment. This can be achieved by running RStudio from pixi shell or from a task in the pixi.toml file.

"},{"location":"ide_integration/r_studio/#full-example","title":"Full example","text":"

The full example can be found here: RStudio example. Here is an example of a pixi.toml file that sets up an RStudio task:

[project]\nname = \"r\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[target.linux.tasks]\nrstudio = \"rstudio\"\n\n[target.osx.tasks]\nrstudio = \"open -a rstudio\"\n# or alternatively with the full path:\n# rstudio = \"/Applications/RStudio.app/Contents/MacOS/RStudio\"\n\n[dependencies]\nr = \">=4.3,<5\"\nr-ggplot2 = \">=3.5.0,<3.6\"\n

Once RStudio has loaded, you can execute the following R code that uses the ggplot2 package:

# Load the ggplot2 package\nlibrary(ggplot2)\n\n# Load the built-in 'mtcars' dataset\ndata <- mtcars\n\n# Create a scatterplot of 'mpg' vs 'wt'\nggplot(data, aes(x = wt, y = mpg)) +\n  geom_point() +\n  labs(x = \"Weight (1000 lbs)\", y = \"Miles per Gallon\") +\n  ggtitle(\"Fuel Efficiency vs. Weight\")\n

Note

This example assumes that you have installed RStudio system-wide. We are working on updating RStudio as well as the R interpreter builds on Windows for maximum compatibility with pixi.

"},{"location":"reference/cli/","title":"Commands","text":""},{"location":"reference/cli/#global-options","title":"Global options","text":"
  • --verbose (-v|vv|vvv) Increase the verbosity of the output messages, the -v|vv|vvv increases the level of verbosity respectively.
  • --help (-h) Shows help information, use -h to get the short version of the help.
  • --version (-V): shows the version of pixi that is used.
  • --quiet (-q): Decreases the amount of output.
  • --color <COLOR>: Whether the log needs to be colored [env: PIXI_COLOR=] [default: auto] [possible values: always, never, auto]. Pixi also honors the FORCE_COLOR and NO_COLOR environment variables. They both take precedence over --color and PIXI_COLOR.
  • --no-progress: Disables the progress bar.[env: PIXI_NO_PROGRESS] [default: false]
"},{"location":"reference/cli/#init","title":"init","text":"

This command is used to create a new project. It initializes a pixi.toml file and also prepares a .gitignore to prevent the environment from being added to git.

It also supports the pyproject.toml file, if you have a pyproject.toml file in the directory where you run pixi init, it appends the pixi data to the pyproject.toml instead of a new pixi.toml file.

"},{"location":"reference/cli/#arguments","title":"Arguments","text":"
  1. [PATH]: Where to place the project (defaults to current path) [default: .]
"},{"location":"reference/cli/#options","title":"Options","text":"
  • --channel <CHANNEL> (-c): Specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • --platform <PLATFORM> (-p): Specify a platform that the project supports. (Allowed to be used more than once)
  • --import <ENV_FILE> (-i): Import an existing conda environment file, e.g. environment.yml.
  • --format <FORMAT>: Specify the format of the project file, either pyproject or pixi. [default: pixi]
  • --scm <SCM>: Specify the SCM used to manage the project with. Possible values: github, gitlab, codeberg. [default: github]

Importing an environment.yml

When importing an environment, the pixi.toml will be created with the dependencies from the environment file. The pixi.lock will be created when you install the environment. We don't support git+ urls as dependencies for pip packages and for the defaults channel we use main, r and msys2 as the default channels.

pixi init myproject\npixi init ~/myproject\npixi init  # Initializes directly in the current directory.\npixi init --channel conda-forge --channel bioconda myproject\npixi init --platform osx-64 --platform linux-64 myproject\npixi init --import environment.yml\npixi init --format pyproject\npixi init --format pixi --scm gitlab\n
"},{"location":"reference/cli/#add","title":"add","text":"

Adds dependencies to the manifest file. It will only add dependencies compatible with the rest of the dependencies in the project. More info on multi-platform configuration.

If the project manifest is a pyproject.toml, by default, adding a pypi dependency will add it to the native project.dependencies array, or to the native dependency-groups table if a feature is specified:

  • pixi add --pypi boto3 would add boto3 to the project.dependencies array
  • pixi add --pypi boto3 --feature aws would add boto3 to the dependency-groups.aws array

Note that if --platform or --editable are specified, the pypi dependency will be added to the tool.pixi.pypi-dependencies table instead as native arrays have no support for platform-specific or editable dependencies.

These dependencies will be read by pixi as if they had been added to the pixi pypi-dependencies tables of the default or a named feature.

"},{"location":"reference/cli/#arguments_1","title":"Arguments","text":"
  1. [SPECS]: The package(s) to add, space separated. The version constraint is optional.
"},{"location":"reference/cli/#options_1","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --host: Specifies a host dependency, important for building a package.
  • --build: Specifies a build dependency, important for building a package.
  • --pypi: Specifies a PyPI dependency, not a conda package. Parses dependencies as PEP508 requirements, supporting extras and versions. See configuration for details.
  • --no-install: Don't install the package to the environment, only add the package to the lock-file.
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • --platform <PLATFORM> (-p): The platform for which the dependency should be added. (Allowed to be used more than once)
  • --feature <FEATURE> (-f): The feature for which the dependency should be added.
  • --editable: Specifies an editable dependency; only used in combination with --pypi.
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi add numpy # (1)!\npixi add numpy pandas \"pytorch>=1.8\" # (2)!\npixi add \"numpy>=1.22,<1.24\" # (3)!\npixi add --manifest-path ~/myproject/pixi.toml numpy # (4)!\npixi add --host \"python>=3.9.0\" # (5)!\npixi add --build cmake # (6)!\npixi add --platform osx-64 clang # (7)!\npixi add --no-install numpy # (8)!\npixi add --no-lockfile-update numpy # (9)!\npixi add --feature featurex numpy # (10)!\n\n# Add a pypi dependency\npixi add --pypi requests[security] # (11)!\npixi add --pypi Django==5.1rc1 # (12)!\npixi add --pypi \"boltons>=24.0.0\" --feature lint # (13)!\npixi add --pypi \"boltons @ https://files.pythonhosted.org/packages/46/35/e50d4a115f93e2a3fbf52438435bb2efcf14c11d4fcd6bdcd77a6fc399c9/boltons-24.0.0-py3-none-any.whl\" # (14)!\npixi add --pypi \"exchangelib @ git+https://github.com/ecederstrand/exchangelib\" # (15)!\npixi add --pypi \"project @ file:///absolute/path/to/project\" # (16)!\npixi add --pypi \"project@file:///absolute/path/to/project\" --editable # (17)!\n
  1. This will add the numpy package to the project with the latest available for the solved environment.
  2. This will add multiple packages to the project solving them all together.
  3. This will add the numpy package with the version constraint.
  4. This will add the numpy package to the project of the manifest file at the given path.
  5. This will add the python package as a host dependency. There is currently no different behavior for host dependencies.
  6. This will add the cmake package as a build dependency. There is currently no different behavior for build dependencies.
  7. This will add the clang package only for the osx-64 platform.
  8. This will add the numpy package to the manifest and lockfile, without installing it in an environment.
  9. This will add the numpy package to the manifest without updating the lockfile or installing it in the environment.
  10. This will add the numpy package in the feature featurex.
  11. This will add the requests package as pypi dependency with the security extra.
  12. This will add the pre-release version of Django to the project as a pypi dependency.
  13. This will add the boltons package in the feature lint as pypi dependency.
  14. This will add the boltons package with the given url as pypi dependency.
  15. This will add the exchangelib package with the given git url as pypi dependency.
  16. This will add the project package with the given file url as pypi dependency.
  17. This will add the project package with the given file url as an editable package as pypi dependency.

Tip

If you want to use a non default pinning strategy, you can set it using pixi's configuration.

pixi config set pinning-strategy no-pin --global\n
The default is semver which will pin the dependencies to the latest major version or minor for v0 versions.

Note

There is an exception to this rule when you add a package we defined as non semver, then we'll use the minor strategy. These are the packages we defined as non semver: Python, Rust, Julia, GCC, GXX, GFortran, NodeJS, Deno, R, R-Base, Perl

"},{"location":"reference/cli/#install","title":"install","text":"

Installs an environment based on the manifest file. If there is no pixi.lock file or it is not up-to-date with the manifest file, it will (re-)generate the lock file.

pixi install only installs one environment at a time, if you have multiple environments you can select the right one with the --environment flag. If you don't provide an environment, the default environment will be installed.

Running pixi install is not required before running other commands. As all commands interacting with the environment will first run the install command if the environment is not ready, to make sure you always run in a correct state. E.g. pixi run, pixi shell, pixi shell-hook, pixi add, pixi remove to name a few.

"},{"location":"reference/cli/#options_2","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided the default environment will be used.
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi install\npixi install --manifest-path ~/myproject/pixi.toml\npixi install --frozen\npixi install --locked\npixi install --environment lint\npixi install -e lint\n
"},{"location":"reference/cli/#update","title":"update","text":"

The update command checks if there are newer versions of the dependencies and updates the pixi.lock file and environments accordingly. It will only update the lock file if the dependencies in the manifest file are still compatible with the new versions.

"},{"location":"reference/cli/#arguments_2","title":"Arguments","text":"
  1. [PACKAGES]... The packages to update, space separated. If no packages are provided, all packages will be updated.
"},{"location":"reference/cli/#options_3","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided all the environments are updated.
  • --platform <PLATFORM> (-p): The platform for which the dependencies should be updated.
  • --dry-run (-n): Only show the changes that would be made, without actually updating the lock file or environment.
  • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
  • --json: Output the changes in json format.
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi update numpy\npixi update numpy pandas\npixi update --manifest-path ~/myproject/pixi.toml numpy\npixi update --environment lint python\npixi update -e lint -e schema -e docs pre-commit\npixi update --platform osx-arm64 mlx\npixi update -p linux-64 -p osx-64 numpy\npixi update --dry-run\npixi update --no-install boto3\n
"},{"location":"reference/cli/#upgrade","title":"upgrade","text":"

The upgrade command checks if there are newer versions of the dependencies and upgrades them in the manifest file. update updates dependencies in the lock file while still fulfilling the version requirements set in the manifest. upgrade loosens the requirements for the given packages, updates the lock file and the adapts the manifest accordingly.

"},{"location":"reference/cli/#arguments_3","title":"Arguments","text":"
  1. [PACKAGES]... The packages to upgrade, space separated. If no packages are provided, all packages will be upgraded.
"},{"location":"reference/cli/#options_4","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --feature <FEATURE> (-e): The feature to upgrade, if none are provided the default feature will be used.
  • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
  • --json: Output the changes in json format.
  • --dry-run (-n): Only show the changes that would be made, without actually updating the manifest, lock file, or environment.
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi upgrade\npixi upgrade numpy\npixi upgrade numpy pandas\npixi upgrade --manifest-path ~/myproject/pixi.toml numpy\npixi upgrade --feature lint python\npixi upgrade --json\npixi upgrade --dry-run\n

Note

The pixi upgrade command will only update versions, except when you specify the exact package name (pixi upgrade numpy).

Then it will remove all fields, apart from:

  • build field containing a wildcard *
  • channel
  • file_name
  • url
  • subdir.
"},{"location":"reference/cli/#run","title":"run","text":"

The run commands first checks if the environment is ready to use. When you didn't run pixi install the run command will do that for you. The custom tasks defined in the manifest file are also available through the run command.

You cannot run pixi run source setup.bash as source is not available in the deno_task_shell commandos and not an executable.

"},{"location":"reference/cli/#arguments_4","title":"Arguments","text":"
  1. [TASK]... The task you want to run in the projects environment, this can also be a normal command. And all arguments after the task will be passed to the task.
"},{"location":"reference/cli/#options_5","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --environment <ENVIRONMENT> (-e): The environment to run the task in, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • --clean-env: Run the task in a clean environment, this will remove all environment variables of the shell environment except for the ones pixi sets. THIS DOESN't WORK ON Windows.
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • --revalidate: Revalidate the full environment, instead of checking the lock file hash. more info
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi run python\npixi run cowpy \"Hey pixi user\"\npixi run --manifest-path ~/myproject/pixi.toml python\npixi run --frozen python\npixi run --locked python\n# If you have specified a custom task in the pixi.toml you can run it with run as well\npixi run build\n# Extra arguments will be passed to the tasks command.\npixi run task argument1 argument2\n\n# If you have multiple environments you can select the right one with the --environment flag.\npixi run --environment cuda python\n\n# THIS DOESN'T WORK ON WINDOWS\n# If you want to run a command in a clean environment you can use the --clean-env flag.\n# The PATH should only contain the pixi environment here.\npixi run --clean-env \"echo \\$PATH\"\n

Info

In pixi the deno_task_shell is the underlying runner of the run command. Checkout their documentation for the syntax and available commands. This is done so that the run commands can be run across all platforms.

Cross environment tasks

If you're using the depends-on feature of the tasks, the tasks will be run in the order you specified them. The depends-on can be used cross environment, e.g. you have this pixi.toml:

pixi.toml
[tasks]\nstart = { cmd = \"python start.py\", depends-on = [\"build\"] }\n\n[feature.build.tasks]\nbuild = \"cargo build\"\n[feature.build.dependencies]\nrust = \">=1.74\"\n\n[environments]\nbuild = [\"build\"]\n

Then you're able to run the build from the build environment and start from the default environment. By only calling:

pixi run start\n

"},{"location":"reference/cli/#exec","title":"exec","text":"

Runs a command in a temporary environment disconnected from any project. This can be useful to quickly test out a certain package or version.

Temporary environments are cached. If the same command is run again, the same environment will be reused.

Cleaning temporary environments

Currently, temporary environments can only be cleaned up manually. Environments for pixi exec are stored under cached-envs-v0/ in the cache directory. Run pixi info to find the cache directory.

"},{"location":"reference/cli/#arguments_5","title":"Arguments","text":"
  1. <COMMAND>: The command to run.
"},{"location":"reference/cli/#options_6","title":"Options:","text":"
  • --spec <SPECS> (-s): Matchspecs of packages to install. If this is not provided, the package is guessed from the command.
  • --channel <CHANNELS> (-c): The channel to install the packages from. If not specified the default channel is used.
  • --force-reinstall If specified a new environment is always created even if one already exists.
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi exec python\n\n# Add a constraint to the python version\npixi exec -s python=3.9 python\n\n# Run ipython and include the py-rattler package in the environment\npixi exec -s ipython -s py-rattler ipython\n\n# Force reinstall to recreate the environment and get the latest package versions\npixi exec --force-reinstall -s ipython -s py-rattler ipython\n
"},{"location":"reference/cli/#remove","title":"remove","text":"

Removes dependencies from the manifest file.

If the project manifest is a pyproject.toml, removing a pypi dependency with the --pypi flag will remove it from either

  • the native pyproject project.dependencies array or the native project.optional-dependencies table (if a feature is specified)
  • pixi pypi-dependencies tables of the default or a named feature (if a feature is specified)
"},{"location":"reference/cli/#arguments_6","title":"Arguments","text":"
  1. <DEPS>...: List of dependencies you wish to remove from the project.
"},{"location":"reference/cli/#options_7","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --host: Specifies a host dependency, important for building a package.
  • --build: Specifies a build dependency, important for building a package.
  • --pypi: Specifies a PyPI dependency, not a conda package.
  • --platform <PLATFORM> (-p): The platform from which the dependency should be removed.
  • --feature <FEATURE> (-f): The feature from which the dependency should be removed.
  • --no-install: Don't install the environment, only remove the package from the lock-file and manifest.
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
pixi remove numpy\npixi remove numpy pandas pytorch\npixi remove --manifest-path ~/myproject/pixi.toml numpy\npixi remove --host python\npixi remove --build cmake\npixi remove --pypi requests\npixi remove --platform osx-64 --build clang\npixi remove --feature featurex clang\npixi remove --feature featurex --platform osx-64 clang\npixi remove --feature featurex --platform osx-64 --build clang\npixi remove --no-install numpy\n
"},{"location":"reference/cli/#task","title":"task","text":"

If you want to make a shorthand for a specific command you can add a task for it.

"},{"location":"reference/cli/#options_8","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
"},{"location":"reference/cli/#task-add","title":"task add","text":"

Add a task to the manifest file, use --depends-on to add tasks you want to run before this task, e.g. build before an execute task.

"},{"location":"reference/cli/#arguments_7","title":"Arguments","text":"
  1. <NAME>: The name of the task.
  2. <COMMAND>: The command to run. This can be more than one word.

Info

If you are using $ for env variables they will be resolved before adding them to the task. If you want to use $ in the task you need to escape it with a \\, e.g. echo \\$HOME.

"},{"location":"reference/cli/#options_9","title":"Options","text":"
  • --platform <PLATFORM> (-p): the platform for which this task should be added.
  • --feature <FEATURE> (-f): the feature for which the task is added, if non provided the default tasks will be added.
  • --depends-on <DEPENDS_ON>: the task it depends on to be run before the one your adding.
  • --cwd <CWD>: the working directory for the task relative to the root of the project.
  • --env <ENV>: the environment variables as key=value pairs for the task, can be used multiple times, e.g. --env \"VAR1=VALUE1\" --env \"VAR2=VALUE2\".
  • --description <DESCRIPTION>: a description of the task.
pixi task add cow cowpy \"Hello User\"\npixi task add tls ls --cwd tests\npixi task add test cargo t --depends-on build\npixi task add build-osx \"METAL=1 cargo build\" --platform osx-64\npixi task add train python train.py --feature cuda\npixi task add publish-pypi \"hatch publish --yes --repo main\" --feature build --env HATCH_CONFIG=config/hatch.toml --description \"Publish the package to pypi\"\n

This adds the following to the manifest file:

[tasks]\ncow = \"cowpy \\\"Hello User\\\"\"\ntls = { cmd = \"ls\", cwd = \"tests\" }\ntest = { cmd = \"cargo t\", depends-on = [\"build\"] }\n\n[target.osx-64.tasks]\nbuild-osx = \"METAL=1 cargo build\"\n\n[feature.cuda.tasks]\ntrain = \"python train.py\"\n\n[feature.build.tasks]\npublish-pypi = { cmd = \"hatch publish --yes --repo main\", env = { HATCH_CONFIG = \"config/hatch.toml\" }, description = \"Publish the package to pypi\" }\n

Which you can then run with the run command:

pixi run cow\n# Extra arguments will be passed to the tasks command.\npixi run test --test test1\n
"},{"location":"reference/cli/#task-remove","title":"task remove","text":"

Remove the task from the manifest file

"},{"location":"reference/cli/#arguments_8","title":"Arguments","text":"
  • <NAMES>: The names of the tasks, space separated.
"},{"location":"reference/cli/#options_10","title":"Options","text":"
  • --platform <PLATFORM> (-p): the platform for which this task is removed.
  • --feature <FEATURE> (-f): the feature for which the task is removed.
pixi task remove cow\npixi task remove --platform linux-64 test\npixi task remove --feature cuda task\n
"},{"location":"reference/cli/#task-alias","title":"task alias","text":"

Create an alias for a task.

"},{"location":"reference/cli/#arguments_9","title":"Arguments","text":"
  1. <ALIAS>: The alias name
  2. <DEPENDS_ON>: The names of the tasks you want to execute on this alias, order counts, first one runs first.
"},{"location":"reference/cli/#options_11","title":"Options","text":"
  • --platform <PLATFORM> (-p): the platform for which this alias is created.
pixi task alias test-all test-py test-cpp test-rust\npixi task alias --platform linux-64 test test-linux\npixi task alias moo cow\n
"},{"location":"reference/cli/#task-list","title":"task list","text":"

List all tasks in the project.

"},{"location":"reference/cli/#options_12","title":"Options","text":"
  • --environment(-e): the environment's tasks list, if non is provided the default tasks will be listed.
  • --summary(-s): list the tasks per environment.
pixi task list\npixi task list --environment cuda\npixi task list --summary\n
"},{"location":"reference/cli/#list","title":"list","text":"

List project's packages. Highlighted packages are explicit dependencies.

"},{"location":"reference/cli/#arguments_10","title":"Arguments","text":"
  1. [REGEX]: List only packages matching a regular expression (optional).
"},{"location":"reference/cli/#options_13","title":"Options","text":"
  • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
  • --json: Whether to output in json format.
  • --json-pretty: Whether to output in pretty json format
  • --sort-by <SORT_BY>: Sorting strategy [default: name] [possible values: size, name, type]
  • --explicit (-x): Only list the packages that are explicitly added to the manifest file.
  • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
  • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
pixi list\npixi list py\npixi list --json-pretty\npixi list --explicit\npixi list --sort-by size\npixi list --platform win-64\npixi list --environment cuda\npixi list --frozen\npixi list --locked\npixi list --no-install\n

Output will look like this, where python will be green as it is the package that was explicitly added to the manifest file:

\u279c pixi list\n Package           Version     Build               Size       Kind   Source\n _libgcc_mutex     0.1         conda_forge         2.5 KiB    conda  _libgcc_mutex-0.1-conda_forge.tar.bz2\n _openmp_mutex     4.5         2_gnu               23.1 KiB   conda  _openmp_mutex-4.5-2_gnu.tar.bz2\n bzip2             1.0.8       hd590300_5          248.3 KiB  conda  bzip2-1.0.8-hd590300_5.conda\n ca-certificates   2023.11.17  hbcca054_0          150.5 KiB  conda  ca-certificates-2023.11.17-hbcca054_0.conda\n ld_impl_linux-64  2.40        h41732ed_0          688.2 KiB  conda  ld_impl_linux-64-2.40-h41732ed_0.conda\n libexpat          2.5.0       hcb278e6_1          76.2 KiB   conda  libexpat-2.5.0-hcb278e6_1.conda\n libffi            3.4.2       h7f98852_5          56.9 KiB   conda  libffi-3.4.2-h7f98852_5.tar.bz2\n libgcc-ng         13.2.0      h807b86a_4          755.7 KiB  conda  libgcc-ng-13.2.0-h807b86a_4.conda\n libgomp           13.2.0      h807b86a_4          412.2 KiB  conda  libgomp-13.2.0-h807b86a_4.conda\n libnsl            2.0.1       hd590300_0          32.6 KiB   conda  libnsl-2.0.1-hd590300_0.conda\n libsqlite         3.44.2      h2797004_0          826 KiB    conda  libsqlite-3.44.2-h2797004_0.conda\n libuuid           2.38.1      h0b41bf4_0          32.8 KiB   conda  libuuid-2.38.1-h0b41bf4_0.conda\n libxcrypt         4.4.36      hd590300_1          98 KiB     conda  libxcrypt-4.4.36-hd590300_1.conda\n libzlib           1.2.13      hd590300_5          60.1 KiB   conda  libzlib-1.2.13-hd590300_5.conda\n ncurses           6.4         h59595ed_2          863.7 KiB  conda  ncurses-6.4-h59595ed_2.conda\n openssl           3.2.0       hd590300_1          2.7 MiB    conda  openssl-3.2.0-hd590300_1.conda\n python            3.12.1      hab00c5b_1_cpython  30.8 MiB   conda  python-3.12.1-hab00c5b_1_cpython.conda\n readline          8.2         h8228510_1          274.9 KiB  conda  readline-8.2-h8228510_1.conda\n tk                8.6.13      noxft_h4845f30_101  3.2 MiB    conda  tk-8.6.13-noxft_h4845f30_101.conda\n tzdata            2023d       h0c530f3_0          116.8 KiB  conda  tzdata-2023d-h0c530f3_0.conda\n xz                5.2.6       h166bdaf_0          408.6 KiB  conda  xz-5.2.6-h166bdaf_0.tar.bz2\n
"},{"location":"reference/cli/#tree","title":"tree","text":"

Display the project's packages in a tree. Highlighted packages are those specified in the manifest.

The package tree can also be inverted (-i), to see which packages require a specific dependencies.

"},{"location":"reference/cli/#arguments_11","title":"Arguments","text":"
  • REGEX optional regex of which dependencies to filter the tree to, or which dependencies to start with when inverting the tree.
"},{"location":"reference/cli/#options_14","title":"Options","text":"
  • --invert (-i): Invert the dependency tree, that is given a REGEX pattern that matches some packages, show all the packages that depend on those.
  • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
  • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
  • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
pixi tree\npixi tree pre-commit\npixi tree -i yaml\npixi tree --environment docs\npixi tree --platform win-64\n

Warning

Use -v to show which pypi packages are not yet parsed correctly. The extras and markers parsing is still under development.

Output will look like this, where direct packages in the manifest file will be green. Once a package has been displayed once, the tree won't continue to recurse through its dependencies (compare the first time python appears, vs the rest), and it will instead be marked with a star (*).

Version numbers are colored by the package type, yellow for Conda packages and blue for PyPI.

\u279c pixi tree\n\u251c\u2500\u2500 pre-commit v3.3.3\n\u2502   \u251c\u2500\u2500 cfgv v3.3.1\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n\u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n\u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n\u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n\u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n\u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210\n\u2502   \u2502       \u251c\u2500\u2500 openssl v3.2.1\n\u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210 (*)\n\u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u2514\u2500\u2500 xz v5.2.6\n\u2502   \u251c\u2500\u2500 identify v2.5.35\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n...\n\u2514\u2500\u2500 tbump v6.9.0\n...\n    \u2514\u2500\u2500 tomlkit v0.12.4\n        \u2514\u2500\u2500 python v3.12.2 (*)\n

A regex pattern can be specified to filter the tree to just those that show a specific direct, or transitive dependency:

\u279c pixi tree pre-commit\n\u2514\u2500\u2500 pre-commit v3.3.3\n    \u251c\u2500\u2500 virtualenv v20.25.1\n    \u2502   \u251c\u2500\u2500 filelock v3.13.1\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n    \u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n    \u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210\n    \u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n    \u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n    \u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n    \u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 xz v5.2.6\n    \u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210 (*)\n    \u2502   \u2502       \u2514\u2500\u2500 openssl v3.2.1\n    \u2502   \u251c\u2500\u2500 platformdirs v4.2.0\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u251c\u2500\u2500 distlib v0.3.8\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u251c\u2500\u2500 pyyaml v6.0.1\n...\n

Additionally, the tree can be inverted, and it can show which packages depend on a regex pattern. The packages specified in the manifest will also be highlighted (in this case cffconvert and pre-commit would be).

\u279c pixi tree -i yaml\n\nruamel.yaml v0.18.6\n\u251c\u2500\u2500 pykwalify v1.8.0\n\u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n\u2514\u2500\u2500 cffconvert v2.0.0\n\npyyaml v6.0.1\n\u2514\u2500\u2500 pre-commit v3.3.3\n\nruamel.yaml.clib v0.2.8\n\u2514\u2500\u2500 ruamel.yaml v0.18.6\n    \u251c\u2500\u2500 pykwalify v1.8.0\n    \u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n    \u2514\u2500\u2500 cffconvert v2.0.0\n\nyaml v0.2.5\n\u2514\u2500\u2500 pyyaml v6.0.1\n    \u2514\u2500\u2500 pre-commit v3.3.3\n
"},{"location":"reference/cli/#shell","title":"shell","text":"

This command starts a new shell in the project's environment. To exit the pixi shell, simply run exit.

"},{"location":"reference/cli/#options_15","title":"Options","text":"
  • --change-ps1 <true or false>: When set to false, the (pixi) prefix in the shell prompt is removed (default: true). The default behavior can be configured globally.
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --no-install: Don't install the environment, only activate the environment.
  • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
  • --environment <ENVIRONMENT> (-e): The environment to activate the shell in, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • --no-progress: Hide all progress bars, always turned on if stderr is not a terminal [env: PIXI_NO_PROGRESS=]
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • --revalidate: Revalidate the full environment, instead of checking lock file hash. more info
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi shell\nexit\npixi shell --manifest-path ~/myproject/pixi.toml\nexit\npixi shell --frozen\nexit\npixi shell --locked\nexit\npixi shell --environment cuda\nexit\n
"},{"location":"reference/cli/#shell-hook","title":"shell-hook","text":"

This command prints the activation script of an environment.

"},{"location":"reference/cli/#options_16","title":"Options","text":"
  • --shell <SHELL> (-s): The shell for which the activation script should be printed. Defaults to the current shell. Currently supported variants: [bash, zsh, xonsh, cmd, powershell, fish, nushell]
  • --manifest-path: the path to manifest file, by default it searches for one in the parent directories.
  • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
  • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
  • --environment <ENVIRONMENT> (-e): The environment to activate, if none are provided the default environment will be used or a selector will be given to select the right environment.
  • --json: Print all environment variables that are exported by running the activation script as JSON. When specifying this option, --shell is ignored.
  • --force-activate: (default, except in experimental mode) Force the activation of the environment, even if the environment is already activated.
  • --revalidate: Revalidate the full environment, instead of checking lock file hash. more info
  • --concurrent-downloads: The number of concurrent downloads to use when installing packages. Defaults to 50.
  • --concurrent-solves: The number of concurrent solves to use when installing packages. Defaults to the number of cpu threads.
pixi shell-hook\npixi shell-hook --shell bash\npixi shell-hook --shell zsh\npixi shell-hook -s powershell\npixi shell-hook --manifest-path ~/myproject/pixi.toml\npixi shell-hook --frozen\npixi shell-hook --locked\npixi shell-hook --environment cuda\npixi shell-hook --json\n

Example use-case, when you want to get rid of the pixi executable in a Docker container.

pixi shell-hook --shell bash > /etc/profile.d/pixi.sh\nrm ~/.pixi/bin/pixi # Now the environment will be activated without the need for the pixi executable.\n
"},{"location":"reference/cli/#search","title":"search","text":"

Search a package, output will list the latest version of the package.

"},{"location":"reference/cli/#arguments_12","title":"Arguments","text":"
  1. <PACKAGE>: Name of package to search, it's possible to use wildcards (*).
"},{"location":"reference/cli/#options_17","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • --limit <LIMIT> (-l): optionally limit the number of search results
  • --platform <PLATFORM> (-p): specify a platform that you want to search for. (default: current platform)
pixi search pixi\npixi search --limit 30 \"py*\"\n# search in a different channel and for a specific platform\npixi search -c robostack --platform linux-64 \"plotjuggler*\"\n
"},{"location":"reference/cli/#self-update","title":"self-update","text":"

Update pixi to the latest version or a specific version. If pixi was installed using another package manager this feature might not be available and pixi should be updated using the package manager used to install it.

"},{"location":"reference/cli/#options_18","title":"Options","text":"
  • --version <VERSION>: The desired version (to downgrade or upgrade to). Update to the latest version if not specified.
pixi self-update\npixi self-update --version 0.13.0\n
"},{"location":"reference/cli/#info","title":"info","text":"

Shows helpful information about the pixi installation, cache directories, disk usage, and more. More information here.

"},{"location":"reference/cli/#options_19","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --extended: extend the information with more slow queries to the system, like directory sizes.
  • --json: Get a machine-readable version of the information as output.
pixi info\npixi info --json --extended\n
"},{"location":"reference/cli/#clean","title":"clean","text":"

Clean the parts of your system which are touched by pixi. Defaults to cleaning the environments and task cache. Use the cache subcommand to clean the cache

"},{"location":"reference/cli/#options_20","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
  • --environment <ENVIRONMENT> (-e): The environment to clean, if none are provided all environments will be removed.
pixi clean\n
"},{"location":"reference/cli/#clean-cache","title":"clean cache","text":"

Clean the pixi cache on your system.

"},{"location":"reference/cli/#options_21","title":"Options","text":"
  • --pypi: Clean the pypi cache.
  • --conda: Clean the conda cache.
  • --mapping: Clean the mapping cache.
  • --exec: Clean the exec cache.
  • --repodata: Clean the repodata cache.
  • --yes: Skip the confirmation prompt.
pixi clean cache # clean all pixi caches\npixi clean cache --pypi # clean only the pypi cache\npixi clean cache --conda # clean only the conda cache\npixi clean cache --mapping # clean only the mapping cache\npixi clean cache --exec # clean only the `exec` cache\npixi clean cache --repodata # clean only the `repodata` cache\npixi clean cache --yes # skip the confirmation prompt\n
"},{"location":"reference/cli/#upload","title":"upload","text":"

Upload a package to a prefix.dev channel

"},{"location":"reference/cli/#arguments_13","title":"Arguments","text":"
  1. <HOST>: The host + channel to upload to.
  2. <PACKAGE_FILE>: The package file to upload.
pixi upload https://prefix.dev/api/v1/upload/my_channel my_package.conda\n
"},{"location":"reference/cli/#auth","title":"auth","text":"

This command is used to authenticate the user's access to remote hosts such as prefix.dev or anaconda.org for private channels.

"},{"location":"reference/cli/#auth-login","title":"auth login","text":"

Store authentication information for given host.

Tip

The host is real hostname not a channel.

"},{"location":"reference/cli/#arguments_14","title":"Arguments","text":"
  1. <HOST>: The host to authenticate with.
"},{"location":"reference/cli/#options_22","title":"Options","text":"
  • --token <TOKEN>: The token to use for authentication with prefix.dev.
  • --username <USERNAME>: The username to use for basic HTTP authentication
  • --password <PASSWORD>: The password to use for basic HTTP authentication.
  • --conda-token <CONDA_TOKEN>: The token to use on anaconda.org / quetz authentication.
pixi auth login repo.prefix.dev --token pfx_JQEV-m_2bdz-D8NSyRSaAndHANx0qHjq7f2iD\npixi auth login anaconda.org --conda-token ABCDEFGHIJKLMNOP\npixi auth login https://myquetz.server --username john --password xxxxxx\n
"},{"location":"reference/cli/#auth-logout","title":"auth logout","text":"

Remove authentication information for a given host.

"},{"location":"reference/cli/#arguments_15","title":"Arguments","text":"
  1. <HOST>: The host to authenticate with.
pixi auth logout <HOST>\npixi auth logout repo.prefix.dev\npixi auth logout anaconda.org\n
"},{"location":"reference/cli/#config","title":"config","text":"

Use this command to manage the configuration.

"},{"location":"reference/cli/#options_23","title":"Options","text":"
  • --system (-s): Specify management scope to system configuration.
  • --global (-g): Specify management scope to global configuration.
  • --local (-l): Specify management scope to local configuration.
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.

Checkout the pixi configuration for more information about the locations.

"},{"location":"reference/cli/#config-edit","title":"config edit","text":"

Edit the configuration file in the default editor.

"},{"location":"reference/cli/#arguments_16","title":"Arguments","text":"
  1. [EDITOR]: The editor to use, defaults to EDITOR environment variable or nano on Unix and notepad on Windows
pixi config edit --system\npixi config edit --local\npixi config edit -g\npixi config edit --global code\npixi config edit --system vim\n
"},{"location":"reference/cli/#config-list","title":"config list","text":"

List the configuration

"},{"location":"reference/cli/#arguments_17","title":"Arguments","text":"
  1. [KEY]: The key to list the value of. (all if not provided)
"},{"location":"reference/cli/#options_24","title":"Options","text":"
  • --json: Output the configuration in JSON format.
pixi config list default-channels\npixi config list --json\npixi config list --system\npixi config list -g\n
"},{"location":"reference/cli/#config-prepend","title":"config prepend","text":"

Prepend a value to a list configuration key.

"},{"location":"reference/cli/#arguments_18","title":"Arguments","text":"
  1. <KEY>: The key to prepend the value to.
  2. <VALUE>: The value to prepend.
pixi config prepend default-channels conda-forge\n
"},{"location":"reference/cli/#config-append","title":"config append","text":"

Append a value to a list configuration key.

"},{"location":"reference/cli/#arguments_19","title":"Arguments","text":"
  1. <KEY>: The key to append the value to.
  2. <VALUE>: The value to append.
pixi config append default-channels robostack\npixi config append default-channels bioconda --global\n
"},{"location":"reference/cli/#config-set","title":"config set","text":"

Set a configuration key to a value.

"},{"location":"reference/cli/#arguments_20","title":"Arguments","text":"
  1. <KEY>: The key to set the value of.
  2. [VALUE]: The value to set. (if not provided, the key will be removed)
pixi config set default-channels '[\"conda-forge\", \"bioconda\"]'\npixi config set --global mirrors '{\"https://conda.anaconda.org/\": [\"https://prefix.dev/conda-forge\"]}'\npixi config set repodata-config.disable-zstd true --system\npixi config set --global detached-environments \"/opt/pixi/envs\"\npixi config set detached-environments false\n
"},{"location":"reference/cli/#config-unset","title":"config unset","text":"

Unset a configuration key.

"},{"location":"reference/cli/#arguments_21","title":"Arguments","text":"
  1. <KEY>: The key to unset.
pixi config unset default-channels\npixi config unset --global mirrors\npixi config unset repodata-config.disable-zstd --system\n
"},{"location":"reference/cli/#global","title":"global","text":"

Global is the main entry point for the part of pixi that executes on the global(system) level. All commands in this section are used to manage global installations of packages and environments through the global manifest. More info on the global manifest can be found here.

Tip

Binaries and environments installed globally are stored in ~/.pixi by default, this can be changed by setting the PIXI_HOME environment variable.

"},{"location":"reference/cli/#global-add","title":"global add","text":"

Adds dependencies to a global environment. Without exposing the binaries of that package to the system by default.

"},{"location":"reference/cli/#arguments_22","title":"Arguments","text":"
  1. [PACKAGE]: The packages to add, this excepts the matchspec format. (e.g. python=3.9.*, python [version='3.11.0', build_number=1])
"},{"location":"reference/cli/#options_25","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): The environment to install the package into.
  • --expose <EXPOSE>: A mapping from name to the binary to expose to the system.
pixi global add python=3.9.* --environment my-env\npixi global add python=3.9.* --expose py39=python3.9 --environment my-env\npixi global add numpy matplotlib --environment my-env\npixi global add numpy matplotlib --expose np=python3.9 --environment my-env\n
"},{"location":"reference/cli/#global-edit","title":"global edit","text":"

Edit the global manifest file in the default editor.

Will try to use the EDITOR environment variable, if not set it will use nano on Unix systems and notepad on Windows.

"},{"location":"reference/cli/#arguments_23","title":"Arguments","text":"
  1. <EDITOR>: The editor to use. (optional)
    pixi global edit\npixi global edit code\npixi global edit vim\n
"},{"location":"reference/cli/#global-install","title":"global install","text":"

This command installs package(s) into its own environment and adds the binary to PATH. Allowing you to access it anywhere on your system without activating the environment.

"},{"location":"reference/cli/#arguments_24","title":"Arguments","text":"

1.[PACKAGE]: The package(s) to install, this can also be a version constraint.

"},{"location":"reference/cli/#options_26","title":"Options","text":"
  • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
  • --platform <PLATFORM> (-p): specify a platform that you want to install the package for. (default: current platform)
  • --environment <ENVIRONMENT> (-e): The environment to install the package into. (default: name of the tool)
  • --expose <EXPOSE>: A mapping from name to the binary to expose to the system. (default: name of the tool)
  • --with <WITH>: Add additional dependencies to the environment. Their executables will not be exposed.
pixi global install ruff\n# Multiple packages can be installed at once\npixi global install starship rattler-build\n# Specify the channel(s)\npixi global install --channel conda-forge --channel bioconda trackplot\n# Or in a more concise form\npixi global install -c conda-forge -c bioconda trackplot\n\n# Support full conda matchspec\npixi global install python=3.9.*\npixi global install \"python [version='3.11.0', build_number=1]\"\npixi global install \"python [version='3.11.0', build=he550d4f_1_cpython]\"\npixi global install python=3.11.0=h10a6764_1_cpython\n\n# Install for a specific platform, only useful on osx-arm64\npixi global install --platform osx-64 ruff\n\n# Install a package with all its executables exposed, together with additional packages that don't expose anything\npixi global install ipython --with numpy --with scipy\n\n# Install into a specific environment name and expose all executables\npixi global install --environment data-science ipython jupyterlab numpy matplotlib\n\n# Expose the binary under a different name\npixi global install --expose \"py39=python3.9\" \"python=3.9.*\"\n

Tip

Running osx-64 on Apple Silicon will install the Intel binary but run it using Rosetta

pixi global install --platform osx-64 ruff\n

After using global install, you can use the package you installed anywhere on your system.

"},{"location":"reference/cli/#global-uninstall","title":"global uninstall","text":"

Uninstalls environments from the global environment. This will remove the environment and all its dependencies from the global environment. It will also remove the related binaries from the system.

"},{"location":"reference/cli/#arguments_25","title":"Arguments","text":"
  1. [ENVIRONMENT]: The environments to uninstall.
pixi global uninstall my-env\npixi global uninstall pixi-pack rattler-build\n
"},{"location":"reference/cli/#global-remove","title":"global remove","text":"

Removes a package from a global environment.

"},{"location":"reference/cli/#arguments_26","title":"Arguments","text":"
  1. [PACKAGE]: The packages to remove.
"},{"location":"reference/cli/#options_27","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): The environment to remove the package from.
pixi global remove -e my-env package1 package2\n
"},{"location":"reference/cli/#global-list","title":"global list","text":"

This command shows the current installed global environments including what binaries come with it. A global installed package/environment can possibly contain multiple exposed binaries and they will be listed out in the command output.

"},{"location":"reference/cli/#options_28","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): The environment to install the package into. (default: name of the tool)

We'll only show the dependencies and exposed binaries of the environment if they differ from the environment name. Here is an example of a few installed packages:

pixi global list\n
Results in:
Global environments at /home/user/.pixi:\n\u251c\u2500\u2500 gh: 2.57.0\n\u251c\u2500\u2500 pixi-pack: 0.1.8\n\u251c\u2500\u2500 python: 3.11.0\n\u2502   \u2514\u2500 exposes: 2to3, 2to3-3.11, idle3, idle3.11, pydoc, pydoc3, pydoc3.11, python, python3, python3-config, python3.1, python3.11, python3.11-config\n\u251c\u2500\u2500 rattler-build: 0.22.0\n\u251c\u2500\u2500 ripgrep: 14.1.0\n\u2502   \u2514\u2500 exposes: rg\n\u251c\u2500\u2500 vim: 9.1.0611\n\u2502   \u2514\u2500 exposes: ex, rview, rvim, view, vim, vimdiff, vimtutor, xxd\n\u2514\u2500\u2500 zoxide: 0.9.6\n

Here is an example of list of a single environment:

pixi g list -e pixi-pack\n
Results in:
The 'pixi-pack' environment has 8 packages:\nPackage          Version    Build        Size\n_libgcc_mutex    0.1        conda_forge  2.5 KiB\n_openmp_mutex    4.5        2_gnu        23.1 KiB\nca-certificates  2024.8.30  hbcca054_0   155.3 KiB\nlibgcc           14.1.0     h77fa898_1   826.5 KiB\nlibgcc-ng        14.1.0     h69a702a_1   50.9 KiB\nlibgomp          14.1.0     h77fa898_1   449.4 KiB\nopenssl          3.3.2      hb9d3cd8_0   2.8 MiB\npixi-pack        0.1.8      hc762bcd_0   4.3 MiB\nPackage          Version    Build        Size\n\nExposes:\npixi-pack\nChannels:\nconda-forge\nPlatform: linux-64\n

"},{"location":"reference/cli/#global-sync","title":"global sync","text":"

As the global manifest can be manually edited, this command will sync the global manifest with the current state of the global environment. You can modify the manifest in $HOME/manifests/pixi_global.toml.

pixi global sync\n
"},{"location":"reference/cli/#global-expose","title":"global expose","text":"

Modify the exposed binaries of a global environment.

"},{"location":"reference/cli/#global-expose-add","title":"global expose add","text":"

Add exposed binaries from an environment to your global environment.

"},{"location":"reference/cli/#arguments_27","title":"Arguments","text":"
  1. [MAPPING]: The binaries to expose (python), or give a map to expose a binary under a different name. (e.g. py310=python3.10) The mapping is mapped as exposed_name=binary_name. Where the exposed name is the one you will be able to use in the terminal, and the binary name is the name of the binary in the environment.
"},{"location":"reference/cli/#options_29","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): The environment to expose the binaries from.
pixi global expose add python --environment my-env\npixi global expose add py310=python3.10 --environment python\n
"},{"location":"reference/cli/#global-expose-remove","title":"global expose remove","text":"

Remove exposed binaries from the global environment.

"},{"location":"reference/cli/#arguments_28","title":"Arguments","text":"
  1. [EXPOSED_NAME]: The binaries to remove from the main global environment.
pixi global expose remove python\npixi global expose remove py310 python3\n
"},{"location":"reference/cli/#global-update","title":"global update","text":"

Update all environments or specify an environment to update to the version.

"},{"location":"reference/cli/#arguments_29","title":"Arguments","text":"
  1. [ENVIRONMENT]: The environment(s) to update.
pixi global update\npixi global update pixi-pack\npixi global update bat rattler-build\n
"},{"location":"reference/cli/#project","title":"project","text":"

This subcommand allows you to modify the project configuration through the command line interface.

"},{"location":"reference/cli/#options_30","title":"Options","text":"
  • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
"},{"location":"reference/cli/#project-channel-add","title":"project channel add","text":"

Add channels to the channel list in the project configuration. When you add channels, the channels are tested for existence, added to the lock file and the environment is reinstalled.

"},{"location":"reference/cli/#arguments_30","title":"Arguments","text":"
  1. <CHANNEL>: The channels to add, name or URL.
"},{"location":"reference/cli/#options_31","title":"Options","text":"
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • --feature <FEATURE> (-f): The feature for which the channel is added.
  • --prepend: Prepend the channel to the list of channels.
pixi project channel add robostack\npixi project channel add bioconda conda-forge robostack\npixi project channel add file:///home/user/local_channel\npixi project channel add https://repo.prefix.dev/conda-forge\npixi project channel add --no-install robostack\npixi project channel add --feature cuda nvidia\npixi project channel add --prepend pytorch\n
"},{"location":"reference/cli/#project-channel-list","title":"project channel list","text":"

List the channels in the manifest file

"},{"location":"reference/cli/#options_32","title":"Options","text":"
  • urls: show the urls of the channels instead of the names.
$ pixi project channel list\nEnvironment: default\n- conda-forge\n\n$ pixi project channel list --urls\nEnvironment: default\n- https://conda.anaconda.org/conda-forge/\n
"},{"location":"reference/cli/#project-channel-remove","title":"project channel remove","text":"

List the channels in the manifest file

"},{"location":"reference/cli/#arguments_31","title":"Arguments","text":"
  1. <CHANNEL>...: The channels to remove, name(s) or URL(s).
"},{"location":"reference/cli/#options_33","title":"Options","text":"
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • --feature <FEATURE> (-f): The feature for which the channel is removed.
pixi project channel remove conda-forge\npixi project channel remove https://conda.anaconda.org/conda-forge/\npixi project channel remove --no-install conda-forge\npixi project channel remove --feature cuda nvidia\n
"},{"location":"reference/cli/#project-description-get","title":"project description get","text":"

Get the project description.

$ pixi project description get\nPackage management made easy!\n
"},{"location":"reference/cli/#project-description-set","title":"project description set","text":"

Set the project description.

"},{"location":"reference/cli/#arguments_32","title":"Arguments","text":"
  1. <DESCRIPTION>: The description to set.
pixi project description set \"my new description\"\n
"},{"location":"reference/cli/#project-environment-add","title":"project environment add","text":"

Add an environment to the manifest file.

"},{"location":"reference/cli/#arguments_33","title":"Arguments","text":"
  1. <NAME>: The name of the environment to add.
"},{"location":"reference/cli/#options_34","title":"Options","text":"
  • -f, --feature <FEATURES>: Features to add to the environment.
  • --solve-group <SOLVE_GROUP>: The solve-group to add the environment to.
  • --no-default-feature: Don't include the default feature in the environment.
  • --force: Update the manifest even if the environment already exists.
pixi project environment add env1 --feature feature1 --feature feature2\npixi project environment add env2 -f feature1 --solve-group test\npixi project environment add env3 -f feature1 --no-default-feature\npixi project environment add env3 -f feature1 --force\n
"},{"location":"reference/cli/#project-environment-remove","title":"project environment remove","text":"

Remove an environment from the manifest file.

"},{"location":"reference/cli/#arguments_34","title":"Arguments","text":"
  1. <NAME>: The name of the environment to remove.
pixi project environment remove env1\n
"},{"location":"reference/cli/#project-environment-list","title":"project environment list","text":"

List the environments in the manifest file.

pixi project environment list\n
"},{"location":"reference/cli/#project-export-conda-environment","title":"project export conda-environment","text":"

Exports a conda environment.yml file. The file can be used to create a conda environment using conda/mamba:

pixi project export conda-environment environment.yml\nmamba create --name <env> --file environment.yml\n
"},{"location":"reference/cli/#arguments_35","title":"Arguments","text":"
  1. <OUTPUT_PATH>: Optional path to render environment.yml to. Otherwise it will be printed to standard out.
"},{"location":"reference/cli/#options_35","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): Environment to render.
  • --platform <PLATFORM> (-p): The platform to render.
pixi project export conda-environment --environment lint\npixi project export conda-environment --platform linux-64 environment.linux-64.yml\n
"},{"location":"reference/cli/#project-export-conda-explicit-spec","title":"project export conda-explicit-spec","text":"

Render a platform-specific conda explicit specification file for an environment. The file can be then used to create a conda environment using conda/mamba:

mamba create --name <env> --file <explicit spec file>\n

As the explicit specification file format does not support pypi-dependencies, use the --ignore-pypi-errors option to ignore those dependencies.

"},{"location":"reference/cli/#arguments_36","title":"Arguments","text":"
  1. <OUTPUT_DIR>: Output directory for rendered explicit environment spec files.
"},{"location":"reference/cli/#options_36","title":"Options","text":"
  • --environment <ENVIRONMENT> (-e): Environment to render. Can be repeated for multiple envs. Defaults to all environments.
  • --platform <PLATFORM> (-p): The platform to render. Can be repeated for multiple platforms. Defaults to all platforms available for selected environments.
  • --ignore-pypi-errors: PyPI dependencies are not supported in the conda explicit spec file. This flag allows creating the spec file even if PyPI dependencies are present.
pixi project export conda-explicit-spec output\npixi project export conda-explicit-spec -e default -e test -p linux-64 output\n
"},{"location":"reference/cli/#project-name-get","title":"project name get","text":"

Get the project name.

$ pixi project name get\nmy project name\n
"},{"location":"reference/cli/#project-name-set","title":"project name set","text":"

Set the project name.

"},{"location":"reference/cli/#arguments_37","title":"Arguments","text":"
  1. <NAME>: The name to set.
pixi project name set \"my new project name\"\n
"},{"location":"reference/cli/#project-platform-add","title":"project platform add","text":"

Adds a platform(s) to the manifest file and updates the lock file.

"},{"location":"reference/cli/#arguments_38","title":"Arguments","text":"
  1. <PLATFORM>...: The platforms to add.
"},{"location":"reference/cli/#options_37","title":"Options","text":"
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • --feature <FEATURE> (-f): The feature for which the platform will be added.
pixi project platform add win-64\npixi project platform add --feature test win-64\n
"},{"location":"reference/cli/#project-platform-list","title":"project platform list","text":"

List the platforms in the manifest file.

$ pixi project platform list\nosx-64\nlinux-64\nwin-64\nosx-arm64\n
"},{"location":"reference/cli/#project-platform-remove","title":"project platform remove","text":"

Remove platform(s) from the manifest file and updates the lock file.

"},{"location":"reference/cli/#arguments_39","title":"Arguments","text":"
  1. <PLATFORM>...: The platforms to remove.
"},{"location":"reference/cli/#options_38","title":"Options","text":"
  • --no-install: do not update the environment, only add changed packages to the lock-file.
  • --feature <FEATURE> (-f): The feature for which the platform will be removed.
pixi project platform remove win-64\npixi project platform remove --feature test win-64\n
"},{"location":"reference/cli/#project-version-get","title":"project version get","text":"

Get the project version.

$ pixi project version get\n0.11.0\n
"},{"location":"reference/cli/#project-version-set","title":"project version set","text":"

Set the project version.

"},{"location":"reference/cli/#arguments_40","title":"Arguments","text":"
  1. <VERSION>: The version to set.
pixi project version set \"0.13.0\"\n
"},{"location":"reference/cli/#project-version-majorminorpatch","title":"project version {major|minor|patch}","text":"

Bump the project version to {MAJOR|MINOR|PATCH}.

pixi project version major\npixi project version minor\npixi project version patch\n
  1. An up-to-date lock file means that the dependencies in the lock file are allowed by the dependencies in the manifest file. For example

    • a manifest with python = \">= 3.11\" is up-to-date with a name: python, version: 3.11.0 in the pixi.lock.
    • a manifest with python = \">= 3.12\" is not up-to-date with a name: python, version: 3.11.0 in the pixi.lock.

    Being up-to-date does not mean that the lock file holds the latest version available on the channel for the given dependency.\u00a0\u21a9\u21a9\u21a9\u21a9\u21a9\u21a9

"},{"location":"reference/pixi_configuration/","title":"The configuration of pixi itself","text":"

Apart from the project specific configuration pixi supports configuration options which are not required for the project to work but are local to the machine. The configuration is loaded in the following order:

LinuxmacOSWindows Priority Location Comments 7 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments 6 your_project/.pixi/config.toml Project-specific configuration 5 $PIXI_HOME/config.toml Global configuration in PIXI_HOME. 4 $HOME/.pixi/config.toml Global configuration in the user home directory. 3 $XDG_CONFIG_HOME/pixi/config.toml XDG compliant user-specific configuration 2 $HOME/.config/pixi/config.toml User-specific configuration 1 /etc/pixi/config.toml System-wide configuration Priority Location Comments 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments 5 your_project/.pixi/config.toml Project-specific configuration 4 $PIXI_HOME/config.toml Global configuration in PIXI_HOME. 3 $HOME/.pixi/config.toml Global configuration in the user home directory. 2 $HOME/Library/Application Support/pixi/config.toml User-specific configuration 1 /etc/pixi/config.toml System-wide configuration Priority Location Comments 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments 5 your_project\\.pixi\\config.toml Project-specific configuration 4 %PIXI_HOME%\\config.toml Global configuration in PIXI_HOME. 3 %USERPROFILE%\\.pixi\\config.toml Global configuration in the user home directory. 2 %APPDATA%\\pixi\\config.toml User-specific configuration 1 C:\\ProgramData\\pixi\\config.toml System-wide configuration

Note

The highest priority wins. If a configuration file is found in a higher priority location, the values from the configuration read from lower priority locations are overwritten.

Note

To find the locations where pixi looks for configuration files, run pixi with -vv.

"},{"location":"reference/pixi_configuration/#configuration-options","title":"Configuration options","text":"Casing In Configuration

In versions of pixi 0.20.1 and older the global configuration used snake_case we've changed to kebab-case for consistency with the rest of the configuration. But we still support the old snake_case configuration, for older configuration options. These are:

  • default_channels
  • change_ps1
  • tls_no_verify
  • authentication_override_file
  • mirrors and sub-options
  • repodata-config and sub-options

The following reference describes all available configuration options.

"},{"location":"reference/pixi_configuration/#default-channels","title":"default-channels","text":"

The default channels to select when running pixi init or pixi global install. This defaults to only conda-forge. config.toml

default-channels = [\"conda-forge\"]\n

Note

The default-channels are only used when initializing a new project. Once initialized the channels are used from the project manifest.

"},{"location":"reference/pixi_configuration/#change-ps1","title":"change-ps1","text":"

When set to false, the (pixi) prefix in the shell prompt is removed. This applies to the pixi shell subcommand. You can override this from the CLI with --change-ps1.

config.toml
change-ps1 = true\n
"},{"location":"reference/pixi_configuration/#tls-no-verify","title":"tls-no-verify","text":"

When set to true, the TLS certificates are not verified.

Warning

This is a security risk and should only be used for testing purposes or internal networks.

You can override this from the CLI with --tls-no-verify.

config.toml
tls-no-verify = false\n
"},{"location":"reference/pixi_configuration/#authentication-override-file","title":"authentication-override-file","text":"

Override from where the authentication information is loaded. Usually, we try to use the keyring to load authentication data from, and only use a JSON file as a fallback. This option allows you to force the use of a JSON file. Read more in the authentication section. config.toml

authentication-override-file = \"/path/to/your/override.json\"\n

"},{"location":"reference/pixi_configuration/#detached-environments","title":"detached-environments","text":"

The directory where pixi stores the project environments, what would normally be placed in the .pixi/envs folder in a project's root. It doesn't affect the environments built for pixi global. The location of environments created for a pixi global installation can be controlled using the PIXI_HOME environment variable.

Warning

We recommend against using this because any environment created for a project is no longer placed in the same folder as the project. This creates a disconnect between the project and its environments and manual cleanup of the environments is required when deleting the project.

However, in some cases, this option can still be very useful, for instance to:

  • force the installation on a specific filesystem/drive.
  • install environments locally but keep the project on a network drive.
  • let a system-administrator have more control over all environments on a system.

This field can consist of two types of input.

  • A boolean value, true or false, which will enable or disable the feature respectively. (not \"true\" or \"false\", this is read as false)
  • A string value, which will be the absolute path to the directory where the environments will be stored.

config.toml

detached-environments = true\n
or: config.toml
detached-environments = \"/opt/pixi/envs\"\n

The environments will be stored in the cache directory when this option is true. When you specify a custom path the environments will be stored in that directory.

The resulting directory structure will look like this:

/opt/pixi/envs\n\u251c\u2500\u2500 pixi-6837172896226367631\n\u2502   \u2514\u2500\u2500 envs\n\u2514\u2500\u2500 NAME_OF_PROJECT-HASH_OF_ORIGINAL_PATH\n    \u251c\u2500\u2500 envs # the runnable environments\n    \u2514\u2500\u2500 solve-group-envs # If there are solve groups\n

"},{"location":"reference/pixi_configuration/#pinning-strategy","title":"pinning-strategy","text":"

The strategy to use for pinning dependencies when running pixi add. The default is semver but you can set the following:

  • no-pin: No pinning, resulting in an unconstraint dependency. *
  • semver: Pinning to the latest version that satisfies the semver constraint. Resulting in a pin to major for most versions and to minor for v0 versions.
  • exact-version: Pinning to the exact version, 1.2.3 -> ==1.2.3.
  • major: Pinning to the major version, 1.2.3 -> >=1.2.3, <2.
  • minor: Pinning to the minor version, 1.2.3 -> >=1.2.3, <1.3.
  • latest-up: Pinning to the latest version, 1.2.3 -> >=1.2.3.
config.toml
pinning-strategy = \"no-pin\"\n
"},{"location":"reference/pixi_configuration/#mirrors","title":"mirrors","text":"

Configuration for conda channel-mirrors, more info below.

config.toml
[mirrors]\n# redirect all requests for conda-forge to the prefix.dev mirror\n\"https://conda.anaconda.org/conda-forge\" = [\"https://prefix.dev/conda-forge\"]\n\n# redirect all requests for bioconda to one of the three listed mirrors\n# Note: for repodata we try the first mirror first.\n\"https://conda.anaconda.org/bioconda\" = [\n  \"https://conda.anaconda.org/bioconda\",\n  # OCI registries are also supported\n  \"oci://ghcr.io/channel-mirrors/bioconda\",\n  \"https://prefix.dev/bioconda\",\n]\n
"},{"location":"reference/pixi_configuration/#repodata-config","title":"repodata-config","text":"

Configuration for repodata fetching. config.toml

[repodata-config]\n# disable fetching of jlap, bz2 or zstd repodata files.\n# This should only be used for specific old versions of artifactory and other non-compliant\n# servers.\ndisable-bzip2 = true   # don't try to download repodata.json.bz2\ndisable-jlap = true    # don't try to download repodata.jlap\ndisable-sharded = true # don't try to download sharded repodata\ndisable-zstd = true    # don't try to download repodata.json.zst\n

The above settings can be overridden on a per-channel basis by specifying a channel prefix in the configuration. config.toml

[repodata-config.\"https://prefix.dev\"]\ndisable-sharded = false\n

"},{"location":"reference/pixi_configuration/#pypi-config","title":"pypi-config","text":"

To setup a certain number of defaults for the usage of PyPI registries. You can use the following configuration options:

  • index-url: The default index URL to use for PyPI packages. This will be added to a manifest file on a pixi init.
  • extra-index-urls: A list of additional URLs to use for PyPI packages. This will be added to a manifest file on a pixi init.
  • keyring-provider: Allows the use of the keyring python package to store and retrieve credentials.
  • allow-insecure-host: Allow insecure connections to host.
config.toml
[pypi-config]\n# Main index url\nindex-url = \"https://pypi.org/simple\"\n# list of additional urls\nextra-index-urls = [\"https://pypi.org/simple2\"]\n# can be \"subprocess\" or \"disabled\"\nkeyring-provider = \"subprocess\"\n# allow insecure connections to host\nallow-insecure-host = [\"localhost:8080\"]\n

index-url and extra-index-urls are not globals

Unlike pip, these settings, with the exception of keyring-provider will only modify the pixi.toml/pyproject.toml file and are not globally interpreted when not present in the manifest. This is because we want to keep the manifest file as complete and reproducible as possible.

"},{"location":"reference/pixi_configuration/#concurrency","title":"concurrency","text":"

Configure multiple settings to limit or extend the concurrency of pixi. config.toml

[concurrency]\n# The maximum number of concurrent downloads\n# Defaults to 50 as that was found to be a good balance between speed and stability\ndownloads = 5\n\n# The maximum number of concurrent dependency resolves\n# Defaults to a heuristic based on the number of cores on the system\nsolves = 2\n
Set them through the CLI with:
pixi config set concurrency.solves 1\npixi config set concurrency.downloads 12\n

"},{"location":"reference/pixi_configuration/#experimental","title":"Experimental","text":"

This allows the user to set specific experimental features that are not yet stable.

Please write a GitHub issue and add the flag experimental to the issue if you find issues with the feature you activated.

"},{"location":"reference/pixi_configuration/#caching-environment-activations","title":"Caching environment activations","text":"

Turn this feature on from configuration with the following command:

# For all your projects\npixi config set experimental.use-environment-activation-cache true --global\n\n# For a specific project\npixi config set experimental.use-environment-activation-cache true --local\n

This will cache the environment activation in the .pixi/activation-env-v0 folder in the project root. It will create a json file for each environment that is activated, and it will be used to activate the environment in the future.

> tree .pixi/activation-env-v0/\n.pixi/activation-env-v0/\n\u251c\u2500\u2500 activation_default.json\n\u2514\u2500\u2500 activation_lint.json\n\n> cat  .pixi/activation-env-v0/activation_lint.json\n{\"hash\":\"8d8344e0751d377a\",\"environment_variables\":{<ENVIRONMENT_VARIABLES_USED_IN_ACTIVATION>}}\n

  • The hash is a hash of the data on that environment in the pixi.lock, plus some important information on the environment activation. Like [activation.scripts] and [activation.env] from the manifest file.
  • The environment_variables are the environment variables that are set when activating the environment.

You can ignore the cache by running:

pixi run/shell/shell-hook --force-activate\n

Set the configuration with: config.toml

[experimental]\n# Enable the use of the environment activation cache\nuse-environment-activation-cache = true\n

Why is this experimental?

This feature is experimental because the cache invalidation is very tricky, and we don't want to disturb users that are not affected by activation times.

"},{"location":"reference/pixi_configuration/#mirror-configuration","title":"Mirror configuration","text":"

You can configure mirrors for conda channels. We expect that mirrors are exact copies of the original channel. The implementation will look for the mirror key (a URL) in the mirrors section of the configuration file and replace the original URL with the mirror URL.

To also include the original URL, you have to repeat it in the list of mirrors.

The mirrors are prioritized based on the order of the list. We attempt to fetch the repodata (the most important file) from the first mirror in the list. The repodata contains all the SHA256 hashes of the individual packages, so it is important to get this file from a trusted source.

You can also specify mirrors for an entire \"host\", e.g.

config.toml
[mirrors]\n\"https://conda.anaconda.org\" = [\"https://prefix.dev/\"]\n

This will forward all request to channels on anaconda.org to prefix.dev. Channels that are not currently mirrored on prefix.dev will fail in the above example.

"},{"location":"reference/pixi_configuration/#oci-mirrors","title":"OCI Mirrors","text":"

You can also specify mirrors on the OCI registry. There is a public mirror on the Github container registry (ghcr.io) that is maintained by the conda-forge team. You can use it like this:

config.toml
[mirrors]\n\"https://conda.anaconda.org/conda-forge\" = [\n  \"oci://ghcr.io/channel-mirrors/conda-forge\",\n]\n

The GHCR mirror also contains bioconda packages. You can search the available packages on Github.

"},{"location":"reference/pixi_manifest/","title":"Manifest","text":"

The pixi.toml is the project manifest, also known as the pixi project configuration file.

A toml file is structured in different tables. This document will explain the usage of the different tables. For more technical documentation check pixi on docs.rs.

Tip

We also support the pyproject.toml file. It has the same structure as the pixi.toml file. except that you need to prepend the tables with tool.pixi instead of just the table name. For example, the [project] table becomes [tool.pixi.project]. There are also some small extras that are available in the pyproject.toml file, checkout the pyproject.toml documentation for more information.

"},{"location":"reference/pixi_manifest/#manifest-discovery","title":"Manifest discovery","text":"

The manifest can be found at the following locations.

Priority Location Comments 6 --manifest-path Command-line argument 5 pixi.toml In your current working directory. 4 pyproject.toml In your current working directory. 3 pixi.toml or pyproject.toml Iterate through all parent directories. The first discovered manifest is used. 1 $PIXI_PROJECT_MANIFEST If $PIXI_IN_SHELL is set. This happens with pixi shell or pixi run.

Note

If multiple locations exist, the manifest with the highest priority will be used.

"},{"location":"reference/pixi_manifest/#the-project-table","title":"The project table","text":"

The minimally required information in the project table is:

[project]\nchannels = [\"conda-forge\"]\nname = \"project-name\"\nplatforms = [\"linux-64\"]\n
"},{"location":"reference/pixi_manifest/#name","title":"name","text":"

The name of the project.

name = \"project-name\"\n
"},{"location":"reference/pixi_manifest/#channels","title":"channels","text":"

This is a list that defines the channels used to fetch the packages from. If you want to use channels hosted on anaconda.org you only need to use the name of the channel directly.

channels = [\"conda-forge\", \"robostack\", \"bioconda\", \"nvidia\", \"pytorch\"]\n

Channels situated on the file system are also supported with absolute file paths:

channels = [\"conda-forge\", \"file:///home/user/staged-recipes/build_artifacts\"]\n

To access private or public channels on prefix.dev or Quetz use the url including the hostname:

channels = [\"conda-forge\", \"https://repo.prefix.dev/channel-name\"]\n
"},{"location":"reference/pixi_manifest/#platforms","title":"platforms","text":"

Defines the list of platforms that the project supports. Pixi solves the dependencies for all these platforms and puts them in the lock file (pixi.lock).

platforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n

The available platforms are listed here: link

Special macOS behavior

macOS has two platforms: osx-64 for Intel Macs and osx-arm64 for Apple Silicon Macs. To support both, include both in your platforms list. Fallback: If osx-arm64 can't resolve, use osx-64. Running osx-64 on Apple Silicon uses Rosetta for Intel binaries.

"},{"location":"reference/pixi_manifest/#version-optional","title":"version (optional)","text":"

The version of the project. This should be a valid version based on the conda Version Spec. See the version documentation, for an explanation of what is allowed in a Version Spec.

version = \"1.2.3\"\n
"},{"location":"reference/pixi_manifest/#authors-optional","title":"authors (optional)","text":"

This is a list of authors of the project.

authors = [\"John Doe <j.doe@prefix.dev>\", \"Marie Curie <mss1867@gmail.com>\"]\n
"},{"location":"reference/pixi_manifest/#description-optional","title":"description (optional)","text":"

This should contain a short description of the project.

description = \"A simple description\"\n
"},{"location":"reference/pixi_manifest/#license-optional","title":"license (optional)","text":"

The license as a valid SPDX string (e.g. MIT AND Apache-2.0)

license = \"MIT\"\n
"},{"location":"reference/pixi_manifest/#license-file-optional","title":"license-file (optional)","text":"

Relative path to the license file.

license-file = \"LICENSE.md\"\n
"},{"location":"reference/pixi_manifest/#readme-optional","title":"readme (optional)","text":"

Relative path to the README file.

readme = \"README.md\"\n
"},{"location":"reference/pixi_manifest/#homepage-optional","title":"homepage (optional)","text":"

URL of the project homepage.

homepage = \"https://pixi.sh\"\n
"},{"location":"reference/pixi_manifest/#repository-optional","title":"repository (optional)","text":"

URL of the project source repository.

repository = \"https://github.com/prefix-dev/pixi\"\n
"},{"location":"reference/pixi_manifest/#documentation-optional","title":"documentation (optional)","text":"

URL of the project documentation.

documentation = \"https://pixi.sh\"\n
"},{"location":"reference/pixi_manifest/#conda-pypi-map-optional","title":"conda-pypi-map (optional)","text":"

Mapping of channel name or URL to location of mapping that can be URL/Path. Mapping should be structured in json format where conda_name: pypi_package_name. Example:

local/robostack_mapping.json
{\n  \"jupyter-ros\": \"my-name-from-mapping\",\n  \"boltons\": \"boltons-pypi\"\n}\n

If conda-forge is not present in conda-pypi-map pixi will use prefix.dev mapping for it.

conda-pypi-map = { \"conda-forge\" = \"https://example.com/mapping\", \"https://repo.prefix.dev/robostack\" = \"local/robostack_mapping.json\"}\n
"},{"location":"reference/pixi_manifest/#channel-priority-optional","title":"channel-priority (optional)","text":"

This is the setting for the priority of the channels in the solver step.

Options:

  • strict: Default, The channels are used in the order they are defined in the channels list. Only packages from the first channel that has the package are used. This ensures that different variants for a single package are not mixed from different channels. Using packages from different incompatible channels like conda-forge and main can lead to hard to debug ABI incompatibilities.

    We strongly recommend not to switch the default.

  • disabled: There is no priority, all package variants from all channels will be set per package name and solved as one. Care should be taken when using this option. Since package variants can come from any channel when you use this mode, packages might not be compatible. This can cause hard to debug ABI incompatibilities.

    We strongly discourage using this option.

channel-priority = \"disabled\"\n

channel-priority = \"disabled\" is a security risk

Disabling channel priority may lead to unpredictable dependency resolutions. This is a possible security risk as it may lead to packages being installed from unexpected channels. It's advisable to maintain the default strict setting and order channels thoughtfully. If necessary, specify a channel directly for a dependency.

[project]\n# Putting conda-forge first solves most issues\nchannels = [\"conda-forge\", \"channel-name\"]\n[dependencies]\npackage = {version = \"*\", channel = \"channel-name\"}\n

"},{"location":"reference/pixi_manifest/#the-tasks-table","title":"The tasks table","text":"

Tasks are a way to automate certain custom commands in your project. For example, a lint or format step. Tasks in a pixi project are essentially cross-platform shell commands, with a unified syntax across platforms. For more in-depth information, check the Advanced tasks documentation. Pixi's tasks are run in a pixi environment using pixi run and are executed using the deno_task_shell.

[tasks]\nsimple = \"echo This is a simple task\"\ncmd = { cmd=\"echo Same as a simple task but now more verbose\"}\ndepending = { cmd=\"echo run after simple\", depends-on=\"simple\"}\nalias = { depends-on=[\"depending\"]}\ndownload = { cmd=\"curl -o file.txt https://example.com/file.txt\" , outputs=[\"file.txt\"]}\nbuild = { cmd=\"npm build\", cwd=\"frontend\", inputs=[\"frontend/package.json\", \"frontend/*.js\"]}\nrun = { cmd=\"python run.py $ARGUMENT\", env={ ARGUMENT=\"value\" }}\nformat = { cmd=\"black $INIT_CWD\" } # runs black where you run pixi run format\nclean-env = { cmd = \"python isolated.py\", clean-env = true} # Only on Unix!\n

You can modify this table using pixi task.

Note

Specify different tasks for different platforms using the target table

Info

If you want to hide a task from showing up with pixi task list or pixi info, you can prefix the name with _. For example, if you want to hide depending, you can rename it to _depending.

"},{"location":"reference/pixi_manifest/#the-system-requirements-table","title":"The system-requirements table","text":"

The system requirements are used to define minimal system specifications used during dependency resolution.

For example, we can define a unix system with a specific minimal libc version.

[system-requirements]\nlibc = \"2.28\"\n
or make the project depend on a specific version of cuda:
[system-requirements]\ncuda = \"12\"\n

The options are:

  • linux: The minimal version of the linux kernel.
  • libc: The minimal version of the libc library. Also allows specifying the family of the libc library. e.g. libc = { family=\"glibc\", version=\"2.28\" }
  • macos: The minimal version of the macOS operating system.
  • cuda: The minimal version of the CUDA library.

More information in the system requirements documentation.

"},{"location":"reference/pixi_manifest/#the-pypi-options-table","title":"The pypi-options table","text":"

The pypi-options table is used to define options that are specific to PyPI registries. These options can be specified either at the root level, which will add it to the default options feature, or on feature level, which will create a union of these options when the features are included in the environment.

The options that can be defined are:

  • index-url: replaces the main index url.
  • extra-index-urls: adds an extra index url.
  • find-links: similar to --find-links option in pip.
  • no-build-isolation: disables build isolation, can only be set per package.
  • index-strategy: allows for specifying the index strategy to use.

These options are explained in the sections below. Most of these options are taken directly or with slight modifications from the uv settings. If any are missing that you need feel free to create an issue requesting them.

"},{"location":"reference/pixi_manifest/#alternative-registries","title":"Alternative registries","text":"

Strict Index Priority

Unlike pip, because we make use of uv, we have a strict index priority. This means that the first index is used where a package can be found. The order is determined by the order in the toml file. Where the extra-index-urls are preferred over the index-url. Read more about this on the uv docs

Often you might want to use an alternative or extra index for your project. This can be done by adding the pypi-options table to your pixi.toml file, the following options are available:

  • index-url: replaces the main index url. If this is not set the default index used is https://pypi.org/simple. Only one index-url can be defined per environment.
  • extra-index-urls: adds an extra index url. The urls are used in the order they are defined. And are preferred over the index-url. These are merged across features into an environment.
  • find-links: which can either be a path {path = './links'} or a url {url = 'https://example.com/links'}. This is similar to the --find-links option in pip. These are merged across features into an environment.

An example:

[pypi-options]\nindex-url = \"https://pypi.org/simple\"\nextra-index-urls = [\"https://example.com/simple\"]\nfind-links = [{path = './links'}]\n

There are some examples in the pixi repository, that make use of this feature.

Authentication Methods

To read about existing authentication methods for private registries, please check the PyPI Authentication section.

"},{"location":"reference/pixi_manifest/#no-build-isolation","title":"No Build Isolation","text":"

Even though build isolation is a good default. One can choose to not isolate the build for a certain package name, this allows the build to access the pixi environment. This is convenient if you want to use torch or something similar for your build-process.

[dependencies]\npytorch = \"2.4.0\"\n\n[pypi-options]\nno-build-isolation = [\"detectron2\"]\n\n[pypi-dependencies]\ndetectron2 = { git = \"https://github.com/facebookresearch/detectron2.git\", rev = \"5b72c27ae39f99db75d43f18fd1312e1ea934e60\"}\n

Conda dependencies define the build environment

To use no-build-isolation effectively, use conda dependencies to define the build environment. These are installed before the PyPI dependencies are resolved, this way these dependencies are available during the build process. In the example above adding torch as a PyPI dependency would be ineffective, as it would not yet be installed during the PyPI resolution phase.

"},{"location":"reference/pixi_manifest/#index-strategy","title":"Index Strategy","text":"

The strategy to use when resolving against multiple index URLs. Description modified from the uv documentation:

By default, uv and thus pixi, will stop at the first index on which a given package is available, and limit resolutions to those present on that first index (first-match). This prevents dependency confusion attacks, whereby an attack can upload a malicious package under the same name to a secondary index.

One index strategy per environment

Only one index-strategy can be defined per environment or solve-group, otherwise, an error will be shown.

"},{"location":"reference/pixi_manifest/#possible-values","title":"Possible values:","text":"
  • \"first-index\": Only use results from the first index that returns a match for a given package name
  • \"unsafe-first-match\": Search for every package name across all indexes, exhausting the versions from the first index before moving on to the next. Meaning if the package a is available on index x and y, it will prefer the version from x unless you've requested a package version that is only available on y.
  • \"unsafe-best-match\": Search for every package name across all indexes, preferring the best version found. If a package version is in multiple indexes, only look at the entry for the first index. So given index, x and y that both contain package a, it will take the best version from either x or y, but should that version be available on both indexes it will prefer x.

PyPI only

The index-strategy only changes PyPI package resolution and not conda package resolution.

"},{"location":"reference/pixi_manifest/#the-dependencies-tables","title":"The dependencies table(s)","text":"

This section defines what dependencies you would like to use for your project.

There are multiple dependencies tables. The default is [dependencies], which are dependencies that are shared across platforms.

Dependencies are defined using a VersionSpec. A VersionSpec combines a Version with an optional operator.

Some examples are:

# Use this exact package version\npackage0 = \"1.2.3\"\n# Use 1.2.3 up to 1.3.0\npackage1 = \"~=1.2.3\"\n# Use larger than 1.2 lower and equal to 1.4\npackage2 = \">1.2,<=1.4\"\n# Bigger or equal than 1.2.3 or lower not including 1.0.0\npackage3 = \">=1.2.3|<1.0.0\"\n

Dependencies can also be defined as a mapping where it is using a matchspec:

package0 = { version = \">=1.2.3\", channel=\"conda-forge\" }\npackage1 = { version = \">=1.2.3\", build=\"py34_0\" }\n

Tip

The dependencies can be easily added using the pixi add command line. Running add for an existing dependency will replace it with the newest it can use.

Note

To specify different dependencies for different platforms use the target table

"},{"location":"reference/pixi_manifest/#dependencies","title":"dependencies","text":"

Add any conda package dependency that you want to install into the environment. Don't forget to add the channel to the project table should you use anything different than conda-forge. Even if the dependency defines a channel that channel should be added to the project.channels list.

[dependencies]\npython = \">3.9,<=3.11\"\nrust = \"1.72\"\npytorch-cpu = { version = \"~=1.1\", channel = \"pytorch\" }\n
"},{"location":"reference/pixi_manifest/#pypi-dependencies","title":"pypi-dependencies","text":"Details regarding the PyPI integration

We use uv, which is a new fast pip replacement written in Rust.

We integrate uv as a library, so we use the uv resolver, to which we pass the conda packages as 'locked'. This disallows uv from installing these dependencies itself, and ensures it uses the exact version of these packages in the resolution. This is unique amongst conda based package managers, which usually just call pip from a subprocess.

The uv resolution is included in the lock file directly.

Pixi directly supports depending on PyPI packages, the PyPA calls a distributed package a 'distribution'. There are Source and Binary distributions both of which are supported by pixi. These distributions are installed into the environment after the conda environment has been resolved and installed. PyPI packages are not indexed on prefix.dev but can be viewed on pypi.org.

Important considerations

  • Stability: PyPI packages might be less stable than their conda counterparts. Prefer using conda packages in the dependencies table where possible.
"},{"location":"reference/pixi_manifest/#version-specification","title":"Version specification:","text":"

These dependencies don't follow the conda matchspec specification. The version is a string specification of the version according to PEP404/PyPA. Additionally, a list of extra's can be included, which are essentially optional dependencies. Note that this version is distinct from the conda MatchSpec type. See the example below to see how this is used in practice:

[dependencies]\n# When using pypi-dependencies, python is needed to resolve pypi dependencies\n# make sure to include this\npython = \">=3.6\"\n\n[pypi-dependencies]\nfastapi = \"*\"  # This means any version (the wildcard `*` is a pixi addition, not part of the specification)\npre-commit = \"~=3.5.0\" # This is a single version specifier\n# Using the toml map allows the user to add `extras`\npandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\n\n# git dependencies\n# With ssh\nflask = { git = \"ssh://git@github.com/pallets/flask\" }\n# With https and a specific revision\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\n# TODO: will support later -> branch = '' or tag = '' to specify a branch or tag\n\n# You can also directly add a source dependency from a path, tip keep this relative to the root of the project.\nminimal-project = { path = \"./minimal-project\", editable = true}\n\n# You can also use a direct url, to either a `.tar.gz` or `.zip`, or a `.whl` file\nclick = { url = \"https://github.com/pallets/click/releases/download/8.1.7/click-8.1.7-py3-none-any.whl\" }\n\n# You can also just the default git repo, it will checkout the default branch\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\n
"},{"location":"reference/pixi_manifest/#full-specification","title":"Full specification","text":"

The full specification of a PyPI dependencies that pixi supports can be split into the following fields:

"},{"location":"reference/pixi_manifest/#extras","title":"extras","text":"

A list of extras to install with the package. e.g. [\"dataframe\", \"sql\"] The extras field works with all other version specifiers as it is an addition to the version specifier.

pandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\npytest = { git = \"URL\", extras = [\"dev\"]}\nblack = { url = \"URL\", extras = [\"cli\"]}\nminimal-project = { path = \"./minimal-project\", editable = true, extras = [\"dev\"]}\n
"},{"location":"reference/pixi_manifest/#version","title":"version","text":"

The version of the package to install. e.g. \">=1.0.0\" or * which stands for any version, this is pixi specific. Version is our default field so using no inline table ({}) will default to this field.

py-rattler = \"*\"\nruff = \"~=1.0.0\"\npytest = {version = \"*\", extras = [\"dev\"]}\n
"},{"location":"reference/pixi_manifest/#index","title":"index","text":"

The index parameter allows you to specify the URL of a custom package index for the installation of a specific package. This feature is useful when you want to ensure that a package is retrieved from a particular source, rather than from the default index.

For example, to use some other than the official Python Package Index (PyPI) at https://pypi.org/simple, you can use the index parameter:

torch = { version = \"*\", index = \"https://download.pytorch.org/whl/cu118\" }\n

This is useful for PyTorch specifically, as the registries are pinned to different CUDA versions.

"},{"location":"reference/pixi_manifest/#git","title":"git","text":"

A git repository to install from. This support both https:// and ssh:// urls.

Use git in combination with rev or subdirectory:

  • rev: A specific revision to install. e.g. rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660
  • subdirectory: A subdirectory to install from. subdirectory = \"src\" or subdirectory = \"src/packagex\"
# Note don't forget the `ssh://` or `https://` prefix!\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\npy-rattler = { git = \"ssh://git@github.com/mamba-org/rattler.git\", subdirectory = \"py-rattler\" }\n
"},{"location":"reference/pixi_manifest/#path","title":"path","text":"

A local path to install from. e.g. path = \"./path/to/package\" We would advise to keep your path projects in the project, and to use a relative path.

Set editable to true to install in editable mode, this is highly recommended as it is hard to reinstall if you're not using editable mode. e.g. editable = true

minimal-project = { path = \"./minimal-project\", editable = true}\n
"},{"location":"reference/pixi_manifest/#url","title":"url","text":"

A URL to install a wheel or sdist directly from an url.

pandas = {url = \"https://files.pythonhosted.org/packages/3d/59/2afa81b9fb300c90531803c0fd43ff4548074fa3e8d0f747ef63b3b5e77a/pandas-2.2.1.tar.gz\"}\n
Did you know you can use: add --pypi?

Use the --pypi flag with the add command to quickly add PyPI packages from the CLI. E.g pixi add --pypi flask

This does not support all the features of the pypi-dependencies table yet.

"},{"location":"reference/pixi_manifest/#source-dependencies-sdist","title":"Source dependencies (sdist)","text":"

The Source Distribution Format is a source based format (sdist for short), that a package can include alongside the binary wheel format. Because these distributions need to be built, the need a python executable to do this. This is why python needs to be present in a conda environment. Sdists usually depend on system packages to be built, especially when compiling C/C++ based python bindings. Think for example of Python SDL2 bindings depending on the C library: SDL2. To help built these dependencies we activate the conda environment that includes these pypi dependencies before resolving. This way when a source distribution depends on gcc for example, it's used from the conda environment instead of the system.

"},{"location":"reference/pixi_manifest/#host-dependencies","title":"host-dependencies","text":"

This table contains dependencies that are needed to build your project but which should not be included when your project is installed as part of another project. In other words, these dependencies are available during the build but are no longer available when your project is installed. Dependencies listed in this table are installed for the architecture of the target machine.

[host-dependencies]\npython = \"~=3.10.3\"\n

Typical examples of host dependencies are:

  • Base interpreters: a Python package would list python here and an R package would list mro-base or r-base.
  • Libraries your project links against during compilation like openssl, rapidjson, or xtensor.
"},{"location":"reference/pixi_manifest/#build-dependencies","title":"build-dependencies","text":"

This table contains dependencies that are needed to build the project. Different from dependencies and host-dependencies these packages are installed for the architecture of the build machine. This enables cross-compiling from one machine architecture to another.

[build-dependencies]\ncmake = \"~=3.24\"\n

Typical examples of build dependencies are:

  • Compilers are invoked on the build machine, but they generate code for the target machine. If the project is cross-compiled, the architecture of the build and target machine might differ.
  • cmake is invoked on the build machine to generate additional code- or project-files which are then include in the compilation process.

Info

The build target refers to the machine that will execute the build. Programs and libraries installed by these dependencies will be executed on the build machine.

For example, if you compile on a MacBook with an Apple Silicon chip but target Linux x86_64 then your build platform is osx-arm64 and your host platform is linux-64.

"},{"location":"reference/pixi_manifest/#the-activation-table","title":"The activation table","text":"

The activation table is used for specialized activation operations that need to be run when the environment is activated.

There are two types of activation operations a user can modify in the manifest:

  • scripts: A list of scripts that are run when the environment is activated.
  • env: A mapping of environment variables that are set when the environment is activated.

These activation operations will be run before the pixi run and pixi shell commands.

Note

The activation operations are run by the system shell interpreter as they run before an environment is available. This means that it runs as cmd.exe on windows and bash on linux and osx (Unix). Only .sh, .bash and .bat files are supported.

And the environment variables are set in the shell that is running the activation script, thus take note when using e.g. $ or %.

If you have scripts or env variable per platform use the target table.

[activation]\nscripts = [\"env_setup.sh\"]\nenv = { ENV_VAR = \"value\" }\n\n# To support windows platforms as well add the following\n[target.win-64.activation]\nscripts = [\"env_setup.bat\"]\n\n[target.linux-64.activation.env]\nENV_VAR = \"linux-value\"\n\n# You can also reference existing environment variables, but this has\n# to be done separately for unix-like operating systems and Windows\n[target.unix.activation.env]\nENV_VAR = \"$OTHER_ENV_VAR/unix-value\"\n\n[target.win.activation.env]\nENV_VAR = \"%OTHER_ENV_VAR%\\\\windows-value\"\n
"},{"location":"reference/pixi_manifest/#the-target-table","title":"The target table","text":"

The target table is a table that allows for platform specific configuration. Allowing you to make different sets of tasks or dependencies per platform.

The target table is currently implemented for the following sub-tables:

  • activation
  • dependencies
  • tasks

The target table is defined using [target.PLATFORM.SUB-TABLE]. E.g [target.linux-64.dependencies]

The platform can be any of:

  • win, osx, linux or unix (unix matches linux and osx)
  • or any of the (more) specific target platforms, e.g. linux-64, osx-arm64

The sub-table can be any of the specified above.

To make it a bit more clear, let's look at an example below. Currently, pixi combines the top level tables like dependencies with the target-specific ones into a single set. Which, in the case of dependencies, can both add or overwrite dependencies. In the example below, we have cmake being used for all targets but on osx-64 or osx-arm64 a different version of python will be selected.

[dependencies]\ncmake = \"3.26.4\"\npython = \"3.10\"\n\n[target.osx.dependencies]\npython = \"3.11\"\n

Here are some more examples:

[target.win-64.activation]\nscripts = [\"setup.bat\"]\n\n[target.win-64.dependencies]\nmsmpi = \"~=10.1.1\"\n\n[target.win-64.build-dependencies]\nvs2022_win-64 = \"19.36.32532\"\n\n[target.win-64.tasks]\ntmp = \"echo $TEMP\"\n\n[target.osx-64.dependencies]\nclang = \">=16.0.6\"\n
"},{"location":"reference/pixi_manifest/#the-feature-and-environments-tables","title":"The feature and environments tables","text":"

The feature table allows you to define features that can be used to create different [environments]. The [environments] table allows you to define different environments. The design is explained in the this design document.

Simplest example
[feature.test.dependencies]\npytest = \"*\"\n\n[environments]\ntest = [\"test\"]\n

This will create an environment called test that has pytest installed.

"},{"location":"reference/pixi_manifest/#the-feature-table","title":"The feature table","text":"

The feature table allows you to define the following fields per feature.

  • dependencies: Same as the dependencies.
  • pypi-dependencies: Same as the pypi-dependencies.
  • pypi-options: Same as the pypi-options.
  • system-requirements: Same as the system-requirements.
  • activation: Same as the activation.
  • platforms: Same as the platforms. Unless overridden, the platforms of the feature will be those defined at project level.
  • channels: Same as the channels. Unless overridden, the channels of the feature will be those defined at project level.
  • channel-priority: Same as the channel-priority.
  • target: Same as the target.
  • tasks: Same as the tasks.

These tables are all also available without the feature prefix. When those are used we call them the default feature. This is a protected name you can not use for your own feature.

Cuda feature table example
[feature.cuda]\nactivation = {scripts = [\"cuda_activation.sh\"]}\n# Results in:  [\"nvidia\", \"conda-forge\"] when the default is `conda-forge`\nchannels = [\"nvidia\"]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"==1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nsystem-requirements = {cuda = \"12\"}\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
Cuda feature table example but written as separate tables
[feature.cuda.activation]\nscripts = [\"cuda_activation.sh\"]\n\n[feature.cuda.dependencies]\ncuda = \"x.y.z\"\ncudnn = \"12.0\"\n\n[feature.cuda.pypi-dependencies]\ntorch = \"==1.9.0\"\n\n[feature.cuda.system-requirements]\ncuda = \"12\"\n\n[feature.cuda.tasks]\nwarmup = \"python warmup.py\"\n\n[feature.cuda.target.osx-arm64.dependencies]\nmlx = \"x.y.z\"\n\n# Channels and Platforms are not available as separate tables as they are implemented as lists\n[feature.cuda]\nchannels = [\"nvidia\"]\nplatforms = [\"linux-64\", \"osx-arm64\"]\n
"},{"location":"reference/pixi_manifest/#the-environments-table","title":"The environments table","text":"

The [environments] table allows you to define environments that are created using the features defined in the [feature] tables.

The environments table is defined using the following fields:

  • features: The features that are included in the environment. Unless no-default-feature is set to true, the default feature is implicitly included in the environment.
  • solve-group: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies. These dependencies will then be the same version in all environments that have the same solve group. But the different environments contain different subsets of the solve-groups dependencies set.
  • no-default-feature: Whether to include the default feature in that environment. The default is false, to include the default feature.

Full environments table specification

[environments]\ntest = {features = [\"test\"], solve-group = \"test\"}\nprod = {features = [\"prod\"], solve-group = \"test\"}\nlint = {features = [\"lint\"], no-default-feature = true}\n
As shown in the example above, in the simplest of cases, it is possible to define an environment only by listing its features:

Simplest example
[environments]\ntest = [\"test\"]\n

is equivalent to

Simplest example expanded
[environments]\ntest = {features = [\"test\"]}\n

When an environment comprises several features (including the default feature):

  • The activation and tasks of the environment are the union of the activation and tasks of all its features.
  • The dependencies and pypi-dependencies of the environment are the union of the dependencies and pypi-dependencies of all its features. This means that if several features define a requirement for the same package, both requirements will be combined. Beware of conflicting requirements across features added to the same environment.
  • The system-requirements of the environment is the union of the system-requirements of all its features. If multiple features specify a requirement for the same system package, the highest version is chosen.
  • The channels of the environment is the union of the channels of all its features. Channel priorities can be specified in each feature, to ensure channels are considered in the right order in the environment.
  • The platforms of the environment is the intersection of the platforms of all its features. Be aware that the platforms supported by a feature (including the default feature) will be considered as the platforms defined at project level (unless overridden in the feature). This means that it is usually a good idea to set the project platforms to all platforms it can support across its environments.
"},{"location":"reference/pixi_manifest/#preview-features","title":"Preview features","text":"

Pixi sometimes introduces new features that are not yet stable, but that we would like for users to test out. These features are called preview features. Preview features are disabled by default and can be enabled by setting the preview field in the project manifest. The preview field is an array of strings that specify the preview features to enable, or the boolean value true to enable all preview features.

An example of a preview feature in the project manifest:

Example preview features in the project manifest
[project]\nname = \"foo\"\nchannels = []\nplatforms = []\npreview = [\"new-resolve\"]\n

Preview features in the documentation will be marked as such on the relevant pages.

"},{"location":"reference/pixi_manifest/#global-configuration","title":"Global configuration","text":"

The global configuration options are documented in the global configuration section.

"},{"location":"switching_from/conda/","title":"Transitioning from the conda or mamba to pixi","text":"

Welcome to the guide designed to ease your transition from conda or mamba to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow, enhancing your development process, and allowing for easy sharing of your work.

"},{"location":"switching_from/conda/#why-pixi","title":"Why Pixi?","text":"

Pixi builds upon the foundation of the conda ecosystem, introducing a project-centric approach rather than focusing solely on environments. This shift towards projects offers a more organized and efficient way to manage dependencies and run code, tailored to modern development practices.

"},{"location":"switching_from/conda/#key-differences-at-a-glance","title":"Key Differences at a Glance","text":"Task Conda/Mamba Pixi Installation Requires an installer Download and add to path (See installation) Creating an Environment conda create -n myenv -c conda-forge python=3.8 pixi init myenv followed by pixi add python=3.8 Activating an Environment conda activate myenv pixi shell within the project directory Deactivating an Environment conda deactivate exit from the pixi shell Running a Task conda run -n myenv python my_program.py pixi run python my_program.py (See run) Installing a Package conda install numpy pixi add numpy Uninstalling a Package conda remove numpy pixi remove numpy

No base environment

Conda has a base environment, which is the default environment when you start a new shell. Pixi does not have a base environment. And requires you to install the tools you need in the project or globally. Using pixi global install bat will install bat in a global environment, which is not the same as the base environment in conda.

Activating pixi environment in the current shell

For some advanced use-cases, you can activate the environment in the current shell. This uses the pixi shell-hook which prints the activation script, which can be used to activate the environment in the current shell without pixi itself.

~/myenv > eval \"$(pixi shell-hook)\"\n

"},{"location":"switching_from/conda/#environment-vs-project","title":"Environment vs Project","text":"

Conda and mamba focus on managing environments, while pixi emphasizes projects. In pixi, a project is a folder containing a manifest(pixi.toml/pyproject.toml) file that describes the project, a pixi.lock lock-file that describes the exact dependencies, and a .pixi folder that contains the environment.

This project-centric approach allows for easy sharing and collaboration, as the project folder contains all the necessary information to recreate the environment. It manages more than one environment for more than one platform in a single project, and allows for easy switching between them. (See multiple environments)

"},{"location":"switching_from/conda/#global-environments","title":"Global environments","text":"

conda installs all environments in one global location. When this is important to you for filesystem reasons, you can use the detached-environments feature of pixi.

pixi config set detached-environments true\n# or a specific location\npixi config set detached-environments /path/to/envs\n
This doesn't allow you to activate the environments using pixi shell -n but it will make the installation of the environments go to the same folder.

pixi does have the pixi global command to install tools on your machine. (See global) This is not a replacement for conda but works the same as pipx and condax. It creates a single isolated environment for the given requirement and installs the binaries into the global path.

pixi global install bat\nbat pixi.toml\n

Never install pip with pixi global

Installations with pixi global get their own isolated environment. Installing pip with pixi global will create a new isolated environment with its own pip binary. Using that pip binary will install packages in the pip environment, making it unreachable form anywhere as you can't activate it.

"},{"location":"switching_from/conda/#automated-switching","title":"Automated switching","text":"

With pixi you can import environment.yml files into a pixi project. (See import)

pixi init --import environment.yml\n
This will create a new project with the dependencies from the environment.yml file.

Exporting your environment

If you are working with Conda users or systems, you can export your environment to a environment.yml file to share them.

pixi project export conda-environment\n
Additionally you can export a conda explicit specification.

"},{"location":"switching_from/conda/#troubleshooting","title":"Troubleshooting","text":"

Encountering issues? Here are solutions to some common problems when being used to the conda workflow:

  • Dependency is excluded because due to strict channel priority not using this option from: 'https://conda.anaconda.org/conda-forge/' This error occurs when the package is in multiple channels. pixi uses a strict channel priority. See channel priority for more information.
  • pixi global install pip, pip doesn't work. pip is installed in the global isolated environment. Use pixi add pip in a project to install pip in the project environment and use that project.
  • pixi global install <Any Library> -> import <Any Library> -> ModuleNotFoundError: No module named '<Any Library>' The library is installed in the global isolated environment. Use pixi add <Any Library> in a project to install the library in the project environment and use that project.
"},{"location":"switching_from/poetry/","title":"Transitioning from poetry to pixi","text":"

Welcome to the guide designed to ease your transition from poetry to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow similar to poetry while including the conda ecosystem and allowing for easy sharing of your work.

"},{"location":"switching_from/poetry/#why-pixi","title":"Why Pixi?","text":"

Poetry is most-likely the closest tool to pixi in terms of project management, in the python ecosystem. On top of the PyPI ecosystem, pixi adds the power of the conda ecosystem, allowing for a more flexible and powerful environment management.

"},{"location":"switching_from/poetry/#quick-look-at-the-differences","title":"Quick look at the differences","text":"Task Poetry Pixi Creating an Environment poetry new myenv pixi init myenv Running a Task poetry run which python pixi run which python pixi uses a built-in cross platform shell for run where poetry uses your shell. Installing a Package poetry add numpy pixi add numpy adds the conda variant. pixi add --pypi numpy adds the PyPI variant. Uninstalling a Package poetry remove numpy pixi remove numpy removes the conda variant. pixi remove --pypi numpy removes the PyPI variant. Building a package poetry build We've yet to implement package building and publishing Publishing a package poetry publish We've yet to implement package building and publishing Reading the pyproject.toml [tool.poetry] [tool.pixi] Defining dependencies [tool.poetry.dependencies] [tool.pixi.dependencies] for conda, [tool.pixi.pypi-dependencies] or [project.dependencies] for PyPI dependencies Dependency definition - numpy = \"^1.2.3\"- numpy = \"~1.2.3\"- numpy = \"*\" - numpy = \">=1.2.3 <2.0.0\"- numpy = \">=1.2.3 <1.3.0\"- numpy = \"*\" Lock file poetry.lock pixi.lock Environment directory ~/.cache/pypoetry/virtualenvs/myenv ./.pixi Defaults to the project folder, move this using the detached-environments"},{"location":"switching_from/poetry/#support-both-poetry-and-pixi-in-my-project","title":"Support both poetry and pixi in my project","text":"

You can allow users to use poetry and pixi in the same project, they will not touch each other's parts of the configuration or system. It's best to duplicate the dependencies, basically making an exact copy of the tool.poetry.dependencies into tool.pixi.pypi-dependencies. Make sure that python is only defined in the tool.pixi.dependencies and not in the tool.pixi.pypi-dependencies.

Mixing pixi and poetry

It's possible to use poetry in pixi environments but this is advised against. Pixi supports PyPI dependencies in a different way than poetry does, and mixing them can lead to unexpected behavior. As you can only use one package manager at a time, it's best to stick to one.

If using poetry on top of a pixi project, you'll always need to install the poetry environment after the pixi environment. And let pixi handle the python and poetry installation.

"},{"location":"tutorials/python/","title":"Tutorial: Doing Python development with Pixi","text":"

In this tutorial, we will show you how to create a simple Python project with pixi. We will show some of the features that pixi provides, that are currently not a part of pdm, poetry etc.

"},{"location":"tutorials/python/#why-is-this-useful","title":"Why is this useful?","text":"

Pixi builds upon the conda ecosystem, which allows you to create a Python environment with all the dependencies you need. This is especially useful when you are working with multiple Python interpreters and bindings to C and C++ libraries. For example, GDAL from PyPI does not have binary C dependencies, but the conda package does. On the other hand, some packages are only available through PyPI, which pixi can also install for you. Best of both world, let's give it a go!

"},{"location":"tutorials/python/#pixitoml-and-pyprojecttoml","title":"pixi.toml and pyproject.toml","text":"

We support two manifest formats: pyproject.toml and pixi.toml. In this tutorial, we will use the pyproject.toml format because it is the most common format for Python projects.

"},{"location":"tutorials/python/#lets-get-started","title":"Let's get started","text":"

Let's start out by creating a new project that uses a pyproject.toml file.

pixi init pixi-py --format pyproject\n

This creates a project with the following structure:

\u251c\u2500\u2500 src\n\u2502   \u2514\u2500\u2500 pixi_py\n\u2502       \u2514\u2500\u2500 __init__.py\n\u2514\u2500\u2500 pyproject.toml\n

The pyproject.toml for the project looks like this:

[project]\nname = \"pixi-py\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [{name = \"Tim de Jager\", email = \"tim@prefix.dev\"}]\nrequires-python = \">= 3.11\"\ndependencies = []\n\n[build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\"]\n\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n\n[tool.pixi.tasks]\n

This project uses a src-layout, but pixi supports both flat- and src-layouts.

"},{"location":"tutorials/python/#whats-in-the-pyprojecttoml","title":"What's in the pyproject.toml?","text":"

Okay, so let's have a look at what sections have been added and how we can modify the pyproject.toml.

These first entries were added to the pyproject.toml file:

# Main pixi entry\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\n# This is your machine platform by default\nplatforms = [\"osx-arm64\"]\n

The channels and platforms are added to the [tool.pixi.project] section. Channels like conda-forge manage packages similar to PyPI but allow for different packages across languages. The keyword platforms determines what platform the project supports.

The pixi_py package itself is added as an editable dependency. This means that the package is installed in editable mode, so you can make changes to the package and see the changes reflected in the environment, without having to re-install the environment.

# Editable installs\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n

In pixi, unlike other package managers, this is explicitly stated in the pyproject.toml file. The main reason being so that you can choose which environment this package should be included in.

"},{"location":"tutorials/python/#managing-both-conda-and-pypi-dependencies-in-pixi","title":"Managing both conda and PyPI dependencies in pixi","text":"

Our projects usually depend on other packages.

$ pixi add black\nAdded black\n

This will result in the following addition to the pyproject.toml:

# Dependencies\n[tool.pixi.dependencies]\nblack = \">=24.4.2,<24.5\"\n

But we can also be strict about the version that should be used with pixi add black=24, resulting in

[tool.pixi.dependencies]\nblack = \"24.*\"\n

Now, let's add some optional dependencies:

pixi add --pypi --feature test pytest\n

Which results in the following fields added to the pyproject.toml:

[project.optional-dependencies]\ntest = [\"pytest\"]\n

After we have added the optional dependencies to the pyproject.toml, pixi automatically creates a feature, which can contain a collection of dependencies, tasks, channels, and more.

Sometimes there are packages that aren't available on conda channels but are published on PyPI. We can add these as well, which pixi will solve together with the default dependencies.

$ pixi add black --pypi\nAdded black\nAdded these as pypi-dependencies.\n

which results in the addition to the dependencies key in the pyproject.toml

dependencies = [\"black\"]\n

When using the pypi-dependencies you can make use of the optional-dependencies that other packages make available. For example, black makes the cli dependencies option, which can be added with the --pypi keyword:

$ pixi add black[cli] --pypi\nAdded black[cli]\nAdded these as pypi-dependencies.\n

which updates the dependencies entry to

dependencies = [\"black[cli]\"]\n
Optional dependencies in pixi.toml

This tutorial focuses on the use of the pyproject.toml, but in case you're curious, the pixi.toml would contain the following entry after the installation of a PyPI package including an optional dependency:

[pypi-dependencies]\nblack = { version = \"*\", extras = [\"cli\"] }\n

"},{"location":"tutorials/python/#installation-pixi-install","title":"Installation: pixi install","text":"

Now let's install the project with pixi install:

$ pixi install\n\u2714 The default environment has been installed.\n

We now have a new directory called .pixi in the project root. This directory contains the environment that was created when we ran pixi install. The environment is a conda environment that contains the dependencies that we specified in the pyproject.toml file. We can also install the test environment with pixi install -e test. We can use these environments for executing code.

We also have a new file called pixi.lock in the project root. This file contains the exact versions of the dependencies that were installed in the environment across platforms.

"},{"location":"tutorials/python/#whats-in-the-environment","title":"What's in the environment?","text":"

Using pixi list, you can see what's in the environment, this is essentially a nicer view on the lock file:

$ pixi list\nPackage          Version       Build               Size       Kind   Source\nbzip2            1.0.8         h93a5062_5          119.5 KiB  conda  bzip2-1.0.8-h93a5062_5.conda\nblack            24.4.2                            3.8 MiB    pypi   black-24.4.2-cp312-cp312-win_amd64.http.whl\nca-certificates  2024.2.2      hf0a4a13_0          152.1 KiB  conda  ca-certificates-2024.2.2-hf0a4a13_0.conda\nlibexpat         2.6.2         hebf3989_0          62.2 KiB   conda  libexpat-2.6.2-hebf3989_0.conda\nlibffi           3.4.2         h3422bc3_5          38.1 KiB   conda  libffi-3.4.2-h3422bc3_5.tar.bz2\nlibsqlite        3.45.2        h091b4b1_0          806 KiB    conda  libsqlite-3.45.2-h091b4b1_0.conda\nlibzlib          1.2.13        h53f4e23_5          47 KiB     conda  libzlib-1.2.13-h53f4e23_5.conda\nncurses          6.4.20240210  h078ce10_0          801 KiB    conda  ncurses-6.4.20240210-h078ce10_0.conda\nopenssl          3.2.1         h0d3ecfb_1          2.7 MiB    conda  openssl-3.2.1-h0d3ecfb_1.conda\npython           3.12.3        h4a7b5fc_0_cpython  12.6 MiB   conda  python-3.12.3-h4a7b5fc_0_cpython.conda\nreadline         8.2           h92ec313_1          244.5 KiB  conda  readline-8.2-h92ec313_1.conda\ntk               8.6.13        h5083fa2_1          3 MiB      conda  tk-8.6.13-h5083fa2_1.conda\ntzdata           2024a         h0c530f3_0          117 KiB    conda  tzdata-2024a-h0c530f3_0.conda\npixi-py          0.1.0                                        pypi   . (editable)\nxz               5.2.6         h57fd34a_0          230.2 KiB  conda  xz-5.2.6-h57fd34a_0.tar.bz2\n

Python

The Python interpreter is also installed in the environment. This is because the Python interpreter version is read from the requires-python field in the pyproject.toml file. This is used to determine the Python version to install in the environment. This way, pixi automatically manages/bootstraps the Python interpreter for you, so no more brew, apt or other system install steps.

Here, you can see the different conda and Pypi packages listed. As you can see, the pixi-py package that we are working on is installed in editable mode. Every environment in pixi is isolated but reuses files that are hard-linked from a central cache directory. This means that you can have multiple environments with the same packages but only have the individual files stored once on disk.

We can create the default and test environments based on our own test feature from the optional-dependency:

pixi project environment add default --solve-group default\npixi project environment add test --feature test --solve-group default\n

Which results in:

# Environments\n[tool.pixi.environments]\ndefault = { solve-group = \"default\" }\ntest = { features = [\"test\"], solve-group = \"default\" }\n
Solve Groups

Solve groups are a way to group dependencies together. This is useful when you have multiple environments that share the same dependencies. For example, maybe pytest is a dependency that influences the dependencies of the default environment. By putting these in the same solve group, you ensure that the versions in test and default are exactly the same.

The default environment is created when you run pixi install. The test environment is created from the optional dependencies in the pyproject.toml file. You can execute commands in this environment with e.g. pixi run -e test python

"},{"location":"tutorials/python/#getting-code-to-run","title":"Getting code to run","text":"

Let's add some code to the pixi-py package. We will add a new function to the src/pixi_py/__init__.py file:

from rich import print\n\ndef hello():\n    return \"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\"\n\ndef say_hello():\n    print(*hello())\n

Now add the rich dependency from PyPI using: pixi add --pypi rich.

Let's see if this works by running:

pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n
Slow?

This might be slow(2 minutes) the first time because pixi installs the project, but it will be near instant the second time.

Pixi runs the self installed Python interpreter. Then, we are importing the pixi_py package, which is installed in editable mode. The code calls the say_hello function that we just added. And it works! Cool!

"},{"location":"tutorials/python/#testing-this-code","title":"Testing this code","text":"

Okay, so let's add a test for this function. Let's add a tests/test_me.py file in the root of the project.

Giving us the following project structure:

.\n\u251c\u2500\u2500 pixi.lock\n\u251c\u2500\u2500 src\n\u2502   \u2514\u2500\u2500 pixi_py\n\u2502       \u2514\u2500\u2500 __init__.py\n\u251c\u2500\u2500 pyproject.toml\n\u2514\u2500\u2500 tests/test_me.py\n
from pixi_py import hello\n\ndef test_pixi_py():\n    assert hello() == (\"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\")\n

Let's add an easy task for running the tests.

$ pixi task add --feature test test \"pytest\"\n\u2714 Added task `test`: pytest .\n

So pixi has a task system to make it easy to run commands. Similar to npm scripts or something you would specify in a Justfile.

Pixi tasks

Tasks are actually a pretty cool pixi feature that is powerful and runs in a cross-platform shell. You can do caching, dependencies and more. Read more about tasks in the tasks section.

$ pixi r test\n\u2728 Pixi task (test): pytest .\n================================================================================================= test session starts =================================================================================================\nplatform darwin -- Python 3.12.2, pytest-8.1.1, pluggy-1.4.0\nrootdir: /private/tmp/pixi-py\nconfigfile: pyproject.toml\ncollected 1 item\n\ntest_me.py .                                                                                                                                                                                                    [100%]\n\n================================================================================================== 1 passed in 0.00s =================================================================================================\n

Neat! It seems to be working!

"},{"location":"tutorials/python/#test-vs-default-environment","title":"Test vs Default environment","text":"

Let's compare the output of the test and default environments...

pixi list -e test\n# vs. default environment\npixi list\n

We see that the test environment has:

package          version       build               size       kind   source\n...\npytest           8.1.1                             1.1 mib    pypi   pytest-8.1.1-py3-none-any.whl\n...\n

However, the default environment is missing this package. This way, you can finetune your environments to only have the packages that are needed for that environment. E.g. you could also have a dev environment that has pytest and ruff installed, but you could omit these from the prod environment. There is a docker example that shows how to set up a minimal prod environment and copy from there.

"},{"location":"tutorials/python/#replacing-pypi-packages-with-conda-packages","title":"Replacing PyPI packages with conda packages","text":"

Last thing, pixi provides the ability for pypi packages to depend on conda packages. Let's confirm this with pixi list:

$ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2                            4.1 MiB    pypi   pygments-2.17.2-py3-none-any.http.whl\n...\n

Let's explicitly add pygments to the pyproject.toml file. Which is a dependency of the rich package.

pixi add pygments\n

This will add the following to the pyproject.toml file:

[tool.pixi.dependencies]\npygments = \">=2.17.2,<2.18\"\n

We can now see that the pygments package is now installed as a conda package.

$ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2        pyhd8ed1ab_0        840.3 KiB  conda  pygments-2.17.2-pyhd8ed1ab_0.conda\n

This way, PyPI dependencies and conda dependencies can be mixed and matched to seamlessly interoperate.

$  pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n

And it still works!

"},{"location":"tutorials/python/#conclusion","title":"Conclusion","text":"

In this tutorial, you've seen how easy it is to use a pyproject.toml to manage your pixi dependencies and environments. We have also explored how to use PyPI and conda dependencies seamlessly together in the same project and install optional dependencies to manage Python packages.

Hopefully, this provides a flexible and powerful way to manage your Python projects and a fertile base for further exploration with Pixi.

Thanks for reading! Happy Coding \ud83d\ude80

Any questions? Feel free to reach out or share this tutorial on X, join our Discord, send us an e-mail or follow our GitHub.

"},{"location":"tutorials/ros2/","title":"Tutorial: Develop a ROS 2 package with pixi","text":"

In this tutorial, we will show you how to develop a ROS 2 package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

The audience for this tutorial is developers who are familiar with ROS 2 and how are interested to try pixi for their development workflow.

"},{"location":"tutorials/ros2/#prerequisites","title":"Prerequisites","text":"
  • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
  • On Windows, it's advised to enable Developer mode. Go to Settings -> Update & Security -> For developers -> Developer mode.

If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

"},{"location":"tutorials/ros2/#create-a-pixi-project","title":"Create a pixi project.","text":"
pixi init my_ros2_project -c robostack-staging -c conda-forge\ncd my_ros2_project\n

It should have created a directory structure like this:

my_ros2_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

The pixi.toml file is the manifest file for your project. It should look like this:

pixi.toml
[project]\nname = \"my_ros2_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"robostack-staging\", \"conda-forge\"]\n# Your project can support multiple platforms, the current platform will be automatically added.\nplatforms = [\"linux-64\"]\n\n[tasks]\n\n[dependencies]\n

The channels you added to the init command are repositories of packages, you can search in these repositories through our prefix.dev website. The platforms are the systems you want to support, in pixi you can support multiple platforms, but you have to define which platforms, so pixi can test if those are supported for your dependencies. For the rest of the fields, you can fill them in as you see fit.

"},{"location":"tutorials/ros2/#add-ros-2-dependencies","title":"Add ROS 2 dependencies","text":"

To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

Let's start with the turtlesim example

pixi add ros-humble-desktop ros-humble-turtlesim\n

This will add the ros-humble-desktop and ros-humble-turtlesim packages to your manifest. Depending on your internet speed this might take a minute, as it will also install ROS in your project folder (.pixi).

Now run the turtlesim example.

pixi run ros2 run turtlesim turtlesim_node\n

Or use the shell command to start an activated environment in your terminal.

pixi shell\nros2 run turtlesim turtlesim_node\n

Congratulations you have ROS 2 running on your machine with pixi!

Some more fun with the turtle

To control the turtle you can run the following command in a new terminal

cd my_ros2_project\npixi run ros2 run turtlesim turtle_teleop_key\n
Now you can control the turtle with the arrow keys on your keyboard.

"},{"location":"tutorials/ros2/#add-a-custom-python-node","title":"Add a custom Python node","text":"

As ros works with custom nodes, let's add a custom node to our project.

pixi run ros2 pkg create --build-type ament_python --destination-directory src --node-name my_node my_package\n

To build the package we need some more dependencies:

pixi add colcon-common-extensions \"setuptools<=58.2.0\"\n

Add the created initialization script for the ros workspace to your manifest file.

Then run the build command

pixi run colcon build\n

This will create a sourceable script in the install folder, you can source this script through an activation script to use your custom node. Normally this would be the script you add to your .bashrc but instead you tell pixi to use it by adding the following to pixi.toml:

Linux & macOSWindows pixi.toml
[activation]\nscripts = [\"install/setup.sh\"]\n
pixi.toml
[activation]\nscripts = [\"install/setup.bat\"]\n
Multi platform support

You can add multiple activation scripts for different platforms, so you can support multiple platforms with one project. Use the following example to add support for both Linux and Windows, using the target syntax.

[project]\nplatforms = [\"linux-64\", \"win-64\"]\n\n[activation]\nscripts = [\"install/setup.sh\"]\n[target.win-64.activation]\nscripts = [\"install/setup.bat\"]\n

Now you can run your custom node with the following command

pixi run ros2 run my_package my_node\n
"},{"location":"tutorials/ros2/#simplify-the-user-experience","title":"Simplify the user experience","text":"

In pixi we have a feature called tasks, this allows you to define a task in your manifest file and run it with a simple command. Let's add a task to run the turtlesim example and the custom node.

pixi task add sim \"ros2 run turtlesim turtlesim_node\"\npixi task add build \"colcon build --symlink-install\"\npixi task add hello \"ros2 run my_package my_node\"\n

Now you can run these task by simply running

pixi run sim\npixi run build\npixi run hello\n
Advanced task usage

Tasks are a powerful feature in pixi.

  • You can add depends-on to the tasks to create a task chain.
  • You can add cwd to the tasks to run the task in a different directory from the root of the project.
  • You can add inputs and outputs to the tasks to create a task that only runs when the inputs are changed.
  • You can use the target syntax to run specific tasks on specific machines.
[tasks]\nsim = \"ros2 run turtlesim turtlesim_node\"\nbuild = {cmd = \"colcon build --symlink-install\", inputs = [\"src\"]}\nhello = { cmd = \"ros2 run my_package my_node\", depends-on = [\"build\"] }\n
"},{"location":"tutorials/ros2/#build-a-c-node","title":"Build a C++ node","text":"

To build a C++ node you need to add the ament_cmake and some other build dependencies to your manifest file.

pixi add ros-humble-ament-cmake-auto compilers pkg-config cmake ninja\n

Now you can create a C++ node with the following command

pixi run ros2 pkg create --build-type ament_cmake --destination-directory src --node-name my_cpp_node my_cpp_package\n

Now you can build it again and run it with the following commands

# Passing arguments to the build command to build with Ninja, add them to the manifest if you want to default to ninja.\npixi run build --cmake-args -G Ninja\npixi run ros2 run my_cpp_package my_cpp_node\n
Tip

Add the cpp task to the manifest file to simplify the user experience.

pixi task add hello-cpp \"ros2 run my_cpp_package my_cpp_node\"\n
"},{"location":"tutorials/ros2/#conclusion","title":"Conclusion","text":"

In this tutorial, we showed you how to create a Python & CMake ROS2 project using pixi. We also showed you how to add dependencies to your project using pixi, and how to run your project using pixi run. This way you can make sure that your project is reproducible on all your machines that have pixi installed.

"},{"location":"tutorials/ros2/#show-off-your-work","title":"Show Off Your Work!","text":"

Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

"},{"location":"tutorials/ros2/#frequently-asked-questions","title":"Frequently asked questions","text":""},{"location":"tutorials/ros2/#what-happens-with-rosdep","title":"What happens with rosdep?","text":"

Currently, we don't support rosdep in a pixi environment, so you'll have to add the packages using pixi add. rosdep will call conda install which isn't supported in a pixi environment.

"},{"location":"tutorials/rust/","title":"Tutorial: Develop a Rust package using pixi","text":"

In this tutorial, we will show you how to develop a Rust package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

The audience for this tutorial is developers who are familiar with Rust and cargo and how are interested to try pixi for their development workflow. The benefit would be within a rust workflow that you lock both rust and the C/System dependencies your project might be using. E.g tokio users will almost most definitely use openssl.

If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

"},{"location":"tutorials/rust/#prerequisites","title":"Prerequisites","text":"
  • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
"},{"location":"tutorials/rust/#create-a-pixi-project","title":"Create a pixi project.","text":"
pixi init my_rust_project\ncd my_rust_project\n

It should have created a directory structure like this:

my_rust_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

The pixi.toml file is the manifest file for your project. It should look like this:

pixi.toml
[project]\nname = \"my_rust_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # (1)!\n\n[tasks]\n\n[dependencies]\n
  1. The platforms is set to your system's platform by default. You can change it to any platform you want to support. e.g. [\"linux-64\", \"osx-64\", \"osx-arm64\", \"win-64\"].
"},{"location":"tutorials/rust/#add-rust-dependencies","title":"Add Rust dependencies","text":"

To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

pixi add rust\n

This will add the rust package to your pixi.toml file under [dependencies]. Which includes the rust toolchain, and cargo.

"},{"location":"tutorials/rust/#add-a-cargo-project","title":"Add a cargo project","text":"

Now that you have rust installed, you can create a cargo project in your pixi project.

pixi run cargo init\n

pixi run is pixi's way to run commands in the pixi environment, it will make sure that the environment is set up correctly for the command to run. It runs its own cross-platform shell, if you want more information checkout the tasks documentation. You can also activate the environment in your own shell by running pixi shell, after that you don't need pixi run ... anymore.

Now we can build a cargo project using pixi.

pixi run cargo build\n
To simplify the build process, you can add a build task to your pixi.toml file using the following command:
pixi task add build \"cargo build\"\n
Which creates this field in the pixi.toml file: pixi.toml
[tasks]\nbuild = \"cargo build\"\n

And now you can build your project using:

pixi run build\n

You can also run your project using:

pixi run cargo run\n
Which you can simplify with a task again.
pixi task add start \"cargo run\"\n

So you should get the following output:

pixi run start\nHello, world!\n

Congratulations, you have a Rust project running on your machine with pixi!

"},{"location":"tutorials/rust/#next-steps-why-is-this-useful-when-there-is-rustup","title":"Next steps, why is this useful when there is rustup?","text":"

Cargo is not a binary package manager, but a source-based package manager. This means that you need to have the Rust compiler installed on your system to use it. And possibly other dependencies that are not included in the cargo package manager. For example, you might need to install openssl or libssl-dev on your system to build a package. This is the case for pixi as well, but pixi will install these dependencies in your project folder, so you don't have to worry about them.

Add the following dependencies to your cargo project:

pixi run cargo add git2\n

If your system is not preconfigured to build C and have the libssl-dev package installed you will not be able to build the project:

pixi run build\n...\nCould not find directory of OpenSSL installation, and this `-sys` crate cannot\nproceed without this knowledge. If OpenSSL is installed and this crate had\ntrouble finding it,  you can set the `OPENSSL_DIR` environment variable for the\ncompilation process.\n\nMake sure you also have the development packages of openssl installed.\nFor example, `libssl-dev` on Ubuntu or `openssl-devel` on Fedora.\n\nIf you're in a situation where you think the directory *should* be found\nautomatically, please open a bug at https://github.com/sfackler/rust-openssl\nand include information about your system as well as this message.\n\n$HOST = x86_64-unknown-linux-gnu\n$TARGET = x86_64-unknown-linux-gnu\nopenssl-sys = 0.9.102\n\n\nIt looks like you're compiling on Linux and also targeting Linux. Currently this\nrequires the `pkg-config` utility to find OpenSSL but unfortunately `pkg-config`\ncould not be found. If you have OpenSSL installed you can likely fix this by\ninstalling `pkg-config`.\n...\n
You can fix this, by adding the necessary dependencies for building git2, with pixi:
pixi add openssl pkg-config compilers\n

Now you should be able to build your project again:

pixi run build\n...\n   Compiling git2 v0.18.3\n   Compiling my_rust_project v0.1.0 (/my_rust_project)\n    Finished dev [unoptimized + debuginfo] target(s) in 7.44s\n     Running `target/debug/my_rust_project`\n

"},{"location":"tutorials/rust/#extra-add-more-tasks","title":"Extra: Add more tasks","text":"

You can add more tasks to your pixi.toml file to simplify your workflow.

For example, you can add a test task to run your tests:

pixi task add test \"cargo test\"\n

And you can add a clean task to clean your project:

pixi task add clean \"cargo clean\"\n

You can add a formatting task to your project:

pixi task add fmt \"cargo fmt\"\n

You can extend these tasks to run multiple commands with the use of the depends-on field.

pixi task add lint \"cargo clippy\" --depends-on fmt\n

"},{"location":"tutorials/rust/#conclusion","title":"Conclusion","text":"

In this tutorial, we showed you how to create a Rust project using pixi. We also showed you how to add dependencies to your project using pixi. This way you can make sure that your project is reproducible on any system that has pixi installed.

"},{"location":"tutorials/rust/#show-off-your-work","title":"Show Off Your Work!","text":"

Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

"},{"location":"CHANGELOG/","title":"Changelog","text":"

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

"},{"location":"CHANGELOG/#0392-2024-12-11","title":"[0.39.2] - 2024-12-11","text":"

Patch release to fix the binary generation in CI.

"},{"location":"CHANGELOG/#0391-2024-12-09","title":"[0.39.1] - 2024-12-09","text":""},{"location":"CHANGELOG/#added","title":"Added","text":"
  • Add proper unit testing for PyPI installation and fix re-installation issues by @tdejager in #2617
  • Add detailed json output for task list by @jjjermiah in #2608
  • Add pixi project name CLI by @LiamConnors in #2649
"},{"location":"CHANGELOG/#changed","title":"Changed","text":"
  • Use fs-err in more places by @Hofer-Julian in #2636
"},{"location":"CHANGELOG/#documentation","title":"Documentation","text":"
  • Remove tclf from community.md\ud83d\udcd1 by @KarelZe in #2619
  • Update contributing guide by @LiamConnors in #2650
  • Update clean cache CLI doc by @LiamConnors in #2657
"},{"location":"CHANGELOG/#fixed","title":"Fixed","text":"
  • Color formatting detection on stdout by @blmaier in #2613
  • Use correct dependency location for pixi upgrade by @Hofer-Julian in #2472
  • Regression detached-environments not used by @ruben-arts in #2627
  • Allow configuring pypi insecure host by @zen-xu in #2521#2622
"},{"location":"CHANGELOG/#refactor","title":"Refactor","text":"
  • Rework CI and use cargo-dist for releases by @baszalmstra in #2566
"},{"location":"CHANGELOG/#pixi-build-preview-work","title":"pixi build Preview work","text":"
  • Refactor to [build-system.build-backend] by @baszalmstra in #2601
  • Remove ipc override from options and give it manually to test by @wolfv in #2629
  • Pixi build trigger rebuild by @Hofer-Julian in #2641
  • Add variant config to [workspace.build-variants] by @wolfv in #2634
  • Add request coalescing for isolated tools by @nichmor in #2589
  • Add example using rich and pixi-build-python and remove flask by @Hofer-Julian in #2638
  • (simple) build tool override by @wolfv in #2620
  • Add caching of build tool installation by @nichmor in #2637
"},{"location":"CHANGELOG/#new-contributors","title":"New Contributors","text":"
  • @blmaier made their first contribution in #2613
"},{"location":"CHANGELOG/#0390-2024-12-02","title":"[0.39.0] - 2024-12-02","text":""},{"location":"CHANGELOG/#highlights","title":"\u2728 Highlights","text":"
  • We now have a new concurrency configuration in the pixi.toml file. This allows you to set the number of concurrent solves or downloads that can be run at the same time.
  • We changed the way pixi searches for a pixi manifest. Where it was previously first considering the activated pixi shell, it will now search first in the current directory and its parent directories. more info
  • The lockfile format is changed to make it slightly smaller and support source dependencies.
"},{"location":"CHANGELOG/#added_1","title":"Added","text":"
  • Add concurrency configuration by @ruben-arts in #2569
"},{"location":"CHANGELOG/#changed_1","title":"Changed","text":"
  • Add XDG_CONFIG_HOME/.config to search of pixi global manifest path by @hoxbro in #2547
  • Let pixi global sync collect errors rather than returning early by @Hofer-Julian in #2586
  • Allow configuring pypi insecure host by @zen-xu in #2521
  • Reorder manifest discovery logic by @Hofer-Julian in #2564
"},{"location":"CHANGELOG/#documentation_1","title":"Documentation","text":"
  • Improve pixi manifest by @Hofer-Julian in #2596
"},{"location":"CHANGELOG/#fixed_1","title":"Fixed","text":"
  • pixi global list failing for empty environments by @Hofer-Julian in #2571
  • Macos activation cargo vars by @ruben-arts in #2578
  • Trampoline without corresponding json breaking by @Hofer-Julian in #2576
  • Ensure pinning strategy is not affected by non-semver packages by @seowalex in #2580
  • Pypi installs happening every time by @tdejager in #2587
  • pixi global report formatting by @Hofer-Julian in #2595
  • Improve test speed and support win-arm64 by @baszalmstra in #2597
  • Update Task::Alias to return command description by @jjjermiah in #2607
"},{"location":"CHANGELOG/#refactor_1","title":"Refactor","text":"
  • Split install pypi into module and files by @tdejager in #2590
  • PyPI installation traits + deduplication by @tdejager in #2599
"},{"location":"CHANGELOG/#pixi-build","title":"Pixi build","text":"

We've merged in the main pixi build feature branch. This is a big change but shouldn't have affected any of the current functionality. If you notice any issues, please let us know.

It can be turned on by preview = \"pixi-build\" in your pixi.toml file. It's under heavy development so expect breaking changes in that feature for now.

  • Preview of pixi build and workspaces by @tdejager in #2250
  • Build recipe yaml directly by @wolfv in #2568
"},{"location":"CHANGELOG/#new-contributors_1","title":"New Contributors","text":"
  • @seowalex made their first contribution in #2580
"},{"location":"CHANGELOG/#0380-2024-11-26","title":"[0.38.0] - 2024-11-26","text":""},{"location":"CHANGELOG/#highlights_1","title":"\u2728 Highlights","text":"
  • Specify pypi-index per pypi-dependency
    [pypi-dependencies]\npytorch ={ version = \"*\", index = \"https://download.pytorch.org/whl/cu118\" }\n
  • [dependency-groups] (PEP735) support in pyproject.toml
    [dependency-groups]\ntest = [\"pytest\"]\ndocs = [\"sphinx\"]\ndev = [{include-group = \"test\"}, {include-group = \"docs\"}]\n\n[tool.pixi.environments]\ndev = [\"dev\"]\n
  • Much improved pixi search output!
"},{"location":"CHANGELOG/#added_2","title":"Added","text":"
  • Add pypi index by @nichmor in #2416
  • Implement PEP735 support by @olivier-lacroix in #2448
  • Extends manifest to allow for preview features by @tdejager in #2489
  • Add versions/build list to pixi search output by @delsner in #2440
  • Expose nested executables in pixi global by @bahugo in #2362
"},{"location":"CHANGELOG/#fixed_2","title":"Fixed","text":"
  • Always print a warning when config is invalid by @Hofer-Julian in #2508
  • Incorrectly saving absolute base as path component by @tdejager in #2501
  • Keep the case when getting the executable in pixi global by @wolfv in #2528
  • Install script on win-arm64 by @baszalmstra in #2538
  • Trampoline installation on pixi global update by @nichmor in #2530
  • Update the PATH env var with dynamic elements on pixi global by @wolfv in #2541
  • Correct ppc64le arch by @wolfv in #2540
"},{"location":"CHANGELOG/#performance","title":"Performance","text":"
  • Experimental environment activation cache by @ruben-arts in #2367
"},{"location":"CHANGELOG/#documentation_2","title":"Documentation","text":"
  • Update project structure in Python tutorial by @LiamConnors in #2506
  • Fix typo in pixi project export conda-environment by @nmarticorena in #2533
  • Fix wrong use of underscores in pixi project export by @traversaro in #2539
  • Adapt completion instructions by @Hofer-Julian in #2561
"},{"location":"CHANGELOG/#new-contributors_2","title":"New Contributors","text":"
  • @nmarticorena made their first contribution in #2533
  • @delsner made their first contribution in #2440
"},{"location":"CHANGELOG/#0370-2024-11-18","title":"[0.37.0] - 2024-11-18","text":""},{"location":"CHANGELOG/#highlights_2","title":"\u2728 Highlights","text":"

We now allow the use of prefix.dev channels with sharded repodata:

Running pixi search rubin-env using hyperfine on the default versus our channels gives these results:

Cache Status Channel Mean [ms] Relative With cache https://prefix.dev/conda-forge 69.3 1.00 Without https://prefix.dev/conda-forge 389.5 5.62 With cache https://conda.anaconda.org/conda-forge 1043.3 15.06 Without https://conda.anaconda.org/conda-forge 2420.3 34.94"},{"location":"CHANGELOG/#breaking","title":"Breaking","text":"
  • Make sure that [activation.env] are not completely overridden by [target. tables, by @hameerabbasi in #2396
"},{"location":"CHANGELOG/#changed_2","title":"Changed","text":"
  • Allow using sharded repodata by @baszalmstra in #2467
"},{"location":"CHANGELOG/#documentation_3","title":"Documentation","text":"
  • Update ros2.md turtlesim section by @nbbrooks in #2442
  • Update pycharm.md to show optional installation by @plainerman in #2487
  • Fix typo in documentation by @saraedum in #2496
  • Update pixi install output by @LiamConnors in #2495
"},{"location":"CHANGELOG/#fixed_3","title":"Fixed","text":"
  • Incorrect python version was used in some parts of the solve by @tdejager in #2481
  • Wrong description on pixi upgrade by @notPlancha in #2483
  • Extra test for mismatch in python versions by @tdejager in #2485
  • Keep build in pixi upgrade by @ruben-arts in #2476
"},{"location":"CHANGELOG/#new-contributors_3","title":"New Contributors","text":"
  • @saraedum made their first contribution in #2496
  • @plainerman made their first contribution in #2487
  • @hameerabbasi made their first contribution in #2396
  • @nbbrooks made their first contribution in #2442
"},{"location":"CHANGELOG/#0360-2024-11-07","title":"[0.36.0] - 2024-11-07","text":""},{"location":"CHANGELOG/#highlights_3","title":"\u2728 Highlights","text":"
  • You can now pixi upgrade your project dependencies.
  • We've done a performance improvement on the prefix validation check, thus faster pixi run startup times.
"},{"location":"CHANGELOG/#added_3","title":"Added","text":"
  • Add powerpc64le target to trampoline by @ruben-arts in #2419
  • Add trampoline tests again by @Hofer-Julian in #2420
  • Add pixi upgrade by @Hofer-Julian in #2368
  • Add platform fallback win-64 for win-arm64 by @chawyehsu in #2427
  • Add --prepend option for pixi project channel add by @mrswastik-robot in #2447
"},{"location":"CHANGELOG/#documentation_4","title":"Documentation","text":"
  • Fix cli basic usage example by @lucascolley in #2432
  • Update python tutorial by @LiamConnors in #2452
  • Improve pixi global docs by @Hofer-Julian in #2437
"},{"location":"CHANGELOG/#fixed_4","title":"Fixed","text":"
  • Use --silent instead of --no-progress-meter for old curl by @jaimergp in #2428
  • Search should return latest package across all platforms by @nichmor in #2424
  • Trampoline unwraps by @ruben-arts in #2422
  • PyPI Index usage (regression in v0.35.0) by @tdejager in #2465
  • PyPI git dependencies (regression in v0.35.0) by @wolfv in #2438
  • Tolerate pixi file errors (regression in v0.35.0) by @jvenant in #2457
  • Make sure tasks are fetched for best platform by @jjjermiah in #2446
"},{"location":"CHANGELOG/#performance_1","title":"Performance","text":"
  • Quick prefix validation check by @ruben-arts in #2400
"},{"location":"CHANGELOG/#new-contributors_4","title":"New Contributors","text":"
  • @jvenant made their first contribution in #2457
  • @mrswastik-robot made their first contribution in #2447
  • @LiamConnors made their first contribution in #2452
"},{"location":"CHANGELOG/#0350-2024-11-05","title":"[0.35.0] - 2024-11-05","text":""},{"location":"CHANGELOG/#highlights_4","title":"\u2728 Highlights","text":"

pixi global now exposed binaries are not scripts anymore but actual executables. Resulting in significant speedup and better compatibility with other tools.

"},{"location":"CHANGELOG/#added_4","title":"Added","text":"
  • Add language packages with minor pinning by default by @ruben-arts in #2310
  • Add grouping for exposing and removing by @nichmor in #2387
  • Add trampoline for pixi global by @Hofer-Julian and @nichmor in #2381
  • Adding SCM option for init command by @alvgaona in #2342
  • Create .pixi/.gitignore containing * by @maresb in #2361
"},{"location":"CHANGELOG/#changed_3","title":"Changed","text":"
  • Use the same package cache folder by @nichmor in #2335zx
  • Disable progress in non tty by @ruben-arts in #2308
  • Improve global install reporting by @Hofer-Julian in #2395
  • Suggest fix in platform error message by @maurosilber in #2404
  • Upgrading uv to 0.4.30 by @tdejager in #2372
"},{"location":"CHANGELOG/#documentation_5","title":"Documentation","text":"
  • Add pybind11 example by @alvgaona in #2324
  • Replace build with uv in pybind11 example by @alvgaona in #2341
  • Fix incorrect statement about env location by @opcode81 in #2370
"},{"location":"CHANGELOG/#fixed_5","title":"Fixed","text":"
  • Global update reporting by @Hofer-Julian in #2352
  • Correctly display unrequested environments on task list by @jjjermiah in #2402
"},{"location":"CHANGELOG/#refactor_2","title":"Refactor","text":"
  • Use built in string methods by @KGrewal1 in #2348
  • Reorganize integration tests by @Hofer-Julian in #2408
  • Reimplement barrier cell on OnceLock by @KGrewal1 in #2347
"},{"location":"CHANGELOG/#new-contributors_5","title":"New Contributors","text":"
  • @maurosilber made their first contribution in #2404
  • @opcode81 made their first contribution in #2370
  • @alvgaona made their first contribution in #2342
"},{"location":"CHANGELOG/#0340-2024-10-21","title":"[0.34.0] - 2024-10-21","text":""},{"location":"CHANGELOG/#highlights_5","title":"\u2728 Highlights","text":"
  • pixi global install now takes a flag --with, inspired by uv tool install. If you only want to add dependencies without exposing them, you can now run pixi global install ipython --with numpy --with matplotlib
  • Improved the output of pixi global subcommands
  • Many bug fixes
"},{"location":"CHANGELOG/#added_5","title":"Added","text":"
  • Add timeouts by @Hofer-Julian in #2311
"},{"location":"CHANGELOG/#changed_4","title":"Changed","text":"
  • Global update should add new executables by @nichmor in #2298
  • Add pixi global install --with by @Hofer-Julian in #2332
"},{"location":"CHANGELOG/#documentation_6","title":"Documentation","text":"
  • Document where pixi-global.toml can be found by @Hofer-Julian in #2304
  • Add ros noetic example by @ruben-arts in #2271
  • Add nichita and julian to CITATION.cff by @Hofer-Julian in #2327
  • Improve keyring documentation to use pixi global by @olivier-lacroix in #2318
"},{"location":"CHANGELOG/#fixed_6","title":"Fixed","text":"
  • pixi global upgrade-all error message by @Hofer-Julian in #2296
  • Select correct run environment by @ruben-arts in #2301
  • Adapt channels to work with newest rattler-build version by @Hofer-Julian in #2306
  • Hide obsolete commands in help page of pixi global by @chawyehsu in #2320
  • Typecheck all tests by @Hofer-Julian in #2328
"},{"location":"CHANGELOG/#refactor_3","title":"Refactor","text":"
  • Improve upload errors by @ruben-arts in #2303
"},{"location":"CHANGELOG/#new-contributors_6","title":"New Contributors","text":"
  • @gerlero made their first contribution in #2300
"},{"location":"CHANGELOG/#0330-2024-10-16","title":"[0.33.0] - 2024-10-16","text":""},{"location":"CHANGELOG/#highlights_6","title":"\u2728 Highlights","text":"

This is the first release with the new pixi global implementation. It's a full reimplementation of pixi global where it now uses a manifest file just like pixi projects. This way you can declare your environments and save them to a VCS.

It also brings features like, adding dependencies to a global environment, and exposing multiple binaries from the same environment that are not part of the main installed packages.

Test it out with:

# Normal feature\npixi global install ipython\n\n# New features\npixi global install \\\n    --environment science \\           # Defined the environment name\n    --expose scipython=ipython \\      # Expose binaries under custom names\n    ipython scipy                     # Define multiple dependencies for one environment\n

This should result in a manifest in $HOME/.pixi/manifests/pixi-global.toml:

version = 1\n\n[envs.ipython]\nchannels = [\"conda-forge\"]\ndependencies = { ipython = \"*\" }\nexposed = { ipython = \"ipython\", ipython3 = \"ipython3\" }\n\n[envs.science]\nchannels = [\"conda-forge\"]\ndependencies = { ipython = \"*\", scipy = \"*\" }\nexposed = { scipython = \"ipython\" }\n

"},{"location":"CHANGELOG/#documentation_7","title":"\ud83d\udcd6 Documentation","text":"

Checkout the updated documentation on this new feature: - Main documentation on this tag: https://pixi.sh/v0.33.0/ - Global CLI documentation: https://pixi.sh/v0.33.0/reference/cli/#global - The implementation documentation: https://pixi.sh/v0.33.0/features/global_tools/ - The initial design proposal: https://pixi.sh/v0.33.0/design_proposals/pixi_global_manifest/

"},{"location":"CHANGELOG/#0322-2024-10-16","title":"[0.32.2] - 2024-10-16","text":""},{"location":"CHANGELOG/#highlights_7","title":"\u2728 Highlights","text":"
  • pixi self-update will only work on the binaries from the GitHub releases, avoiding accidentally breaking the installation.
  • We now support gcs:// conda registries.
  • No more broken PowerShell after using pixi shell.
"},{"location":"CHANGELOG/#changed_5","title":"Changed","text":"
  • Add support for gcs:// conda registries by @clement-chaneching in #2263
"},{"location":"CHANGELOG/#documentation_8","title":"Documentation","text":"
  • Small fixes in tutorials/python.md by @carschandler in #2252
  • Update pixi list docs by @Hofer-Julian in #2269
"},{"location":"CHANGELOG/#fixed_7","title":"Fixed","text":"
  • Bind ctrl c listener so that it doesn't interfere on powershell by @wolfv in #2260
  • Explicitly run default environment by @ruben-arts in #2273
  • Parse env name on adding by @ruben-arts in #2279
"},{"location":"CHANGELOG/#refactor_4","title":"Refactor","text":"
  • Make self-update a compile time feature by @freundTech in #2213
"},{"location":"CHANGELOG/#new-contributors_7","title":"New Contributors","text":"
  • @clement-chaneching made their first contribution in #2263
  • @freundTech made their first contribution in #2213
"},{"location":"CHANGELOG/#0321-2024-10-08","title":"[0.32.1] - 2024-10-08","text":""},{"location":"CHANGELOG/#fixes","title":"Fixes","text":"
  • Bump Rust version to 1.81 by @wolfv in #2227
"},{"location":"CHANGELOG/#documentation_9","title":"Documentation","text":"
  • Pixi-pack, docker, devcontainer by @pavelzw in #2220
"},{"location":"CHANGELOG/#0320-2024-10-08","title":"[0.32.0] - 2024-10-08","text":""},{"location":"CHANGELOG/#highlights_8","title":"\u2728 Highlights","text":"

The biggest fix in this PR is the move to the latest rattler as it came with some major bug fixes for macOS and Rust 1.81 compatibility.

"},{"location":"CHANGELOG/#changed_6","title":"Changed","text":"
  • Correctly implement total ordering for dependency provider by @tdejager in rattler/#892
"},{"location":"CHANGELOG/#fixed_8","title":"Fixed","text":"
  • Fixed self-clobber issue when up/down grading packages by @wolfv in rattler/#893
  • Check environment name before returning not found print by @ruben-arts in #2198
  • Turn off symlink follow for task cache by @ruben-arts in #2209
"},{"location":"CHANGELOG/#0310-2024-10-03","title":"[0.31.0] - 2024-10-03","text":""},{"location":"CHANGELOG/#highlights_9","title":"\u2728 Highlights","text":"

Thanks to our maintainer @baszamstra! He sped up the resolver for all cases we could think of in #2162 Check the result of times it takes to solve the environments in our test set:

"},{"location":"CHANGELOG/#added_6","title":"Added","text":"
  • Add nodefaults to imported conda envs by @ruben-arts in #2097
  • Add newline to .gitignore by @ruben-arts in #2095
  • Add --no-activation option to prevent env activation during global install/upgrade by @183amir in #1980
  • Add --priority arg to project channel add by @minrk in #2086
"},{"location":"CHANGELOG/#changed_7","title":"Changed","text":"
  • Use pixi spec for conda environment yml by @ruben-arts in #2096
  • Update rattler by @nichmor in #2120
  • Update README.md by @ruben-arts in #2129
  • Follow symlinks while walking files by @0xbe7a in #2141
"},{"location":"CHANGELOG/#documentation_10","title":"Documentation","text":"
  • Adapt wording in pixi global proposal by @Hofer-Julian in #2098
  • Community: add array-api-extra by @lucascolley in #2107
  • pixi global mention no-activation by @Hofer-Julian in #2109
  • Add minimal constructor example by @bollwyvl in #2102
  • Update global manifest install by @Hofer-Julian in #2128
  • Add description for pixi update --json by @scottamain in #2160
  • Fixes backticks for doc strings by @rachfop in #2174
"},{"location":"CHANGELOG/#fixed_9","title":"Fixed","text":"
  • Sort exported conda explicit spec topologically by @synapticarbors in #2101
  • --import env_file breaks channel priority by @fecet in #2113
  • Allow exact yanked pypi packages by @nichmor in #2116
  • Check if files are same in self-update by @apoorvkh in #2132
  • get_or_insert_nested_table by @Hofer-Julian in #2167
  • Improve install.sh PATH handling and general robustness by @Arcitec in #2189
  • Output tasks on pixi run without input by @ruben-arts in #2193
"},{"location":"CHANGELOG/#performance_2","title":"Performance","text":"
  • Significantly speed up conda resolution by @baszalmstra in #2162
"},{"location":"CHANGELOG/#new-contributors_8","title":"New Contributors","text":"
  • @Arcitec made their first contribution in #2189
  • @rachfop made their first contribution in #2174
  • @scottamain made their first contribution in #2160
  • @apoorvkh made their first contribution in #2132
  • @0xbe7a made their first contribution in #2141
  • @fecet made their first contribution in #2113
  • @minrk made their first contribution in #2086
  • @183amir made their first contribution in #1980
  • @lucascolley made their first contribution in #2107
"},{"location":"CHANGELOG/#0300-2024-09-19","title":"[0.30.0] - 2024-09-19","text":""},{"location":"CHANGELOG/#highlights_10","title":"\u2728 Highlights","text":"

I want to thank @synapticarbors and @abkfenris for starting the work on pixi project export. Pixi now supports the export of a conda environment.yml file and a conda explicit specification file. This is a great addition to the project and will help users to share their projects with other non pixi users.

"},{"location":"CHANGELOG/#added_7","title":"Added","text":"
  • Export conda explicit specification file from project by @synapticarbors in #1873
  • Add flag to pixi search by @Hofer-Julian in #2018
  • Adds the ability to set the index strategy by @tdejager in #1986
  • Export conda environment.yml by @abkfenris in #2003
"},{"location":"CHANGELOG/#changed_8","title":"Changed","text":"
  • Improve examples/docker by @jennydaman in #1965
  • Minimal pre-commit tasks by @Hofer-Julian in #1984
  • Improve error and feedback when target does not exist by @tdejager in #1961
  • Move the rectangle using a mouse in SDL by @certik in #2069
"},{"location":"CHANGELOG/#documentation_11","title":"Documentation","text":"
  • Update cli.md by @xela-95 in #2047
  • Update system-requirements information by @ruben-arts in #2079
  • Append to file syntax in task docs by @nicornk in #2013
  • Change documentation of pixi upload to refer to correct API endpoint by @traversaro in #2074
"},{"location":"CHANGELOG/#testing","title":"Testing","text":"
  • Add downstream nerfstudio test by @tdejager in #1996
  • Run pytests in parallel by @tdejager in #2027
  • Testing common wheels by @tdejager in #2031
"},{"location":"CHANGELOG/#fixed_10","title":"Fixed","text":"
  • Lock file is always outdated for pypi path dependencies by @nichmor in #2039
  • Fix error message for export conda explicit spec by @synapticarbors in #2048
  • Use conda-pypi-map for feature channels by @nichmor in #2038
  • Constrain feature platforms in schema by @bollwyvl in #2055
  • Split tag creation functions by @tdejager in #2062
  • Tree print to pipe by @ruben-arts in #2064
  • subdirectory in pypi url by @ruben-arts in #2065
  • Create a GUI application on Windows, not Console by @certik in #2067
  • Make dashes underscores in python package names by @ruben-arts in #2073
  • Give better errors on broken pyproject.toml by @ruben-arts in #2075
"},{"location":"CHANGELOG/#refactor_5","title":"Refactor","text":"
  • Stop duplicating strip_channel_alias from rattler by @Hofer-Julian in #2017
  • Follow-up wheels tests by @Hofer-Julian in #2063
  • Integration test suite by @Hofer-Julian in #2081
  • Remove psutils by @Hofer-Julian in #2083
  • Add back older caching method by @tdejager in #2046
  • Release script by @Hofer-Julian in #1978
  • Activation script by @Hofer-Julian in #2014
  • Pins python version in add_pypi_functionality by @tdejager in #2040
  • Improve the lock_file_usage flags and behavior. by @ruben-arts in #2078
  • Move matrix to workflow that it is used in by @tdejager in #1987
  • Refactor manifest into more generic approach by @nichmor in #2015
"},{"location":"CHANGELOG/#new-contributors_9","title":"New Contributors","text":"
  • @certik made their first contribution in #2069
  • @xela-95 made their first contribution in #2047
  • @nicornk made their first contribution in #2013
  • @jennydaman made their first contribution in #1965
"},{"location":"CHANGELOG/#0290-2024-09-04","title":"[0.29.0] - 2024-09-04","text":""},{"location":"CHANGELOG/#highlights_11","title":"\u2728 Highlights","text":"
  • Add build-isolation options, for more details check out our docs
  • Allow to use virtual package overrides from environment variables (PR)
  • Many bug fixes
"},{"location":"CHANGELOG/#added_8","title":"Added","text":"
  • Add build-isolation options by @tdejager in #1909
  • Add release script by @Hofer-Julian in #1971
"},{"location":"CHANGELOG/#changed_9","title":"Changed","text":"
  • Use rustls-tls instead of native-tls per default by @Hofer-Julian in #1929
  • Upgrade to uv 0.3.4 by @tdejager in #1936
  • Upgrade to uv 0.4.0 by @tdejager in #1944
  • Better error for when the target or platform are missing by @tdejager in #1959
  • Improve integration tests by @Hofer-Julian in #1958
  • Improve release script by @Hofer-Julian in #1974
"},{"location":"CHANGELOG/#fixed_11","title":"Fixed","text":"
  • Update env variables in installation docs by @lev112 in #1937
  • Always overwrite when pixi adding the dependency by @ruben-arts in #1935
  • Typo in schema.json by @SobhanMP in #1948
  • Using file url as mapping by @nichmor in #1930
  • Offline mapping should not request by @nichmor in #1968
  • pixi init for pyproject.toml by @Hofer-Julian in #1947
  • Use two in memory indexes, for resolve and builds by @tdejager in #1969
  • Minor issues and todos by @KGrewal1 in #1963
"},{"location":"CHANGELOG/#refactor_6","title":"Refactor","text":"
  • Improve integration tests by @Hofer-Julian in #1942
"},{"location":"CHANGELOG/#new-contributors_10","title":"New Contributors","text":"
  • @SobhanMP made their first contribution in #1948
  • @lev112 made their first contribution in #1937
"},{"location":"CHANGELOG/#0282-2024-08-28","title":"[0.28.2] - 2024-08-28","text":""},{"location":"CHANGELOG/#changed_10","title":"Changed","text":"
  • Use mold on linux by @Hofer-Julian in #1914
"},{"location":"CHANGELOG/#documentation_12","title":"Documentation","text":"
  • Fix global manifest by @Hofer-Julian in #1912
  • Document azure keyring usage by @tdejager in #1913
"},{"location":"CHANGELOG/#fixed_12","title":"Fixed","text":"
  • Let init add dependencies independent of target and don't install by @ruben-arts in #1916
  • Enable use of manylinux wheeltags once again by @tdejager in #1925
  • The bigger runner by @ruben-arts in #1902
"},{"location":"CHANGELOG/#0281-2024-08-26","title":"[0.28.1] - 2024-08-26","text":""},{"location":"CHANGELOG/#changed_11","title":"Changed","text":"
  • Uv upgrade to 0.3.2 by @tdejager in #1900
"},{"location":"CHANGELOG/#documentation_13","title":"Documentation","text":"
  • Add keyrings.artifacts to the list of project built with pixi by @jslorrma in #1908
"},{"location":"CHANGELOG/#fixed_13","title":"Fixed","text":"
  • Use default indexes if non where given by the lockfile by @ruben-arts in #1910
"},{"location":"CHANGELOG/#new-contributors_11","title":"New Contributors","text":"
  • @jslorrma made their first contribution in #1908
"},{"location":"CHANGELOG/#0280-2024-08-22","title":"[0.28.0] - 2024-08-22","text":""},{"location":"CHANGELOG/#highlights_12","title":"\u2728 Highlights","text":"
  • Bug Fixes: Major fixes in general but especially for PyPI installation issues and better error messaging.
  • Compatibility: Default Linux version downgraded to 4.18 for broader support.
  • New Features: Added INIT_CWD in pixi run, improved logging, and more cache options.
"},{"location":"CHANGELOG/#added_9","title":"Added","text":"
  • Add INIT_CWD to activated env pixi run by @ruben-arts in #1798
  • Add context to error when parsing conda-meta files by @baszalmstra in #1854
  • Add some logging for when packages are actually overridden by conda by @tdejager in #1874
  • Add package when extra is added by @ruben-arts in #1856
"},{"location":"CHANGELOG/#changed_12","title":"Changed","text":"
  • Use new gateway to get the repodata for global install by @nichmor in #1767
  • Pixi global proposal by @Hofer-Julian in #1757
  • Upgrade to new uv 0.2.37 by @tdejager in #1829
  • Use new gateway for pixi search by @nichmor in #1819
  • Extend pixi clean cache with more cache options by @ruben-arts in #1872
  • Downgrade __linux default to 4.18 by @ruben-arts in #1887
"},{"location":"CHANGELOG/#documentation_14","title":"Documentation","text":"
  • Fix instructions for update github actions by @Hofer-Julian in #1774
  • Fix fish completion script by @dennis-wey in #1789
  • Expands the environment variable examples in the reference section by @travishathaway in #1779
  • Community feedback pixi global by @Hofer-Julian in #1800
  • Additions to the pixi global proposal by @Hofer-Julian in #1803
  • Stop using invalid environment name in pixi global proposal by @Hofer-Julian in #1826
  • Extend pixi global proposal by @Hofer-Julian in #1861
  • Make channels required in pixi global manifest by @Hofer-Julian in #1868
  • Fix linux minimum version in project_configuration docs by @traversaro in #1888
"},{"location":"CHANGELOG/#fixed_14","title":"Fixed","text":"
  • Try to increase rlimit by @baszalmstra in #1766
  • Add test for invalid environment names by @Hofer-Julian in #1825
  • Show global config in info command by @ruben-arts in #1807
  • Correct documentation of PIXI_ENVIRONMENT_PLATFORMS by @traversaro in #1842
  • Format in docs/features/environment.md by @cdeil in #1846
  • Make proper use of NamedChannelOrUrl by @ruben-arts in #1820
  • Trait impl override by @baszalmstra in #1848
  • Tame pixi search by @baszalmstra in #1849
  • Fix pixi tree -i duplicate output by @baszalmstra in #1847
  • Improve spec parsing error messages by @baszalmstra in #1786
  • Parse matchspec from CLI Lenient by @baszalmstra in #1852
  • Improve parsing of pypi-dependencies by @baszalmstra in #1851
  • Don't enforce system requirements for task tests by @baszalmstra in #1855
  • Satisfy when there are no pypi packages in the lockfile by @ruben-arts in #1862
  • Ssh url should not contain colon by @baszalmstra in #1865
  • find-links with manifest-path by @baszalmstra in #1864
  • Increase stack size in debug mode on windows by @baszalmstra in #1867
  • Solve-group-envs should reside in .pixi folder by @baszalmstra in #1866
  • Move package-override logging by @tdejager in #1883
  • Pinning logic for minor and major by @baszalmstra in #1885
  • Docs manifest tests by @ruben-arts in #1879
"},{"location":"CHANGELOG/#refactor_7","title":"Refactor","text":"
  • Encapsulate channel resolution logic for CLI by @olivier-lacroix in #1781
  • Move to pub(crate) fn in order to detect and remove unused functions by @Hofer-Julian in #1805
  • Only compile TaskNode::full_command for tests by @Hofer-Julian in #1809
  • Derive Default for more structs by @Hofer-Julian in #1824
  • Rename get_up_to_date_prefix to update_prefix by @Hofer-Julian in #1837
  • Make HasSpecs implementation more functional by @Hofer-Julian in #1863
"},{"location":"CHANGELOG/#new-contributors_12","title":"New Contributors","text":"
  • @cdeil made their first contribution in #1846
"},{"location":"CHANGELOG/#0271-2024-08-09","title":"[0.27.1] - 2024-08-09","text":""},{"location":"CHANGELOG/#documentation_15","title":"Documentation","text":"
  • Fix mlx feature in \"multiple machines\" example by @rgommers in #1762
  • Update some of the cli and add osx rosetta mention by @ruben-arts in #1760
  • Fix typo by @pavelzw in #1771
"},{"location":"CHANGELOG/#fixed_15","title":"Fixed","text":"
  • User agent string was wrong by @wolfv in #1759
  • Dont accidentally wipe pyproject.toml on init by @ruben-arts in #1775
"},{"location":"CHANGELOG/#refactor_8","title":"Refactor","text":"
  • Add pixi_spec crate by @baszalmstra in #1741
"},{"location":"CHANGELOG/#new-contributors_13","title":"New Contributors","text":"
  • @rgommers made their first contribution in #1762
"},{"location":"CHANGELOG/#0270-2024-08-07","title":"[0.27.0] - 2024-08-07","text":""},{"location":"CHANGELOG/#highlights_13","title":"\u2728 Highlights","text":"

This release contains a lot of refactoring and improvements to the codebase, in preparation for future features and improvements. Including with that we've fixed a ton of bugs. To make sure we're not breaking anything we've added a lot of tests and CI checks. But let us know if you find any issues!

As a reminder, you can update pixi using pixi self-update and move to a specific version, including backwards, with pixi self-update --version 0.27.0.

"},{"location":"CHANGELOG/#added_10","title":"Added","text":"
  • Add pixi run completion for fish shell by @dennis-wey in #1680
"},{"location":"CHANGELOG/#changed_13","title":"Changed","text":"
  • Move examples from setuptools to hatchling by @Hofer-Julian in #1692
  • Let pixi init create hatchling pyproject.toml by @Hofer-Julian in #1693
  • Make [project] table optional for pyproject.toml manifests by @olivier-lacroix in #1732
"},{"location":"CHANGELOG/#documentation_16","title":"Documentation","text":"
  • Improve the fish completions location by @tdejager in #1647
  • Explain why we use hatchling by @Hofer-Julian
  • Update install CLI doc now that the update command exist by @olivier-lacroix in #1690
  • Mention pixi exec in GHA docs by @pavelzw in #1724
  • Update to correct spelling by @ahnsn in #1730
  • Ensure hatchling is used everywhere in documentation by @olivier-lacroix in #1733
  • Add readme to WASM example by @wolfv in #1703
  • Fix typo by @pavelzw in #1660
  • Fix typo by @DimitriPapadopoulos in #1743
  • Fix typo by @SeaOtocinclus in #1651
"},{"location":"CHANGELOG/#testing_1","title":"Testing","text":"
  • Added script and tasks for testing examples by @tdejager in #1671
  • Add simple integration tests by @ruben-arts in #1719
"},{"location":"CHANGELOG/#fixed_16","title":"Fixed","text":"
  • Prepend pixi to path instead of appending by @vigneshmanick in #1644
  • Add manifest tests and run them in ci by @ruben-arts in #1667
  • Use hashed pypi mapping by @baszalmstra in #1663
  • Depend on pep440_rs from crates.io and use replace by @baszalmstra in #1698
  • pixi add with more than just package name and version by @ruben-arts in #1704
  • Ignore pypi logic on non pypi projects by @ruben-arts in #1705
  • Fix and refactor --no-lockfile-update by @ruben-arts in #1683
  • Changed example to use hatchling by @tdejager in #1729
  • Todo clean up by @KGrewal1 in #1735
  • Allow for init to pixi.toml when pyproject.toml is available. by @ruben-arts in #1640
  • Test on macos-13 by @ruben-arts in #1739
  • Make sure pixi vars are available before activation.env vars are by @ruben-arts in #1740
  • Authenticate exec package download by @olivier-lacroix in #1751
"},{"location":"CHANGELOG/#refactor_9","title":"Refactor","text":"
  • Extract pixi_manifest by @baszalmstra in #1656
  • Delay channel config url evaluation by @baszalmstra in #1662
  • Split out pty functionality by @tdejager in #1678
  • Make project manifest loading DRY and consistent by @olivier-lacroix in #1688
  • Refactor channel add and remove CLI commands by @olivier-lacroix in #1689
  • Refactor pixi::consts and pixi::config into separate crates by @tdejager in #1684
  • Move dependencies to pixi_manifest by @tdejager in #1700
  • Moved pypi environment modifiers by @tdejager in #1699
  • Split HasFeatures by @tdejager in #1712
  • Move, splits and renames the HasFeatures trait by @tdejager in #1717
  • Merge utils by @tdejager in #1718
  • Move fancy to its own crate by @tdejager in #1722
  • Move config to repodata functions by @tdejager in #1723
  • Move pypi-mapping to its own crate by @tdejager in #1725
  • Split utils into 2 crates by @tdejager in #1736
  • Add progress bar as a crate by @nichmor in #1727
  • Split up pixi_manifest lib by @tdejager in #1661
"},{"location":"CHANGELOG/#new-contributors_14","title":"New Contributors","text":"
  • @DimitriPapadopoulos made their first contribution in #1743
  • @KGrewal1 made their first contribution in #1735
  • @ahnsn made their first contribution in #1730
  • @dennis-wey made their first contribution in #1680
"},{"location":"CHANGELOG/#0261-2024-07-22","title":"[0.26.1] - 2024-07-22","text":""},{"location":"CHANGELOG/#fixed_17","title":"Fixed","text":"
  • Make sure we also build the msi installer by @ruben-arts in #1645
"},{"location":"CHANGELOG/#0260-2024-07-19","title":"[0.26.0] - 2024-07-19","text":""},{"location":"CHANGELOG/#highlights_14","title":"\u2728 Highlights","text":"
  • Specify how pixi pins your dependencies with the pinning-strategy in the config. e.g. semver -> >=1.2.3,<2 and no-pin -> *) #1516
  • Specify how pixi solves multiple channels with channel-priority in the manifest. #1631
"},{"location":"CHANGELOG/#added_11","title":"Added","text":"
  • Add short options to config location flags by @ruben-arts in #1586
  • Add a file guard to indicate if an environment is being installed by @baszalmstra in #1593
  • Add pinning-strategy to the configuration by @ruben-arts in #1516
  • Add channel-priority to the manifest and solve by @ruben-arts in #1631
  • Add nushell completion by @Hofer-Julian in #1599
  • Add nushell completions for pixi run by @Hofer-Julian in #1627
  • Add completion for pixi run --environment for nushell by @Hofer-Julian in #1636
"},{"location":"CHANGELOG/#changed_14","title":"Changed","text":"
  • Upgrade uv 0.2.18 by @tdejager in #1540
  • Refactor pyproject.toml parser by @nichmor in #1592
  • Interactive warning for packages in pixi global install by @ruben-arts in #1626
"},{"location":"CHANGELOG/#documentation_17","title":"Documentation","text":"
  • Add WASM example with JupyterLite by @wolfv in #1623
  • Added LLM example by @ytjhai in #1545
  • Add note to mark directory as excluded in pixi-pycharm by @pavelzw in #1579
  • Add changelog to docs by @vigneshmanick in #1574
  • Updated the values of the system requirements by @tdejager in #1575
  • Tell cargo install which bin to install by @ruben-arts in #1584
  • Update conflict docs for cargo add by @Hofer-Julian in #1600
  • Revert \"Update conflict docs for cargo add \" by @Hofer-Julian in #1605
  • Add reference documentation for the exec command by @baszalmstra in #1587
  • Add transitioning docs for poetry and conda by @ruben-arts in #1624
  • Add pixi-pack by @pavelzw in #1629
  • Use '-' instead of '_' for package name by @olivier-lacroix in #1628
"},{"location":"CHANGELOG/#fixed_18","title":"Fixed","text":"
  • Flaky task test by @tdejager in #1581
  • Pass command line arguments verbatim by @baszalmstra in #1582
  • Run clippy on all targets by @Hofer-Julian in #1588
  • Pre-commit install pixi task by @Hofer-Julian in #1590
  • Add clap_complete_nushell to dependencies by @Hofer-Julian in #1625
  • Write to stdout for machine readable output by @Hofer-Julian in #1639
"},{"location":"CHANGELOG/#refactor_10","title":"Refactor","text":"
  • Migrate to workspace by @baszalmstra in #1597
"},{"location":"CHANGELOG/#removed","title":"Removed","text":"
  • Remove double manifest warning by @tdejager in #1580
"},{"location":"CHANGELOG/#new-contributors_15","title":"New Contributors","text":"
  • @ytjhai made their first contribution in #1545
"},{"location":"CHANGELOG/#0250-2024-07-05","title":"[0.25.0] - 2024-07-05","text":""},{"location":"CHANGELOG/#highlights_15","title":"\u2728 Highlights","text":"
  • pixi exec command, execute commands in temporary environments, useful for testing in short-lived sessions.
  • We've bumped the default system-requirements to higher defaults: glibc (2.17 -> 2.28), osx64 (10.15 -> 13.0), osx-arm64 (11.0 -> 13.0). Let us know if this causes any issues. To keep the previous values please use a system-requirements table, this is explained here
"},{"location":"CHANGELOG/#changed_15","title":"Changed","text":"
  • Bump system requirements by @wolfv in #1553
  • Better error when exec is missing a cmd by @tdejager in #1565
  • Make exec use authenticated client by @tdejager in #1568
"},{"location":"CHANGELOG/#documentation_18","title":"Documentation","text":"
  • Automatic updating using github actions by @pavelzw in #1456
  • Describe the --change-ps1 option for pixi shell by @Yura52 in #1536
  • Add some other quantco repos by @pavelzw in #1542
  • Add example using geos-rs by @Hofer-Julian in #1563
"},{"location":"CHANGELOG/#fixed_19","title":"Fixed","text":"
  • Tiny error in basic_usage.md by @Sjouks in #1513
  • Lazy initialize client by @baszalmstra in #1511
  • URL typos in rtd examples by @kklein in #1538
  • Fix satisfiability for short sha hashes by @tdejager in #1530
  • Wrong path passed to dynamic check by @tdejager in #1552
  • Don't error if no tasks is available on platform by @hoxbro in #1550
"},{"location":"CHANGELOG/#refactor_11","title":"Refactor","text":"
  • Add to use update code by @baszalmstra in #1508
"},{"location":"CHANGELOG/#new-contributors_16","title":"New Contributors","text":"
  • @kklein made their first contribution in #1538
  • @Yura52 made their first contribution in #1536
  • @Sjouks made their first contribution in #1513
"},{"location":"CHANGELOG/#0242-2024-06-14","title":"[0.24.2] - 2024-06-14","text":""},{"location":"CHANGELOG/#documentation_19","title":"Documentation","text":"
  • Add readthedocs examples by @bollwyvl in #1423
  • Fix typo in project_configuration.md by @RaulPL in #1502
"},{"location":"CHANGELOG/#fixed_20","title":"Fixed","text":"
  • Too much shell variables in activation of pixi shell by @ruben-arts in #1507
"},{"location":"CHANGELOG/#0241-2024-06-12","title":"[0.24.1] - 2024-06-12","text":""},{"location":"CHANGELOG/#fixed_21","title":"Fixed","text":"
  • Replace http code %2b with + by @ruben-arts in #1500
"},{"location":"CHANGELOG/#0240-2024-06-12","title":"[0.24.0] - 2024-06-12","text":""},{"location":"CHANGELOG/#highlights_16","title":"\u2728 Highlights","text":"
  • You can now run in a more isolated environment on unix machines, using pixi run --clean-env TASK_NAME.
  • You can new easily clean your environment with pixi clean or the cache with pixi clean cache
"},{"location":"CHANGELOG/#added_12","title":"Added","text":"
  • Add pixi clean command by @ruben-arts in #1325
  • Add --clean-env flag to tasks and run command by @ruben-arts in #1395
  • Add description field to task by @jjjermiah in #1479
  • Add pixi file to the environment to add pixi specific details by @ruben-arts in #1495
"},{"location":"CHANGELOG/#changed_16","title":"Changed","text":"
  • Project environment cli by @baszalmstra in #1433
  • Update task list console output by @vigneshmanick in #1443
  • Upgrade uv by @tdejager in #1436
  • Sort packages in list_global_packages by @dhirschfeld in #1458
  • Added test for special chars wheel filename by @tdejager in #1454
"},{"location":"CHANGELOG/#documentation_20","title":"Documentation","text":"
  • Improve multi env tasks documentation by @ruben-arts in #1494
"},{"location":"CHANGELOG/#fixed_22","title":"Fixed","text":"
  • Use the activated environment when running a task by @tdejager in #1461
  • Fix authentication pypi-deps for download from lockfile by @tdejager in #1460
  • Display channels correctly in pixi info by @ruben-arts in #1459
  • Render help for --frozen by @ruben-arts in #1468
  • Don't record purl for non conda-forge channels by @nichmor in #1451
  • Use best_platform to verify the run platform by @ruben-arts in #1472
  • Creation of parent dir of symlink by @ruben-arts in #1483
  • pixi install --all output missing newline by @vigneshmanick in #1487
  • Don't error on already existing dependency by @ruben-arts in #1449
  • Remove debug true in release by @ruben-arts in #1477
"},{"location":"CHANGELOG/#new-contributors_17","title":"New Contributors","text":"
  • @dhirschfeld made their first contribution in #1458

Full commit history

"},{"location":"CHANGELOG/#0230-2024-05-27","title":"[0.23.0] - 2024-05-27","text":""},{"location":"CHANGELOG/#highlights_17","title":"\u2728 Highlights","text":"
  • This release adds two new commands pixi config and pixi update
    • pixi config allows you to edit, set, unset, append, prepend and list your local/global or system configuration.
    • pixi update re-solves the full lockfile or use pixi update PACKAGE to only update PACKAGE, making sure your project is using the latest versions that the manifest allows for.
"},{"location":"CHANGELOG/#added_13","title":"Added","text":"
  • Add pixi config command by @chawyehsu in #1339
  • Add pixi list --explicit flag command by @jjjermiah in #1403
  • Add [activation.env] table for environment variables by @ruben-arts in #1156
  • Allow installing multiple envs, including --all at once by @tdejager in #1413
  • Add pixi update command to re-solve the lockfile by @baszalmstra in #1431 (fixes 20 :thumbsup:)
  • Add detached-environments to the config, move environments outside the project folder by @ruben-arts in #1381 (fixes 11 :thumbsup:)
"},{"location":"CHANGELOG/#changed_17","title":"Changed","text":"
  • Use the gateway to fetch repodata by @baszalmstra in #1307
  • Switch to compressed mapping by @nichmor in #1335
  • Warn on pypi conda clobbering by @nichmor in #1353
  • Align remove arguments with add by @olivier-lacroix in #1406
  • Add backward compat logic for older lock files by @nichmor in #1425
"},{"location":"CHANGELOG/#documentation_21","title":"Documentation","text":"
  • Fix small screen by removing getting started section. by @ruben-arts in #1393
  • Improve caching docs by @ruben-arts in #1422
  • Add example, python library using gcp upload by @tdejager in #1380
  • Correct typos with --no-lockfile-update. by @tobiasraabe in #1396
"},{"location":"CHANGELOG/#fixed_23","title":"Fixed","text":"
  • Trim channel url when filter packages_for_prefix_mapping by @zen-xu in #1391
  • Use the right channels when upgrading global packages by @olivier-lacroix in #1326
  • Fish prompt display looks wrong in tide by @tfriedel in #1424
  • Use local mapping instead of remote by @nichmor in #1430
"},{"location":"CHANGELOG/#refactor_12","title":"Refactor","text":"
  • Remove unused fetch_sparse_repodata by @olivier-lacroix in #1411
  • Remove project level method that are per environment by @olivier-lacroix in #1412
  • Update lockfile functionality for reusability by @baszalmstra in #1426
"},{"location":"CHANGELOG/#new-contributors_18","title":"New Contributors","text":"
  • @tfriedel made their first contribution in #1424
  • @jjjermiah made their first contribution in #1403
  • @tobiasraabe made their first contribution in #1396

Full commit history

"},{"location":"CHANGELOG/#0220-2024-05-13","title":"[0.22.0] - 2024-05-13","text":""},{"location":"CHANGELOG/#highlights_18","title":"\u2728 Highlights","text":"
  • Support for source pypi dependencies through the cli:
    • pixi add --pypi 'package @ package.whl', perfect for adding just build wheels to your environment in CI.
    • pixi add --pypi 'package_from_git @ git+https://github.com/org/package.git', to add a package from a git repository.
    • pixi add --pypi 'package_from_path @ file:///path/to/package' --editable, to add a package from a local path.
"},{"location":"CHANGELOG/#added_14","title":"Added","text":"
  • Implement more functions for pixi add --pypi by @wolfv in #1244
"},{"location":"CHANGELOG/#documentation_22","title":"Documentation","text":"
  • Update install cli doc by @vigneshmanick in #1336
  • Replace empty default example with no-default-feature by @beenje in #1352
  • Document the add & remove cli behaviour with pyproject.toml manifest by @olivier-lacroix in #1338
  • Add environment activation to GitHub actions docs by @pavelzw in #1371
  • Clarify in CLI that run can also take commands by @twrightsman in #1368
"},{"location":"CHANGELOG/#fixed_24","title":"Fixed","text":"
  • Automated update of install script in pixi.sh by @ruben-arts in #1351
  • Wrong description on pixi project help by @notPlancha in #1358
  • Don't need a python interpreter when not having pypi dependencies. by @ruben-arts in #1366
  • Don't error on not editable not path by @ruben-arts in #1365
  • Align shell-hook cli with shell by @ruben-arts in #1364
  • Only write prefix file if needed by @ruben-arts in #1363
"},{"location":"CHANGELOG/#refactor_13","title":"Refactor","text":"
  • Lock-file resolve functionality in separated modules by @tdejager in #1337
  • Use generic for RepoDataRecordsByName and PypiRecordsByName by @olivier-lacroix in #1341
"},{"location":"CHANGELOG/#new-contributors_19","title":"New Contributors","text":"
  • @twrightsman made their first contribution in #1368
  • @notPlancha made their first contribution in #1358
  • @vigneshmanick made their first contribution in #1336

Full commit history

"},{"location":"CHANGELOG/#0211-2024-05-07","title":"[0.21.1] - 2024-05-07","text":""},{"location":"CHANGELOG/#fixed_25","title":"Fixed","text":"
  • Use read timeout, not global timeout by @wolfv in #1329
  • Channel priority logic by @ruben-arts in #1332

Full commit history

"},{"location":"CHANGELOG/#0210-2024-05-06","title":"[0.21.0] - 2024-05-06","text":""},{"location":"CHANGELOG/#highlights_19","title":"\u2728 Highlights","text":"
  • This release adds support for configuring PyPI settings globally, to use alternative PyPI indexes and load credentials with keyring.
  • We now support cross-platform running, for osx-64 on osx-arm64 and wasm environments.
  • There is now a no-default-feature option to simplify usage of environments.
"},{"location":"CHANGELOG/#added_15","title":"Added","text":"
  • Add pypi config for global local config file + keyring support by @wolfv in #1279
  • Allow for cross-platform running, for osx-64 on osx-arm64 and wasm environments by @wolfv in #1020
"},{"location":"CHANGELOG/#changed_18","title":"Changed","text":"
  • Add no-default-feature option to environments by @olivier-lacroix in #1092
  • Add /etc/pixi/config.toml to global configuration search paths by @pavelzw in #1304
  • Change global config fields to kebab-case by @tdejager in #1308
  • Show all available task with task list by @Hoxbro in #1286
  • Allow to emit activation environment variables as JSON by @borchero in #1317
  • Use locked pypi packages as preferences in the pypi solve to get minimally updating lock files by @ruben-arts in #1320
  • Allow to upgrade several global packages at once by @olivier-lacroix in #1324
"},{"location":"CHANGELOG/#documentation_23","title":"Documentation","text":"
  • Typo in tutorials python by @carschandler in #1297
  • Python Tutorial: Dependencies, PyPI, Order, Grammar by @JesperDramsch in #1313
"},{"location":"CHANGELOG/#fixed_26","title":"Fixed","text":"
  • Schema version and add it to tbump by @ruben-arts in #1284
  • Make integration test fail in ci and fix ssh issue by @ruben-arts in #1301
  • Automate adding install scripts to the docs by @ruben-arts in #1302
  • Do not always request for prefix mapping by @nichmor in #1300
  • Align CLI aliases and add missing by @ruben-arts in #1316
  • Alias depends_on to depends-on by @ruben-arts in #1310
  • Add error if channel or platform doesn't exist on remove by @ruben-arts in #1315
  • Allow spec in pixi q instead of only name by @ruben-arts in #1314
  • Remove dependency on sysroot for linux by @ruben-arts in #1319
  • Fix linking symlink issue, by updating to the latest rattler by @baszalmstra in #1327
"},{"location":"CHANGELOG/#refactor_14","title":"Refactor","text":"
  • Use IndexSet instead of Vec for collections of unique elements by @olivier-lacroix in #1289
  • Use generics over PyPiDependencies and CondaDependencies by @olivier-lacroix in #1303
"},{"location":"CHANGELOG/#new-contributors_20","title":"New Contributors","text":"
  • @borchero made their first contribution in #1317
  • @JesperDramsch made their first contribution in #1313
  • @Hoxbro made their first contribution in #1286
  • @carschandler made their first contribution in #1297

Full commit history

"},{"location":"CHANGELOG/#0201-2024-04-26","title":"[0.20.1] - 2024-04-26","text":""},{"location":"CHANGELOG/#highlights_20","title":"\u2728 Highlights","text":"
  • Big improvements on the pypi-editable installs.
"},{"location":"CHANGELOG/#fixed_27","title":"Fixed","text":"
  • Editable non-satisfiable by @baszalmstra in #1251
  • Satisfiability with pypi extras by @baszalmstra in #1253
  • Change global install activation script permission from 0o744 -> 0o755 by @zen-xu in #1250
  • Avoid creating Empty TOML tables by @olivier-lacroix in #1270
  • Uses the special-case uv path handling for both built and source by @tdejager in #1263
  • Modify test before attempting to write to .bash_profile in install.sh by @bruchim-cisco in #1267
  • Parse properly 'default' as environment Cli argument by @olivier-lacroix in #1247
  • Apply schema.json normalization, add to docs by @bollwyvl in #1265
  • Improve absolute path satisfiability by @tdejager in #1252
  • Improve parse deno error and make task a required field in the cli by @ruben-arts in #1260
"},{"location":"CHANGELOG/#new-contributors_21","title":"New Contributors","text":"
  • @bollwyvl made their first contribution in #1265
  • @bruchim-cisco made their first contribution in #1267
  • @zen-xu made their first contribution in #1250

Full commit history

"},{"location":"CHANGELOG/#0200-2024-04-19","title":"[0.20.0] - 2024-04-19","text":""},{"location":"CHANGELOG/#highlights_21","title":"\u2728 Highlights","text":"
  • We now support env variables in the task definition, these can also be used as default values for parameters in your task which you can overwrite with your shell's env variables. e.g. task = { cmd = \"task to run\", env = { VAR=\"value1\", PATH=\"my/path:$PATH\" } }
  • We made a big effort on fixing issues and improving documentation!
"},{"location":"CHANGELOG/#added_16","title":"Added","text":"
  • Add env to the tasks to specify tasks specific environment variables by @wolfv in https://github.com/prefix-dev/pixi/pull/972
"},{"location":"CHANGELOG/#changed_19","title":"Changed","text":"
  • Add --pyproject option to pixi init with a pyproject.toml by @olivier-lacroix in #1188
  • Upgrade to new uv version 0.1.32 by @tdejager in #1208
"},{"location":"CHANGELOG/#documentation_24","title":"Documentation","text":"
  • Document pixi.lock by @ruben-arts in #1209
  • Document channel priority definition by @ruben-arts in #1234
  • Add rust tutorial including openssl example by @ruben-arts in #1155
  • Add python tutorial to documentation by @tdejager in #1179
  • Add JupyterLab integration docs by @renan-r-santos in #1147
  • Add Windows support for PyCharm integration by @pavelzw in #1192
  • Setup_pixi for local pixi installation by @ytausch in #1181
  • Update pypi docs by @Hofer-Julian in #1215
  • Fix order of --no-deps when pip installing in editable mode by @glemaitre in #1220
  • Fix frozen documentation by @ruben-arts in #1167
"},{"location":"CHANGELOG/#fixed_28","title":"Fixed","text":"
  • Small typo in list cli by @tdejager in #1169
  • Issue with invalid solve group by @baszalmstra in #1190
  • Improve error on parsing lockfile by @ruben-arts in #1180
  • Replace _ with - when creating environments from features by @wolfv in #1203
  • Prevent duplicate direct dependencies in tree by @abkfenris in #1184
  • Use project root directory instead of task.working_directory for base dir when hashing by @wolfv in #1202
  • Do not leak env vars from bat scripts in cmd.exe by @wolfv in #1205
  • Make file globbing behave more as expected by @wolfv in #1204
  • Fix for using file::// in pyproject.toml dependencies by @tdejager in #1196
  • Improve pypi version conversion in pyproject.toml dependencies by @wolfv in #1201
  • Update to the latest rattler by @wolfv in #1235
"},{"location":"CHANGELOG/#breaking_1","title":"BREAKING","text":"
  • task = { cmd = \"task to run\", cwd = \"folder\", inputs = \"input.txt\", output = \"output.txt\"} Where input.txt and output.txt where previously in folder they are now relative the project root. This changed in: #1202
  • task = { cmd = \"task to run\", inputs = \"input.txt\"} previously searched for all input.txt files now only for the ones in the project root. This changed in: #1204
"},{"location":"CHANGELOG/#new-contributors_22","title":"New Contributors","text":"
  • @glemaitre made their first contribution in #1220

Full commit history

"},{"location":"CHANGELOG/#0191-2024-04-11","title":"[0.19.1] - 2024-04-11","text":""},{"location":"CHANGELOG/#highlights_22","title":"\u2728 Highlights","text":"

This fixes the issue where pixi would generate broken environments/lockfiles when a mapping for a brand-new version of a package is missing.

"},{"location":"CHANGELOG/#changed_20","title":"Changed","text":"
  • Add fallback mechanism for missing mapping by @nichmor in #1166

Full commit history

"},{"location":"CHANGELOG/#0190-2024-04-10","title":"[0.19.0] - 2024-04-10","text":""},{"location":"CHANGELOG/#highlights_23","title":"\u2728 Highlights","text":"
  • This release adds a new pixi tree command to show the dependency tree of the project.
  • Pixi now persists the manifest and environment when activating a shell, so you can use pixi as if you are in that folder while in the shell.
"},{"location":"CHANGELOG/#added_17","title":"Added","text":"
  • pixi tree command to show dependency tree by @abkfenris in #1069
  • Persistent shell manifests by @abkfenris in #1080
  • Add to pypi in feature (pixi add --feature test --pypi package) by @ruben-arts in #1135
  • Use new mapping by @nichmor in #888
  • --no-progress to disable all progress bars by @baszalmstra in #1105
  • Create a table if channel is specified (pixi add conda-forge::rattler-build) by @baszalmstra in #1079
"},{"location":"CHANGELOG/#changed_21","title":"Changed","text":"
  • Add the project itself as an editable dependency by @olivier-lacroix in #1084
  • Get tool.pixi.project.name from project.name by @olivier-lacroix in #1112
  • Create features and environments from extras by @olivier-lacroix in #1077
  • Pypi supports come out of Beta by @olivier-lacroix in #1120
  • Enable to force PIXI_ARCH for pixi installation by @beenje in #1129
  • Improve tool.pixi.project detection logic by @olivier-lacroix in #1127
  • Add purls for packages if adding pypi dependencies by @nichmor in #1148
  • Add env name if not default to tree and list commands by @ruben-arts in #1145
"},{"location":"CHANGELOG/#documentation_25","title":"Documentation","text":"
  • Add MODFLOW 6 to community docs by @Hofer-Julian in #1125
  • Addition of ros2 tutorial by @ruben-arts in #1116
  • Improve install script docs by @ruben-arts in #1136
  • More structured table of content by @tdejager in #1142
"},{"location":"CHANGELOG/#fixed_29","title":"Fixed","text":"
  • Amend syntax in conda-meta/history to prevent conda.history.History.parse() error by @jaimergp in #1117
  • Fix docker example and include pyproject.toml by @tdejager in #1121
"},{"location":"CHANGELOG/#new-contributors_23","title":"New Contributors","text":"
  • @abkfenris made their first contribution in #1069
  • @beenje made their first contribution in #1129
  • @jaimergp made their first contribution in #1117

Full commit history

"},{"location":"CHANGELOG/#0180-2024-04-02","title":"[0.18.0] - 2024-04-02","text":""},{"location":"CHANGELOG/#highlights_24","title":"\u2728 Highlights","text":"
  • This release adds support for pyproject.toml, now pixi reads from the [tool.pixi] table.
  • We now support editable PyPI dependencies, and PyPI source dependencies, including git, path, and url dependencies.

[!TIP] These new features are part of the ongoing effort to make pixi more flexible, powerful, and comfortable for the python users. They are still in progress so expect more improvements on these features soon, so please report any issues you encounter and follow our next releases!

"},{"location":"CHANGELOG/#added_18","title":"Added","text":"
  • Support for pyproject.toml by @olivier-lacroix in #999
  • Support for PyPI source dependencies by @tdejager in #985
  • Support for editable PyPI dependencies by @tdejager in #1044
"},{"location":"CHANGELOG/#changed_22","title":"Changed","text":"
  • XDG_CONFIG_HOME and XDG_CACHE_HOME compliance by @chawyehsu in #1050
  • Build pixi for windows arm by @baszalmstra in #1053
  • Platform literals by @baszalmstra in #1054
  • Cli docs: --user is actually --username
  • Fixed error in auth example (CLI docs) by @ytausch in #1076
"},{"location":"CHANGELOG/#documentation_26","title":"Documentation","text":"
  • Add lockfile update description in preparation for pixi update by @ruben-arts in #1073
  • zsh may be used for installation on macOS by @pya in #1091
  • Fix typo in pixi auth documentation by @ytausch in #1076
  • Add rstudio to the IDE integration docs by @wolfv in #1144
"},{"location":"CHANGELOG/#fixed_30","title":"Fixed","text":"
  • Test failure on riscv64 by @hack3ric in #1045
  • Validation test was testing on a wrong pixi.toml by @ruben-arts in #1056
  • Pixi list shows path and editable by @baszalmstra in #1100
  • Docs ci by @ruben-arts in #1074
  • Add error for unsupported pypi dependencies by @baszalmstra in #1052
  • Interactively delete environment when it was relocated by @baszalmstra in #1102
  • Allow solving for different platforms by @baszalmstra in #1101
  • Don't allow extra keys in pypi requirements by @baszalmstra in #1104
  • Solve when moving dependency from conda to pypi by @baszalmstra in #1099
"},{"location":"CHANGELOG/#new-contributors_24","title":"New Contributors","text":"
  • @pya made their first contribution in #1091
  • @ytausch made their first contribution in #1076
  • @hack3ric made their first contribution in #1045
  • @olivier-lacroix made their first contribution in #999
  • @henryiii made their first contribution in #1063

Full commit history

"},{"location":"CHANGELOG/#0171-2024-03-21","title":"[0.17.1] - 2024-03-21","text":""},{"location":"CHANGELOG/#highlights_25","title":"\u2728 Highlights","text":"

A quick bug-fix release for pixi list.

"},{"location":"CHANGELOG/#documentation_27","title":"Documentation","text":"
  • Fix typo by @pavelzw in #1028
"},{"location":"CHANGELOG/#fixed_31","title":"Fixed","text":"
  • Remove the need for a python interpreter in pixi list by @baszalmstra in #1033
"},{"location":"CHANGELOG/#0170-2024-03-19","title":"[0.17.0] - 2024-03-19","text":""},{"location":"CHANGELOG/#highlights_26","title":"\u2728 Highlights","text":"
  • This release greatly improves pixi global commands, thanks to @chawyehsu!
  • We now support global (or local) configuration for pixi's own behavior, including mirrors, and OCI registries.
  • We support channel mirrors for corporate environments!
  • Faster task execution thanks to caching \ud83d\ude80 Tasks that already executed successfully can be skipped based on the hash of the inputs and outputs.
  • PyCharm and GitHub Actions integration thanks to @pavelzw \u2013 read more about it in the docs!
"},{"location":"CHANGELOG/#added_19","title":"Added","text":"
  • Add citation file by @ruben-arts in #908
  • Add a pixi badge by @ruben-arts in #961
  • Add deserialization of pypi source dependencies from toml by @ruben-arts and @wolf in #895 #984
  • Implement mirror and OCI settings by @wolfv in #988
  • Implement inputs and outputs hash based task skipping by @wolfv in #933
"},{"location":"CHANGELOG/#changed_23","title":"Changed","text":"
  • Refined global upgrade commands by @chawyehsu in #948
  • Global upgrade supports matchspec by @chawyehsu in #962
  • Improve pixi search with platform selection and making limit optional by @wolfv in #979
  • Implement global config options by @wolfv in #960 #1015 #1019
  • Update auth to use rattler cli by @kassoulait by @ruben-arts in #986
"},{"location":"CHANGELOG/#documentation_28","title":"Documentation","text":"
  • Remove cache: true from setup-pixi by @pavelzw in #950
  • Add GitHub Actions documentation by @pavelzw in #955
  • Add PyCharm documentation by @pavelzw in #974
  • Mention watch_file in direnv usage by @pavelzw in #983
  • Add tip to help users when no PROFILE file exists by @ruben-arts in #991
  • Move yaml comments into mkdocs annotations by @pavelzw in #1003
  • Fix --env and extend actions examples by @ruben-arts in #1005
  • Add Wflow to projects built with pixi by @Hofer-Julian in #1006
  • Removed linenums to avoid buggy visualization by @ruben-arts in #1002
  • Fix typos by @pavelzw in #1016
"},{"location":"CHANGELOG/#fixed_32","title":"Fixed","text":"
  • Pypi dependencies not being removed by @tdejager in #952
  • Permissions for lint pr by @ruben-arts in #852
  • Install Windows executable with install.sh in Git Bash by @jdblischak in #966
  • Proper scanning of the conda-meta folder for json entries by @wolfv in #971
  • Global shim scripts for Windows by @wolfv in #975
  • Correct fish prompt by @wolfv in #981
  • Prefix_file rename by @ruben-arts in #959
  • Conda transitive dependencies of pypi packages are properly extracted by @baszalmstra in #967
  • Make tests more deterministic and use single * for glob expansion by @wolfv in #987
  • Create conda-meta/history file by @pavelzw in #995
  • Pypi dependency parsing was too lenient by @wolfv in #984
  • Add reactivation of the environment in pixi shell by @wolfv in #982
  • Add tool to strict json schema by @ruben-arts in #969
"},{"location":"CHANGELOG/#new-contributors_25","title":"New Contributors","text":"
  • @jdblischak made their first contribution in #966
  • @kassoulait made their first contribution in #986

Full commit history

"},{"location":"CHANGELOG/#0161-2024-03-11","title":"[0.16.1] - 2024-03-11","text":""},{"location":"CHANGELOG/#fixed_33","title":"Fixed","text":"
  • Parse lockfile matchspecs lenient, fixing bug introduced in 0.16.0 by @ruben-arts in #951

Full commit history

"},{"location":"CHANGELOG/#0160-2024-03-09","title":"[0.16.0] - 2024-03-09","text":""},{"location":"CHANGELOG/#highlights_27","title":"\u2728 Highlights","text":"
  • This release removes rip and add uv as the PyPI resolver and installer.
"},{"location":"CHANGELOG/#added_20","title":"Added","text":"
  • Add tcsh install support by @obust in #898
  • Add user agent to pixi http client by @baszalmstra in #892
  • Add a schema for the pixi.toml by @ruben-arts in #936
"},{"location":"CHANGELOG/#changed_24","title":"Changed","text":"
  • Switch from rip to uv by @tdejager in #863
  • Move uv options into context by @tdejager in #911
  • Add Deltares projects to Community.md by @Hofer-Julian in #920
  • Upgrade to uv 0.1.16, updated for changes in the API by @tdejager in #935
"},{"location":"CHANGELOG/#fixed_34","title":"Fixed","text":"
  • Made the uv re-install logic a bit more clear by @tdejager in #894
  • Avoid duplicate pip dependency while importing environment.yaml by @sumanth-manchala in #890
  • Handle custom channels when importing from env yaml by @sumanth-manchala in #901
  • Pip editable installs getting uninstalled by @renan-r-santos in #902
  • Highlight pypi deps in pixi list by @sumanth-manchala in #907
  • Default to the default environment if possible by @ruben-arts in #921
  • Switching channels by @baszalmstra in #923
  • Use correct name of the channel on adding by @ruben-arts in #928
  • Turn back on jlap for faster repodata fetching by @ruben-arts in #937
  • Remove dists site-packages's when python interpreter changes by @tdejager in #896
"},{"location":"CHANGELOG/#new-contributors_26","title":"New Contributors","text":"
  • @obust made their first contribution in #898
  • @renan-r-santos made their first contribution in #902

Full Commit history

"},{"location":"CHANGELOG/#0152-2024-02-29","title":"[0.15.2] - 2024-02-29","text":""},{"location":"CHANGELOG/#changed_25","title":"Changed","text":"
  • Add more info to a failure of activation by @ruben-arts in #873
"},{"location":"CHANGELOG/#fixed_35","title":"Fixed","text":"
  • Improve global list UX when there is no global env dir created by @sumanth-manchala in #865
  • Update rattler to v0.19.0 by @AliPiccioniQC in #885
  • Error on pixi run if platform is not supported by @ruben-arts in #878
"},{"location":"CHANGELOG/#new-contributors_27","title":"New Contributors","text":"
  • @sumanth-manchala made their first contribution in #865
  • @AliPiccioniQC made their first contribution in #885

Full commit history

"},{"location":"CHANGELOG/#0151-2024-02-26","title":"[0.15.1] - 2024-02-26","text":""},{"location":"CHANGELOG/#added_21","title":"Added","text":"
  • Add prefix to project info json output by @baszalmstra in #859
"},{"location":"CHANGELOG/#changed_26","title":"Changed","text":"
  • New pixi global list display format by @chawyehsu in #723
  • Add direnv usage by @pavelzw in #845
  • Add docker example by @pavelzw in #846
  • Install/remove multiple packages globally by @chawyehsu in #854
"},{"location":"CHANGELOG/#fixed_36","title":"Fixed","text":"
  • Prefix file in init --import by @ruben-arts in #855
  • Environment and feature names in pixi info --json by @baszalmstra in #857

Full commit history

"},{"location":"CHANGELOG/#0150-2024-02-23","title":"[0.15.0] - 2024-02-23","text":""},{"location":"CHANGELOG/#highlights_28","title":"\u2728 Highlights","text":"
  • [pypi-dependencies] now get build in the created environment so it uses the conda installed build tools.
  • pixi init --import env.yml to import an existing conda environment file.
  • [target.unix.dependencies] to specify dependencies for unix systems instead of per platform.

[!WARNING] This versions build failed, use v0.15.1

"},{"location":"CHANGELOG/#added_22","title":"Added","text":"
  • pass environment variables during pypi resolution and install (#818)
  • skip micromamba style selector lines and warn about them (#830)
  • add import yml flag (#792)
  • check duplicate dependencies (#717)
  • (ci) check conventional PR title (#820)
  • add --feature to pixi add (#803)
  • add windows, macos, linux and unix to targets (#832)
"},{"location":"CHANGELOG/#fixed_37","title":"Fixed","text":"
  • cache and retry pypi name mapping (#839)
  • check duplicates while adding dependencies (#829)
  • logic PIXI_NO_PATH_UPDATE variable (#822)
"},{"location":"CHANGELOG/#other","title":"Other","text":"
  • add mike to the documentation and update looks (#809)
  • add instructions for installing on Alpine Linux (#828)
  • more error reporting in self-update (#823)
  • disabled jlap for now (#836)

Full commit history

"},{"location":"CHANGELOG/#0140-2024-02-15","title":"[0.14.0] - 2024-02-15","text":""},{"location":"CHANGELOG/#highlights_29","title":"\u2728 Highlights","text":"

Now, solve-groups can be used in [environments] to ensure dependency alignment across different environments without simultaneous installation. This feature is particularly beneficial for managing identical dependencies in test and production environments. Example configuration:

[environments]\ntest = { features = [\"prod\", \"test\"], solve-groups = [\"group1\"] }\nprod = { features = [\"prod\"], solve-groups = [\"group1\"] }\n
This setup simplifies managing dependencies that must be consistent across test and production.

"},{"location":"CHANGELOG/#added_23","title":"Added","text":"
  • Add index field to pypi requirements by @vlad-ivanov-name in #784
  • Add -f/--feature to the pixi project platform command by @ruben-arts in #785
  • Warn user when unused features are defined by @ruben-arts in #762
  • Disambiguate tasks interactive by @baszalmstra in #766
  • Solve groups for conda by @baszalmstra in #783
  • Pypi solve groups by @baszalmstra in #802
  • Enable reflinks by @baszalmstra in #729
"},{"location":"CHANGELOG/#changed_27","title":"Changed","text":"
  • Add environment name to the progress by @ruben-arts in #788
  • Set color scheme by @ruben-arts in #773
  • Update lock on pixi list by @ruben-arts in #775
  • Use default env if task available in it. by @ruben-arts in #772
  • Color environment name in install step by @ruben-arts in #795
"},{"location":"CHANGELOG/#fixed_38","title":"Fixed","text":"
  • Running cuda env and using those tasks. by @ruben-arts in #764
  • Make svg a gif by @ruben-arts in #782
  • Fmt by @ruben-arts
  • Check for correct platform in task env creation by @ruben-arts in #759
  • Remove using source name by @ruben-arts in #765
  • Auto-guessing of the shell in the shell-hook by @ruben-arts in https://github.com/prefix-dev/pixi/pull/811
  • sdist with direct references by @nichmor in https://github.com/prefix-dev/pixi/pull/813
"},{"location":"CHANGELOG/#miscellaneous","title":"Miscellaneous","text":"
  • Add slim-trees to community projects by @pavelzw in #760
  • Add test to default env in polarify example
  • Add multiple machine example by @ruben-arts in #757
  • Add more documentation on environments by @ruben-arts in #790
  • Update rip and rattler by @wolfv in #798
  • Rattler 0.18.0 by @baszalmstra in #805
  • Rip 0.8.0 by @nichmor in #806
  • Fix authentication path by @pavelzw in #796
  • Initial addition of integration test by @ruben-arts in https://github.com/prefix-dev/pixi/pull/804
"},{"location":"CHANGELOG/#new-contributors_28","title":"New Contributors","text":"
  • @vlad-ivanov-name made their first contribution in #784
  • @nichmor made their first contribution in #806

Full commit history

"},{"location":"CHANGELOG/#0130-2024-02-01","title":"[0.13.0] - 2024-02-01","text":""},{"location":"CHANGELOG/#highlights_30","title":"\u2728 Highlights","text":"

This release is pretty crazy in amount of features! The major ones are: - We added support for multiple environments. :tada: Checkout the documentation - We added support for sdist installation, which greatly improves the amount of packages that can be installed from PyPI. :rocket:

[!IMPORTANT]

Renaming of PIXI_PACKAGE_* variables:

PIXI_PACKAGE_ROOT -> PIXI_PROJECT_ROOT\nPIXI_PACKAGE_NAME ->  PIXI_PROJECT_NAME\nPIXI_PACKAGE_MANIFEST -> PIXI_PROJECT_MANIFEST\nPIXI_PACKAGE_VERSION -> PIXI_PROJECT_VERSION\nPIXI_PACKAGE_PLATFORMS -> PIXI_ENVIRONMENT_PLATFORMS\n
Check documentation here: https://pixi.sh/environment/

[!IMPORTANT]

The .pixi/env/ folder has been moved to accommodate multiple environments. If you only have one environment it is now named .pixi/envs/default.

"},{"location":"CHANGELOG/#added_24","title":"Added","text":"
  • Add support for multiple environment:
    • Update to rattler lock v4 by @baszalmstra in #698
    • Multi-env installation and usage by @baszalmstra in #721
    • Update all environments in the lock-file when requesting an environment by @baszalmstra in #711
    • Run tasks in the env they are defined by @baszalmstra in #731
    • polarify use-case as an example by @ruben-arts in #735
    • Make environment name parsing strict by @ruben-arts in #673
    • Use named environments (only \"default\" for now) by @ruben-arts in #674
    • Use task graph instead of traversal by @baszalmstra in #725
    • Multi env documentation by @ruben-arts in #703
    • pixi info -e/--environment option by @ruben-arts in #676
    • pixi channel add -f/--feature option by @ruben-arts in #700
    • pixi channel remove -f/--feature option by @ruben-arts in #706
    • pixi remove -f/--feature option by @ruben-arts in #680
    • pixi task list -e/--environment option by @ruben-arts in #694
    • pixi task remove -f/--feature option by @ruben-arts in #694
    • pixi install -e/--environment option by @ruben-arts in #722
  • Support for sdists in pypi-dependencies by @tdejager in #664
  • Add pre-release support to pypi-dependencies by @tdejager in #716
  • Support adding dependencies for project's unsupported platforms by @orhun in #668
  • Add pixi list command by @hadim in #665
  • Add pixi shell-hook command by @orhun in #672#679 #684
  • Use env variable to configure locked, frozen and color by @hadim in #726
  • pixi self-update by @hadim in #675
  • Add PIXI_NO_PATH_UPDATE for PATH update suppression by @chawyehsu in #692
  • Set the cache directory by @ruben-arts in #683
"},{"location":"CHANGELOG/#changed_28","title":"Changed","text":"
  • Use consistent naming for tests module by @orhun in #678
  • Install pixi and add to the path in docker example by @ruben-arts in #743
  • Simplify the deserializer of PyPiRequirement by @orhun in #744
  • Use tabwriter instead of comfy_table by @baszalmstra in #745
  • Document environment variables by @ruben-arts in #746
"},{"location":"CHANGELOG/#fixed_39","title":"Fixed","text":"
  • Quote part of the task that has brackets ([ or ]) by @JafarAbdi in #677
  • Package clobber and __pycache__ removal issues by @wolfv in #573
  • Non-global reqwest client by @tdejager in #693
  • Fix broken pipe error during search by @orhun in #699
  • Make pixi search result correct by @chawyehsu in #713
  • Allow the tasks for all platforms to be shown in pixi info by @ruben-arts in #728
  • Flaky tests while installing pypi dependencies by @baszalmstra in #732
  • Linux install script by @mariusvniekerk in #737
  • Download wheels in parallel to avoid deadlock by @baszalmstra in #752
"},{"location":"CHANGELOG/#new-contributors_29","title":"New Contributors","text":"
  • @JafarAbdi made their first contribution in #677
  • @mariusvniekerk made their first contribution in #737

Full commit history

"},{"location":"CHANGELOG/#0120-2024-01-15","title":"[0.12.0] - 2024-01-15","text":""},{"location":"CHANGELOG/#highlights_31","title":"\u2728 Highlights","text":"
  • Some great community contributions, pixi global upgrade, pixi project version commands, a PIXI_HOME variable.
  • A ton of refactor work to prepare for the multi-environment feature.
    • Note that there are no extra environments created yet, but you can just specify them in the pixi.toml file already.
    • Next we'll build the actual environments.
"},{"location":"CHANGELOG/#added_25","title":"Added","text":"
  • Add global upgrade command to pixi by @trueleo in #614
  • Add configurable PIXI_HOME by @chawyehsu in #627
  • Add --pypi option to pixi remove by @marcelotrevisani in https://github.com/prefix-dev/pixi/pull/602
  • PrioritizedChannels to specify channel priority by @ruben-arts in https://github.com/prefix-dev/pixi/pull/658
  • Add project version {major,minor,patch} CLIs by @hadim in https://github.com/prefix-dev/pixi/pull/633
"},{"location":"CHANGELOG/#changed_29","title":"Changed","text":"
  • Refactored project model using targets, features and environments by @baszalmstra in https://github.com/prefix-dev/pixi/pull/616
  • Move code from Project to Environment by @baszalmstra in #630
  • Refactored system-requirements from Environment by @baszalmstra in #632
  • Extract activation.scripts into Environment by @baszalmstra in #659
  • Extract pypi-dependencies from Environment by @baszalmstra in https://github.com/prefix-dev/pixi/pull/656
  • De-serialization of features and environments by @ruben-arts in https://github.com/prefix-dev/pixi/pull/636
"},{"location":"CHANGELOG/#fixed_40","title":"Fixed","text":"
  • Make install.sh also work with wget if curl is not available by @wolfv in #644
  • Use source build for rattler by @ruben-arts
  • Check for pypi-dependencies before amending the pypi purls by @ruben-arts in #661
  • Don't allow the use of reflinks by @ruben-arts in #662
"},{"location":"CHANGELOG/#removed_1","title":"Removed","text":"
  • Remove windows and unix system requirements by @baszalmstra in #635
"},{"location":"CHANGELOG/#documentation_29","title":"Documentation","text":"
  • Document the channel logic by @ruben-arts in https://github.com/prefix-dev/pixi/pull/610
  • Update the instructions for installing on Arch Linux by @orhun in https://github.com/prefix-dev/pixi/pull/653
  • Update Community.md by @KarelZe in https://github.com/prefix-dev/pixi/pull/654
  • Replace contributions.md with contributing.md and make it more standardized by @ruben-arts in https://github.com/prefix-dev/pixi/pull/649
  • Remove windows and unix system requirements by @baszalmstra in https://github.com/prefix-dev/pixi/pull/635
  • Add CODE_OF_CONDUCT.md by @ruben-arts in https://github.com/prefix-dev/pixi/pull/648
  • Removed remaining .ps1 references by @bahugo in https://github.com/prefix-dev/pixi/pull/643
"},{"location":"CHANGELOG/#new-contributors_30","title":"New Contributors","text":"
  • @marcelotrevisani made their first contribution in https://github.com/prefix-dev/pixi/pull/602
  • @trueleo made their first contribution in https://github.com/prefix-dev/pixi/pull/614
  • @bahugo made their first contribution in https://github.com/prefix-dev/pixi/pull/643
  • @KarelZe made their first contribution in https://github.com/prefix-dev/pixi/pull/654

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.11.0...v0.12.0

"},{"location":"CHANGELOG/#0111-2024-01-06","title":"[0.11.1] - 2024-01-06","text":""},{"location":"CHANGELOG/#fixed_41","title":"Fixed","text":"
  • Upgrading rattler to fix pixi auth in #642
"},{"location":"CHANGELOG/#0110-2024-01-05","title":"[0.11.0] - 2024-01-05","text":""},{"location":"CHANGELOG/#highlights_32","title":"\u2728 Highlights","text":"
  • Lots of important and preparations for the pypi sdist and multi environment feature
  • Lots of new contributors that help pixi improve!
"},{"location":"CHANGELOG/#added_26","title":"Added","text":"
  • Add new commands for pixi project {version|channel|platform|description} by @hadim in #579
  • Add dependabot.yml by @pavelzw in #606
"},{"location":"CHANGELOG/#changed_30","title":"Changed","text":"
  • winget-releaser gets correct identifier by @ruben-arts in #561
  • Task run code by @baszalmstra in #556
  • No ps1 in activation scripts by @ruben-arts in #563
  • Changed some names for clarity by @tdejager in #568
  • Change font and make it dark mode by @ruben-arts in #576
  • Moved pypi installation into its own module by @tdejager in #589
  • Move alpha to beta feature and toggle it off with env var by @ruben-arts in #604
  • Improve UX activation scripts by @ruben-arts in #560
  • Add sanity check by @tdejager in #569
  • Refactor manifest by @ruben-arts in #572
  • Improve search by @Johnwillliam in #578
  • Split pypi and conda solve steps by @tdejager in #601
"},{"location":"CHANGELOG/#fixed_42","title":"Fixed","text":"
  • Save file after lockfile is correctly updated by @ruben-arts in #555
  • Limit the number of concurrent solves by @baszalmstra in #571
  • Use project virtual packages in add command by @msegado in #609
  • Improved mapped dependency by @ruben-arts in #574
"},{"location":"CHANGELOG/#documentation_30","title":"Documentation","text":"
  • Change font and make it dark mode by @ruben-arts in #576
  • typo: no ps1 in activation scripts by @ruben-arts in #563
  • Document adding CUDA to system-requirements by @ruben-arts in #595
  • Multi env proposal documentation by @ruben-arts in #584
  • Fix multiple typos in configuration.md by @SeaOtocinclus in #608
  • Add multiple machines from one project example by @pavelzw in #605
"},{"location":"CHANGELOG/#new-contributors_31","title":"New Contributors","text":"
  • @hadim made their first contribution in #579
  • @msegado made their first contribution in #609
  • @Johnwillliam made their first contribution in #578
  • @SeaOtocinclus made their first contribution in #608

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.10.0...v0.11.0

"},{"location":"CHANGELOG/#0100-2023-12-8","title":"[0.10.0] - 2023-12-8","text":""},{"location":"CHANGELOG/#highlights_33","title":"Highlights","text":"
  • Better pypi-dependencies support, now install even more of the pypi packages.
  • pixi add --pypi command to add a pypi package to your project.
"},{"location":"CHANGELOG/#added_27","title":"Added","text":"
  • Use range (>=1.2.3, <1.3) when adding requirement, instead of 1.2.3.* by @baszalmstra in https://github.com/prefix-dev/pixi/pull/536
  • Update rip to fix by @tdejager in https://github.com/prefix-dev/pixi/pull/543
    • Better Bytecode compilation (.pyc) support by @baszalmstra
    • Recognize .data directory headers by @baszalmstra
  • Also print arguments given to a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/545
  • Add pixi add --pypi command by @ruben-arts in https://github.com/prefix-dev/pixi/pull/539
"},{"location":"CHANGELOG/#fixed_43","title":"Fixed","text":"
  • space in global install path by @ruben-arts in https://github.com/prefix-dev/pixi/pull/513
  • Glibc version/family parsing by @baszalmstra in https://github.com/prefix-dev/pixi/pull/535
  • Use build and host specs while getting the best version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/538
"},{"location":"CHANGELOG/#miscellaneous_1","title":"Miscellaneous","text":"
  • docs: add update manual by @ruben-arts in https://github.com/prefix-dev/pixi/pull/521
  • add lightgbm demo by @partrita in https://github.com/prefix-dev/pixi/pull/492
  • Update documentation link by @williamjamir in https://github.com/prefix-dev/pixi/pull/525
  • Update Community.md by @jiaxiyang in https://github.com/prefix-dev/pixi/pull/527
  • Add winget releaser by @ruben-arts in https://github.com/prefix-dev/pixi/pull/547
  • Custom rerun-sdk example, force driven graph of pixi.lock by @ruben-arts in https://github.com/prefix-dev/pixi/pull/548
  • Better document pypi part by @ruben-arts in https://github.com/prefix-dev/pixi/pull/546
"},{"location":"CHANGELOG/#new-contributors_32","title":"New Contributors","text":"
  • @partrita made their first contribution in https://github.com/prefix-dev/pixi/pull/492
  • @williamjamir made their first contribution in https://github.com/prefix-dev/pixi/pull/525
  • @jiaxiyang made their first contribution in https://github.com/prefix-dev/pixi/pull/527

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.1...v0.10.0

"},{"location":"CHANGELOG/#091-2023-11-29","title":"[0.9.1] - 2023-11-29","text":""},{"location":"CHANGELOG/#highlights_34","title":"Highlights","text":"
  • PyPI's scripts are now fixed. For example: https://github.com/prefix-dev/pixi/issues/516
"},{"location":"CHANGELOG/#fixed_44","title":"Fixed","text":"
  • Remove attr (unused) and update all dependencies by @wolfv in https://github.com/prefix-dev/pixi/pull/510
  • Remove empty folders on python uninstall by @baszalmstra in https://github.com/prefix-dev/pixi/pull/512
  • Bump rip to add scripts by @baszalmstra in https://github.com/prefix-dev/pixi/pull/517

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.0...v0.9.1

"},{"location":"CHANGELOG/#090-2023-11-28","title":"[0.9.0] - 2023-11-28","text":""},{"location":"CHANGELOG/#highlights_35","title":"Highlights","text":"
  • You can now run pixi remove, pixi rm to remove a package from the environment
  • Fix pip install -e issue that was created by release v0.8.0 : https://github.com/prefix-dev/pixi/issues/507
"},{"location":"CHANGELOG/#added_28","title":"Added","text":"
  • pixi remove command by @Wackyator in https://github.com/prefix-dev/pixi/pull/483
"},{"location":"CHANGELOG/#fixed_45","title":"Fixed","text":"
  • Install entrypoints for [pypi-dependencies] @baszalmstra in https://github.com/prefix-dev/pixi/pull/508
  • Only uninstall pixi installed packages by @baszalmstra in https://github.com/prefix-dev/pixi/pull/509

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.8.0...v0.9.0

"},{"location":"CHANGELOG/#080-2023-11-27","title":"[0.8.0] - 2023-11-27","text":""},{"location":"CHANGELOG/#highlights_36","title":"Highlights","text":"
  • \ud83c\udf89\ud83d\udc0d[pypi-dependencies] ALPHA RELEASE\ud83d\udc0d\ud83c\udf89, you can now add PyPI dependencies to your pixi project.
  • UX of pixi run has been improved with better errors and showing what task is run.

[!NOTE] [pypi-dependencies] support is still incomplete, missing functionality is listed here: https://github.com/orgs/prefix-dev/projects/6. Our intent is not to have 100% feature parity with pip, our goal is that you only need pixi for both conda and pypi packages alike.

"},{"location":"CHANGELOG/#added_29","title":"Added","text":"
  • Bump rattler @ruben-arts in https://github.com/prefix-dev/pixi/pull/496
  • Implement lock-file satisfiability with pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/494
  • List pixi tasks when command not found is returned by @ruben-arts in https://github.com/prefix-dev/pixi/pull/488
  • Show which command is run as a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/491 && https://github.com/prefix-dev/pixi/pull/493
  • Add progress info to conda install by @baszalmstra in https://github.com/prefix-dev/pixi/pull/470
  • Install pypi dependencies (alpha) by @baszalmstra in https://github.com/prefix-dev/pixi/pull/452
"},{"location":"CHANGELOG/#fixed_46","title":"Fixed","text":"
  • Add install scripts to pixi.sh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/458 && https://github.com/prefix-dev/pixi/pull/459 && https://github.com/prefix-dev/pixi/pull/460
  • Fix RECORD not found issue by @baszalmstra in https://github.com/prefix-dev/pixi/pull/495
  • Actually add to the .gitignore and give better errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/490
  • Support macOS for pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/478
  • Custom pypi-dependencies type by @ruben-arts in https://github.com/prefix-dev/pixi/pull/471
  • pypi-dependencies parsing errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/479
  • Progress issues by @baszalmstra in https://github.com/prefix-dev/pixi/pull/4
"},{"location":"CHANGELOG/#miscellaneous_2","title":"Miscellaneous","text":"
  • Example: ctypes by @liquidcarbon in https://github.com/prefix-dev/pixi/pull/441
  • Mention the AUR package by @orhun in https://github.com/prefix-dev/pixi/pull/464
  • Update rerun example by @ruben-arts in https://github.com/prefix-dev/pixi/pull/489
  • Document pypi-dependencies by @ruben-arts in https://github.com/prefix-dev/pixi/pull/481
  • Ignore docs paths on rust workflow by @ruben-arts in https://github.com/prefix-dev/pixi/pull/482
  • Fix flaky tests, run serially by @baszalmstra in https://github.com/prefix-dev/pixi/pull/477
"},{"location":"CHANGELOG/#new-contributors_33","title":"New Contributors","text":"
  • @liquidcarbon made their first contribution in https://github.com/prefix-dev/pixi/pull/441
  • @orhun made their first contribution in https://github.com/prefix-dev/pixi/pull/464

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.7.0...v0.8.0

"},{"location":"CHANGELOG/#070-2023-11-14","title":"[0.7.0] - 2023-11-14","text":""},{"location":"CHANGELOG/#highlights_37","title":"Highlights","text":"
  • Channel priority: channels = [\"conda-forge\", \"pytorch\"] All packages found in conda-forge will not be taken from pytorch.
  • Channel specific dependencies: pytorch = { version=\"*\", channel=\"pytorch\"}
  • Autocompletion on pixi run <TABTAB>
  • Moved all pixi documentation into this repo, try it with pixi run docs!
  • Lots of new contributors!
"},{"location":"CHANGELOG/#added_30","title":"Added","text":"
  • Bump rattler to its newest version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/395 * Some notable changes: * Add channel priority (If a package is found in the first listed channel it will not be looked for in the other channels). * Fix JLAP using wrong hash. * Lockfile forward compatibility error.
  • Add nushell support by @wolfv in https://github.com/prefix-dev/pixi/pull/360
  • Autocomplete tasks on pixi run for bash and zsh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/390
  • Add prefix location file to avoid copy error by @ruben-arts in https://github.com/prefix-dev/pixi/pull/422
  • Channel specific dependencies python = { version = \"*\" channel=\"conda-forge\" } by @ruben-arts in https://github.com/prefix-dev/pixi/pull/439
"},{"location":"CHANGELOG/#changed_31","title":"Changed","text":"
  • project.version as optional field in the pixi.toml by @ruben-arts in https://github.com/prefix-dev/pixi/pull/400
"},{"location":"CHANGELOG/#fixed_47","title":"Fixed","text":"
  • Deny unknown fields in pixi.toml to help users find errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/396
  • install.sh to create dot file if not present by @humphd in https://github.com/prefix-dev/pixi/pull/408
  • Ensure order of repodata fetches by @baszalmstra in https://github.com/prefix-dev/pixi/pull/405
  • Strip Linux binaries by @baszalmstra in https://github.com/prefix-dev/pixi/pull/414
  • Sort task list by @ruben-arts in https://github.com/prefix-dev/pixi/pull/431
  • Fix global install path on windows by @ruben-arts in https://github.com/prefix-dev/pixi/pull/449
  • Let PIXI_BIN_PATH use backslashes by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/442
  • Print more informative error if created file is empty by @traversaro in https://github.com/prefix-dev/pixi/pull/447
"},{"location":"CHANGELOG/#docs","title":"Docs","text":"
  • Move to mkdocs with all documentation by @ruben-arts in https://github.com/prefix-dev/pixi/pull/435
  • Fix typing errors by @FarukhS52 in https://github.com/prefix-dev/pixi/pull/426
  • Add social cards to the pages by @ruben-arts in https://github.com/prefix-dev/pixi/pull/445
  • Enhance README.md: Added Table of Contents, Grammar Improvements by @adarsh-jha-dev in https://github.com/prefix-dev/pixi/pull/421
  • Adding conda-auth to community examples by @travishathaway in https://github.com/prefix-dev/pixi/pull/433
  • Minor grammar correction by @tylere in https://github.com/prefix-dev/pixi/pull/406
  • Make capitalization of tab titles consistent by @tylere in https://github.com/prefix-dev/pixi/pull/407
"},{"location":"CHANGELOG/#new-contributors_34","title":"New Contributors","text":"
  • @tylere made their first contribution in https://github.com/prefix-dev/pixi/pull/406
  • @humphd made their first contribution in https://github.com/prefix-dev/pixi/pull/408
  • @adarsh-jha-dev made their first contribution in https://github.com/prefix-dev/pixi/pull/421
  • @FarukhS52 made their first contribution in https://github.com/prefix-dev/pixi/pull/426
  • @travishathaway made their first contribution in https://github.com/prefix-dev/pixi/pull/433
  • @traversaro made their first contribution in https://github.com/prefix-dev/pixi/pull/447

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.6.0...v0.7.0

"},{"location":"CHANGELOG/#060-2023-10-17","title":"[0.6.0] - 2023-10-17","text":""},{"location":"CHANGELOG/#highlights_38","title":"Highlights","text":"

This release fixes some bugs and adds the --cwd option to the tasks.

"},{"location":"CHANGELOG/#fixed_48","title":"Fixed","text":"
  • Improve shell prompts by @ruben-arts in https://github.com/prefix-dev/pixi/pull/385 https://github.com/prefix-dev/pixi/pull/388
  • Change --frozen logic to error when there is no lockfile by @ruben-arts in https://github.com/prefix-dev/pixi/pull/373
  • Don't remove the '.11' from 'python3.11' binary file name by @ruben-arts in https://github.com/prefix-dev/pixi/pull/366
"},{"location":"CHANGELOG/#changed_32","title":"Changed","text":"
  • Update rerun example to v0.9.1 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/389
"},{"location":"CHANGELOG/#added_31","title":"Added","text":"
  • Add the current working directory (--cwd) in pixi tasks by @ruben-arts in https://github.com/prefix-dev/pixi/pull/380

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.5.0...v0.6.0

"},{"location":"CHANGELOG/#050-2023-10-03","title":"[0.5.0] - 2023-10-03","text":""},{"location":"CHANGELOG/#highlights_39","title":"Highlights","text":"

We rebuilt pixi shell, fixing the fact that your rc file would overrule the environment activation.

"},{"location":"CHANGELOG/#fixed_49","title":"Fixed","text":"
  • Change how shell works and make activation more robust by @wolfv in https://github.com/prefix-dev/pixi/pull/316
  • Documentation: use quotes in cli by @pavelzw in https://github.com/prefix-dev/pixi/pull/367
"},{"location":"CHANGELOG/#added_32","title":"Added","text":"
  • Create or append to the .gitignore and .gitattributes files by @ruben-arts in https://github.com/prefix-dev/pixi/pull/359
  • Add --locked and --frozen to getting an up-to-date prefix by @ruben-arts in https://github.com/prefix-dev/pixi/pull/363
  • Documentation: improvement/update by @ruben-arts in https://github.com/prefix-dev/pixi/pull/355
  • Example: how to build a docker image using pixi by @ruben-arts in https://github.com/prefix-dev/pixi/pull/353 & https://github.com/prefix-dev/pixi/pull/365
  • Update to the newest rattler by @baszalmstra in https://github.com/prefix-dev/pixi/pull/361
  • Periodic cargo upgrade --all --incompatible by @wolfv in https://github.com/prefix-dev/pixi/pull/358

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.4.0...v0.5.0

"},{"location":"CHANGELOG/#040-2023-09-22","title":"[0.4.0] - 2023-09-22","text":""},{"location":"CHANGELOG/#highlights_40","title":"Highlights","text":"

This release adds the start of a new cli command pixi project which will allow users to interact with the project configuration from the command line.

"},{"location":"CHANGELOG/#fixed_50","title":"Fixed","text":"
  • Align with latest rattler version 0.9.0 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/350
"},{"location":"CHANGELOG/#added_33","title":"Added","text":"
  • Add codespell (config, workflow) to catch typos + catch and fix some of those by @yarikoptic in https://github.com/prefix-dev/pixi/pull/329
  • remove atty and use stdlib by @wolfv in https://github.com/prefix-dev/pixi/pull/337
  • xtsci-dist to Community.md by @HaoZeke in https://github.com/prefix-dev/pixi/pull/339
  • ribasim to Community.md by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/340
  • LFortran to Community.md by @wolfv in https://github.com/prefix-dev/pixi/pull/341
  • Give tip to resolve virtual package issue by @ruben-arts in https://github.com/prefix-dev/pixi/pull/348
  • pixi project channel add subcommand by @baszalmstra and @ruben-arts in https://github.com/prefix-dev/pixi/pull/347
"},{"location":"CHANGELOG/#new-contributors_35","title":"New Contributors","text":"
  • @yarikoptic made their first contribution in https://github.com/prefix-dev/pixi/pull/329
  • @HaoZeke made their first contribution in https://github.com/prefix-dev/pixi/pull/339

Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.3.0...v0.4.0

"},{"location":"CHANGELOG/#030-2023-09-11","title":"[0.3.0] - 2023-09-11","text":""},{"location":"CHANGELOG/#highlights_41","title":"Highlights","text":"

This releases fixes a lot of issues encountered by the community as well as some awesome community contributions like the addition of pixi global list and pixi global remove.

"},{"location":"CHANGELOG/#fixed_51","title":"Fixed","text":"
  • Properly detect Cuda on linux using our build binaries, by @baszalmstra (#290)
  • Package names are now case-insensitive, by @baszalmstra (#285)
  • Issue with starts-with and compatibility operator, by @tdejager (#296)
  • Lock files are now consistently sorted, by @baszalmstra (#295 & #307)
  • Improved xonsh detection and powershell env-var escaping, by @wolfv (#307)
  • system-requirements are properly filtered by platform, by @ruben-arts (#299)
  • Powershell completion install script, by @chawyehsu (#325)
  • Simplified and improved shell quoting, by @baszalmstra (#313)
  • Issue where platform specific subdirs were required, by @baszalmstra (#333)
  • thread 'tokio-runtime-worker' has overflowed its stack issue, by @baszalmstra (#28)
"},{"location":"CHANGELOG/#added_34","title":"Added","text":"
  • Certificates from the OS certificate store are now used, by @baszalmstra (#310)
  • pixi global list and pixi global remove commands, by @cjfuller (#318)
"},{"location":"CHANGELOG/#changed_33","title":"Changed","text":"
  • --manifest-path must point to a pixi.toml file, by @baszalmstra (#324)
"},{"location":"CHANGELOG/#020-2023-08-22","title":"[0.2.0] - 2023-08-22","text":""},{"location":"CHANGELOG/#highlights_42","title":"Highlights","text":"
  • Added pixi search command to search for packages, by @Wackyator. (#244)
  • Added target specific tasks, eg. [target.win-64.tasks], by @ruben-arts. (#269)
  • Flaky install caused by the download of packages, by @baszalmstra. (#281)
"},{"location":"CHANGELOG/#fixed_52","title":"Fixed","text":"
  • Install instructions, by @baszalmstra. (#258)
  • Typo in getting started, by @RaulPL. (#266)
  • Don't execute alias tasks, by @baszalmstra. (#274)
"},{"location":"CHANGELOG/#added_35","title":"Added","text":"
  • Rerun example, by @ruben-arts. (#236)
  • Reduction of pixi's binary size, by @baszalmstra (#256)
  • Updated pixi banner, including webp file for faster loading, by @baszalmstra. (#257)
  • Set linguist attributes for pixi.lock automatically, by @spenserblack. (#265)
  • Contribution manual for pixi, by @ruben-arts. (#268)
  • GitHub issue templates, by @ruben-arts. (#271)
  • Links to prefix.dev in readme, by @tdejager. (#279)
"},{"location":"CHANGELOG/#010-2023-08-11","title":"[0.1.0] - 2023-08-11","text":"

As this is our first Semantic Versioning release, we'll change from the prototype to the developing phase, as semver describes. A 0.x release could be anything from a new major feature to a breaking change where the 0.0.x releases will be bugfixes or small improvements.

"},{"location":"CHANGELOG/#highlights_43","title":"Highlights","text":"
  • Update to the latest rattler version, by @baszalmstra. (#249)
"},{"location":"CHANGELOG/#fixed_53","title":"Fixed","text":"
  • Only add shebang to activation scripts on unix platforms, by @baszalmstra. (#250)
  • Use official crates.io releases for all dependencies, by @baszalmstra. (#252)
"},{"location":"CHANGELOG/#008-2023-08-01","title":"[0.0.8] - 2023-08-01","text":""},{"location":"CHANGELOG/#highlights_44","title":"Highlights","text":"
  • Much better error printing using miette, by @baszalmstra. (#211)
  • You can now use pixi on aarch64-linux, by @pavelzw. (#233)
  • Use the Rust port of libsolv as the default solver, by @ruben-arts. (#209)
"},{"location":"CHANGELOG/#added_36","title":"Added","text":"
  • Add mention to condax in the docs, by @maresb. (#207)
  • Add brew installation instructions, by @wolfv. (#208)
  • Add activation.scripts to the pixi.toml to configure environment activation, by @ruben-arts. (#217)
  • Add pixi upload command to upload packages to prefix.dev, by @wolfv. (#127)
  • Add more metadata fields to the pixi.toml, by @wolfv. (#218)
  • Add pixi task list to show all tasks in the project, by @tdejager. (#228)
  • Add --color to configure the colors in the output, by @baszalmstra. (#243)
  • Examples, ROS2 Nav2, JupyterLab and QGIS, by @ruben-arts.
"},{"location":"CHANGELOG/#fixed_54","title":"Fixed","text":"
  • Add trailing newline to pixi.toml and .gitignore, by @pavelzw. (#216)
  • Deny unknown fields and rename license-file in pixi.toml, by @wolfv. (#220)
  • Overwrite PS1 variable when going into a pixi shell, by @ruben-arts. (#201)
"},{"location":"CHANGELOG/#changed_34","title":"Changed","text":"
  • Install environment when adding a dependency using pixi add, by @baszalmstra. (#213)
  • Improve and speedup CI, by @baszalmstra. (#241)
"},{"location":"CHANGELOG/#007-2023-07-11","title":"[0.0.7] - 2023-07-11","text":""},{"location":"CHANGELOG/#highlights_45","title":"Highlights","text":"
  • Transitioned the run subcommand to use the deno_task_shell for improved cross-platform functionality. More details in the Deno Task Runner documentation.
  • Added an info subcommand to retrieve system-specific information understood by pixi.
"},{"location":"CHANGELOG/#breaking-changes","title":"BREAKING CHANGES","text":"
  • [commands] in the pixi.toml is now called [tasks]. (#177)
"},{"location":"CHANGELOG/#added_37","title":"Added","text":"
  • The pixi info command to get more system information by @wolfv in (#158)
  • Documentation on how to use the cli by @ruben-arts in (#160)
  • Use the deno_task_shell to execute commands in pixi run by @baszalmstra in (#173)
  • Use new solver backend from rattler by @baszalmstra in (#178)
  • The pixi command command to the cli by @tdejager in (#177)
  • Documentation on how to use the pixi auth command by @wolfv in (#183)
  • Use the newest rattler 0.6.0 by @baszalmstra in (#185)
  • Build with pixi section to the documentation by @tdejager in (#196)
"},{"location":"CHANGELOG/#fixed_55","title":"Fixed","text":"
  • Running tasks sequentially when using depends_on by @tdejager in (#161)
  • Don't add PATH variable where it is already set by @baszalmstra in (#169)
  • Fix README by @Hofer-Julian in (#182)
  • Fix Ctrl+C signal in pixi run by @tdejager in (#190)
  • Add the correct license information to the lockfiles by @wolfv in (#191)
"},{"location":"CHANGELOG/#006-2023-06-30","title":"[0.0.6] - 2023-06-30","text":""},{"location":"CHANGELOG/#highlights_46","title":"Highlights","text":"

Improving the reliability is important to us, so we added an integration testing framework, we can now test as close as possible to the CLI level using cargo.

"},{"location":"CHANGELOG/#added_38","title":"Added","text":"
  • An integration test harness, to test as close as possible to the user experience but in rust. (#138, #140, #156)
  • Add different levels of dependencies in preparation for pixi build, allowing host- and build- dependencies (#149)
"},{"location":"CHANGELOG/#fixed_56","title":"Fixed","text":"
  • Use correct folder name on pixi init (#144)
  • Fix windows cli installer (#152)
  • Fix global install path variable (#147)
  • Fix macOS binary notarization (#153)
"},{"location":"CHANGELOG/#005-2023-06-26","title":"[0.0.5] - 2023-06-26","text":"

Fixing Windows installer build in CI. (#145)

"},{"location":"CHANGELOG/#004-2023-06-26","title":"[0.0.4] - 2023-06-26","text":""},{"location":"CHANGELOG/#highlights_47","title":"Highlights","text":"

A new command, auth which can be used to authenticate the host of the package channels. A new command, shell which can be used to start a shell in the pixi environment of a project. A refactor of the install command which is changed to global install and the install command now installs a pixi project if you run it in the directory. Platform specific dependencies using [target.linux-64.dependencies] instead of [dependencies] in the pixi.toml

Lots and lots of fixes and improvements to make it easier for this user, where bumping to the new version of rattler helped a lot.

"},{"location":"CHANGELOG/#added_39","title":"Added","text":"
  • Platform specific dependencies and helpful error reporting on pixi.toml issues(#111)
  • Windows installer, which is very useful for users that want to start using pixi on windows. (#114)
  • shell command to use the pixi environment without pixi run. (#116)
  • Verbosity options using -v, -vv, -vvv (#118)
  • auth command to be able to login or logout of a host like repo.prefix.dev if you're using private channels. (#120)
  • New examples: CPP sdl: #121, Opencv camera calibration #125
  • Apple binary signing and notarization. (#137)
"},{"location":"CHANGELOG/#changed_35","title":"Changed","text":"
  • pixi install moved to pixi global install and pixi install became the installation of a project using the pixi.toml (#124)
"},{"location":"CHANGELOG/#fixed_57","title":"Fixed","text":"
  • pixi run uses default shell (#119)
  • pixi add command is fixed. (#132)
  • Community issues fixed: #70, #72, #90, #92, #94, #96
"}]} \ No newline at end of file diff --git a/v0.39.2/sitemap.xml b/v0.39.2/sitemap.xml new file mode 100644 index 000000000..93797ea4c --- /dev/null +++ b/v0.39.2/sitemap.xml @@ -0,0 +1,183 @@ + + + + https://prefix-dev.github.io/pixi/v0.39.2/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/Community/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/FAQ/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/basic_usage/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/packaging/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/vision/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/authentication/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/channel_priority/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/explain_info_command/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/github_actions/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/production_deployment/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/pyproject_toml/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/advanced/updates_github_actions/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/examples/cpp-sdl/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/examples/opencv/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/examples/ros2-nav2/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/advanced_tasks/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/environment/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/global_tools/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/lockfile/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/multi_environment/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/multi_platform_configuration/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/features/system_requirements/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/ide_integration/devcontainer/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/ide_integration/jupyterlab/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/ide_integration/pycharm/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/ide_integration/r_studio/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/reference/cli/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/reference/pixi_configuration/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/reference/pixi_manifest/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/switching_from/conda/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/switching_from/poetry/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/tutorials/python/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/tutorials/ros2/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/tutorials/rust/ + 2024-12-11 + daily + + + https://prefix-dev.github.io/pixi/v0.39.2/CHANGELOG/ + 2024-12-11 + daily + + \ No newline at end of file diff --git a/v0.39.2/sitemap.xml.gz b/v0.39.2/sitemap.xml.gz new file mode 100644 index 000000000..d8a6c7242 Binary files /dev/null and b/v0.39.2/sitemap.xml.gz differ diff --git a/v0.39.2/source_files/pixi_config_tomls/detached_environments_path_config.toml b/v0.39.2/source_files/pixi_config_tomls/detached_environments_path_config.toml new file mode 100644 index 000000000..d7e347630 --- /dev/null +++ b/v0.39.2/source_files/pixi_config_tomls/detached_environments_path_config.toml @@ -0,0 +1,3 @@ +# --8<-- [start:detached-environments-path] +detached-environments = "/opt/pixi/envs" +# --8<-- [end:detached-environments-path] diff --git a/v0.39.2/source_files/pixi_config_tomls/main_config.toml b/v0.39.2/source_files/pixi_config_tomls/main_config.toml new file mode 100644 index 000000000..d1bac8af4 --- /dev/null +++ b/v0.39.2/source_files/pixi_config_tomls/main_config.toml @@ -0,0 +1,83 @@ + +# --8<-- [start:default-channels] +default-channels = ["conda-forge"] +# --8<-- [end:default-channels] + +# --8<-- [start:change-ps1] +change-ps1 = true +# --8<-- [end:change-ps1] + +# --8<-- [start:tls-no-verify] +tls-no-verify = false +# --8<-- [end:tls-no-verify] + +# --8<-- [start:authentication-override-file] +authentication-override-file = "/path/to/your/override.json" +# --8<-- [end:authentication-override-file] + +# --8<-- [start:detached-environments] +detached-environments = true +# --8<-- [end:detached-environments] + +# --8<-- [start:pinning-strategy] +pinning-strategy = "no-pin" +# --8<-- [end:pinning-strategy] + +# --8<-- [start:repodata-config] +[repodata-config] +# disable fetching of jlap, bz2 or zstd repodata files. +# This should only be used for specific old versions of artifactory and other non-compliant +# servers. +disable-bzip2 = true # don't try to download repodata.json.bz2 +disable-jlap = true # don't try to download repodata.jlap +disable-sharded = true # don't try to download sharded repodata +disable-zstd = true # don't try to download repodata.json.zst +# --8<-- [end:repodata-config] +# --8<-- [start:prefix-repodata-config] +[repodata-config."https://prefix.dev"] +disable-sharded = false +# --8<-- [end:prefix-repodata-config] + +# --8<-- [start:pypi-config] +[pypi-config] +# Main index url +index-url = "https://pypi.org/simple" +# list of additional urls +extra-index-urls = ["https://pypi.org/simple2"] +# can be "subprocess" or "disabled" +keyring-provider = "subprocess" +# allow insecure connections to host +allow-insecure-host = ["localhost:8080"] +# --8<-- [end:pypi-config] + +# --8<-- [start:concurrency] +[concurrency] +# The maximum number of concurrent downloads +# Defaults to 50 as that was found to be a good balance between speed and stability +downloads = 5 + +# The maximum number of concurrent dependency resolves +# Defaults to a heuristic based on the number of cores on the system +solves = 2 +# --8<-- [end:concurrency] + +# --8<-- [start:experimental] +[experimental] +# Enable the use of the environment activation cache +use-environment-activation-cache = true +# --8<-- [end:experimental] + +# --8<-- [start:mirrors] +[mirrors] +# redirect all requests for conda-forge to the prefix.dev mirror +"https://conda.anaconda.org/conda-forge" = ["https://prefix.dev/conda-forge"] + +# redirect all requests for bioconda to one of the three listed mirrors +# Note: for repodata we try the first mirror first. +"https://conda.anaconda.org/bioconda" = [ + "https://conda.anaconda.org/bioconda", + # OCI registries are also supported + "oci://ghcr.io/channel-mirrors/bioconda", + "https://prefix.dev/bioconda", +] +# --8<-- [end:mirrors] diff --git a/v0.39.2/source_files/pixi_config_tomls/mirror_prefix_config.toml b/v0.39.2/source_files/pixi_config_tomls/mirror_prefix_config.toml new file mode 100644 index 000000000..26a19bb6f --- /dev/null +++ b/v0.39.2/source_files/pixi_config_tomls/mirror_prefix_config.toml @@ -0,0 +1,4 @@ +# --8<-- [start:mirrors] +[mirrors] +"https://conda.anaconda.org" = ["https://prefix.dev/"] +# --8<-- [end:mirrors] diff --git a/v0.39.2/source_files/pixi_config_tomls/oci_config.toml b/v0.39.2/source_files/pixi_config_tomls/oci_config.toml new file mode 100644 index 000000000..943d28e0b --- /dev/null +++ b/v0.39.2/source_files/pixi_config_tomls/oci_config.toml @@ -0,0 +1,6 @@ +# --8<-- [start:oci-mirrors] +[mirrors] +"https://conda.anaconda.org/conda-forge" = [ + "oci://ghcr.io/channel-mirrors/conda-forge", +] +# --8<-- [end:oci-mirrors] diff --git a/v0.39.2/source_files/pixi_tomls/lots_of_channels.toml b/v0.39.2/source_files/pixi_tomls/lots_of_channels.toml new file mode 100644 index 000000000..41dae2174 --- /dev/null +++ b/v0.39.2/source_files/pixi_tomls/lots_of_channels.toml @@ -0,0 +1,12 @@ +[project] +name = "lots_of_channels" +platforms = [] +# --8<-- [start:project_channels_long] +channels = ["conda-forge", "robostack", "bioconda", "nvidia", "pytorch"] +# --8<-- [end:project_channels_long] + + +[feature.path_channels] +# --8<-- [start:project_channels_path] +channels = ["conda-forge", "file:///home/user/staged-recipes/build_artifacts"] +# --8<-- [end:project_channels_path] diff --git a/v0.39.2/source_files/pixi_tomls/main_pixi.toml b/v0.39.2/source_files/pixi_tomls/main_pixi.toml new file mode 100644 index 000000000..11550c506 --- /dev/null +++ b/v0.39.2/source_files/pixi_tomls/main_pixi.toml @@ -0,0 +1,41 @@ +[project] +# --8<-- [start:project_name] +name = "project-name" +# --8<-- [end:project_name] +# --8<-- [start:project_channels] +channels = ["conda-forge", "https://repo.prefix.dev/channel-name"] +# --8<-- [end:project_channels] +# --8<-- [start:project_platforms] +platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"] +# --8<-- [end:project_platforms] +# --8<-- [start:project_version] +version = "1.2.3" +# --8<-- [end:project_version] +# --8<-- [start:project_authors] +authors = ["John Doe ", "Marie Curie "] +# --8<-- [end:project_authors] +# --8<-- [start:project_description] +description = "A simple description" +# --8<-- [end:project_description] +# --8<-- [start:project_license] +license = "MIT" +# --8<-- [end:project_license] + +# Untestable because it's a path to a must have file. +# --8<-- [start:project_license_file] +# license-file = "LICENSE.md" +# --8<-- [end:project_license_file] +# --8<-- [start:project_readme] +# readme = "README.md" +# --8<-- [end:project_readme] + + +# --8<-- [start:project_homepage] +homepage = "https://pixi.sh" +# --8<-- [end:project_homepage] +# --8<-- [start:project_repository] +repository = "https://github.com/prefix-dev/pixi" +# --8<-- [end:project_repository] +# --8<-- [start:project_documentation] +documentation = "https://pixi.sh" +# --8<-- [end:project_documentation] diff --git a/v0.39.2/source_files/pixi_tomls/simple_pixi.toml b/v0.39.2/source_files/pixi_tomls/simple_pixi.toml new file mode 100644 index 000000000..6aa519742 --- /dev/null +++ b/v0.39.2/source_files/pixi_tomls/simple_pixi.toml @@ -0,0 +1,7 @@ + +# --8<-- [start:project] +[project] +channels = ["conda-forge"] +name = "project-name" +platforms = ["linux-64"] +# --8<-- [end:project] diff --git a/v0.39.2/stylesheets/extra.css b/v0.39.2/stylesheets/extra.css new file mode 100644 index 000000000..66be165cc --- /dev/null +++ b/v0.39.2/stylesheets/extra.css @@ -0,0 +1,96 @@ +.md-header__topic { + font-family: 'Dosis', sans-serif; +} + +[data-md-color-primary=prefix] { + --md-primary-fg-color: #F9C405; + --md-primary-fg-color--light: #ffee57; + --md-primary-fg-color--dark: #F9C405; + --md-primary-bg-color: #000000de; + --md-primary-bg-color--light: #0000008a +} + +[data-md-color-accent=prefix] { + --md-accent-fg-color: #fa0; + --md-accent2-fg-color: #eab308; + --md-accent-fg-color--transparent: #ffaa001a; + --md-accent-bg-color: #000000de; + --md-accent-bg-color--light: #0000008a +} + + +[data-md-color-primary=prefix-light] { + --md-primary-fg-color: #000000de; + --md-primary-fg-color--light: #ffee57; + --md-primary-fg-color--dark: #F9C405; + --md-primary-bg-color: #F9C405; + --md-primary-bg-color--light: #F9C405; + --md-code-bg-color: rgba(0, 0, 0, 0.04); +} + +[data-md-color-accent=prefix-light] { + --md-accent-fg-color: #2e2400; + --md-accent2-fg-color: #19116f; + --md-accent-fg-color--transparent: #ffaa001a; + --md-accent-bg-color: #000000de; + --md-accent-bg-color--light: #0000008a +} + +.md-typeset a { + color: var(--md-accent2-fg-color); +} + +.md-nav__item .md-nav__link--active, .md-nav__item .md-nav__link--active code { + color: var(--md-accent-fg-color); + font-weight: bold; +} + +.md-header__topic:first-child { + font-weight: normal; +} + +.md-typeset h1 { + color: var(--md-accent-fg-color); +} +.md-typeset h1, .md-typeset h2, .md-typeset h3, .md-typeset h4, .md-typeset h5, .md-typeset h6 { + font-family: 'Dosis', sans-serif; + font-weight: 500; + color: var(--md-accent-fg-color); +} + +.md-typeset p { + /* kerning */ + text-rendering: optimizeLegibility; +} + +:root > * { + --md-code-hl-string-color: var(--md-accent-fg-color); +} + +.md-header__button.md-logo { + padding: 0; + margin: 0; +} + +.md-header__button.md-logo img, .md-header__button.md-logo svg { + height: 2.1rem; +} + +[dir=ltr] .md-header__title { + margin-left: 0.5rem; +} + +.md-footer-meta__item--prefix-logo { + height: 2rem; + margin: 0.5rem 0; + display: flex; +} + +.md-footer-meta__item--prefix-logo img { + height: 100%; +} + +table code { + white-space: nowrap; + word-break: keep-all; +} diff --git a/v0.39.2/switching_from/conda/index.html b/v0.39.2/switching_from/conda/index.html new file mode 100644 index 000000000..9f6075681 --- /dev/null +++ b/v0.39.2/switching_from/conda/index.html @@ -0,0 +1,1925 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Conda/Mamba - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+ +
+
+ + + +
+
+ + + + + + + + + + + + +

Transitioning from the conda or mamba to pixi#

+

Welcome to the guide designed to ease your transition from conda or mamba to pixi. +This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. +With pixi, you'll experience a project-based workflow, enhancing your development process, and allowing for easy sharing of your work.

+

Why Pixi?#

+

Pixi builds upon the foundation of the conda ecosystem, introducing a project-centric approach rather than focusing solely on environments. +This shift towards projects offers a more organized and efficient way to manage dependencies and run code, tailored to modern development practices.

+

Key Differences at a Glance#

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
TaskConda/MambaPixi
InstallationRequires an installerDownload and add to path (See installation)
Creating an Environmentconda create -n myenv -c conda-forge python=3.8pixi init myenv followed by pixi add python=3.8
Activating an Environmentconda activate myenvpixi shell within the project directory
Deactivating an Environmentconda deactivateexit from the pixi shell
Running a Taskconda run -n myenv python my_program.pypixi run python my_program.py (See run)
Installing a Packageconda install numpypixi add numpy
Uninstalling a Packageconda remove numpypixi remove numpy
+
+

No base environment

+

Conda has a base environment, which is the default environment when you start a new shell. +Pixi does not have a base environment. And requires you to install the tools you need in the project or globally. +Using pixi global install bat will install bat in a global environment, which is not the same as the base environment in conda.

+
+
+Activating pixi environment in the current shell +

For some advanced use-cases, you can activate the environment in the current shell. +This uses the pixi shell-hook which prints the activation script, which can be used to activate the environment in the current shell without pixi itself. +

~/myenv > eval "$(pixi shell-hook)"
+

+
+

Environment vs Project#

+

Conda and mamba focus on managing environments, while pixi emphasizes projects. +In pixi, a project is a folder containing a manifest(pixi.toml/pyproject.toml) file that describes the project, a pixi.lock lock-file that describes the exact dependencies, and a .pixi folder that contains the environment.

+

This project-centric approach allows for easy sharing and collaboration, as the project folder contains all the necessary information to recreate the environment. +It manages more than one environment for more than one platform in a single project, and allows for easy switching between them. (See multiple environments)

+

Global environments#

+

conda installs all environments in one global location. +When this is important to you for filesystem reasons, you can use the detached-environments feature of pixi. +

pixi config set detached-environments true
+# or a specific location
+pixi config set detached-environments /path/to/envs
+
+This doesn't allow you to activate the environments using pixi shell -n but it will make the installation of the environments go to the same folder.

+

pixi does have the pixi global command to install tools on your machine. (See global) +This is not a replacement for conda but works the same as pipx and condax. +It creates a single isolated environment for the given requirement and installs the binaries into the global path. +

pixi global install bat
+bat pixi.toml
+

+
+

Never install pip with pixi global

+
+

Installations with pixi global get their own isolated environment. +Installing pip with pixi global will create a new isolated environment with its own pip binary. +Using that pip binary will install packages in the pip environment, making it unreachable form anywhere as you can't activate it.

+

Automated switching#

+

With pixi you can import environment.yml files into a pixi project. (See import) +

pixi init --import environment.yml
+
+This will create a new project with the dependencies from the environment.yml file.

+
+Exporting your environment +

If you are working with Conda users or systems, you can export your environment to a environment.yml file to share them. +

pixi project export conda-environment
+
+Additionally you can export a conda explicit specification.

+
+

Troubleshooting#

+

Encountering issues? Here are solutions to some common problems when being used to the conda workflow:

+
    +
  • Dependency is excluded because due to strict channel priority not using this option from: 'https://conda.anaconda.org/conda-forge/' + This error occurs when the package is in multiple channels. pixi uses a strict channel priority. See channel priority for more information.
  • +
  • pixi global install pip, pip doesn't work. + pip is installed in the global isolated environment. Use pixi add pip in a project to install pip in the project environment and use that project.
  • +
  • pixi global install <Any Library> -> import <Any Library> -> ModuleNotFoundError: No module named '<Any Library>' + The library is installed in the global isolated environment. Use pixi add <Any Library> in a project to install the library in the project environment and use that project.
  • +
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/switching_from/poetry/index.html b/v0.39.2/switching_from/poetry/index.html new file mode 100644 index 000000000..a00fa127a --- /dev/null +++ b/v0.39.2/switching_from/poetry/index.html @@ -0,0 +1,1843 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Poetry - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Transitioning from poetry to pixi#

+

Welcome to the guide designed to ease your transition from poetry to pixi. +This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. +With pixi, you'll experience a project-based workflow similar to poetry while including the conda ecosystem and allowing for easy sharing of your work.

+

Why Pixi?#

+

Poetry is most-likely the closest tool to pixi in terms of project management, in the python ecosystem. +On top of the PyPI ecosystem, pixi adds the power of the conda ecosystem, allowing for a more flexible and powerful environment management.

+

Quick look at the differences#

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
TaskPoetryPixi
Creating an Environmentpoetry new myenvpixi init myenv
Running a Taskpoetry run which pythonpixi run which python pixi uses a built-in cross platform shell for run where poetry uses your shell.
Installing a Packagepoetry add numpypixi add numpy adds the conda variant. pixi add --pypi numpy adds the PyPI variant.
Uninstalling a Packagepoetry remove numpypixi remove numpy removes the conda variant. pixi remove --pypi numpy removes the PyPI variant.
Building a packagepoetry buildWe've yet to implement package building and publishing
Publishing a packagepoetry publishWe've yet to implement package building and publishing
Reading the pyproject.toml[tool.poetry][tool.pixi]
Defining dependencies[tool.poetry.dependencies][tool.pixi.dependencies] for conda, [tool.pixi.pypi-dependencies] or [project.dependencies] for PyPI dependencies
Dependency definition- numpy = "^1.2.3"
- numpy = "~1.2.3"
- numpy = "*"
- numpy = ">=1.2.3 <2.0.0"
- numpy = ">=1.2.3 <1.3.0"
- numpy = "*"
Lock filepoetry.lockpixi.lock
Environment directory~/.cache/pypoetry/virtualenvs/myenv./.pixi Defaults to the project folder, move this using the detached-environments
+

Support both poetry and pixi in my project#

+

You can allow users to use poetry and pixi in the same project, they will not touch each other's parts of the configuration or system. +It's best to duplicate the dependencies, basically making an exact copy of the tool.poetry.dependencies into tool.pixi.pypi-dependencies. +Make sure that python is only defined in the tool.pixi.dependencies and not in the tool.pixi.pypi-dependencies.

+
+

Mixing pixi and poetry

+

It's possible to use poetry in pixi environments but this is advised against. +Pixi supports PyPI dependencies in a different way than poetry does, and mixing them can lead to unexpected behavior. +As you can only use one package manager at a time, it's best to stick to one.

+

If using poetry on top of a pixi project, you'll always need to install the poetry environment after the pixi environment. +And let pixi handle the python and poetry installation.

+
+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/tutorials/python/index.html b/v0.39.2/tutorials/python/index.html new file mode 100644 index 000000000..c9303bdcb --- /dev/null +++ b/v0.39.2/tutorials/python/index.html @@ -0,0 +1,2234 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Python - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Tutorial: Doing Python development with Pixi#

+

In this tutorial, we will show you how to create a simple Python project with pixi. +We will show some of the features that pixi provides, that are currently not a part of pdm, poetry etc.

+

Why is this useful?#

+

Pixi builds upon the conda ecosystem, which allows you to create a Python environment with all the dependencies you need. +This is especially useful when you are working with multiple Python interpreters and bindings to C and C++ libraries. +For example, GDAL from PyPI does not have binary C dependencies, but the conda package does. +On the other hand, some packages are only available through PyPI, which pixi can also install for you. +Best of both world, let's give it a go!

+

pixi.toml and pyproject.toml#

+

We support two manifest formats: pyproject.toml and pixi.toml. +In this tutorial, we will use the pyproject.toml format because it is the most common format for Python projects.

+

Let's get started#

+

Let's start out by creating a new project that uses a pyproject.toml file.

+
pixi init pixi-py --format pyproject
+
+

This creates a project with the following structure:

+
├── src
+   └── pixi_py
+       └── __init__.py
+└── pyproject.toml
+
+

The pyproject.toml for the project looks like this:

+
[project]
+name = "pixi-py"
+version = "0.1.0"
+description = "Add a short description here"
+authors = [{name = "Tim de Jager", email = "tim@prefix.dev"}]
+requires-python = ">= 3.11"
+dependencies = []
+
+[build-system]
+build-backend = "hatchling.build"
+requires = ["hatchling"]
+
+[tool.pixi.project]
+channels = ["conda-forge"]
+platforms = ["osx-arm64"]
+
+[tool.pixi.pypi-dependencies]
+pixi-py = { path = ".", editable = true }
+
+[tool.pixi.tasks]
+
+

This project uses a src-layout, but pixi supports both flat- and src-layouts.

+

What's in the pyproject.toml?#

+

Okay, so let's have a look at what sections have been added and how we can modify the pyproject.toml.

+

These first entries were added to the pyproject.toml file:

+
# Main pixi entry
+[tool.pixi.project]
+channels = ["conda-forge"]
+# This is your machine platform by default
+platforms = ["osx-arm64"]
+
+

The channels and platforms are added to the [tool.pixi.project] section. +Channels like conda-forge manage packages similar to PyPI but allow for different packages across languages. +The keyword platforms determines what platform the project supports.

+

The pixi_py package itself is added as an editable dependency. +This means that the package is installed in editable mode, so you can make changes to the package and see the changes reflected in the environment, without having to re-install the environment.

+
# Editable installs
+[tool.pixi.pypi-dependencies]
+pixi-py = { path = ".", editable = true }
+
+

In pixi, unlike other package managers, this is explicitly stated in the pyproject.toml file. +The main reason being so that you can choose which environment this package should be included in.

+

Managing both conda and PyPI dependencies in pixi#

+

Our projects usually depend on other packages.

+
$ pixi add black
+Added black
+
+

This will result in the following addition to the pyproject.toml:

+
# Dependencies
+[tool.pixi.dependencies]
+black = ">=24.4.2,<24.5"
+
+

But we can also be strict about the version that should be used with pixi add black=24, resulting in

+
[tool.pixi.dependencies]
+black = "24.*"
+
+

Now, let's add some optional dependencies:

+
pixi add --pypi --feature test pytest
+
+

Which results in the following fields added to the pyproject.toml: +

[project.optional-dependencies]
+test = ["pytest"]
+

+

After we have added the optional dependencies to the pyproject.toml, pixi automatically creates a feature, which can contain a collection of dependencies, tasks, channels, and more.

+

Sometimes there are packages that aren't available on conda channels but are published on PyPI. +We can add these as well, which pixi will solve together with the default dependencies.

+
$ pixi add black --pypi
+Added black
+Added these as pypi-dependencies.
+
+

which results in the addition to the dependencies key in the pyproject.toml

+
dependencies = ["black"]
+
+

When using the pypi-dependencies you can make use of the optional-dependencies that other packages make available. +For example, black makes the cli dependencies option, which can be added with the --pypi keyword:

+
$ pixi add black[cli] --pypi
+Added black[cli]
+Added these as pypi-dependencies.
+
+

which updates the dependencies entry to

+
dependencies = ["black[cli]"]
+
+
+Optional dependencies in pixi.toml +

This tutorial focuses on the use of the pyproject.toml, but in case you're curious, the pixi.toml would contain the following entry after the installation of a PyPI package including an optional dependency: +

[pypi-dependencies]
+black = { version = "*", extras = ["cli"] }
+

+
+

Installation: pixi install#

+

Now let's install the project with pixi install:

+
$ pixi install
+ The default environment has been installed.
+
+

We now have a new directory called .pixi in the project root. +This directory contains the environment that was created when we ran pixi install. +The environment is a conda environment that contains the dependencies that we specified in the pyproject.toml file. +We can also install the test environment with pixi install -e test. +We can use these environments for executing code.

+

We also have a new file called pixi.lock in the project root. +This file contains the exact versions of the dependencies that were installed in the environment across platforms.

+

What's in the environment?#

+

Using pixi list, you can see what's in the environment, this is essentially a nicer view on the lock file:

+
$ pixi list
+Package          Version       Build               Size       Kind   Source
+bzip2            1.0.8         h93a5062_5          119.5 KiB  conda  bzip2-1.0.8-h93a5062_5.conda
+black            24.4.2                            3.8 MiB    pypi   black-24.4.2-cp312-cp312-win_amd64.http.whl
+ca-certificates  2024.2.2      hf0a4a13_0          152.1 KiB  conda  ca-certificates-2024.2.2-hf0a4a13_0.conda
+libexpat         2.6.2         hebf3989_0          62.2 KiB   conda  libexpat-2.6.2-hebf3989_0.conda
+libffi           3.4.2         h3422bc3_5          38.1 KiB   conda  libffi-3.4.2-h3422bc3_5.tar.bz2
+libsqlite        3.45.2        h091b4b1_0          806 KiB    conda  libsqlite-3.45.2-h091b4b1_0.conda
+libzlib          1.2.13        h53f4e23_5          47 KiB     conda  libzlib-1.2.13-h53f4e23_5.conda
+ncurses          6.4.20240210  h078ce10_0          801 KiB    conda  ncurses-6.4.20240210-h078ce10_0.conda
+openssl          3.2.1         h0d3ecfb_1          2.7 MiB    conda  openssl-3.2.1-h0d3ecfb_1.conda
+python           3.12.3        h4a7b5fc_0_cpython  12.6 MiB   conda  python-3.12.3-h4a7b5fc_0_cpython.conda
+readline         8.2           h92ec313_1          244.5 KiB  conda  readline-8.2-h92ec313_1.conda
+tk               8.6.13        h5083fa2_1          3 MiB      conda  tk-8.6.13-h5083fa2_1.conda
+tzdata           2024a         h0c530f3_0          117 KiB    conda  tzdata-2024a-h0c530f3_0.conda
+pixi-py          0.1.0                                        pypi   . (editable)
+xz               5.2.6         h57fd34a_0          230.2 KiB  conda  xz-5.2.6-h57fd34a_0.tar.bz2
+
+
+

Python

+

The Python interpreter is also installed in the environment. +This is because the Python interpreter version is read from the requires-python field in the pyproject.toml file. +This is used to determine the Python version to install in the environment. +This way, pixi automatically manages/bootstraps the Python interpreter for you, so no more brew, apt or other system install steps.

+
+

Here, you can see the different conda and Pypi packages listed. +As you can see, the pixi-py package that we are working on is installed in editable mode. +Every environment in pixi is isolated but reuses files that are hard-linked from a central cache directory. +This means that you can have multiple environments with the same packages but only have the individual files stored once on disk.

+

We can create the default and test environments based on our own test feature from the optional-dependency:

+
pixi project environment add default --solve-group default
+pixi project environment add test --feature test --solve-group default
+
+

Which results in:

+
# Environments
+[tool.pixi.environments]
+default = { solve-group = "default" }
+test = { features = ["test"], solve-group = "default" }
+
+
+Solve Groups +

Solve groups are a way to group dependencies together. +This is useful when you have multiple environments that share the same dependencies. +For example, maybe pytest is a dependency that influences the dependencies of the default environment. +By putting these in the same solve group, you ensure that the versions in test and default are exactly the same.

+
+

The default environment is created when you run pixi install. +The test environment is created from the optional dependencies in the pyproject.toml file. +You can execute commands in this environment with e.g. pixi run -e test python

+

Getting code to run#

+

Let's add some code to the pixi-py package. +We will add a new function to the src/pixi_py/__init__.py file:

+
from rich import print
+
+def hello():
+    return "Hello, [bold magenta]World[/bold magenta]!", ":vampire:"
+
+def say_hello():
+    print(*hello())
+
+

Now add the rich dependency from PyPI using: pixi add --pypi rich.

+

Let's see if this works by running:

+
pixi r python -c "import pixi_py; pixi_py.say_hello()"
+Hello, World! 🧛
+
+
+Slow? +

This might be slow(2 minutes) the first time because pixi installs the project, but it will be near instant the second time.

+
+

Pixi runs the self installed Python interpreter. +Then, we are importing the pixi_py package, which is installed in editable mode. +The code calls the say_hello function that we just added. +And it works! Cool!

+

Testing this code#

+

Okay, so let's add a test for this function. +Let's add a tests/test_me.py file in the root of the project.

+

Giving us the following project structure:

+
.
+├── pixi.lock
+├── src
+   └── pixi_py
+       └── __init__.py
+├── pyproject.toml
+└── tests/test_me.py
+
+
from pixi_py import hello
+
+def test_pixi_py():
+    assert hello() == ("Hello, [bold magenta]World[/bold magenta]!", ":vampire:")
+
+

Let's add an easy task for running the tests.

+
$ pixi task add --feature test test "pytest"
+ Added task `test`: pytest .
+
+

So pixi has a task system to make it easy to run commands. +Similar to npm scripts or something you would specify in a Justfile.

+
+Pixi tasks +

Tasks are actually a pretty cool pixi feature that is powerful and runs in a cross-platform shell. +You can do caching, dependencies and more. +Read more about tasks in the tasks section.

+
+
$ pixi r test
+ Pixi task (test): pytest .
+================================================================================================= test session starts =================================================================================================
+platform darwin -- Python 3.12.2, pytest-8.1.1, pluggy-1.4.0
+rootdir: /private/tmp/pixi-py
+configfile: pyproject.toml
+collected 1 item
+
+test_me.py .                                                                                                                                                                                                    [100%]
+
+================================================================================================== 1 passed in 0.00s =================================================================================================
+
+

Neat! It seems to be working!

+

Test vs Default environment#

+

Let's compare the output of the test and default environments...

+
pixi list -e test
+# vs. default environment
+pixi list
+
+

We see that the test environment has:

+
package          version       build               size       kind   source
+...
+pytest           8.1.1                             1.1 mib    pypi   pytest-8.1.1-py3-none-any.whl
+...
+
+

However, the default environment is missing this package. +This way, you can finetune your environments to only have the packages that are needed for that environment. +E.g. you could also have a dev environment that has pytest and ruff installed, but you could omit these from the prod environment. +There is a docker example that shows how to set up a minimal prod environment and copy from there.

+

Replacing PyPI packages with conda packages#

+

Last thing, pixi provides the ability for pypi packages to depend on conda packages. +Let's confirm this with pixi list:

+
$ pixi list
+Package          Version       Build               Size       Kind   Source
+...
+pygments         2.17.2                            4.1 MiB    pypi   pygments-2.17.2-py3-none-any.http.whl
+...
+
+

Let's explicitly add pygments to the pyproject.toml file. +Which is a dependency of the rich package.

+
pixi add pygments
+
+

This will add the following to the pyproject.toml file:

+
[tool.pixi.dependencies]
+pygments = ">=2.17.2,<2.18"
+
+

We can now see that the pygments package is now installed as a conda package.

+
$ pixi list
+Package          Version       Build               Size       Kind   Source
+...
+pygments         2.17.2        pyhd8ed1ab_0        840.3 KiB  conda  pygments-2.17.2-pyhd8ed1ab_0.conda
+
+

This way, PyPI dependencies and conda dependencies can be mixed and matched to seamlessly interoperate.

+
$  pixi r python -c "import pixi_py; pixi_py.say_hello()"
+Hello, World! 🧛
+
+

And it still works!

+

Conclusion#

+

In this tutorial, you've seen how easy it is to use a pyproject.toml to manage your pixi dependencies and environments. +We have also explored how to use PyPI and conda dependencies seamlessly together in the same project and install optional dependencies to manage Python packages.

+

Hopefully, this provides a flexible and powerful way to manage your Python projects and a fertile base for further exploration with Pixi.

+

Thanks for reading! Happy Coding 🚀

+

Any questions? Feel free to reach out or share this tutorial on X, join our Discord, send us an e-mail or follow our GitHub.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/tutorials/ros2/index.html b/v0.39.2/tutorials/ros2/index.html new file mode 100644 index 000000000..409cb48a1 --- /dev/null +++ b/v0.39.2/tutorials/ros2/index.html @@ -0,0 +1,2061 @@ + + + + + + + + + + + + + + + + + + + + + + + + + ROS 2 - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Tutorial: Develop a ROS 2 package with pixi#

+

In this tutorial, we will show you how to develop a ROS 2 package using pixi. +The tutorial is written to be executed from top to bottom, missing steps might result in errors.

+

The audience for this tutorial is developers who are familiar with ROS 2 and how are interested to try pixi for their development workflow.

+

Prerequisites#

+
    +
  • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. + The crux of this tutorial is to show you only need pixi!
  • +
  • On Windows, it's advised to enable Developer mode. Go to Settings -> Update & Security -> For developers -> Developer mode.
  • +
+
+

If you're new to pixi, you can check out the basic usage guide. +This will teach you the basics of pixi project within 3 minutes.

+
+

Create a pixi project.#

+
pixi init my_ros2_project -c robostack-staging -c conda-forge
+cd my_ros2_project
+
+

It should have created a directory structure like this:

+
my_ros2_project
+├── .gitattributes
+├── .gitignore
+└── pixi.toml
+
+

The pixi.toml file is the manifest file for your project. It should look like this:

+
pixi.toml
[project]
+name = "my_ros2_project"
+version = "0.1.0"
+description = "Add a short description here"
+authors = ["User Name <user.name@email.url>"]
+channels = ["robostack-staging", "conda-forge"]
+# Your project can support multiple platforms, the current platform will be automatically added.
+platforms = ["linux-64"]
+
+[tasks]
+
+[dependencies]
+
+

The channels you added to the init command are repositories of packages, you can search in these repositories through our prefix.dev website. +The platforms are the systems you want to support, in pixi you can support multiple platforms, but you have to define which platforms, so pixi can test if those are supported for your dependencies. +For the rest of the fields, you can fill them in as you see fit.

+

Add ROS 2 dependencies#

+

To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

+

Let's start with the turtlesim example

+
pixi add ros-humble-desktop ros-humble-turtlesim
+
+

This will add the ros-humble-desktop and ros-humble-turtlesim packages to your manifest. +Depending on your internet speed this might take a minute, as it will also install ROS in your project folder (.pixi).

+

Now run the turtlesim example.

+
pixi run ros2 run turtlesim turtlesim_node
+
+

Or use the shell command to start an activated environment in your terminal.

+
pixi shell
+ros2 run turtlesim turtlesim_node
+
+

Congratulations you have ROS 2 running on your machine with pixi!

+
+Some more fun with the turtle +

To control the turtle you can run the following command in a new terminal +

cd my_ros2_project
+pixi run ros2 run turtlesim turtle_teleop_key
+
+Now you can control the turtle with the arrow keys on your keyboard.

+
+

Turtlesim control

+

Add a custom Python node#

+

As ros works with custom nodes, let's add a custom node to our project.

+
pixi run ros2 pkg create --build-type ament_python --destination-directory src --node-name my_node my_package
+
+

To build the package we need some more dependencies:

+
pixi add colcon-common-extensions "setuptools<=58.2.0"
+
+

Add the created initialization script for the ros workspace to your manifest file.

+

Then run the build command

+
pixi run colcon build
+
+

This will create a sourceable script in the install folder, you can source this script through an activation script to use your custom node. +Normally this would be the script you add to your .bashrc but instead you tell pixi to use it by adding the following to pixi.toml:

+
+
+
+
pixi.toml
[activation]
+scripts = ["install/setup.sh"]
+
+
+
+
pixi.toml
[activation]
+scripts = ["install/setup.bat"]
+
+
+
+
+
+Multi platform support +

You can add multiple activation scripts for different platforms, so you can support multiple platforms with one project. +Use the following example to add support for both Linux and Windows, using the target syntax.

+
[project]
+platforms = ["linux-64", "win-64"]
+
+[activation]
+scripts = ["install/setup.sh"]
+[target.win-64.activation]
+scripts = ["install/setup.bat"]
+
+
+

Now you can run your custom node with the following command

+
pixi run ros2 run my_package my_node
+
+

Simplify the user experience#

+

In pixi we have a feature called tasks, this allows you to define a task in your manifest file and run it with a simple command. +Let's add a task to run the turtlesim example and the custom node.

+
pixi task add sim "ros2 run turtlesim turtlesim_node"
+pixi task add build "colcon build --symlink-install"
+pixi task add hello "ros2 run my_package my_node"
+
+

Now you can run these task by simply running

+
pixi run sim
+pixi run build
+pixi run hello
+
+
+Advanced task usage +

Tasks are a powerful feature in pixi.

+
    +
  • You can add depends-on to the tasks to create a task chain.
  • +
  • You can add cwd to the tasks to run the task in a different directory from the root of the project.
  • +
  • You can add inputs and outputs to the tasks to create a task that only runs when the inputs are changed.
  • +
  • You can use the target syntax to run specific tasks on specific machines.
  • +
+
+
[tasks]
+sim = "ros2 run turtlesim turtlesim_node"
+build = {cmd = "colcon build --symlink-install", inputs = ["src"]}
+hello = { cmd = "ros2 run my_package my_node", depends-on = ["build"] }
+
+

Build a C++ node#

+

To build a C++ node you need to add the ament_cmake and some other build dependencies to your manifest file.

+
pixi add ros-humble-ament-cmake-auto compilers pkg-config cmake ninja
+
+

Now you can create a C++ node with the following command

+
pixi run ros2 pkg create --build-type ament_cmake --destination-directory src --node-name my_cpp_node my_cpp_package
+
+

Now you can build it again and run it with the following commands

+
# Passing arguments to the build command to build with Ninja, add them to the manifest if you want to default to ninja.
+pixi run build --cmake-args -G Ninja
+pixi run ros2 run my_cpp_package my_cpp_node
+
+
+Tip +

Add the cpp task to the manifest file to simplify the user experience.

+
pixi task add hello-cpp "ros2 run my_cpp_package my_cpp_node"
+
+
+

Conclusion#

+

In this tutorial, we showed you how to create a Python & CMake ROS2 project using pixi. +We also showed you how to add dependencies to your project using pixi, and how to run your project using pixi run. +This way you can make sure that your project is reproducible on all your machines that have pixi installed.

+

Show Off Your Work!#

+

Finished with your project? +We'd love to see what you've created! +Share your work on social media using the hashtag #pixi and tag us @prefix_dev. +Let's inspire the community together!

+

Frequently asked questions#

+

What happens with rosdep?#

+

Currently, we don't support rosdep in a pixi environment, so you'll have to add the packages using pixi add. +rosdep will call conda install which isn't supported in a pixi environment.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/tutorials/rust/index.html b/v0.39.2/tutorials/rust/index.html new file mode 100644 index 000000000..0a6cc3c23 --- /dev/null +++ b/v0.39.2/tutorials/rust/index.html @@ -0,0 +1,1993 @@ + + + + + + + + + + + + + + + + + + + + + + + + + Rust - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + + + + + +
+
+ + + + + + + + + + + + +

Tutorial: Develop a Rust package using pixi#

+

In this tutorial, we will show you how to develop a Rust package using pixi. +The tutorial is written to be executed from top to bottom, missing steps might result in errors.

+

The audience for this tutorial is developers who are familiar with Rust and cargo and how are interested to try pixi for their development workflow. +The benefit would be within a rust workflow that you lock both rust and the C/System dependencies your project might be using. E.g tokio users will almost most definitely use openssl.

+
+

If you're new to pixi, you can check out the basic usage guide. +This will teach you the basics of pixi project within 3 minutes.

+
+

Prerequisites#

+
    +
  • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. + The crux of this tutorial is to show you only need pixi!
  • +
+

Create a pixi project.#

+
pixi init my_rust_project
+cd my_rust_project
+
+

It should have created a directory structure like this:

+
my_rust_project
+├── .gitattributes
+├── .gitignore
+└── pixi.toml
+
+

The pixi.toml file is the manifest file for your project. It should look like this:

+
pixi.toml
[project]
+name = "my_rust_project"
+version = "0.1.0"
+description = "Add a short description here"
+authors = ["User Name <user.name@email.url>"]
+channels = ["conda-forge"]
+platforms = ["linux-64"] # (1)!
+
+[tasks]
+
+[dependencies]
+
+
    +
  1. The platforms is set to your system's platform by default. You can change it to any platform you want to support. e.g. ["linux-64", "osx-64", "osx-arm64", "win-64"].
  2. +
+

Add Rust dependencies#

+

To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues. +

pixi add rust
+

+

This will add the rust package to your pixi.toml file under [dependencies]. +Which includes the rust toolchain, and cargo.

+

Add a cargo project#

+

Now that you have rust installed, you can create a cargo project in your pixi project. +

pixi run cargo init
+

+

pixi run is pixi's way to run commands in the pixi environment, it will make sure that the environment is set up correctly for the command to run. +It runs its own cross-platform shell, if you want more information checkout the tasks documentation. +You can also activate the environment in your own shell by running pixi shell, after that you don't need pixi run ... anymore.

+

Now we can build a cargo project using pixi. +

pixi run cargo build
+
+To simplify the build process, you can add a build task to your pixi.toml file using the following command: +
pixi task add build "cargo build"
+
+Which creates this field in the pixi.toml file: +
pixi.toml
[tasks]
+build = "cargo build"
+

+

And now you can build your project using: +

pixi run build
+

+

You can also run your project using: +

pixi run cargo run
+
+Which you can simplify with a task again. +
pixi task add start "cargo run"
+

+

So you should get the following output: +

pixi run start
+Hello, world!
+

+

Congratulations, you have a Rust project running on your machine with pixi!

+

Next steps, why is this useful when there is rustup?#

+

Cargo is not a binary package manager, but a source-based package manager. +This means that you need to have the Rust compiler installed on your system to use it. +And possibly other dependencies that are not included in the cargo package manager. +For example, you might need to install openssl or libssl-dev on your system to build a package. +This is the case for pixi as well, but pixi will install these dependencies in your project folder, so you don't have to worry about them.

+

Add the following dependencies to your cargo project: +

pixi run cargo add git2
+

+

If your system is not preconfigured to build C and have the libssl-dev package installed you will not be able to build the project: +

pixi run build
+...
+Could not find directory of OpenSSL installation, and this `-sys` crate cannot
+proceed without this knowledge. If OpenSSL is installed and this crate had
+trouble finding it,  you can set the `OPENSSL_DIR` environment variable for the
+compilation process.
+
+Make sure you also have the development packages of openssl installed.
+For example, `libssl-dev` on Ubuntu or `openssl-devel` on Fedora.
+
+If you're in a situation where you think the directory *should* be found
+automatically, please open a bug at https://github.com/sfackler/rust-openssl
+and include information about your system as well as this message.
+
+$HOST = x86_64-unknown-linux-gnu
+$TARGET = x86_64-unknown-linux-gnu
+openssl-sys = 0.9.102
+
+
+It looks like you're compiling on Linux and also targeting Linux. Currently this
+requires the `pkg-config` utility to find OpenSSL but unfortunately `pkg-config`
+could not be found. If you have OpenSSL installed you can likely fix this by
+installing `pkg-config`.
+...
+
+You can fix this, by adding the necessary dependencies for building git2, with pixi: +
pixi add openssl pkg-config compilers
+

+

Now you should be able to build your project again: +

pixi run build
+...
+   Compiling git2 v0.18.3
+   Compiling my_rust_project v0.1.0 (/my_rust_project)
+    Finished dev [unoptimized + debuginfo] target(s) in 7.44s
+     Running `target/debug/my_rust_project`
+

+

Extra: Add more tasks#

+

You can add more tasks to your pixi.toml file to simplify your workflow.

+

For example, you can add a test task to run your tests: +

pixi task add test "cargo test"
+

+

And you can add a clean task to clean your project: +

pixi task add clean "cargo clean"
+

+

You can add a formatting task to your project: +

pixi task add fmt "cargo fmt"
+

+

You can extend these tasks to run multiple commands with the use of the depends-on field. +

pixi task add lint "cargo clippy" --depends-on fmt
+

+

Conclusion#

+

In this tutorial, we showed you how to create a Rust project using pixi. +We also showed you how to add dependencies to your project using pixi. +This way you can make sure that your project is reproducible on any system that has pixi installed.

+

Show Off Your Work!#

+

Finished with your project? +We'd love to see what you've created! +Share your work on social media using the hashtag #pixi and tag us @prefix_dev. +Let's inspire the community together!

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/v0.39.2/vision/index.html b/v0.39.2/vision/index.html new file mode 100644 index 000000000..d562a50c4 --- /dev/null +++ b/v0.39.2/vision/index.html @@ -0,0 +1,1780 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + Pixi Vision - Pixi by prefix.dev + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ +
+ + + + + + + + +
+ + +
+ +
+ + + + + + +
+
+ + + +
+
+
+ + + + + +
+
+
+ + + +
+
+
+ + + +
+
+
+ + + +
+
+ + + + + + + + + + + + +

Vision#

+

We created pixi because we want to have a cargo/npm/yarn like package management experience for conda. We really love what the conda packaging ecosystem achieves, but we think that the user experience can be improved a lot. +Modern package managers like cargo have shown us, how great a package manager can be. We want to bring that experience to the conda ecosystem.

+

Pixi values#

+

We want to make pixi a great experience for everyone, so we have a few values that we want to uphold:

+
    +
  1. Fast. We want to have a fast package manager, that is able to solve the environment in a few seconds.
  2. +
  3. User Friendly. We want to have a package manager that puts user friendliness on the front-line. Providing easy, accessible and intuitive commands. That have the element of least surprise.
  4. +
  5. Isolated Environment. We want to have isolated environments, that are reproducible and easy to share. Ideally, it should run on all common platforms. The Conda packaging system provides an excellent base for this.
  6. +
  7. Single Tool. We want to integrate most common uses when working on a development project with Pixi, so it should support at least dependency management, command management, building and uploading packages. You should not need to reach to another external tool for this.
  8. +
  9. Fun. It should be fun to use pixi and not cause frustrations, you should not need to think about it a lot and it should generally just get out of your way.
  10. +
+

Conda#

+

We are building on top of the conda packaging ecosystem, this means that we have a huge number of packages available for different platforms on conda-forge. We believe the conda packaging ecosystem provides a solid base to manage your dependencies. Conda-forge is community maintained and very open to contributions. It is widely used in data science and scientific computing, robotics and other fields. And has a proven track record.

+

Target languages#

+

Essentially, we are language agnostics, we are targeting any language that can be installed with conda. Including: C++, Python, Rust, Zig etc. +But we do believe the python ecosystem can benefit from a good package manager that is based on conda. +So we are trying to provide an alternative to existing solutions there. +We also think we can provide a good solution for C++ projects, as there are a lot of libraries available on conda-forge today. +Pixi also truly shines when using it for multi-language projects e.g. a mix of C++ and Python, because we provide a nice way to build everything up to and including +system level packages.

+ + + + + + + + + + + + + + + + +
+
+ + + + + +
+ + + +
+ + + + + + +
+
+
+
+ +
+ + + + + + + + + + \ No newline at end of file diff --git a/versions.json b/versions.json index f59169909..d58296c2b 100644 --- a/versions.json +++ b/versions.json @@ -9,15 +9,20 @@ { "version": "v0.39.2-prerelease.2", "title": "v0.39.2-prerelease.2", - "aliases": [ - "latest" - ] + "aliases": [] }, { "version": "v0.39.2-prerelease.0", "title": "v0.39.2-prerelease.0", "aliases": [] }, + { + "version": "v0.39.2", + "title": "v0.39.2", + "aliases": [ + "latest" + ] + }, { "version": "v0.39.0", "title": "v0.39.0",
git clone https://github.com/prefix-dev/pixi.git
+