Skip to content

Commit

Permalink
Merge branch 'PyPSA:master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
martacki authored Nov 7, 2024
2 parents 8140062 + 35050d9 commit 21a97ba
Show file tree
Hide file tree
Showing 30 changed files with 1,281 additions and 403 deletions.
27 changes: 15 additions & 12 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,18 +23,18 @@ concurrency:
jobs:
run-tests:
name: OS
runs-on: ${{ matrix.os }}-latest
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
os: [ubuntu, macos, windows]
include:
- os: ubuntu-latest
env_file: envs/ubuntu-pinned.yaml
- os: macos-latest
- os: ubuntu
env_file: envs/linux-pinned.yaml
- os: macos
env_file: envs/macos-pinned.yaml
- os: windows-latest
- os: windows
env_file: envs/windows-pinned.yaml
runs-on: ${{ matrix.os }}

defaults:
run:
Expand Down Expand Up @@ -75,16 +75,19 @@ jobs:
make test
- name: Upload artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: logs-${{ matrix.inhouse }}
name: results-${{ matrix.os }}
path: |
logs
.snakemake/log
results
retention-days: 3

run-tests-on-dev-deps:
name: Inhouse
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
Expand All @@ -93,7 +96,6 @@ jobs:
- atlite
- powerplantmatching
- linopy
runs-on: ubuntu-latest

defaults:
run:
Expand All @@ -105,7 +107,7 @@ jobs:
# Only run checks if package is not pinned
- name: Check if inhouse package is pinned
run: |
grep_line=$(grep -- '- ${{ matrix.inhouse }}' envs/ubuntu-pinned.yaml)
grep_line=$(grep -- '- ${{ matrix.inhouse }}' envs/environment.yaml)
if [[ $grep_line == *"<"* || $grep_line == *"=="* ]]; then
echo "pinned=true" >> $GITHUB_ENV
else
Expand Down Expand Up @@ -137,12 +139,12 @@ jobs:
uses: actions/cache@v4
with:
path: ${{ env.CONDA }}/envs
key: conda-${{ runner.os }}-${{ runner.arch }}-${{ matrix.inhouse }}-hashFiles("envs/ubuntu-pinned.yaml")
key: conda-${{ runner.os }}-${{ runner.arch }}-${{ matrix.inhouse }}-hashFiles("envs/linux-pinned.yaml")
id: cache-env

- name: Update environment
if: env.pinned == 'false' && steps.cache-env.outputs.cache-hit != 'true'
run: conda env update -n pypsa-eur -f envs/ubuntu-pinned.yaml
run: conda env update -n pypsa-eur -f envs/linux-pinned.yaml

- name: Install inhouse packages from master
if: env.pinned == 'false'
Expand All @@ -158,8 +160,9 @@ jobs:
if: env.pinned == 'false'
uses: actions/upload-artifact@v4
with:
name: logs-${{ matrix.inhouse }}
name: results-inhouse-${{ matrix.inhouse }}
path: |
logs
.snakemake/log
results
retention-days: 3
32 changes: 16 additions & 16 deletions .github/workflows/update-fixed-env.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Fixed-Version Environment Checker
name: Update pinned envs

on:
push:
Expand All @@ -12,19 +12,19 @@ on:


jobs:
update-environment-fixed:
update-pinned-environment:
name: Update pinned envs
runs-on: ${{ matrix.os }}
runs-on: ${{ matrix.os }}-latest
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
os: [ubuntu, macos, windows]
include:
- os: ubuntu-latest
suffix: "ubuntu"
- os: macos-latest
- os: ubuntu
suffix: "linux"
- os: macos
suffix: "macos"
- os: windows-latest
- os: windows
suffix: "windows"

steps:
Expand All @@ -36,7 +36,7 @@ jobs:
activate-environment: ${{ github.event.repository.name }}
environment-file: envs/environment.yaml

- name: Update environment.fixed.yaml per OS
- name: Update pinned environment per OS
run: |
conda env export --name ${{ github.event.repository.name }} --no-builds > envs/${{ matrix.suffix }}-pinned.yaml
Expand All @@ -49,8 +49,8 @@ jobs:
- name: Add SPDX header (windows)
if: ${{ matrix.suffix == 'windows' }}
run: |
$SPDX_HEADER = "# SPDX-FileCopyrightText: 2017-2024 The PyPSA-Eur Authors`n# SPDX-License-Identifier: CC0-1.0`n"
$CurrentContent = Get-Content "envs/${{ matrix.suffix }}-pinned.yaml"
$SPDX_HEADER = "# SPDX-FileCopyrightText: 2017-2024 The PyPSA-Eur Authors`r`n# SPDX-License-Identifier: CC0-1.0`r`n`r`n"
$CurrentContent = Get-Content "envs/${{ matrix.suffix }}-pinned.yaml" -Raw
$NewContent = $SPDX_HEADER + $CurrentContent
$NewContent | Set-Content "envs/${{ matrix.suffix }}-pinned.yaml"
Expand All @@ -61,7 +61,7 @@ jobs:
path: envs/${{ matrix.suffix }}-pinned.yaml

create-pull-request:
needs: update-environment-fixed
needs: update-pinned-environment
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
Expand All @@ -71,15 +71,15 @@ jobs:
- name: Prepare files for commit
run: |
mkdir -p envs
mv ubuntu-pinned/* envs/ubuntu-pinned.yaml
mv linux-pinned/* envs/linux-pinned.yaml
mv macos-pinned/* envs/macos-pinned.yaml
mv windows-pinned/* envs/windows-pinned.yaml
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
token: ${{ secrets.GITHUB_TOKEN }}
branch: update-environment-fixed
title: "[github-actions.ci] Update fixed environments"
body: "Automatically generated PR to update pinned environment files for Windows, macOS, and Ubuntu."
branch: update-pinned-environment
title: "[github-actions.ci] Update pinned envs"
body: "Automatically generated PR to update pinned environment files for Windows, macOS, and Linux."
commit-message: "Update pinned environment files for all platforms"
36 changes: 21 additions & 15 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
#
# SPDX-License-Identifier: CC0-1.0

.PHONY: _conda_check install install-fixed test clean-tests reset
.PHONY: _conda_check install install-pinned-linux install-pinned-windows install-pinned-macos test clean-tests reset

# Helper: Check if conda or mamba is installed and set CONDA_OR_MAMBA variable
_conda_check:
Expand All @@ -18,33 +18,39 @@ _conda_check:
exit 1; \
fi

# Install the environment
# Install environment
install: _conda_check
@$(CONDA_OR_MAMBA) env create -f envs/environment.yaml
@$(CONDA_OR_MAMBA) run -n pypsa-eur pre-commit install

# Install fixed environment
install-fixed: _conda_check
@$(CONDA_OR_MAMBA) env create -f envs/environment.fixed.yaml
# Install pinned environment
install-pinned-linux: _conda_check
@$(CONDA_OR_MAMBA) env create -f envs/pinned-linux.yaml
@$(CONDA_OR_MAMBA) run -n pypsa-eur pre-commit install
install-pinned-windows: _conda_check
@$(CONDA_OR_MAMBA) env create -f envs/pinned-windows.yaml
@$(CONDA_OR_MAMBA) run -n pypsa-eur pre-commit install
install-pinned-macos: _conda_check
@$(CONDA_OR_MAMBA) env create -f envs/pinned-macos.yaml
@$(CONDA_OR_MAMBA) run -n pypsa-eur pre-commit install

# Run default tests
test:
set -e
snakemake -call solve_elec_networks --configfile config/test/config.electricity.yaml --rerun-triggers=mtime
snakemake -call all --configfile config/test/config.overnight.yaml --rerun-triggers=mtime
snakemake -call all --configfile config/test/config.myopic.yaml --rerun-triggers=mtime
snakemake -call make_summary_perfect --configfile config/test/config.perfect.yaml --rerun-triggers=mtime
snakemake -call all --configfile config/test/config.scenarios.yaml --rerun-triggers=mtime -n
snakemake solve_elec_networks --configfile config/test/config.electricity.yaml --rerun-triggers=mtime
snakemake --configfile config/test/config.overnight.yaml --rerun-triggers=mtime
snakemake --configfile config/test/config.myopic.yaml --rerun-triggers=mtime
snakemake make_summary_perfect --configfile config/test/config.perfect.yaml --rerun-triggers=mtime
snakemake --configfile config/test/config.scenarios.yaml --rerun-triggers=mtime -n
echo "All tests completed successfully."

# Cleans all output files from tests
clean-tests:
snakemake -call solve_elec_networks --configfile config/test/config.electricity.yaml --rerun-triggers=mtime --delete-all-output
snakemake -call all --configfile config/test/config.overnight.yaml --rerun-triggers=mtime --delete-all-output
snakemake -call all --configfile config/test/config.myopic.yaml --rerun-triggers=mtime --delete-all-output
snakemake -call make_summary_perfect --configfile config/test/config.perfect.yaml --rerun-triggers=mtime --delete-all-output
snakemake -call all --configfile config/test/config.scenarios.yaml --rerun-triggers=mtime -n --delete-all-output
snakemake solve_elec_networks --configfile config/test/config.electricity.yaml --rerun-triggers=mtime --delete-all-output
snakemake --configfile config/test/config.overnight.yaml --rerun-triggers=mtime --delete-all-output
snakemake --configfile config/test/config.myopic.yaml --rerun-triggers=mtime --delete-all-output
snakemake make_summary_perfect --configfile config/test/config.perfect.yaml --rerun-triggers=mtime --delete-all-output
snakemake --configfile config/test/config.scenarios.yaml --rerun-triggers=mtime -n --delete-all-output

# Removes all created files except for large cutout files (similar to fresh clone)
reset:
Expand Down
4 changes: 3 additions & 1 deletion config/config.default.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -597,7 +597,9 @@ sector:
overdimension_heat_generators:
decentral: 1.1 #to cover demand peaks bigger than data
central: 1.0
chp: true
chp:
enable: true
fuel: gas # for all fuels the same techno economic data from gas CHP is taken
micro_chp: false
solar_thermal: true
solar_cf_correction: 0.788457 # = >>> 1/1.2683
Expand Down
4 changes: 3 additions & 1 deletion doc/configtables/sector.csv
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,9 @@ biomass_boiler,--,"{true, false}",Add option for transforming biomass into heat
overdimension_heat_generators,,,Add option for overdimensioning heating systems by a certain factor. This allows them to cover heat demand peaks e.g. 10% higher than those in the data with a setting of 1.1.
-- decentral,--,float,The factor for overdimensioning (increasing CAPEX) decentral heating systems
-- central,--,float,The factor for overdimensioning (increasing CAPEX) central heating systems
chp,--,"{true, false}",Add option for using Combined Heat and Power (CHP)
chp,--,,
-- enable,--,"{true, false}",Add option for using Combined Heat and Power (CHP)
-- fuel,--,string or list of fuels,"Possible options are all fuels which have an exisitng bus and their CO2 intensity is given in the technology data. Currently possible are ""gas"", ""oil"", ""methanol"", ""lignite"", ""coal"". For all fuels, the techno-economic data from gas CHP is used."
micro_chp,--,"{true, false}",Add option for using Combined Heat and Power (CHP) for decentral areas.
solar_thermal,--,"{true, false}",Add option for using solar thermal to generate heat.
solar_cf_correction,--,float,The correction factor for the value provided by the solar thermal profile calculations
Expand Down
6 changes: 3 additions & 3 deletions doc/configuration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -77,13 +77,13 @@ The ``scenario`` section is an extraordinary section of the config file
that is strongly connected to the :ref:`wildcards` and is designed to
facilitate running multiple scenarios through a single command

.. code:: bash
.. code:: console
# for electricity-only studies
snakemake -call solve_elec_networks
$ snakemake solve_elec_networks
# for sector-coupling studies
snakemake -call solve_sector_networks
$ snakemake solve_sector_networks
For each wildcard, a **list of values** is provided. The rule
``solve_all_elec_networks`` will trigger the rules for creating
Expand Down
40 changes: 21 additions & 19 deletions doc/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,9 @@ Clone the Repository

First of all, clone the `PyPSA-Eur repository <https://github.com/PyPSA/pypsa-eur>`__ using the version control system ``git`` in the command line.

.. code:: bash
.. code:: console
/some/other/path % cd /some/path
/some/path % git clone https://github.com/PyPSA/pypsa-eur.git
$ git clone https://github.com/PyPSA/pypsa-eur.git
.. _deps:
Expand All @@ -30,27 +28,31 @@ Install Python Dependencies
===============================

PyPSA-Eur relies on a set of other Python packages to function.
We recommend using the package manager `mamba <https://mamba.readthedocs.io/en/latest/>`__ to install them and manage your environments.
For instructions for your operating system follow the ``mamba`` `installation guide <https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html>`__.
We recommend using the package manager `mamba <https://mamba.readthedocs.io/en/latest/>`__
to install them and manage your environments. For instructions for your operating
system follow the ``mamba`` `installation guide <https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html>`__.
You can also use ``conda`` equivalently.

The package requirements are curated in the `envs/environment.yaml <https://github.com/PyPSA/pypsa-eur/blob/master/envs/environment.yaml>`__ file.
The environment can be installed and activated using
The package requirements are curated in the ``envs/environment.yaml`` file.
There are also regularly updated pinned environment files for each operating system to
ensure reproducibility (``envs/windows-pinned.yaml``, ``envs/linux-pinned.yaml``, ``envs/macos-pinned.yaml``).
We recommend to use the pinned files for a stable environment, but you could also use
the unpinned file.

.. code:: bash
.. code:: console
.../pypsa-eur % mamba env create -f envs/environment.yaml
$ mamba env create -f envs/linux-pinned.yaml # replace for your os
.../pypsa-eur % mamba activate pypsa-eur
$ mamba activate pypsa-eur
.. note::
The equivalent commands for ``conda`` would be

.. code:: bash
.. code:: console
.../pypsa-eur % conda env create -f envs/environment.yaml
$ conda env create -f envs/linux-pinned.yaml # replace for your os
.../pypsa-eur % conda activate pypsa-eur
$ conda activate pypsa-eur
Install a Solver
Expand Down Expand Up @@ -84,10 +86,10 @@ Nevertheless, you can still use open-source solvers for smaller problems.
For an open-source solver setup install for example HiGHS **and** SCIP in your ``conda`` environment on OSX/Linux.
To install the default solver Gurobi, run

.. code:: bash
.. code:: console
mamba activate pypsa-eur
mamba install -c gurobi gurobi
$ mamba activate pypsa-eur
$ mamba install -c gurobi gurobi
Additionally, you need to setup your `Gurobi license <https://www.gurobi.com/solutions/licensing/>`__.

Expand All @@ -105,6 +107,6 @@ the configuration options are in :ref:`config`.
You can also use ``snakemake`` to specify another file, e.g.
``config/config.mymodifications.yaml``, to update the settings of the ``config/config.yaml``.

.. code:: bash
.. code:: console
.../pypsa-eur % snakemake -call --configfile config/config.mymodifications.yaml
$ snakemake --configfile config/config.mymodifications.yaml
4 changes: 2 additions & 2 deletions doc/introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@ scripts were modified.

For instance, an invocation to

.. code:: bash
.. code:: console
.../pypsa-eur % snakemake -call results/networks/base_s_128_elec_lvopt_.nc
$ snakemake results/networks/base_s_128_elec_lvopt_.nc
follows this dependency graph

Expand Down
12 changes: 12 additions & 0 deletions doc/release_notes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,18 @@ Release Notes
Upcoming Release
================

* Feature: Allow CHPs to use different fuel sources such as gas, oil, coal, and methanol. Note that the cost assumptions are based on a gas CHP.

* Improve `sanitize_carrier`` function by filling in colors of missing carriers with colors mapped after using the function `rename_techs`.

* Bugfix: Adjusted efficiency2 (to atmosphere) for bioliquids-to-oil Link in `prepare_sector_network` to exactly offset the corresponding oil emissions.

* Bugfix: Waste CHPs were added to all electricity buses even if they were not connected to heating network. This is now fixed.

* Bugfix: Duplicates in build_transmission_projects were caught, but not removed from the network. This is now fixed.

* Replaced the Store representation of biogenic carriers (solid biomass, biogas, bioliquids, MSW) in ``prepare_sector_network`` with the extended Generator component that uses the ``e_sum_min`` and ``e_sum_max`` attributes to enforce minimum usage and limit maximum potential, respectively.

* Added option to reduce central heating forward temperatures by annual percentage (see rule :mod:`build_central_heating_temperature_profiles`). This makes COP profiles and heat pump efficiencies planning-horizon-dependent. Myopic and perfect foresight modes were adjusted accordingly to update COPs of existing heat pumps in preceding years to adjusted temperatures.

* Rearranged workflow to cluster the electricity network before calculating
Expand Down
Loading

0 comments on commit 21a97ba

Please sign in to comment.