Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update readme files #228

Merged
merged 5 commits into from
Mar 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 0 additions & 20 deletions .github/next_steps/03_readthedocs.md

This file was deleted.

60 changes: 31 additions & 29 deletions README.dev.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,24 +32,26 @@ install-nplinker-deps

Afterwards check that the install directory is present in the `PATH` environment variable.

## Running the tests

There are two ways to run tests.
You can also use [conda](https://docs.conda.io/projects/conda/en/stable/) to manage python environments.

The first way requires an activated virtual environment with the development tools installed:
## Running the tests

**Run unit tests with**
```shell
pytest -v
pytest
# or
pytest -n auto tests/unit
```
Parallel testing is supported with `pytest-xdist` plugin. To run tests in parallel, use the `-n`
option, e.g. `-n auto` to run tests in parallel with the number of CPUs available.

The second is to use `tox`, which can be installed separately (e.g. with `pip install tox`), i.e. not necessarily inside the virtual environment you use for installing `nplinker`, but then builds the necessary virtual environments itself by simply running:

**Run integration tests with**
```shell
tox
pytest -n 0 tests/integration
```
`-n 0` means no parallel testing.


Testing with `tox` allows for keeping the testing environment separate from your development environment.
The development environment will typically accumulate (old) packages during development that interfere with testing; this problem is avoided by testing with `tox`.

### Test coverage

Expand Down Expand Up @@ -117,35 +119,35 @@ For more info about static typing and mypy, see:
- [Static typing with Python](https://typing.readthedocs.io/en/latest/index.html#)
- [Mypy doc](https://mypy.readthedocs.io/en/stable/)

## Generating the API docs
## Docs
We use [MkDocs](https://www.mkdocs.org/) and its theme [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/)
to generate documentations. The configurations of MkDocs are set in [mkdocs.yml](mkdocs.yml) file.

To watch the changes of current doc in real time, run:
```shell
cd docs
make html
mkdocs serve
# or to watch src and docs directories
mkdocs serve -w docs -w src
```
Then open your browser and go to [http://127.0.0.1:8000/](http://127.0.0.1:8000/).

The documentation will be in `docs/_build/html`

If you do not have `make` use
### Publishing the docs
The docs are published on github pages. We use [mike](https://github.com/jimporter/mike)
to deploy the docs to the `gh-pages` branch and to manage the versions of docs.

For example, to deploy the version 2.0 of the docs to the `gh-pages` branch and make it the latest
version, run:
```shell
sphinx-build -b html docs docs/_build/html
mike deploy -p -u 2.0 latest
```
If you are not happy with the changes you can run `mike delete [version]`.
All these mike operations will be recorded as git commits of branch `gh-pages`.

To find undocumented Python objects run
`mike serve` is used to check all versions committed to branch `gh-pages`, which is for checking
the production website. If you have changes but not commit them yet, you should use `mkdocs serve`
instead of `mike serve` to check them.

```shell
cd docs
make coverage
cat _build/coverage/python.txt
```

To [test snippets](https://www.sphinx-doc.org/en/master/usage/extensions/doctest.html) in documentation run

```shell
cd docs
make doctest
```

## Versioning

Expand Down
33 changes: 15 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@
| (2/5) license | [![github license badge](https://img.shields.io/github/license/NPLinker/nplinker)](https://github.com/NPLinker/nplinker) |
| (3/5) community registry | [![pypi badge](https://img.shields.io/pypi/v/nplinker.svg?color=blue)](https://pypi.python.org/project/nplinker/) ![Docker Image Version (latest by date)](https://img.shields.io/docker/v/nlesc/nplinker?arch=amd64&label=docker&sort=date)|
| (4/5) citation | [![Zenodo](https://zenodo.org/badge/DOI/10.5281/zenodo.4680218.svg)](https://doi.org/10.5281/zenodo.4680218) |
| (5/5) checklist | Coming soon |
| (5/5) checklist | ![Static Badge](https://img.shields.io/badge/Coming_Soon-lightgrey) |
| how FAIR is | [![fair-software badge](https://img.shields.io/badge/fair--software.eu-%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8B-yellow)](https://fair-software.eu) |
| **Other best practices** | |
| Documentation | [![Documentation Status](https://readthedocs.org/projects/nplinker/badge/?version=latest)](https://nplinker.readthedocs.io/en/latest/?badge=latest) |
| Documentation | [![Static Badge](https://img.shields.io/badge/Docs_Available-lightgreen)](https://nplinker.github.io/nplinker) [🔗](https://nplinker.github.io/nplinker)|
| Build & Test | [![build](https://github.com/NPLinker/nplinker/actions/workflows/build.yml/badge.svg)](https://github.com/NPLinker/nplinker/actions/workflows/build.yml) |
| Static analysis | [![workflow scq badge](https://sonarcloud.io/api/project_badges/measure?project=NPLinker_nplinker&metric=alert_status)](https://sonarcloud.io/dashboard?id=NPLinker_nplinker) |
| Coverage | [![workflow scc badge](https://sonarcloud.io/api/project_badges/measure?project=NPLinker_nplinker&metric=coverage)](https://sonarcloud.io/dashboard?id=NPLinker_nplinker) |
Expand All @@ -18,21 +18,21 @@


# Natural Products Linker (NPLinker)
NPLinker is a python framework for data mining microbial natural products by integrating genomics and metabolomics data.

NPLinker aims to address the significant bottleneck that exists in the realization of the potential of genome-led metabolite discovery, namely the slow manual matching of predicted biosynthetic gene clusters (BGCs) with metabolites produced during bacterial culture; linking phenotype to genotype.

NPLinker implements a new data-centric approach to alleviate this linking problem by searching for patterns of strain presence and absence between groups of similar spectra (molecular families; MF) and groups of similar BGCs (gene cluster families; GCF). Searches can be performed using a number of available analysis methods employed in isolation or together.
Original paper: [Ranking microbial metabolomic and genomic links in the NPLinker framework using complementary scoring functions](https://doi.org/10.1371/journal.pcbi.1008920).

## Setup and usage

Currently available analysis methods (scoring methods):
- **Metcalf** (standardised): see [Hjörleifsson Eldjárn G, et al. (2021)](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008920), [Doroghazi JR, et al. (2014)](https://www.nature.com/articles/nchembio.1659), and the [demo](notebooks/nplinker_demo1.ipynb).
- **Rosetta**: see [Hjörleifsson Eldjárn G, et al. (2021)](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008920) and [Soldatou S, et al. (2021)](https://www.mdpi.com/1660-3397/19/2/103).
- **NPClassScore**: see the preprint [Louwen JJR, et al. (2022)](https://www.researchsquare.com/article/rs-1391827/v2), and the [demo](notebooks/npclassscore_linking/NPClassScore_demo.ipynb).
### Requirement
- Linux, MacOS, or Windows with WSL ([Windows Subsystem for Linux](https://learn.microsoft.com/en-us/windows/wsl/about)) enabled
- For Windows without WSL, please use the [docker image](https://hub.docker.com/r/nlesc/nplinker)
- Python version ≥3.9

## Setup and usage

### Installation
NPLinker is a Python package, you can install it as following:
```
```shell
# create a new virtual environment
python -m venv env
source env/bin/activate
Expand All @@ -44,15 +44,12 @@ pip install nplinker
install-nplinker-deps
```

Due to hardware requirements of some non-pypi dependecies:
- NPLinker can only be installed on `Linux` and `MacOS (Intel chip)`
- `MacOS(Apple Silicon, M1/M2 chip)` user should execute the install commands in a [Rosseta-enabled terminal](https://support.apple.com/en-us/HT211861).
- For `Windows` users, please use our [docker image](https://hub.docker.com/r/nlesc/nplinker).

A virtual environment is *required* to install the the non-pypi dependencies. You can also use `conda`
to manage python environments.

See the example in [Jupyter notebook](notebooks/nplinker_demo1.ipynb) for a guided introduction to the NPLinker API which shows how to load and examine a dataset. Other notebooks are present showcasing other scoring methods, like for [NPClassScore](notebooks/npclassscore_linking/NPClassScore_demo.ipynb).
### Usage

If you want to visualize and manipulate NPLinker predictions, check [NPLinker Webapp](https://github.com/NPLinker/nplinker-webapp) for more info.
See the [documentation](https://nplinker.github.io/nplinker) for more information about how to use NPLinker.

## Contributing

Expand Down
3 changes: 1 addition & 2 deletions bin/install-nplinker-deps
Original file line number Diff line number Diff line change
Expand Up @@ -150,8 +150,7 @@ echo "🔥 Start installing FastTree ..."
fi
echo -e "✅ FastTree installed successfully\n"

#--- Install Hmmer (not support Windows and ARM64, required by BigScape)
# brew hmmer not available ARM64 platform (e.g. Apple silicon)
#--- Install Hmmer (required by BigScape)
if ! command -v hmmpress &> /dev/null; then
install_it hmmer
fi
Expand Down
28 changes: 4 additions & 24 deletions project_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,19 +7,6 @@ text when the development of the software package takes off.
For a quick reference on software development, we refer to [the software guide
checklist](https://guide.esciencecenter.nl/#/best_practices/checklist).

## Python versions

This repository is set up with Python versions:

- 3.6
- 3.7
- 3.8
- 3.9


Add or remove Python versions based on project requirements. See [the
guide](https://guide.esciencecenter.nl/#/best_practices/language_guides/python) for more information about Python
versions.

## Package management and dependencies

Expand All @@ -43,21 +30,15 @@ help you decide which tool to use for packaging.
- The testing framework used is [PyTest](https://pytest.org)
- [PyTest introduction](https://pythontest.com/pytest-book/)
- PyTest is listed as a development dependency
- This is configured in `setup.cfg`
- This is configured in `pyproject.toml`
gcroci2 marked this conversation as resolved.
Show resolved Hide resolved
- The project uses [GitHub action workflows](https://docs.github.com/en/actions) to automatically run tests on GitHub infrastructure against multiple Python versions
- Workflows can be found in [`.github/workflows`](.github/workflows/)
- [Relevant section in the guide](https://guide.esciencecenter.nl/#/best_practices/language_guides/python?id=testing)

## Documentation

- Documentation should be put in the [`docs/`](docs/) directory. The contents have been generated using `sphinx-quickstart` (Sphinx version 1.6.5).
- We recommend writing the documentation using Restructured Text (reST) and Google style docstrings.
- [Restructured Text (reST) and Sphinx CheatSheet](https://thomas-cokelaer.info/tutorials/sphinx/rest_syntax.html)
- [Google style docstring examples](http://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html).
- The documentation is set up with the ReadTheDocs Sphinx theme.
- Check out its [configuration options](https://sphinx-rtd-theme.readthedocs.io/en/latest/).
- [AutoAPI](https://sphinx-autoapi.readthedocs.io/) is used to generate documentation for the package Python objects.
- `.readthedocs.yaml` is the ReadTheDocs configuration file. When ReadTheDocs is building the documentation this package and its development dependencies are installed so the API reference can be rendered.
- Documentation should be put in the [`docs/`](docs/) directory.
- We recommend writing the documentation using [MkDocs](https://www.mkdocs.org/) and [Google style docstrings](http://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html).
- [Relevant section in the guide](https://guide.esciencecenter.nl/#/best_practices/language_guides/python?id=writingdocumentation)

## Coding style conventions and code quality
Expand All @@ -69,7 +50,7 @@ help you decide which tool to use for packaging.
[Sonarcloud](https://sonarcloud.io/) is used to perform quality analysis and code coverage report

- `sonar-project.properties` is the SonarCloud [configuration](https://docs.sonarqube.org/latest/analysis/analysis-parameters/) file
- `.github/workflows/sonarcloud.yml` is the GitHub action workflow which performs the SonarCloud analysis
- `.github/workflows/sonar-cloud.yml` is the GitHub action workflow which performs the SonarCloud analysis

## Package version number

Expand All @@ -80,7 +61,6 @@ help you decide which tool to use for packaging.
## Logging

- We recommend using the logging module for getting useful information from your module (instead of using print).
- The project is set up with a logging example.
- [Relevant section in the guide](https://guide.esciencecenter.nl/#/best_practices/language_guides/python?id=logging)

## CHANGELOG.md
Expand Down
Loading