Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add motus preplong for long reads profiling #559

Draft
wants to merge 45 commits into
base: crazy-corgi
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
79d1535
Template update for nf-core/tools version 3.0.0
nf-core-bot Oct 8, 2024
56e023f
Template update for nf-core/tools version 3.0.1
nf-core-bot Oct 9, 2024
69969cf
Template update for nf-core/tools version 3.0.2
nf-core-bot Oct 11, 2024
226abd9
resolve the conflicts
LilyAnderssonLee Oct 17, 2024
ff4d28f
remove max_cpu from test.config
LilyAnderssonLee Oct 17, 2024
f3ef6aa
update schema wit new release
LilyAnderssonLee Oct 18, 2024
5e90741
add exists:true to the sechma to check the file path
LilyAnderssonLee Oct 18, 2024
a9af810
change db_path format back to the original value path in schema_database
LilyAnderssonLee Oct 18, 2024
9e20ff1
update the type of run_accession into string and integer
LilyAnderssonLee Oct 22, 2024
88c9f46
Update CITATIONS.md
LilyAnderssonLee Oct 22, 2024
2900ae2
Update conf/modules.config
LilyAnderssonLee Oct 22, 2024
b0c1154
address comments from James
LilyAnderssonLee Oct 22, 2024
340b0af
Update nextflow.config
LilyAnderssonLee Oct 22, 2024
05b743f
Add Lili as co-author
LilyAnderssonLee Oct 22, 2024
fb38fb7
Fix CI to run each test separately
jfy133 Oct 24, 2024
353188d
Update main.nf
LilyAnderssonLee Oct 24, 2024
f7c6176
Update .nf-core.yml
LilyAnderssonLee Oct 24, 2024
dbabb5d
[automated] Fix code linting
nf-core-bot Oct 24, 2024
fe84cc3
Fix mOTUs if/else statemetn
jfy133 Oct 24, 2024
00067f5
update configs and database schema
LilyAnderssonLee Oct 24, 2024
e34437b
Merge pull request #545 from nf-core/nf-core-template-merge-3.0.2
LilyAnderssonLee Oct 24, 2024
8262e0a
Add darkmode version of metromap
jfy133 Oct 24, 2024
93aa499
Merge pull request #548 from nf-core/darkmode-diagram
jfy133 Oct 24, 2024
6c6cd8c
Fix darkmode diagram
jfy133 Oct 31, 2024
2e6f37d
refactor: prepare the additional prefixes input
Midnighter Oct 31, 2024
a1a8289
chore: update krakenuniq/preloadedkrakenuniq
Midnighter Nov 5, 2024
c0311a1
fix: create always unique prefix
Midnighter Nov 6, 2024
30fc22a
fix: make prefix unique across PE and SE
Midnighter Nov 21, 2024
163d600
Update CHANGELOG.md
jfy133 Nov 21, 2024
b2cac44
Merge pull request #549 from nf-core/darkmode-diagram
jfy133 Nov 21, 2024
3022cd2
Update CHANGELOG.md
jfy133 Nov 21, 2024
e58a59c
Merge pull request #550 from nf-core/update-krakenuniq
jfy133 Nov 21, 2024
51b8acb
update nonpareil/curve
LilyAnderssonLee Nov 22, 2024
4da7342
update CHANGELOG.md
LilyAnderssonLee Nov 22, 2024
7d261e8
Merge pull request #552 from nf-core/nonpareil_curve_update
LilyAnderssonLee Nov 22, 2024
716dd78
bump the version
LilyAnderssonLee Nov 25, 2024
42479b0
update changelog
LilyAnderssonLee Nov 25, 2024
1a27aaa
Merge pull request #553 from nf-core/pre_patch_bump_v
LilyAnderssonLee Nov 25, 2024
05e4f2e
Move the modules.config line below the base.config line
LilyAnderssonLee Nov 28, 2024
2a64aed
update configs
LilyAnderssonLee Nov 28, 2024
acd85cc
Merge pull request #556 from nf-core/patch_release_fix
LilyAnderssonLee Nov 28, 2024
0af416a
Merge pull request #554 from nf-core/dev
LilyAnderssonLee Nov 28, 2024
803f928
Bump the version after patch release v1.2.1
LilyAnderssonLee Nov 28, 2024
79055cd
Merge pull request #557 from LilyAnderssonLee/post_release_bump
jfy133 Nov 28, 2024
047808a
add motus preplong for long reads
LilyAnderssonLee Dec 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ If you'd like to write some code for nf-core/taxprofiler, the standard workflow
1. Check that there isn't already an issue about your idea in the [nf-core/taxprofiler issues](https://github.com/nf-core/taxprofiler/issues) to avoid duplicating work. If there isn't one already, please create one so that others know you're working on this
2. [Fork](https://help.github.com/en/github/getting-started-with-github/fork-a-repo) the [nf-core/taxprofiler repository](https://github.com/nf-core/taxprofiler) to your GitHub account
3. Make the necessary changes / additions within your forked repository following [Pipeline conventions](#pipeline-contribution-conventions)
4. Use `nf-core schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
4. Use `nf-core pipelines schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
5. Submit a Pull Request against the `dev` branch and wait for the code to be reviewed and merged

If you're not used to this workflow with git, you can start with some [docs from GitHub](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests) or even their [excellent `git` resources](https://try.github.io/).
Expand All @@ -40,7 +40,7 @@ There are typically two types of tests that run:
### Lint tests

`nf-core` has a [set of guidelines](https://nf-co.re/developers/guidelines) which all pipelines must adhere to.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core lint <pipeline-directory>` command.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core pipelines lint <pipeline-directory>` command.

If any failures or warnings are encountered, please follow the listed URL for more documentation.

Expand Down Expand Up @@ -75,7 +75,7 @@ If you wish to contribute a new step, please use the following coding standards:
2. Write the process block (see below).
3. Define the output channel if needed (see below).
4. Add any new parameters to `nextflow.config` with a default (see below).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core schema build` tool).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core pipelines schema build` tool).
6. Add sanity checks and validation for all relevant parameters.
7. Perform local tests to validate that the new code works as expected.
8. If applicable, add a new test command in `.github/workflow/ci.yml`.
Expand All @@ -86,11 +86,11 @@ If you wish to contribute a new step, please use the following coding standards:

Parameters should be initialised / defined with default values in `nextflow.config` under the `params` scope.

Once there, use `nf-core schema build` to add to `nextflow_schema.json`.
Once there, use `nf-core pipelines schema build` to add to `nextflow_schema.json`.

### Default processes resource requirements

Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.
Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/main/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.

The process resources can be passed on to the tool dynamically within the process with the `${task.cpus}` and `${task.memory}` variables in the `script:` block.

Expand All @@ -103,7 +103,7 @@ Please use the following naming schemes, to make it easy to understand what is g

### Nextflow version bumping

If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core bump-version --nextflow . [min-nf-version]`
If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core pipelines bump-version --nextflow . [min-nf-version]`

### Images and figures

Expand Down
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Learn more about contributing: [CONTRIBUTING.md](https://github.com/nf-core/taxp
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the pipeline conventions in the [contribution docs](https://github.com/nf-core/taxprofiler/tree/master/.github/CONTRIBUTING.md)
- [ ] If necessary, also make a PR on the nf-core/taxprofiler _branch_ on the [nf-core/test-datasets](https://github.com/nf-core/test-datasets) repository.
- [ ] Make sure your code lints (`nf-core lint`).
- [ ] Make sure your code lints (`nf-core pipelines lint`).
- [ ] Ensure the test suite passes (`nextflow run . -profile test,docker --outdir <OUTDIR>`).
- [ ] Check for unexpected warnings in debug mode (`nextflow run . -profile debug,test,docker --outdir <OUTDIR>`).
- [ ] Usage Documentation in `docs/usage.md` is updated.
Expand Down
25 changes: 21 additions & 4 deletions .github/workflows/awsfulltest.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,35 @@
name: nf-core AWS full size tests
# This workflow is triggered on published releases.
# This workflow is triggered on PRs opened against the master branch.
# It can be additionally triggered manually with GitHub actions workflow dispatch button.
# It runs the -profile 'test_full' on AWS batch

on:
release:
types: [published]
pull_request:
branches:
- master
workflow_dispatch:
pull_request_review:
types: [submitted]

jobs:
run-platform:
name: Run AWS full tests
if: github.repository == 'nf-core/taxprofiler'
# run only if the PR is approved by at least 2 reviewers and against the master branch or manually triggered
if: github.repository == 'nf-core/taxprofiler' && github.event.review.state == 'approved' && github.event.pull_request.base.ref == 'master' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
steps:
- uses: octokit/request-action@v2.x
id: check_approvals
with:
route: GET /repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/reviews
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- id: test_variables
if: github.event_name != 'workflow_dispatch'
run: |
JSON_RESPONSE='${{ steps.check_approvals.outputs.data }}'
CURRENT_APPROVALS_COUNT=$(echo $JSON_RESPONSE | jq -c '[.[] | select(.state | contains("APPROVED")) ] | length')
test $CURRENT_APPROVALS_COUNT -ge 2 || exit 1 # At least 2 approvals are required
- name: Launch workflow via Seqera Platform
uses: seqeralabs/action-tower-launch@v2
with:
Expand Down
63 changes: 47 additions & 16 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,27 +7,34 @@ on:
pull_request:
release:
types: [published]
workflow_dispatch:

env:
NXF_ANSI_LOG: false
NXF_SINGULARITY_CACHEDIR: ${{ github.workspace }}/.singularity
NXF_SINGULARITY_LIBRARYDIR: ${{ github.workspace }}/.singularity

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
cancel-in-progress: true

jobs:
test:
name: Run pipeline with test data
name: "Run pipeline with test data (${{ matrix.NXF_VER }} | ${{ matrix.test_name }} | ${{ matrix.profile }})"
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/taxprofiler') }}"
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
NXF_VER:
- "23.04.0"
- "24.04.2"
- "latest-everything"
tags:
profile:
- "conda"
- "docker"
- "singularity"
test_name:
- "test"
- "test_nopreprocessing"
- "test_noprofiling"
Expand All @@ -39,37 +46,61 @@ jobs:
- "test_alternativepreprocessing"
- "test_bbduk"
- "test_prinseqplusplus"

isMaster:
- ${{ github.base_ref == 'master' }}
# Exclude conda and singularity on dev
exclude:
- isMaster: false
profile: "conda"
- isMaster: false
profile: "singularity"
steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4

- name: Install Nextflow
- name: Set up Nextflow
uses: nf-core/setup-nextflow@v2
with:
version: "${{ matrix.NXF_VER }}"

- name: Disk space cleanup
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1
- name: Set up Apptainer
if: matrix.profile == 'singularity'
uses: eWaterCycle/setup-apptainer@main

- name: Run pipeline with test data
# TODO nf-core: You can customise CI pipeline run tests as required
# For example: adding multiple test runs with different parameters
# Remember that you can parallelise this by using strategy.matrix
- name: Set up Singularity
if: matrix.profile == 'singularity'
run: |
sudo locale-gen en_US.UTF-8
sudo update-locale LANG=en_US.UTF-8
mkdir -p $NXF_SINGULARITY_CACHEDIR
mkdir -p $NXF_SINGULARITY_LIBRARYDIR

- name: Set up Miniconda
if: matrix.profile == 'conda'
uses: conda-incubator/setup-miniconda@a4260408e20b96e80095f42ff7f1a15b27dd94ca # v3
with:
miniconda-version: "latest"
auto-update-conda: true
conda-solver: libmamba
channels: conda-forge,bioconda

- name: Set up Conda
if: matrix.profile == 'conda'
run: |
echo $(realpath $CONDA)/condabin >> $GITHUB_PATH
echo $(realpath python) >> $GITHUB_PATH

- name: Clean up Disk space
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1

- name: Run pipeline with test data
run: |
if [[ "${{ matrix.tags }}" == "test_motus" ]]; then
if [[ "${{ matrix.test_name }}" == "test_motus" ]]; then
wget https://raw.githubusercontent.com/motu-tool/mOTUs/master/motus/downloadDB.py
python downloadDB.py --no-download-progress
echo 'tool,db_name,db_params,db_type,db_path' > 'database_motus.csv'
echo "motus,db1_mOTU,,short,db_mOTU" >> 'database_motus.csv'
echo "motus,db2_mOTU,,long,db_mOTU" >> 'database_motus.csv'
echo "motus,db3_mOTU,,short;long,db_mOTU" >> 'database_motus.csv'
nextflow run ${GITHUB_WORKSPACE} -profile docker,${{ matrix.tags }} --databases ./database_motus.csv --outdir ./results_${{ matrix.tags }};
nextflow run ${GITHUB_WORKSPACE} -profile docker,${{ matrix.test_name }} --databases ./database_motus.csv --outdir ./results_${{ matrix.test_name }};
else
nextflow run ${GITHUB_WORKSPACE} -profile docker,${{ matrix.tags }} --outdir ./results_${{ matrix.tags }};
nextflow run ${GITHUB_WORKSPACE} -profile docker,${{ matrix.test_name }} --outdir ./results_${{ matrix.test_name }};
fi
53 changes: 43 additions & 10 deletions .github/workflows/download_pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Test successful pipeline download with 'nf-core download'
name: Test successful pipeline download with 'nf-core pipelines download'

# Run the workflow when:
# - dispatched manually
Expand All @@ -8,7 +8,7 @@ on:
workflow_dispatch:
inputs:
testbranch:
description: "The specific branch you wish to utilize for the test execution of nf-core download."
description: "The specific branch you wish to utilize for the test execution of nf-core pipelines download."
required: true
default: "dev"
pull_request:
Expand Down Expand Up @@ -39,9 +39,11 @@ jobs:
with:
python-version: "3.12"
architecture: "x64"
- uses: eWaterCycle/setup-singularity@931d4e31109e875b13309ae1d07c70ca8fbc8537 # v7

- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@4bb22c52d4f63406c49e94c804632975787312b3 # v2.0.0
with:
singularity-version: 3.8.3
apptainer-version: 1.3.4

- name: Install dependencies
run: |
Expand All @@ -54,33 +56,64 @@ jobs:
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> ${GITHUB_ENV}
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> ${GITHUB_ENV}

- name: Make a cache directory for the container images
run: |
mkdir -p ./singularity_container_images

- name: Download the pipeline
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
run: |
nf-core download ${{ env.REPO_LOWERCASE }} \
nf-core pipelines download ${{ env.REPO_LOWERCASE }} \
--revision ${{ env.REPO_BRANCH }} \
--outdir ./${{ env.REPOTITLE_LOWERCASE }} \
--compress "none" \
--container-system 'singularity' \
--container-library "quay.io" -l "docker.io" -l "ghcr.io" \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io" \
--container-cache-utilisation 'amend' \
--download-configuration
--download-configuration 'yes'

- name: Inspect download
run: tree ./${{ env.REPOTITLE_LOWERCASE }}

- name: Count the downloaded number of container images
id: count_initial
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Initial container image count: $image_count"
echo "IMAGE_COUNT_INITIAL=$image_count" >> ${GITHUB_ENV}

- name: Run the downloaded pipeline (stub)
id: stub_run_pipeline
continue-on-error: true
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
- name: Run the downloaded pipeline (stub run not supported)
id: run_pipeline
if: ${{ job.steps.stub_run_pipeline.status == failure() }}
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -profile test,singularity --outdir ./results

- name: Count the downloaded number of container images
id: count_afterwards
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Post-pipeline run container image count: $image_count"
echo "IMAGE_COUNT_AFTER=$image_count" >> ${GITHUB_ENV}

- name: Compare container image counts
run: |
if [ "${{ env.IMAGE_COUNT_INITIAL }}" -ne "${{ env.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ env.IMAGE_COUNT_INITIAL }}
final_count=${{ env.IMAGE_COUNT_AFTER }}
difference=$((final_count - initial_count))
echo "$difference additional container images were \n downloaded at runtime . The pipeline has no support for offline runs!"
tree ./singularity_container_images
exit 1
else
echo "The pipeline can be downloaded successfully!"
fi
23 changes: 19 additions & 4 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: nf-core linting
# This workflow is triggered on pushes and PRs to the repository.
# It runs the `nf-core lint` and markdown lint tests to ensure
# It runs the `nf-core pipelines lint` and markdown lint tests to ensure
# that the code meets the nf-core guidelines.
on:
push:
Expand Down Expand Up @@ -41,17 +41,32 @@ jobs:
python-version: "3.12"
architecture: "x64"

- name: read .nf-core.yml
uses: pietrobolcato/action-read-yaml@1.1.0
id: read_yml
with:
config: ${{ github.workspace }}/.nf-core.yml

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install nf-core
pip install nf-core==${{ steps.read_yml.outputs['nf_core_version'] }}

- name: Run nf-core pipelines lint
if: ${{ github.base_ref != 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt pipelines lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Run nf-core lint
- name: Run nf-core pipelines lint --release
if: ${{ github.base_ref == 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md
run: nf-core -l lint_log.txt pipelines lint --release --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Save PR number
if: ${{ always() }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linting_comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Download lint results
uses: dawidd6/action-download-artifact@09f2f74827fd3a8607589e5ad7f9398816f540fe # v3
uses: dawidd6/action-download-artifact@bf251b5aa9c2f7eeb574a96ee720e24f801b7c11 # v6
with:
workflow: linting.yml
workflow_conclusion: completed
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/release-announcements.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
- name: get topics and convert to hashtags
id: get_topics
run: |
echo "topics=$(curl -s https://nf-co.re/pipelines.json | jq -r '.remote_workflows[] | select(.full_name == "${{ github.repository }}") | .topics[]' | awk '{print "#"$0}' | tr '\n' ' ')" >> $GITHUB_OUTPUT
echo "topics=$(curl -s https://nf-co.re/pipelines.json | jq -r '.remote_workflows[] | select(.full_name == "${{ github.repository }}") | .topics[]' | awk '{print "#"$0}' | tr '\n' ' ')" | sed 's/-//g' >> $GITHUB_OUTPUT
- uses: rzr/fediverse-action@master
with:
Expand Down
Loading
Loading