Skip to content

Commit

Permalink
Merge pull request #60 from pinellolab/dev
Browse files Browse the repository at this point in the history
  • Loading branch information
lingfeiwang committed May 15, 2024
2 parents 7e60cbb + aaa231e commit cb84353
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 7 deletions.
11 changes: 7 additions & 4 deletions .github/workflows/ci-inference.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Lingfei Wang, 2022. All rights reserved.
# Lingfei Wang, 2022, 2024. All rights reserved.
name: Network inference tests

on:
Expand All @@ -23,6 +23,7 @@ jobs:
strategy:
matrix:
python-version: ["3.9","3.10"]
nth: ["2"]
steps:
- uses: actions/checkout@v4
- name: Install
Expand All @@ -44,7 +45,7 @@ jobs:
actual-folder: tmp_static
expected-h5: test/output/static_expected.h5
actual-h5: output/static.h5
makefile-params: '{"ENVMODE": "none", "NTH": "2", "DEVICE": "cpu", "GENOME_MACS2": "mm", "JOINT": "1", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3"}'
makefile-params: '{"ENVMODE": "none", "NTH": "${{ matrix.nth }}", "DEVICE": "cpu", "GENOME_MACS2": "mm", "JOINT": "1", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3"}'
exclusions: 'footprints.bed net_weight.tsv.gz net_covfactor.tsv.gz net_meanvar.tsv.gz net_loss.tsv.gz net_stats.tsv.gz net_nweight.tsv.gz net_iweight.tsv.gz net_inweight.tsv.gz'

test-blood1:
Expand All @@ -53,6 +54,7 @@ jobs:
strategy:
matrix:
python-version: ["3.9","3.10"]
nth: ["1","2"]
steps:
- uses: actions/checkout@v4
- name: Install
Expand All @@ -74,7 +76,7 @@ jobs:
actual-folder: test/tmp_static
expected-h5: test/output/static_expected.h5
actual-h5: test/output/static.h5
makefile-params: '{"ENVMODE": "none", "NTH": "2", "DEVICE": "cpu", "GENOME_MACS2": "hs", "JOINT": "0", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3"}'
makefile-params: '{"ENVMODE": "none", "NTH": "${{ matrix.nth }}", "DEVICE": "cpu", "GENOME_MACS2": "hs", "JOINT": "0", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3"}'
exclusions: 'reads.bam reads.bai net_weight.tsv.gz net_covfactor.tsv.gz net_meanvar.tsv.gz net_loss.tsv.gz net_stats.tsv.gz'

test-blood2:
Expand All @@ -83,6 +85,7 @@ jobs:
strategy:
matrix:
python-version: ["3.9","3.10"]
nth: ["2"]
steps:
- uses: actions/checkout@v4
- name: Install
Expand All @@ -104,5 +107,5 @@ jobs:
actual-folder: test/tmp_dynamic
expected-h5: test/output/dynamic_expected.h5
actual-h5: test/output/dynamic.h5
makefile-params: '{"ENVMODE": "none", "NTH": "2", "DEVICE": "cpu", "GENOME_MACS2": "hs", "JOINT": "0", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3", "PARAMS-DYNAMIC-SUBSETS_RNA": "1000 10 10"}'
makefile-params: '{"ENVMODE": "none", "NTH": "${{ matrix.nth }}", "DEVICE": "cpu", "GENOME_MACS2": "hs", "JOINT": "0", "KPARAMS-NETWORK-RECONSTRUCT+": " --nstep 10 --nstep_report 3", "PARAMS-DYNAMIC-SUBSETS_RNA": "1000 10 10"}'
exclusions: 'reads.bam reads.bai net_weight.tsv.gz net_covfactor.tsv.gz net_meanvar.tsv.gz net_loss.tsv.gz net_stats.tsv.gz'
4 changes: 4 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -204,6 +204,10 @@ FAQ

Some visualization functions in Dictys return two or more figures, such as ``figs = net.draw_discover(...)``. You can save them separately with ``figs[0].savefig('output1.pdf'); figs[1].savefig('output2.pdf'); ...``. See `matplotlib.figure.savefig <https://matplotlib.org/stable/api/figure_api.html#matplotlib.figure.Figure.savefig>`_ and `issue 15 <https://github.com/pinellolab/dictys/issues/15>`_.

* **How should I deal with the error ``index: invalid option -- '@'``?**

You see this error because your machine setup does not allow to install Dictys with the latest samtools version. The ``-@`` option allows parallel core usage to improve speed but is only recognized by a recent samtools version. If you cannot install a recent samtools version, you can add ``"NTH": "1"`` in the dictionary in the line ``dictys_helper makefile_update.py ../makefiles/config.mk...`` This will enforce single-thread samtools and avoid the ``-@`` option.

Issues
==========================
Please raise an issue on `github <https://github.com/pinellolab/dictys/issues/new/choose>`_.
Expand Down
12 changes: 9 additions & 3 deletions src/dictys/scripts/chromatin_macs2.sh
Original file line number Diff line number Diff line change
Expand Up @@ -24,14 +24,20 @@ genome_size=$6
cutoff=$7
nodes=$8

if [ "a$nodes" == "a1" ]; then
nodes2=""
else
nodes2="-@ $nodes"
fi

#Create bam file for custom cells list, filter out chrM/chrUn/chrRandom, sort and index
awk '{printf("%s\n","'$cells_dir/'"$1)}' "$cells_list" > "00-cells.txt"
( samtools view -h -@ "$nodes" "$(head -n 1 "00-cells.txt" )" | grep -v '^@HD' | grep -v '^@PG' ; tail -n +2 "00-cells.txt" | while read l; do samtools view -@ "$nodes" "$l"; done ) | awk '$3!="chrM"' | grep -v chrUn_ | grep -v GL00 | grep -v -e "random" | samtools view -1 -@ "$nodes" -o "02-filtered.bam" -
( samtools view -h $nodes2 "$(head -n 1 "00-cells.txt" )" | grep -v '^@HD' | grep -v '^@PG' ; tail -n +2 "00-cells.txt" | while read l; do samtools view $nodes2 "$l"; done ) | awk '$3!="chrM"' | grep -v chrUn_ | grep -v GL00 | grep -v -e "random" | samtools view -1 $nodes2 -o "02-filtered.bam" -

#filter, sort and index bam file.
samtools sort -o "$output_bam" -@ "$nodes" -l 1 02-filtered.bam
samtools sort -o "$output_bam" $nodes2 -l 1 02-filtered.bam
rm 02-filtered.bam
samtools index -@ "$nodes" "$output_bam" "$output_bai"
samtools index $nodes2 "$output_bam" "$output_bai"

#Step3A. Peak calling on aggregate population [Keep only significant peaks]
OMP_NUM_THREADS=$nodes MKL_NUM_THREADS=$nodes NUMEXPR_NUM_THREADS=$nodes OPENBLAS_NUM_THREADS=$nodes OMP_MAX_THREADS=$nodes MKL_MAX_THREADS=$nodes NUMEXPR_MAX_THREADS=$nodes OPENBLAS_MAX_THREADS=$nodes VECLIB_MAXIMUM_THREADS=$nodes macs2 callpeak -t "$output_bam" -n 04 -g $genome_size --nomodel --shift -75 --extsize 150 --keep-dup all --verbose 4 --call-summits -q $cutoff
Expand Down

0 comments on commit cb84353

Please sign in to comment.