-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nanorepeat "freezing" #18
Comments
Hi Federico, By the way, please try to use |
Hi, thanks a lot for the feedback and suggestion, I am now running with
I could certainly try to remove this repeat from my input file and re-run it. I have no problem in doing that, but I was wondering whether you have a guess on why this happens. Is this random or you think this particular repeat might be problematic? Thanks a lot, |
I'm not sure because I haven't seen the reads in this repeat. It might be due to an high sequencing depth, or these reads might contain very long repeats. Is it convenient for you to check the region in IGV? (you can use samtools view to extract the reads of a specific region so that you dont' need to download the whole bam file from the server) By the way, it looks like Thanks, |
Hi there,
I am running nanorepeat on some ONT bam files with the following command:
nanoRepeat.py -i $bam -t bam -d ont_sup -r $genome -b $bed -c 20 -o $output
where
$bed
contains 6,841 entries and$bam
is a ~100G file.The issue I am experiencing is that after some hours of running, nanorepeat seems to freeze. While at the beginning of the run I could see in the output folder tmp subfolders like
${sample}.NanoRepeat_temp.chr8-20707547-20707710-AAGA
, now in the output folder there is only a subfolder named${sample}.details
. However, the program is still running, the last operation appears to have been done more than 6 hours ago and no*tsv
file has been generated.I have tried to repeat the analysis on a SR bed file containing 10 entries and everything run smoothly. Might this perhaps be linked to the big size of BAM (~100G) and BED (7k entries) files?
I have increased the memory up to 500gb, but nothing changed.
Do you have any idea on what is going on?
Thanks in advance,
Federico
The text was updated successfully, but these errors were encountered: