error in partition step

153 views
Skip to first unread message

Chaiwat Naktang

unread,
Apr 18, 2016, 5:11:36 AM4/18/16
to EVidenceModeler-users

After I get the gff file from serveral program and I try to run EVM to get a gff. However I'm get the error in the partition step as the following


#####command#######


qsub -cwd -b y -N evm_partition /colossus/home/chaiwat.nak/program_genome_annotation/EVM_r2012-06-25/EvmUtils/partition_EVM_inputs.pl --genome /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/test/Rubber_assemble_v1_remove_redundant.fasta --gene_predictions /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/test/gene_prediction.gff3 --protein_alignments /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/test/protein_alignments.gff3 --transcript_alignments /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/test/newtranscript_alignment.gff3 --segmentSize 100000 --overlapSize 10000 --partition_listing ../partitions_list.out


However it get an error message command is



####error message #######

sh: quiver: command not found
sh: /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/partition_evm/falcon_1: Is a directory
sh: quiver/falcon_1: No such file or directory
sh: quiver_1-100000/gene_prediction.gff3: No such file or directory

usage: /colossus/home/chaiwat.nak/program_genome_annotation/EVM_r2012-06-25/EvmUtils/gff_range_retriever.pl seq_id min_lend max_rend [adjust_min_lend_to_1=0] < gff_file > subset_gff_file

error, /colossus/home/chaiwat.nak/program_genome_annotation/EVM_r2012-06-25/EvmUtils/gff_range_retriever.pl falcon_1|quiver 1 100000 ADJUST_TO_ONE < /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/partition_evm/falcon_1|quiver/gene_prediction.gff3 > /colossus/home/chaiwat.nak/project/rubber_pnik/result_evm_ver2/partition_evm/falcon_1|quiver/falcon_1|quiver_1-100000/gene_prediction.gff3, 32512  at /colossus/home/chaiwat.nak/program_genome_annotation/EVM_r2012-06-25/EvmUtils/partition_EVM_inputs.pl line 242, <$filehandle> line 1.


then I check with my falcon_1 and I found the length of falcon_1 is 170735 which is larger than segment size that i set to 100000


My question is the error message com from the length of my contig

Brian Haas

unread,
Apr 18, 2016, 9:01:51 AM4/18/16
to Chaiwat Naktang, EVidenceModeler-users
The problem here is that it doesn't like pipes in the contig names.  Are you using the latest EVM?  If so, I'll aim to fix this

-Brian
(by iPhone)

--
You received this message because you are subscribed to the Google Groups "EVidenceModeler-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to evidencemodeler-...@googlegroups.com.
To post to this group, send email to evidencemo...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/evidencemodeler-users/27838b67-dfee-4a3e-a16a-0b37cf9096f9%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Message has been deleted

Chaiwat Naktang

unread,
Apr 19, 2016, 3:56:19 AM4/19/16
to EVidenceModeler-users
Thank you for your quickly response. Now I run with the latest EVM version and It' working now.

เมื่อ วันจันทร์ที่ 18 เมษายน ค.ศ. 2016 16 นาฬิกา 11 นาที 36 วินาที UTC+7, Chaiwat Naktang เขียนว่า:

Brian Haas

unread,
Apr 19, 2016, 9:24:30 AM4/19/16
to Chaiwat Naktang, EVidenceModeler-users
Here's an updated script. Drop it in EvmUtils/   Let's see if it works for you.  Note we might need to make more changes....  I'm having you test it for me while I'm away.

best,

~b
partition_EVM_inputs.pl

Nikolay Alabi

unread,
Jul 11, 2019, 2:44:47 AM7/11/19
to EVidenceModeler-users
Hello,

I am facing the same issue that Chaiwat had. I am using EVM 1.1.1 and have also tried your updated script, I still face the same issue. Additionally, I am unsure whether or not to put the transdecoder gff file under pasaTerminalExons. 

Here is the error: 
sh: size485611451/scaffold1: No such file or directory
sh: size485611451: command not found
sh: size485611451/genome_nf.all.RAW.gff: No such file or directory
sh: size485611451_1-100000/genome_nf.all.RAW.gff: No such file or directory

usage: /project/6001701/nalabi/new/EVidenceModeler-1.1.1/EvmUtils/gff_range_retriever.pl seq_id min_lend max_rend [adjust_min_lend_to_1=0] < gff_file > subset_gff_file

error, /project/6001701/nalabi/new/EVidenceModeler-1.1.1/EvmUtils/gff_range_retriever.pl scaffold1|size485611451 1 100000 ADJUST_TO_ONE < /scratch/nalabi/test2/scaffold1|size485611451/genome_nf.all.RAW.gff > /scratch/nalabi/test2/scaffold1|size485611451/scaffold1|size485611451_1-100000/genome_nf.all.RAW.gff, 32512  at /home/nalabi/projects/def-colautti/nalabi/new/EVidenceModeler-1.1.1/EvmUtils/partition_EVM_inputs.pl line 242, <$filehandle> line 1.

This is my original code: 
~/projects/def-colautti/nalabi/new/EVidenceModeler-1.1.1/EvmUtils/partition_EVM_inputs.pl --genome ../ApGenomeRedun4.fasta --gene_predictions ../genome_nf.all.RAW.gff --protein_alignments ../exonerate_3_all.gff --pasaTerminalExons ../Trinity.fasta.transdecoder.gff3 --pasaTerminalExons ../my_pasa_db.valid_blat_alignments.gff3 --pasaTerminalExons ../my_pasa_db.valid_gmap_alignments.gff3 --repeats ../allRepeats.gff --transcript_alignments ../gmap.spliced_alignments.gff3.completed --segmentSize 100000 --overlapSize 10000 --partition_listing partitions_list.out

Please let me know your thoughts. 

Best, 

Nikolay

Brian Haas

unread,
Jul 12, 2019, 2:20:14 AM7/12/19
to Nikolay Alabi, EVidenceModeler-users
Hi

It looks like it's related to the genome scaffold identifiers.  If you remove the pipe symbol from all identifiers, it might work.  I thought I had addressed this in the code a long while ago.  I'm on vacation and can revisit this when I return if it's still an issue.

-via googleFi

--
You received this message because you are subscribed to the Google Groups "EVidenceModeler-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to evidencemodeler-...@googlegroups.com.
To post to this group, send email to evidencemo...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages