Segmentation fault (core dumped) error

27 views
Skip to first unread message

Juan Carvajal

unread,
Nov 18, 2025, 3:33:58 PM (11 days ago) Nov 18
to GeneRax
Hi,
When I try to use GeneRax, I get the error a "Segmentation fault (core dumped)" after "Optimizing gene trees". I have ran it with these files in the past and it worked fine.  Do you have any idea what could be causing it?
Bellow is the whole output.
Thank you very much!
Juan

(GeneRax) root@Juan:~/generax_juan# generax -s Species_tree.nwk -f config.txt
[Juan:38858] mca_base_component_repository_open: unable to open mca_btl_openib: librdmacm.so.1: cannot open shared object file: No such file or directory (ignored)
[00:00:00] GeneRax 2.1.3
Logs will also be printed into GeneRax/generax.log
GeneRax was called as follow:
generax -s Species_tree.nwk -f config.txt

General information:
- Output prefix: GeneRax
- Families information: config.txt
- Species tree: Species_tree.nwk
- MPI Ranks: 1
- Random seed: 123
- Reconciliation model: UndatedDTL
- DTL rates: global rates
- Infer ML reconciliation: ON
- Unrooted reconciliation likelihood: OFF
- Enforcing gene tree root: OFF
- Prune species tree mode: OFF

Gene tree correction information:
- Gene tree search strategy: SPR
- Max gene SPR radius: 5

[00:00:00] Filtering invalid families...

End of instance initialization
[00:00:00] Starting species tree initialization...
[00:00:00] End of species tree initialization
[00:00:00] Filtering invalid families based on the starting species tree...

[00:00:00] Gathering statistics about the families...
[00:00:00] Input data information:
- Number of gene families: 1
- Number of species: 17
- Total number of genes: 32
- Average number of genes per family: 32
- Maximum number of genes per family: 32
- Species covered with the smallest family coverage: "1396_1096" (covered by 1/1 families)
- Average (over species) species family coverage: 1

[00:00:00] Reconciliation rates optimization...
        D=0.0691137, L=0.0213723, T=0.223418, RecLL= -85.8485

[00:00:00] Optimizing gene trees with radius=1...
Segmentation fault (core dumped)
[00:00:00] JointLL=0 RecLL=0 LibpllLL=0

[00:00:00] Reconciliation rates optimization...
terminate called after throwing an instance of 'LibpllException'
  what():  Could not load open newick file GeneRax/results/Juan/geneTree.newick
[Juan:38858] *** Process received signal ***
[Juan:38858] Signal: Aborted (6)
[Juan:38858] Signal code:  (-6)
[Juan:38858] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x42520)[0x79e581442520]
[Juan:38858] [ 1] /lib/x86_64-linux-gnu/libc.so.6(pthread_kill+0x12c)[0x79e5814969fc]
[Juan:38858] [ 2] /lib/x86_64-linux-gnu/libc.so.6(raise+0x16)[0x79e581442476]
[Juan:38858] [ 3] /lib/x86_64-linux-gnu/libc.so.6(abort+0xd3)[0x79e5814287f3]
[Juan:38858] [ 4] /root/anaconda3/envs/GeneRax/bin/../lib/libstdc++.so.6(_ZN9__gnu_cxx27__verbose_terminate_handlerEv+0xc0)[0x79e581879165]
[Juan:38858] [ 5] /root/anaconda3/envs/GeneRax/bin/../lib/libstdc++.so.6(+0xbb747)[0x79e581877747]
[Juan:38858] [ 6] /root/anaconda3/envs/GeneRax/bin/../lib/libstdc++.so.6(_ZSt10unexpectedv+0x0)[0x79e5818710e3]
[Juan:38858] [ 7] /root/anaconda3/envs/GeneRax/bin/../lib/libstdc++.so.6(__cxa_rethrow+0x0)[0x79e58187794a]
[Juan:38858] [ 8] generax(+0x3f87c)[0x650fc130787c]
[Juan:38858] [ 9] generax(_ZN13LibpllParsers20parallelGetTreeSizesERKSt6vectorI10FamilyInfoSaIS1_EE+0xc6)[0x650fc1363cc6]
[Juan:38858] [10] generax(_ZN16PerCoreGeneTreesC1ERKSt6vectorI10FamilyInfoSaIS1_EEbb+0x153b)[0x650fc137d1db]
[Juan:38858] [11] generax(_ZN8Routines13optimizeRatesEbRKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERK12RecModelInfoRSt6vectorI10FamilyInfoSaISC_EEbR10ParametersRl+0xce)[0x650fc138245e]
[Juan:38858] [12] generax(_ZN11GeneRaxCore25optimizeRatesAndGeneTreesER15GeneRaxInstancebbj+0x66a)[0x650fc132f43a]
[Juan:38858] [13] generax(_ZN11GeneRaxCore19geneTreeJointSearchER15GeneRaxInstance+0x9f)[0x650fc13306af]
[Juan:38858] [14] generax(_Z12generax_mainiPPcPv+0x229)[0x650fc1324349]
[Juan:38858] [15] /lib/x86_64-linux-gnu/libc.so.6(+0x29d90)[0x79e581429d90]
[Juan:38858] [16] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x80)[0x79e581429e40]
[Juan:38858] [17] generax(+0x5a6a4)[0x650fc13226a4]
[Juan:38858] *** End of error message ***
Aborted (core dumped)

Stefan Flaumberg

unread,
Nov 20, 2025, 7:05:18 AM (9 days ago) Nov 20
to GeneRax
Hi Juan,

From the following line at the beginning of your report it seems that the problem is not with GeneRax, but with your Open MPI library:
[Juan:38858] mca_base_component_repository_open: unable to open mca_btl_openib: librdmacm.so.1: cannot open shared object file: No such file or directory (ignored)

To test it, could you please reinstall GeneRax without MPI support, run the analysis once again and describe what happens?
To this end, change the 3rd line in ./install.sh script in the GeneRax directory to
cmake -DDISABLE_MPI=ON ..
and run the modified ./install.sh script. This will make your installation not to rely on the MPI libraries.

Please, write back about the results.

Best,
Stefan

Juan Carvajal

unread,
Nov 20, 2025, 11:53:36 AM (9 days ago) Nov 20
to GeneRax
Hi Stefan,
That solved my problem!
Up until now, I have been using the version from  bioconda, do you know if there is a fix for that one?
Thank you very much!
Juan

Stefan Flaumberg

unread,
Nov 20, 2025, 6:12:30 PM (9 days ago) Nov 20
to GeneRax
Juan,

That's great! However, not using MPI comes at the cost of inability to use the tool for medium-to-large datasets.
Do you still have the problem with the unmodified GitHub version? Could you please test the installation without that change in the ./install.sh file (the 3rd line should be just cmake ..)?

I know nothing about the Bioconda package. The recipe specification cites OpenMPI as an external dependency and states that it should have a version >=4.1.6, <5.0a0. So maybe you have the wrong version installed on your computer. Or maybe your MPI version is wrong for the Bioconda package, but is right for the GitHub package. You can test it, as I described in the paragraph above. I observe no issues with the GitHub package when running with OpenMPI version 4.1.5.

Best,
Stefan

Reply all
Reply to author
Forward
0 new messages