Hello USPEX Users
I face with "error: forrtl: severe (41): insufficient virtual memory" in running USPEX with QE,
It says I haven't sufficient memory while It doesn't use even third of my memory in the calculations.
My computer specifications are:processor : Intel Core i7 CPU 930 @ 2.80GHz
Memory : 5.8 GiB
I'll be thankful if anyone could help me.
I attached part of my log and error files bellow:
LOG:
forrtl: severe (41): insufficient virtual memory
Image PC Routine Line Source
pw.x 0000000000B2634E Unknown Unknown Unknown
pw.x 0000000000B24DE6 Unknown Unknown Unknown
pw.x 0000000000AC7DE2 Unknown Unknown Unknown
pw.x 0000000000A606DB Unknown Unknown Unknown
pw.x 0000000000A9F443 Unknown Unknown Unknown
pw.x 00000000007E18B4 gvect_mp_gvect_in 105 recvec.f90
pw.x 00000000005B8D0A data_structure_ 68 data_structure.f90
pw.x 0000000000571B17 allocate_fft_ 37 allocate_fft.f90
pw.x 0000000000439B2B init_run_ 47 init_run.f90
pw.x 0000000000407550 MAIN__ 95 pwscf.f90
pw.x 00000000004072BC Unknown Unknown Unknown
libc.so.6 00007FF9D470376D Unknown Unknown Unknown
pw.x 00000000004071B9 Unknown Unknown Unknown
forrtl: severe (41): insufficient virtual memory
Image PC Routine Line Source
pw.x 0000000000B2634E Unknown Unknown Unknown
pw.x 0000000000B24DE6 Unknown Unknown Unknown
pw.x 0000000000AC7DE2 Unknown Unknown Unknown
pw.x 0000000000A606DB Unknown Unknown Unknown
pw.x 0000000000A9F443 Unknown Unknown Unknown
pw.x 00000000007E18B4 gvect_mp_gvect_in 105 recvec.f90
pw.x 00000000005B8D0A data_structure_ 68 data_structure.f90
pw.x 0000000000571B17 allocate_fft_ 37 allocate_fft.f90
pw.x 0000000000439B2B init_run_ 47 init_run.f90
pw.x 0000000000407550 MAIN__ 95 pwscf.f90
pw.x 00000000004072BC Unknown Unknown Unknown
libc.so.6 00007F56E9CD476D Unknown Unknown Unknown
.
.
.
and this is my error files is CalcFold1
Program PWSCF v.5.0.2 (svn rev. 9656) starts on 23Sep2013 at 13:16: 4
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
URL
http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote.php Parallel version (MPI), running on 8 processors
R & G space division: proc/nbgrp/npool/nimage = 8
Current dimensions of program PWSCF are:
Max number of different atomic species (ntypx) = 10
Max number of k-points (npk) = 40000
Max angular momentum in pseudopotentials (lmaxx) = 3
Waiting for input...
Reading input from standard input
Subspace diagonalization in iterative solution of the eigenvalue problem:
scalapack distributed-memory algorithm (size of sub-group: 2* 2 procs)
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 847509 308187 77046 *********32409449140511854
Max 847510 308188 77048 *********32409459440511855
Sum 6780073 2465503 616373 **************************
or
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 951740 346090 86521 *********35632841444540757
Max 951741 346091 86522 *********35632855044540759
Sum 7613927 2768721 692169 114982171*****************