New Versions of gme2hrm, hrm, hmx2hmy and HRM User's Guide; GME 30 km / L60

6 views
Skip to first unread message

Gilberto Bonatti

unread,
Jan 13, 2010, 10:24:27 AM1/13/10
to hrm_...@googlegroups.com
Dear HRM Users,

I have compiled the hrm_2.8 and when I ran it, I had the following error:

decomposition of the HRM
 Processor:   0  j1_start:   2  j1_end: 300  j2_start:   2  j2_end:   5
 Processor:   1  j1_start:   2  j1_end: 300  j2_start:   6  j2_end:  10
 Processor:   2  j1_start:   2  j1_end: 300  j2_start:  11  j2_end:  15
 Processor:   3  j1_start:   2  j1_end: 300  j2_start:  16  j2_end:  19
 Processor:   4  j1_start:   2  j1_end: 300  j2_start:  20  j2_end:  24
 Processor:   5  j1_start:   2  j1_end: 300  j2_start:  25  j2_end:  29
 Processor:   6  j1_start:   2  j1_end: 300  j2_start:  30  j2_end:  33
 Processor:   7  j1_start:   2  j1_end: 300  j2_start:  34  j2_end:  38
 Processor:   8  j1_start:   2  j1_end: 300  j2_start:  39  j2_end:  43
 Processor:   9  j1_start:   2  j1_end: 300  j2_start:  44  j2_end:  47
 Processor:  10  j1_start:   2  j1_end: 300  j2_start:  48  j2_end:  52
 Processor:  11  j1_start:   2  j1_end: 300  j2_start:  53  j2_end:  57
 Processor:  12  j1_start:   2  j1_end: 300  j2_start:  58  j2_end:  61
 Processor:  13  j1_start:   2  j1_end: 300  j2_start:  62  j2_end:  66
 Processor:  14  j1_start:   2  j1_end: 300  j2_start:  67  j2_end:  71
 Processor:  15  j1_start:   2  j1_end: 300  j2_start:  72  j2_end:  76
 Processor:  16  j1_start:   2  j1_end: 300  j2_start:  77  j2_end:  80
 Processor:  17  j1_start:   2  j1_end: 300  j2_start:  81  j2_end:  85
 Processor:  18  j1_start:   2  j1_end: 300  j2_start:  86  j2_end:  90
 Processor:  19  j1_start:   2  j1_end: 300  j2_start:  91  j2_end:  94
 Processor:  20  j1_start:   2  j1_end: 300  j2_start:  95  j2_end:  99
 Processor:  21  j1_start:   2  j1_end: 300  j2_start: 100  j2_end: 104
 Processor:  22  j1_start:   2  j1_end: 300  j2_start: 105  j2_end: 108
 Processor:  23  j1_start:   2  j1_end: 300  j2_start: 109  j2_end: 113
 Processor:  24  j1_start:   2  j1_end: 300  j2_start: 114  j2_end: 118
 Processor:  25  j1_start:   2  j1_end: 300  j2_start: 119  j2_end: 122
 Processor:  26  j1_start:   2  j1_end: 300  j2_start: 123  j2_end: 127
 Processor:  27  j1_start:   2  j1_end: 300  j2_start: 128  j2_end: 132
 Processor:  28  j1_start:   2  j1_end: 300  j2_start: 133  j2_end: 136
 Processor:  29  j1_start:   2  j1_end: 300  j2_start: 137  j2_end: 141
 Processor:  30  j1_start:   2  j1_end: 300  j2_start: 142  j2_end: 146
 Processor:  31  j1_start:   2  j1_end: 300  j2_start: 147  j2_end: 151
 Processor:  32  j1_start:   2  j1_end: 300  j2_start: 152  j2_end: 155
 Processor:  33  j1_start:   2  j1_end: 300  j2_start: 156  j2_end: 160
 Processor:  34  j1_start:   2  j1_end: 300  j2_start: 161  j2_end: 165
 Processor:  35  j1_start:   2  j1_end: 300  j2_start: 166  j2_end: 169
 Processor:  36  j1_start:   2  j1_end: 300  j2_start: 170  j2_end: 174
 Processor:  37  j1_start:   2  j1_end: 300  j2_start: 175  j2_end: 179
 Processor:  38  j1_start:   2  j1_end: 300  j2_start: 180  j2_end: 183
 Processor:  39  j1_start:   2  j1_end: 300  j2_start: 184  j2_end: 188
 Processor:  40  j1_start:   2  j1_end: 300  j2_start: 189  j2_end: 193
 Processor:  41  j1_start:   2  j1_end: 300  j2_start: 194  j2_end: 197
 Processor:  42  j1_start:   2  j1_end: 300  j2_start: 198  j2_end: 202
 Processor:  43  j1_start:   2  j1_end: 300  j2_start: 203  j2_end: 207
 Processor:  44  j1_start:   2  j1_end: 300  j2_start: 208  j2_end: 211
 Processor:  45  j1_start:   2  j1_end: 300  j2_start: 212  j2_end: 216
 Processor:  46  j1_start:   2  j1_end: 300  j2_start: 217  j2_end: 221
 Processor:  47  j1_start:   2  j1_end: 300  j2_start: 222  j2_end: 226
 Processor:  48  j1_start:   2  j1_end: 300  j2_start: 227  j2_end: 230
 Processor:  49  j1_start:   2  j1_end: 300  j2_start: 231  j2_end: 235
 Processor:  50  j1_start:   2  j1_end: 300  j2_start: 236  j2_end: 240
 Processor:  51  j1_start:   2  j1_end: 300  j2_start: 241  j2_end: 244
 Processor:  52  j1_start:   2  j1_end: 300  j2_start: 245  j2_end: 249
 Processor:  53  j1_start:   2  j1_end: 300  j2_start: 250  j2_end: 254
 Processor:  54  j1_start:   2  j1_end: 300  j2_start: 255  j2_end: 258
 Processor:  55  j1_start:   2  j1_end: 300  j2_start: 259  j2_end: 263
 Processor:  56  j1_start:   2  j1_end: 300  j2_start: 264  j2_end: 268
 Processor:  57  j1_start:   2  j1_end: 300  j2_start: 269  j2_end: 272
 Processor:  58  j1_start:   2  j1_end: 300  j2_start: 273  j2_end: 277
 Processor:  59  j1_start:   2  j1_end: 300  j2_start: 278  j2_end: 282
 Processor:  60  j1_start:   2  j1_end: 300  j2_start: 283  j2_end: 286
 Processor:  61  j1_start:   2  j1_end: 300  j2_start: 287  j2_end: 291
 Processor:  62  j1_start:   2  j1_end: 300  j2_start: 292  j2_end: 296
 Processor:  63  j1_start:   2  j1_end: 300  j2_start: 297  j2_end: 300



 Domain decomposition in j3-direction
 Processor:   0  j3_start:   1  j3_end:   0
 Processor:   1  j3_start:   1  j3_end:   0
 Processor:   2  j3_start:   1  j3_end:   0
 Processor:   3  j3_start:   1  j3_end:   0
 Processor:   4  j3_start:   1  j3_end:   0
 Processor:   5  j3_start:   1  j3_end:   0
 Processor:   6  j3_start:   1  j3_end:   0
 Processor:   7  j3_start:   1  j3_end:   1



 ======================================================
 Definition of soil model layers
 INDEX      TOP      BOT     Full Level Level(PDS)  Layer thickness
 cm       cm        cm          cm       cm
 1      0.000     1.000     0.500         1     1.000
 2      1.000     3.000     2.000         2     2.000
 3      3.000     9.000     6.000         6     6.000
 4      9.000    27.000    18.000        18    18.000
 5     27.000    81.000    54.000        54    54.000
 6     81.000   243.000   162.000       162   162.000
 7    243.000   729.000   486.000       486   486.000
 8    729.000  2187.000  1458.000      1458  1458.000
 ======================================================
MPI: MPI_COMM_WORLD rank 57 has terminated without calling MPI_Finalize()
MPI: aborting job
MPI: Received signal 11




Is there a new parameter to be placed on INPUT_HRM?

I have tested the 2.7 version of HRM with the 2.6 of GME and works perfectly.

OUTPUT_HRM following attached.


Best regards,

 
Gilberto Bonatti

OUTPUT_HRM

D. Bala Subrahamanyam

unread,
Jan 13, 2010, 10:26:40 AM1/13/10
to Gilberto Bonatti, hrm_...@googlegroups.com
Dear Dr. Gilberto,

Recently, I have compiled the hrm2.8 and gme2hrm2.6 code with a single processor and did not encounter any problem.
I do not know exactly on parallel computing (i.e., MPI version). I too would like to know answer to your question.

My best wishes,

SUBBU.

--
You received this message because you are subscribed to the Google Groups "hrm_help" group.
To post to this group, send email to hrm_...@googlegroups.com.
To unsubscribe from this group, send email to hrm_help+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/hrm_help?hl=en.




--
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>> Dr. D. Bala Subrahamanyam, Scientist/Engineer - 'SD'
>> Govt. of India, Dept. of Space, SPACE PHYSICS LABORATORY
>> Indian Space Research Organization, VIKRAM SARABHAI SPACE CENTRE
>> ISRO P.O., THIRUVANANTHAPURAM - 695 022, KERALA, INDIA
>> Ph.: +91-9895656150; Fax: +91-471-2706535
>> E-mail: subb...@yahoo.com  
>> My Homepage: http://www.geocities.com/subbu_dbs/
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Gilberto

unread,
Jan 13, 2010, 10:53:31 AM1/13/10
to hrm_help
Dear SUBBU,

I can run the HRM 2.8 with a maximum of 32 (nproc1=1, nproc2=32)
processors without error.
If I try with more than 32 processors (with nproc1=1), I get this
error. In the 2.7 version of HRM it doesn't happen.
PS: In the 2.8 version of HRM with 64 processors (nproc1=2,nproc2=32)
works perfectly.

Thanks for your help.

Best regards,

Gilberto

On 13 jan, 13:26, "D. Bala Subrahamanyam" <subrahaman...@gmail.com>
wrote:


> Dear Dr. Gilberto,
>
> Recently, I have compiled the hrm2.8 and gme2hrm2.6 code with a single
> processor and did not encounter any problem.
> I do not know exactly on parallel computing (i.e., MPI version). I too would
> like to know answer to your question.
>
> My best wishes,
>
> SUBBU.
>

> > hrm_help+u...@googlegroups.com<hrm_help%2Bunsu...@googlegroups.com>


> > .
> > For more options, visit this group at
> >http://groups.google.com/group/hrm_help?hl=en.
>
> --
>
>
>
> >> Dr. D. Bala Subrahamanyam, Scientist/Engineer - 'SD'
> >> Govt. of India, Dept. of Space, SPACE PHYSICS LABORATORY
> >> Indian Space Research Organization, VIKRAM SARABHAI SPACE CENTRE
> >> ISRO P.O., THIRUVANANTHAPURAM - 695 022, KERALA, INDIA
> >> Ph.: +91-9895656150; Fax: +91-471-2706535

> >> E-mail: subbu_...@yahoo.com
> >> My Homepage:http://www.geocities.com/subbu_dbs/

Reply all
Reply to author
Forward
0 new messages