version for gfortran 8.0.+ ?

74 views
Skip to first unread message

Dan Nagle

unread,
Mar 2, 2018, 7:32:17 PM3/2/18
to openco...@googlegroups.com
Hi,

With the download version, there are nonsense errors.
new descriptor / old descriptor ?

Is there a version of OC that plays well with gf 8.0.1 ?

--

Cheers!
Dan Nagle


Zaak Beekman

unread,
Mar 3, 2018, 3:55:35 PM3/3/18
to OpenCoarrays
Hi Dan,

Yes, OpenCoarrays is tightly coupled to GFortran, so if you are using an unreleased version of GFortran (i.e., 8.x/trunk) then the probability of encountering problems increases greatly. (We are working on integrating OpenCoarrays into the GFortran testing infrastructure, so, hopefully, this will be improved in the future.)

If you want to try your luck with the current GFortran trunk and OpenCoarrays your best bet is to download the master branch of open coarrays. You may obtain a zip archive from https://github.com/sourceryinstitute/OpenCoarrays/archive/master.zip or, simply clone the repository at g...@github.com:sourceryinstitute/OpenCoarrays.git or https://github.com/sourceryinstitute/OpenCoarrays.git and build and install it from there.

We have added a number of changes on the master branch to match changes in GFortran 8, but testing is limited, so I won't make any promises, that everything works as it should, but at least it will compile and be an improvement to trying to use 1.9.3 or the 2.0 release candidate.

Hope this helps, and if you hit any bugs please let us know at https://github.com/sourceryinstitute/OpenCoarrays/issues/new so that we can fix them.

Thanks,
Zaak

michael siehl

unread,
Mar 16, 2018, 9:59:31 AM3/16/18
to OpenCoarrays

I can confirm that OpenCoarrays 2.0.0 works well together with gfortran 8.0.1 on a shared memory laptop computer with Linux Ubuntu 14.04 LTS (64bit) on it. A first simple test using Fortran 2018 coarray team features did work perfectly. (But please consider the current caveats with OpenCoarrays 2.0.0 described by Damian Rouson, here:  https://groups.google.com/forum/#!topic/opencoarrays/hCTBKRVb4n8 ).


Problems may arise during installation, and some of the problems are unrelated to OpenCoarrays. I did use OpenCoarrays install.sh script for the installation. To do so successfully, I did 'manually' preinstall GCC 7.3.0 release (C and C++ only), GCC 8.0.1 snapshot (gfortran only), and MPICH 3.2.1 (using a simple 'trick', see below).


Some problems that I did experience:

  • On my laptop computer, the GCC snapshot 8-20180304 did abort compilation with an error (I tried two times). The snapshot 8-20180225 did work successfully. I have not tried the most recent snapshot 8-20180311 yet.
  • OpenMPI release 3.0.0 did not work well on my laptop computer: Running a coarray program was only successful using a small number of coarray images, equal or less to the number of physical cores on my computer. (And even then, hyperthreading of the cores could not be used). A runtime error message demanded to make more 'slots' available. OpenMPI 2.1.3, on the other hand, did work well, but I did not use it together with coarray teams.

Here is the simple trick for installing MPICH:

3 – Build and Install MPICH 3.2.1 for use with OpenCoarrays 2.0.0

3-0:

Open the Ubuntu Software-Center and install g++ (version 4.8.4), but do NOT install gfortran (4.8.4) to leave the gfortran 8.0.1 installation intact. (The OpenCoarrays installation later does require that the MPICH mpi.mod file is created by the same version of the gfortran compiler).

3-1:

Visit www.mpich.org and download the latest MPICH version 3.2.1 (stable release), with file name extension 'tar.gz' (e.g. mpich-3.2.1.tar.gz).

3-2:

Right-click and unpack the downloaded archive file.

3-3:

Open a terminal window and change to the directory with the unpacked files (using the cd command).

3-4:

To compile MPICH, enter the following commands:

./configure (this takes a little while)

make        (-||-)

Exit the terminal window.

3-5:

Now, open the Ubuntu Software-Center and install gfortran (version 4.8.4). This makes gfortran 4.8.4 the currently installed compiler.

3-6:

Open a new terminal window and change to the directory with the unpacked files (using the cd command).

To install MPICH, enter 'sudo make install ' in the terminal window.


The full instructions to 'Build and Install GCC 7.3.0, gfortran 8.0.1, MPICH 3.2.1, and OpenCoarrays 2.0.0 (to allow for Fortran 2018 coarray team support) on Linux Ubuntu 14.04 LTS (64 bit)' can be found here:

 https://github.com/MichaelSiehl/Install_MPI_GCC_and_OpenCoarrays_on_LINUX/blob/master/Build_and_Install_GCC730_gfortran801_mpich321_OpenCoarrays200_on_Linux_Ubuntu_14_04.pdf


cheers

Zaak Beekman

unread,
Mar 21, 2018, 1:26:37 PM3/21/18
to OpenCoarrays
Hi Michael,

OpenMPI has some quirks/limitations on oversubscribing, and as of recent releases forces the user to create a hosts file to run oversubscribed. If you build OpenCoarrays with OpenMPI and then run the tests in verbose mode, I think (but might be mistaken) that the cafrun command will be shown that is used to run the tests. (`ctest --verbose`)

At any rate, to run oversubscribed coarray code with OpenMPI here are the additional options that need to get passed through the cafrun wrapper to mpirun:

    --oversubscribe
    --hostfile /path/to/host/file

On my mac laptop with a Intel(R) Core(TM) i7-4850HQ CPU @ 2.30GHz I have 4 physical cores with 2 logical cores/HTs per physical core: https://ark.intel.com/products/76086/Intel-Core-i7-4850HQ-Processor-6M-Cache-up-to-3_50-GHz. So the hosts file that we generate at CMake configure time looks like:

    $ cat build-dir/hostsfile
    IBB-MBP-PT.local slots=8

It is found in the CMake build directory (prerequisites/builds/opencoarrays... if using install.sh, or wherever you put it if invoking CMake directly) and uses CMake to detect the number of logical cores on the build platform and the host name. On a shared memory desktop/laptop/workstation your hosts file should look like: `host.name slots=<num_logical_cores>`. You can just copy it from the CMake build directory.

If you oversubscribe insane quantities of processes per logical core, OpenMPI may still fail. But adding the hostfile flag with path to host file and the --oversubscribe flag should help get you running.

e.g.:

    cafrun -np 32 --hostfile ./hostfile --oversubscribe ./a.out arg1 arg2

Hope this helps.

-Zaak

michael siehl

unread,
Mar 25, 2018, 5:59:51 PM3/25/18
to OpenCoarrays
Hi Zaak,
thanks a lot for that infomation.

Michael

Zaak Beekman

unread,
Mar 26, 2018, 5:05:47 PM3/26/18
to OpenCoarrays
FYI Michael and everyone else: We have created a FAQ document at: https://github.com/sourceryinstitute/OpenCoarrays/blob/master/FAQ.md

If you have questions you want answered, please submit them here or as a new issue. Thanks!

michael siehl

unread,
Mar 26, 2018, 5:40:39 PM3/26/18
to OpenCoarrays
Great !
Reply all
Reply to author
Forward
0 new messages