Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modernize MPI interface #363

Closed
phil-blain opened this issue Sep 18, 2019 · 14 comments
Closed

Modernize MPI interface #363

phil-blain opened this issue Sep 18, 2019 · 14 comments

Comments

@phil-blain
Copy link
Member

phil-blain commented Sep 18, 2019

I just noticed that the CICE code uses 2 (very) old ways to get MPI functionality, that is
a preprocessor 'include' macro:

#include <mpif.h>

(in cicecore/cicedynB/infrastructure/comm/mpi/ice_reprosum.F90 and cicecore/cicedynB/infrastructure/comm/serial/ice_reprosum.F90 )

or a Fortran 'include' statement :

include 'mpif.h'

(everywhere else)

The mpif.h header is strongly discouraged by the MPI standard and will be deprecated in the future; the more modern way is to use a Fortran 'use' statement:

use mpi 
! or 
use mpi_f08 ! (the only way that is compatible with the Fortran standard)

Reference : https://www.mpi-forum.org/docs/mpi-3.1/mpi31-report/node408.htm

I think making this change would make CICE more future-proof.

@phil-blain
Copy link
Member Author

As an aside, I'm starting to work on coupling CICE6 with NEMO for our systems and the #include <mpif.h> statement in ice_reprosum.F90 makes the NEMO build system fail (it is a [unnecessarily] complex, black-box build system).
As a first step I propose changing just cicecore/cicedynB/infrastructure/comm/mpi/ice_reprosum.F90 and cicecore/cicedynB/infrastructure/comm/serial/ice_reprosum.F90 from a preprocessor macro to a Fortran include statement. @apcraig would you be open to that ? I can make a PR.

@apcraig
Copy link
Contributor

apcraig commented Sep 19, 2019

That would be good. Have you verified that will fix your problem in NEMO?

@phil-blain
Copy link
Member Author

Yes I did, it does. However I have another problem, should I open a new issue ?

@apcraig
Copy link
Contributor

apcraig commented Sep 19, 2019

I just merged #365. If there are other issues, feel free to add them to this one or open a new issue, whatever you prefer.

@phil-blain
Copy link
Member Author

I'd vote for reopening since the code modernization issue was not addressed.. maybe we can talk about it at the next teleconference ?

@phil-blain
Copy link
Member Author

In #389 we switched to the mpi module. We should investigate using the mpi_f08 module as this is the way forward.

Basically in mpi_f08 handles are not integers anymore but of custom type, so I don't think the code changes would be that big.

@apcraig
Copy link
Contributor

apcraig commented Dec 19, 2019

I did a bit of research about mpi_f08 the other day and support is associated with the MPI lib that we're building with per compiler and per machine. Before we move forward with mpi_f08, we should have confidence that installed versions of MPI on just about any machine/setup the community might use will support mpi_f08. This is the kind of situation where we want to support the greatest common implementation.

@eclare108213
Copy link
Contributor

Is there more to discuss on this issue, or is it ready to be closed? @phil-blain @apcraig

@phil-blain
Copy link
Member Author

My opinion is that in a few years (maybe more), we'll eventually want to switch to the mpi_f08 module, so I would let this issue open just as a reminder.

@apcraig
Copy link
Contributor

apcraig commented May 13, 2020

I think my preference would be to close this for now. When we are ready to update to mpi_f08, we can open another issue. I think if we have open issues for things we might do in several years, it's going to be difficult to sort out active issues from inactive issues. Maybe we can create a new label called "Type: Inactive" and close this. That label would allow us to search for these kinds of things in all issues later. Just an idea.

@eclare108213
Copy link
Contributor

I like that idea. There are several other issues that might get that label.

@apcraig
Copy link
Contributor

apcraig commented May 13, 2020

I have added an Inactive label to both CICE and Icepack that we can use. I have attached it to this issue, but have not yet closed it. If @phil-blain agrees using this label and closing these issues for the time being is reasonable, I think we could probably take the same approach to a few other issues as well.

@phil-blain
Copy link
Member Author

I think that's a good idea.

@apcraig
Copy link
Contributor

apcraig commented Jun 3, 2020

Just to add a quick note. We have added a section to the Software Development Practice to summarize a policy regarding this issue. https://github.com/CICE-Consortium/About-Us/wiki/Software-Development-Practices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants