The OASIS Coupler Forum

  HOME

Map Grid Size mismatch error

Up to Specific issues in real coupled models

Posted by Anonymous at June 6 2023

Hi,

I have a coupled NEMO-WAVEWATCH model. When I ran the coupled model I had this error:

------------------------------------------------------------------------
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
 MCT::m_ExchangeMaps::ExGSMapGSMap_:: MCTERROR, Grid Size mismatch
 MCT::m_ExchangeMaps::ExGSMapGSMap_:: MCTERROR, Grid Size mismatch
 LocalMap Gsize =            0  RemoteMap Gsize =        13189
MCT::m_ExchangeMaps::ExGSMapGSMap_: Map Grid Size mismatch error, stat =3
012.MCT(MPEU)::die.: from MCT::m_ExchangeMaps::ExGSMapGSMap_()
application called MPI_Abort(MPI_COMM_WORLD, 2) - process 18
 LocalMap Gsize =        13189  RemoteMap Gsize =            0
MCT::m_ExchangeMaps::ExGSMapGSMap_: Map Grid Size mismatch error, stat =3
000.MCT(MPEU)::die.: from MCT::m_ExchangeMaps::ExGSMapGSMap_()
application called MPI_Abort(MPI_COMM_WORLD, 2) - process 0
------------------------------------------------------------------------

Could anyone tell me what wrong is and how to solve it? Thanks.

Rui

Posted by Anonymous at June 6 2023

Hi Rui,
Something is obviously wrong in the definition of your coupling grids. One grid seems to be defined with 13189 points but the other is not defined at all.
You can check sections 2.2.4 and 5.1 for the definition of the coupling grids.
 Let me know if this helps,
  Sophie

Posted by Anonymous at June 7 2023

Hi Sophie,

I agree. It seems that the grid of model1(115*99) is not identified correctly. I checked the grid file and namcouple but didn't find where the error come from.

This is the content of the coupling grid file:
-------------------------------------------------------------------
ncdump -h grids.nc
netcdf grids {
dimensions:
        lon1 = 115 ;
        lat1 = 99 ;
        crn1 = 4 ;
        lon2 = 121 ;
        lat2 = 109 ;
        crn2 = 4 ;
variables:
        float to12.lon(lat1, lon1) ;
        float to12.lat(lat1, lon1) ;
        float to12.clo(crn1, lat1, lon1) ;
        float to12.cla(crn1, lat1, lon1) ;
        float uo12.lon(lat1, lon1) ;
        float uo12.lat(lat1, lon1) ;
        float uo12.clo(crn1, lat1, lon1) ;
        float uo12.cla(crn1, lat1, lon1) ;
        float vo12.lon(lat1, lon1) ;
        float vo12.lat(lat1, lon1) ;
        float vo12.clo(crn1, lat1, lon1) ;
        float vo12.cla(crn1, lat1, lon1) ;
        float ww3t.lon(lat2, lon2) ;
        float ww3t.lat(lat2, lon2) ;
        float ww3t.clo(crn2, lat2, lon2) ;
        float ww3t.cla(crn2, lat2, lon2) ;
}
-------------------------------------------------------------------

And this is the namcouple:
-------------------------------------------------------------------
 $NFIELDS
        3 #15
 $END
#
 $NBMODEL
     2     oceanx wwatch  
 $END
#
 $RUNTIME
      86400
 $END
#
 $NLOGPRT
 0  0
 $END
#
 $STRINGS
#
O_OCurxw WW3_OSSU 1  3600 2 ocean.nc EXPORTED
115 99 121 109 uo12 ww3t LAG=+900 
P  2  P  0
LOCTRANS SCRIPR 
AVERAGE
DISTWGT LR SCALAR LATLON 1 8
#
O_OCuryw WW3_OSSV 1  3600 2 ocean.nc EXPORTED
115 99 121 109 vo12 ww3t LAG=+900 
P  2  P  0
LOCTRANS SCRIPR 
AVERAGE
DISTWGT LR SCALAR LATLON 1 8
#
WW3__BHD O_Bhd 1  3600 2 waves.nc EXPORTED
121 109 115 99 ww3t to12 LAG=+3600
P  0  P  2
LOCTRANS SCRIPR
INSTANT
DISTWGT LR SCALAR LATLON 1 8
#
 $END
-------------------------------------------------------------------

Rui

Posted by Anonymous at June 7 2023

Hi Rui,

Could you send your grids.nc, masks.nc and areas.nc files at oasishelp@cerfacs.fr so I can test them with a toy ?
Thanks,
Laure

Posted by Anonymous at June 12 2023

Hi Laure,

The grids.nc, masks.nc, namcouple and output file nout.000000 are sent to oasishelp@cerfacs.fr. I did not create the areas.nc because I only used the SCRIPR remapping (maybe I was wrong?). 

Thanks,
Rui

Posted by Anonymous at June 12 2023

Hi Rui,

Yes I go the files. I will have a look at them today and let you know.
Best regards,
Laure

Posted by Anonymous at June 12 2023

Hi Rui,

Could you try to run your coupled model using the following lines in your namcouple:
R 0 R 0
instead of 
P 2 P 0
and 
P 0 P 2

and let me know ?
Because your lon and lat are regional.

Thanks,
Best regards,
Laure

Posted by Anonymous at June 23 2023

Hi Laure,

I replaced the P 2 P 0, P 0 P 2 with R 0 R 0, but still got the same error:

$ mpirun -np 4 ./ww3_shel : -np 4 ./opa
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
oasis_init_comp: Calling MPI_Init
 MCT::m_ExchangeMaps::ExGSMapGSMap_:: MCTERROR, Grid Size mismatch
 LocalMap Gsize =            0  RemoteMap Gsize =        13189
MCT::m_ExchangeMaps::ExGSMapGSMap_: Map Grid Size mismatch error, stat =3
000.MCT(MPEU)::die.: from MCT::m_ExchangeMaps::ExGSMapGSMap_()
 MCT::m_ExchangeMaps::ExGSMapGSMap_:: MCTERROR, Grid Size mismatch
 LocalMap Gsize =        13189  RemoteMap Gsize =            0
MCT::m_ExchangeMaps::ExGSMapGSMap_: Map Grid Size mismatch error, stat =3
004.MCT(MPEU)::die.: from MCT::m_ExchangeMaps::ExGSMapGSMap_()
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD
with errorcode 2.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[login:29782] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[login:29782] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Best regards,
Rui

Posted by Anonymous at June 28 2023

Hi Rui,

Can you tell me which partition you use for each of your grid to12 and ww3t (apple, box, orange ?) ?

Could you send to me the value you have in your decomposition arrays ig_paral for each grid (see https://www.cerfacs.fr/oa4web/oasis3-mct_5.0/oasis3mct_UserGuide/node15.html of the documentation)
Thanks,
Laure

Posted by Anonymous at August 28 2023

Hi Laure,

Sorry for my late reply.

I just found that the partition for to12 is box (paral(1) = 2), but the partition for ww3t is orange (paral(1) = 3). Is that OK to use different partitions in different models?

Best regards,
Rui

Posted by Anonymous at August 29 2023

Yes of course, you can have different partitions for the different models. This is not the problem. However, something must be wrong in the way you specify your partition with oasis_def_partition in one of your code. See section 2.2.3 of the User Guide for details.
You could also try to reproduce the problem with a toy model (i.e. not your real model but just a "empty" program that reproduces your coupling exchanges with your grids and partitions) and send the toy to us. Please have a look at /exmple/spoc/spoc_communication for an example of such a toy that you could use as a basis.
  With best regards,
 Sophie

Posted by Anonymous at September 25 2023

Hi Sophie,

I did a test with the modified spoc_communication: shrink the atmos and ocean meshes from global to region, but still got error. Why is this happening?

I also sent a email attached with all the files to oasishelp@cerfacs.fr.

----------------------------------------------------------
namcoule:

 $NFIELDS
  14400
 $NLOGPRT
  30  0
 $STRINGS
FIELD_SEND_OCN FIELD_RECV_ATM 1 3600  1  fdocn.nc EXPOUT
111 111 62 61 torc  lmdz
R 0 R 0
SCRIPR 
BILINEAR LR SCALAR LATLON 1
FIELD_SEND_ATM FIELD_RECV_OCN  1 7200  1  fdatm.nc EXPOUT
62 61 111 111 lmdz torc
R 0 R 0
SCRIPR
BILINEAR LR SCALAR LATLON 1
----------------------------------------------------------
$ cat ocean.out_100
 -----------------------------------------------------------
 I am ocean process with rank :           0
 in my local communicator gathering            2 processes
 ----------------------------------------------------------
 Local partition definition
 il_extentx, il_extenty, il_size, il_offsetx, il_offsety, il_offset =
         111          55        6105           0           0           0
 ig_paral =            1           0        6105
 grid_lat_ocean maximum and minimum   8.02665502955887
  -57.1574302538470
 var_id FRECVOCN, var_id FSENDOCN           1           2
 End of initialisation phase
 Timestep, field min and max value

----------------------------------------------------------
$cat atmos.out_100
 -----------------------------------------------------------
 I am atmos process with rank :           0
 in my local communicator gathering            2 processes
 ----------------------------------------------------------
 Local partition definition
 il_extentx, il_extenty, il_size, il_offsetx, il_offsety, il_offset =
          62          30        1860           0           0           0
 ig_paral =            1           0        1860
 grid_lat_atmos maximum and minimum   8.87324142456055
  -64.6478881835938
 var_id FRECVATM, var_id FSENDATM           1           2
 End of initialisation phase
 Timestep, field min and max value

----------------------------------------------------------
netcdf atmos_mesh {
dimensions:
        crn_lmdz = 4 ;
        y_lmdz = 61 ;
        x_lmdz = 62 ;
variables:
        double cla(crn_lmdz, y_lmdz, x_lmdz) ;
        double clo(crn_lmdz, y_lmdz, x_lmdz) ;
        int imask(y_lmdz, x_lmdz) ;
        double lat(y_lmdz, x_lmdz) ;
        double lon(y_lmdz, x_lmdz) ;
        double srf(y_lmdz, x_lmdz) ;
----------------------------------------------------------
netcdf ocean_mesh {
dimensions:
        crn_torc = 4 ;
        y_torc = 111 ;
        x_torc = 111 ;
variables:
        double cla(crn_torc, y_torc, x_torc) ;
                cla:long_name = "Latitudes of grid cell corners of torc" ;
                cla:units = "degree_N" ;
        double clo(crn_torc, y_torc, x_torc) ;
                clo:long_name = "Longitudes of grid cell corners of torc" ;
                clo:units = "degree_E" ;
        int imask(y_torc, x_torc) ;
        double lat(y_torc, x_torc) ;
                lat:long_name = "Latitudes of torc" ;
                lat:units = "degree_N" ;
        double lon(y_torc, x_torc) ;
                lon:long_name = "Longitudes of torc" ;
                lon:units = "degree_E" ;
        double srf(y_torc, x_torc) ;
                srf:long_name = "Areas of torc" ;
                srf:units = "m2" ;

----------------------------------------------------------
netcdf fdatm {
dimensions:
        j = 61 ;
        i = 62 ;
variables:
        double FIELD_SEND_ATM(j, i) ;
----------------------------------------------------------
netcdf fdocn {
dimensions:
        j = 111 ;
        i = 111 ;
variables:
        double FIELD_SEND_OCN(j, i) ;
        double torc.lat(j, i) ;
        double torc.lon(j, i) ;

Many thanks,
Rui
Reply to this