-
Notifications
You must be signed in to change notification settings - Fork 131
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CICE C-grid crash on 1/12 degree tripole #941
Comments
Hi Dave, Finally I don't think it is the remapping that is causing the problem. The velocities are huge. 6.67 m/s is unrealistic. There might even be higher velocities elsewhere. What do you get in ice_std_out for the max velocity? I guess there is an instability developing. It could be related to the coupling with the ocean model. |
Hi JF,
Thank you. Yes, for sure those are unrealistic velocities. What I am not
sure about is this case does run without error if I switch to grid_ocn = 'B'
instead of 'C'. So I was wondering if the large velocities would be related
to the C grid somehow.
I'll need to re-run both cases to get diagnostic output. We turn that off
(set a very large diagfreq) because we found the diagnostics with global
sums can take a fair amount of time in operational runs.
Once I have more details I'll post them. Thank you again!
David
From: JFLemieux73 ***@***.***>
Sent: Tuesday, March 12, 2024 10:07
To: CICE-Consortium/CICE ***@***.***>
Cc: david.hebert ***@***.***>; Author
***@***.***>
Subject: Re: [CICE-Consortium/CICE] CICE C-grid crash on 1/12 degree tripole
(Issue #941)
Hi Dave,
Finally I don't think it is the remapping that is causing the problem. The
velocities are huge. 6.67 m/s is unrealistic. There might even be higher
velocities elsewhere. What do you get in ice_std_out for the max velocity?
I guess there is an instability developing. It could be related to the
coupling with the ocean model.
-
Reply to this email directly, view it on GitHub
<https://usg01.safelinks.protection.office365.us/?url=https%3A%2F%2Fgithub.c
om%2FCICE-Consortium%2FCICE%2Fissues%2F941%23issuecomment-1992148896&data=05
%7C02%7Cdavid.a.hebert10.civ%40us.navy.mil%7C109bb83b357f46c795e608dc42b6e29
c%7Ce3333e00c8774b87b6ad45e942de1750%7C0%7C0%7C638458600454974186%7CUnknown%
7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn
0%3D%7C0%7C%7C%7C&sdata=Nm7vxLdLonc5dWwI7IOcgSH502CKaeaiSQZLQ6V6nIs%3D&reser
ved=0> , or unsubscribe.
You are receiving this because you authored the thread.
<https://github.com/notifications/beacon/AE52VPDVGK6LGAN3UQYRPALYX4Y3FA5CNFS
M6AAAAABESKRND6WGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTT
WXXD2A.gif> Message ID:
***@***.***
***@***.***> >
|
Hi Dave, I am adding @dupontf to the discussion. I think a good candidate for the instability is the explicit treatment of Coriolis for the C-grid (it is implicit for the B-grid). This was already identified as a potential problem is this issue: We didn't have problems in uncoupled mode but it might be different when coupled to an ocean model. What I don't understand is that we followed Kimmritz et al. 2016 which is probably what is used in the MITgcm. Here is what I propose: -I will contact Martin Losch about the MITgcm. Sounds good? jf |
Thank you JF, Fred, all involved! All sounds good. I will work on setting up
new runs in the coming days. Sorry it will be a little slow while I am on
travel.
Dave
From: JFLemieux73 ***@***.***>
Sent: Wednesday, March 13, 2024 07:30
To: CICE-Consortium/CICE ***@***.***>
Cc: david.hebert ***@***.***>; Author
***@***.***>
Subject: Re: [CICE-Consortium/CICE] CICE C-grid crash on 1/12 degree tripole
(Issue #941)
Hi Dave,
I am adding @dupontf
<https://usg01.safelinks.protection.office365.us/?url=https%3A%2F%2Fgithub.c
om%2Fdupontf&data=05%7C02%7Cdavid.a.hebert10.civ%40us.navy.mil%7Cbff7a6fa124
3402abedd08dc436a23e9%7Ce3333e00c8774b87b6ad45e942de1750%7C0%7C0%7C638459370
350503273%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTi
I6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=sNFOl7ee1mH3EPi7dnkJVpCu9ljvbdIp
74tMPtdgeHY%3D&reserved=0> to the discussion. I think a good candidate for
the instability is the explicit treatment of Coriolis for the C-grid (it is
implicit for the B-grid).
This was already identified as a potential problem is this issue:
#917
<https://usg01.safelinks.protection.office365.us/?url=https%3A%2F%2Fgithub.c
om%2FCICE-Consortium%2FCICE%2Fissues%2F917&data=05%7C02%7Cdavid.a.hebert10.c
iv%40us.navy.mil%7Cbff7a6fa1243402abedd08dc436a23e9%7Ce3333e00c8774b87b6ad45
e942de1750%7C0%7C0%7C638459370350503273%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4w
LjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=gl
X7czamIhPuAVr6nG%2BIxkvIWGMfWAffSQSx4hRdRbU%3D&reserved=0>
We didn't have problems in uncoupled mode but it might be different when
coupled to an ocean model. What I don't understand is that we followed
Kimmritz et al. 2016 which is probably what is used in the MITgcm.
Here is what I propose:
…-I will contact Martin Losch about the MITgcm.
-I will talk to @dupontf
<https://usg01.safelinks.protection.office365.us/?url=https%3A%2F%2Fgithub.c
om%2Fdupontf&data=05%7C02%7Cdavid.a.hebert10.civ%40us.navy.mil%7Cbff7a6fa124
3402abedd08dc436a23e9%7Ce3333e00c8774b87b6ad45e942de1750%7C0%7C0%7C638459370
350503273%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTi
I6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=sNFOl7ee1mH3EPi7dnkJVpCu9ljvbdIp
74tMPtdgeHY%3D&reserved=0> about this. We will try to come up with a
semi-implicit approach (Bouillon et al. 2009).
-Could you please rerun the 29 hours and if possible have a plot of the max
speed (in ice_std_out) as a function of time?
-To make sure it is not the remapping, could you please do a second run with
upwind? You could also add max speed as a function of time (with upwind) to
the previous plot.
Sounds good?
jf
-
Reply to this email directly, view it on GitHub
<https://usg01.safelinks.protection.office365.us/?url=https%3A%2F%2Fgithub.c
om%2FCICE-Consortium%2FCICE%2Fissues%2F941%23issuecomment-1994536201&data=05
%7C02%7Cdavid.a.hebert10.civ%40us.navy.mil%7Cbff7a6fa1243402abedd08dc436a23e
9%7Ce3333e00c8774b87b6ad45e942de1750%7C0%7C0%7C638459370350503273%7CUnknown%
7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn
0%3D%7C0%7C%7C%7C&sdata=gc6AWCxdg6WD4cd7OC0d9721tNXRJ3dly29t9sUmEng%3D&reser
ved=0> , or unsubscribe.
You are receiving this because you authored the thread.
<https://github.com/notifications/beacon/AE52VPEWQV34PTBD3KNXZSLYYBPH7A5CNFS
M6AAAAABESKRND6WGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTT
W4I2QS.gif> Message ID:
***@***.***
***@***.***> >
|
I emailed Martin Losch. Here is what he wrote: "I haven’t had any issues with EVP in a very long time, so I am not quite sure how much I can help here. I always use explicit Coriolis (also for implicit solvers) and this was never a problem other than a timestep constraint (~1hour or so). The EVP subcycling “timesteps” are so short that I don’t expect this ever to be an issue. If this is happening in the nearly free-drift of the MIZ I would expect that there’s something with happening with the ice-ocean/ice-wind stress. Here I usually use a “semi-implicit” treatment, where I compute the linear drag from cDrag = Cd*abs(ui-uo) (with some minimal drag) and then add the the “symmetric parts” (the cosine part when you do rotate the drag to model some sort of Ekman spiral) to the LHS. But In a python version of the EVP solver we tried both implicit and explicit treatment and didn’t find too much of a difference (in coarse simulations). I hope that helps" |
One more comment from Martin: "HYCOM is on a C-grid, right? If CICE was on B then there needed to be some averaging to get the velocities onto B-points and the ice-ocean stress right, whereas now this averaging needs to be different, right? So I would put my money on the ice-ocean stress …" |
Well, I am not totally ignoring the possibility of Coriolis instability since, in absence of friction, the explicit representation is unconditionally unstable. However, we had a coupling issue in the past in NEMO that is worth mentioning: it is the feedback between ice-ocean stress computed in the sea-ice component and passed back to NEMO. NEMO uses a leapfrog scheme which is stable to friction if friction uses velocity at n-1 for computing velocity at n+1 (leapfrog on n-1, n, and n+1). However, NEMO was passing ocean velocity at time n to the ice module, which is passed back to NEMO ultimately, once the ice-ocean stress is computed, and is therefore unstable in the leapfrog framework... just a thought! |
Hi Dave, https://github.com/JFLemieux73/CICE/tree/semi_imp_Coriolis I ran gx1 for five days in stand-alone mode. The fields (aice, hi, uvelE, vvelN) are very similar between the standard (explicit) and new (semi-implicit) code. |
For what its worth ... I ran latest CICE with nuopc/cmeps driver on 1 degree tripole grid for 12 months without any crashes. The uvel/vvel maximum look ok. The resolution is quite different but the configuration is similar, i.e. all coupling for ocean and atmosphere is done on the A-grid but cice is calculating on C-grid ( |
Thanks Anton for the info. |
I have just run CICE6-MOM6 coupled with JRA55 forcing for two cycles (122) years. CICE6 / MOM6 are both on the c-grid, but the coupling is still done at A-grid points. This was the 2/3 degree tripole grid. |
Quick update here. We have started running CESM coupled with the C-grid turned on. This ran fine when we initialized with a "spun-up" ocean and sea ice state. However, when we started the ocean from rest and with CICE having ice_ic = 'none' this blows up very quickly with grid_ice = 'C', but is fine with grid_ice = 'B'. Note that this failure occurs in MOM first. This is an extreme case, but it does highlight some potential edge cases for the C-grid. |
This is similar to our experience at this point - I get either errors from exceeding the MOM CFL truncation limit too many times or extreme lows in sea surface height. The speculation was it close to the stability limit in b-grid and the small change caused the failure in the c-grid. I'm still investigating but there was nothing obviously strange going on in the CICE output |
Hi @dabail10 and @anton-seaice, If I understand well the model starts without ice when ice_ic='none'., right? When it is about to crash, do you get large sea ice velocities in CICE? |
I can't speak for the ACCESS-NRI folks, but our test that failed had ice_ic = 'none' and we do get 6 m/s velocities in the Southern Hemisphere. |
As there is very little ice, the rheology term is negligible. It should be easier to debug. You can even make a test to see if it still crashes with Pstar=0 (if kstrength=0) or Cf=0 (kstrength=1). My guess is that it is related to the air and water stresses. Assuming the sea surface tilt is small (it is supposed to be a small term in the sea ice momentum eq.), the momentum balance is aicetauair = aicetauwater. With the interpolation we are doing right now I am not convinced that aice on the left = aice on the rigth. This could cause a crash. |
Agreed. This is an extreme test. When we initialize with "realistic" ice conditions it runs fine. I will work on some more debugging of this soon. |
I am working on updating to CICE 6.5.0 to enable use of CICE on the C-Grid. At the moment we export ocean velocities on the ‘A’ grid. Before moving ocean coupling location to C-grid, I wanted to test running CICE on C-grid and ocean on ‘A’ grid. After 29 hours I get errors that I think look like what J-F was seeing in his testing. Here is info in the logs. This is not at the tripole seam (total grid size is 4500x3527, this is happening at (2561,3033). (In email I mentioned at southern hemisphere, but I was mistaken, sorry). Any thoughts on what else to look for? Thank you for your help!
grid_ice = C
grid_ice_thrm = T
grid_ice_dynu = E
grid_ice_dynv = N
grid_atm = A
grid_atm_thrm = T
grid_atm_dynu = T
grid_atm_dynv = T
grid_ocn = A
grid_ocn_thrm = T
grid_ocn_dynu = T
grid_ocn_dynv = T
New mass < 0, i, j = 6 3034
Old mass = 1.009873647647687E-004
New mass = -1.723111112528535E-007
Net transport = -1.011596758760216E-004
istep1, my_task, iblk, cat = 7552661 426 1 1
(print_state) negative area (ice)\u00A5
(print_state) istep1, my_task, i, j, iblk: 7552661 426 6
3034 1
(print_state) Global block: 427
(print_state) Global i and j: 2561 3033
(print_state) Lat, Lon (degrees): 58.5665063869671
-89.4533113919817
aice 6.447869072451422E-004
aice0 0.999355213092755
uvel(i,j) 3.33410265094825
vvel(i,j) -2.27173681352247
uvelE(i,j) 6.66673187374646
vvelN(i,j) 0.000000000000000E+000
atm states and fluxes
uatm = 10.7973263899519
vatm = -9.60914979831031
potT = 282.715820312500
Tair = 282.715820312500
Qa = 6.291424389928579E-003
rhoa = 1.23230516910553
swvdr = 150.374064941406
swvdf = 128.892055664062
swidr = 166.485571899414
swidf = 91.2985394287109
flw = 310.546752929688
frain = 4.701963916886598E-005
fsnow = 0.000000000000000E+000
ocn states and fluxes
frzmlt = -1000.00000000000
sst = 5.41451142398993
sss = 30.1864523562483
Tf = -1.63006842723741
uocn = 7.606549328525385E-002
vocn = -0.499165868790315
strtltxU= 0.000000000000000E+000
strtltyU= 0.000000000000000E+000
The text was updated successfully, but these errors were encountered: