Technical Working Group Meeting, September 2018


Date: 11th September 2018

  • Marshall Ward (MW) (Chair), NCI
  • Aidan Heerdegen (AH) and Andrew Kiss (AK), CLEX ANU
  • Russ Fiedler (RF), Matt Chamberlain (MC), CSIRO Hobart
  • Nic Hannah (NH) Double Precision
  • Peter Dobrohotoff (PD), CSIRO Aspendale

Clean up Actions list


  • Incorporate RF wave mixing update into MOM5 codebase + bug fix (AH)
  • Code harmonisation updates to ACCESS and ESM meetings (PD, RF)
  • Check red sea fix timing is absolute, not relative (AH)
  • MW liase with AK about tenth model hangs (AK, MW)
  • Profile ACCESS-OM2-01 (MW)


  • Follow up with Andy Hogg regarding shared codebase (MW)
  • Nudging code test case (RF)


MW: 4 block success. 16 block didn’t work. sectrobin also didn’t work. Limited perspective on problem.

RF: blow out in time with extra blocks was halo updates. Weakness with round robin. A lot of overhead, no local comms. Maybe 8 tiles/processor might work. Marshall’s profiling showed small number of processors dominated run time. Want to minimise the maximum. That is the limiter

AH: Where are the max tiles?

RF: Seasonal ice near Hudson Bay, Sea of Okhotsk and Aleutian Islands.

MW: Nic used total CPU count less than number of blocks

RF: Could run with more, or less. MW: 80 CPUs less, could solve this.

AH: General strategy to concentrate on not assigning CPUs to the low work (blue areas) and let the high work areas take care of themselves?

RF: Only worried about slowest tile. Nice to have even distribution, but hard to achieve that in practice.

AH: Slowest tiles change over time RF: read in a map of expected ice concentration. Or have a heuristic, say weight by latitude. AH: If identify areas that do very little work, say never want to have many processors there, and free up processors for high work areas.

AK: There are five hot stripes and four cold stripes. Some processors have 5 blocks, some have 4. The outlying busiest ranks are on those hot stripes. If we get rid of striping with more even split, that would have maybe a spike on a lower baseline

RF: About half the processors have 5 about half have 4, request a few more PEs and that would close to balancing this issue.

NH: First attempt 1600 PEs with an even 4 blocks across all. With idealised test case Ocean was not blocking at all. Though could save a couple of hundred PEs, and there was not a big difference. However Andrew’s real world config is behaving differently. Worth going back up to 1600 and doing an even 4 or even 8 blocks. Assumed wanted everything to be even. Seemed roughly the same to have a mix. This profiling shows I was wrong.

RF: Can easily work out to get exactly 5 blocks per PE. AK: If you give me that number I can try it. NH: 5 across the board is better. Don’t want a single PE doing more work. RF: Slowest one kills you.

AH: How does the land masking affect it? A thicker stripe in NH? RF: Yes. Did I post a picture of where tiles are allocated? NH: More blocks means getting rid of more land? RF: Lose with communication cost.

NH: In order to get this working I ran into the raijin problem: messages getting lost and deadlocks. When we got 0.1 deg MOM-SIS working had issues with point to point sends and recvs, and Marshall change that to proper gather to get initialisation working. The gather inside CICE is implemented with point to point sends and recvs. Assume similar. It is doing a send for every block. MW: Andrew’s finished ok? AK: Ran with 30×35. MW: mxm might resolve this problem? NH: Resolved by putting in a barrier after all the sends, otherwise deadlocks. MW: Did you add barriers? NH: Yes to the MPI gather code. MW: Clear that CICE is heavily barriered. NH: Could implement properly with MPI_gather. MW: Caveat didn’t work with the global field. NH: Only does a global gather once when writing out restarts. Not too bad. MW: A lot of MPI ranks? NH: 1600 x number of blocks is the number of sends. MW: So number of messages, not number of ranks. MW: Only added barrier for restart? NH: Could have done that, but added in MPI_gather. Maybe that is bad? Actually didn’t add, just enabled it by defining a preprocessor flag.

AH: Is there an effect that it gets wider in the north that you’re sampling more ice in those areas?

AH: Should we pull out the slowest blocks and see where all the blocks are that contribute to the slowest processors.

RF: Correspond to areas of highest ice concentration. AH: There is ice in Okhotsk in northern summer? RF: Yes.

MW: Arctic and Antarctic are sharing work. RF: How many for this run? MW: 1385 RF: If you run with 1500 or so get an even distribution.

NH: Should decided what is the next step/run?

MW: Two options, massively increase number of blocks, but this is blowing out with comms time, or even divided 5 blocks. RF: Yes that is the one to do next.

AK: sectrobin should solve the communications issue but couldn’t get it to run. NH: Not sure if code needs to change? RF: Test on 1 degree model.

AK: First step to even up current run with 4 or 5 blocks. MW: Should confirm that many blocks is a comms problem and not a tripole issue for example. But this is a research problem.

AK: Will switch to this for 0.1 deg production as it is already better.

NH: New code 1 block per PE gives identical answers to old code. 4 blocks does not give identical answers to old code. Not sure if I should expect it to be the same. Don’t know how CICE works. In terms of coupling it should be the same if you’re coupling to individual blocks or multiple blocks. Not ruling out it should be identical and there is something going wrong. AK: What would make it non-identical? Order of summation? NH: could be something like that. MW: Might be CIE doing a layer calc before doing vertical? Have to know more about CICE. NH: might be worth looking into further so at least we know that we’re not making bugs.

AK: How would I switch to this for the production run? Not bitwise identical? Just check fields look physically reasonable? NH: Hard problem. Can’t see physical difference. Only looking at last few bits of a floating point number. MW: Did an MPI sum on a single rank and it changed the last bit. Found it running the FMS diagnostics and that is why they failed. Don’t fail at GFDL. Scary stuff. NH: Scary and time consuming.

MW: Clear strategy. Get rid of bands. Go with 1600 cores. Have a 16 block job running, will keep everyone updated.

Code Harmonisation

AH: My understanding with the ESM harmonisation is that we’re close, as we haven’t yet put in the coupling changes from CM2 that you had to take out of the ESM code. PD: Dave Bi’s iceberg scheme? AH: If we get the WOMBAT code into MOM5 that would be harmonised I think. PD: Maybe Matt has a better handle?

MC: Are the OM and CM almost harmonised except for iceberg information? Are they almost the same? AH: I believe so. Once we get WOMBAT in there we’re good to go. Russ had a different idea about how to handle the case of different coupling fields.

RF: Have to get rid of ACCESS keyword. In many cases redundant. AH: ACCESS keyword can be replaced  by ACCESS_CM or ACCESS_OM. RF: Yes!

RF: On CICE side of things (and probably MOM) coupling fields are currently defined as parameters. Can use calls to PRISM, test return code, put some tests for legal code/parameters for icebergs for example. Don’t need ifdef’s, can test on the fly. A lot easier than recompiling every time.

AH: How do we implement this? Put WOMBAT code in now so we have an ESM harmonised version and then deal with coupling etc as this is ACCESS-CM? RF: Want to bed down ACCESS-CM and OM harmonised first. The WOMBAT stuff will move in quite simply. I’d like to take that on, have been tasked to do this to take some of the load off Matt. Get this first step out of the way and then move on to WOMBAT and ESM. Until the first step done things can be in a state of flux.

MC: Is wind ehanced mixing in ACCESS-OM? RF: Yes. MC: FAMIP in ACCESS-OM? RF: They’re in MOM5. MC: They weren’t in ACCESS-CM code. AH: That is a 3 year old fork. MC: Can we update ESM from ACCESS-OM? AH: This morning putting WOMBAT changes into MOM5 pull request. Can grab and check if it works. MC: What is the difference in pulling from one direction to the other? AH: ESM is a 3 year old fork with little history in common with current MOM. Couldn’t code  into ESM would be too difficult. Cherry picked your changes into the MOM5 code, but wouldn’t work the other way. Will lease with Russ to get ACCESS-CM changes.

AH: Would WOMBAT always be part of MOM5-SIS. MW: Is it big? RF: No, very small. MW: Let’s leave it in MOM5. Just executable bloat. RF: Just a few fields. MC: Allocated, so if not turned on, then no issues. RF: WOMBAT wants the 10m waves, but we need that for the wave mixing as well.

Travis CI on MOM5

AH: ACCESS-OM no longer compiles because you need libaccessom2 as well. NH: Same before. Always needed OASIS. AH: I’ve got CM compiling by pulling in OASIS and make it. All the compilation tests are passing. Could pull in the libaccessom2 and compile in a similar way to ACCESS-CM. There is no old ACCESS-OM build anymore. It is ACCESS-OM2. MW: Do we want to do this external to the repo? AH: Nice to have the tests there and passing. OM now has different driver code to CM, so can’t be sure you’ve done it properly without an ACCESS-OM compilation test. NH: There always needs to be a dependency on a coupler. libaccessom2 is more than a coupler. Maybe some of it is undesirable. Not worse than having a dependency on OASIS. AH: Just wanted to make sure there wasn’t an ACCESS-OM that was independent of libaccessom2. MW: Can you provide libaccessom2 as a binary and headers? AH: Yes, that is a possibility. NH: Could just be a .a file. MW: that is how you handle dependencies, as a binary, like libc. MW: Do you call OASIS in MOM? NH: Yes. In yatm don’t directly call OASIS. Could change coupler in future without changing models. MW: No problem with wrapping OASIS. AH: Can do the same thing I did with CM, pulled in OASIS, built it. Pretty straightforward.



  • Create even 5 blocks per PE map for CICE (RF)
  • Get coupling changes into MOM for harmonisation (RF+AH)


  • Update model name list and other configurations on OceansAus repo (AK)
  • Shared google doc on reproducibility strategy (AH)
  • Pull request for WOMBAT changes into MOM5 repo (MC, MW)
  • Compare out OASIS/CICE coupling code in ACCESS-CM2 and ACCESS-OM2 (RF)
  • After FMS moved to submodule, incorporate MPI-IO changes into FMS (MW)
  • Incorporate WOMBAT into CM2.5 decadal prediction codebase and publish to Github (RF)
  • Move FMS to submodule of MOM5 github repo (MW)
  • Make a proper plan for model release — discuss at COSIMA meeting. Ask students/researchers what they need to get started with a model (MW and TWG)
  • Blog post around issues with high core count jobs and mxm mtl (NH)
  • Look into OpenDAP/THREDDS for use with MOM on raijin (AH, NH)
  • Add RF ocean bathymetry code to OceansAus repo (RF)
  • Add MPI barrier before ice halo updates timer to check if slow timing issues are just ice load imbalances that appear as longer times due to synchronisation (NH).
  • Redo SSS restoring with patch smoothing (AH)
  • Get Ben/Andy to endorse provision of MAS to CoE (no-one assigned)
  • CICE and MATM need to output namelists for metadata crawling (AK)
  • Provide 1 deg RYF ACCESS-OM-1.0 config to MC (AK)
  • Update ACCESS-OM2 model configs (AK)

COSIMA 2018 Report

Aims & Goals

The third meeting of the Consortium for Ocean Sea Ice Modelling in Australia (COSIMA) was held in Canberra on 7-8 May 2018. This annual COSIMA workshop aims to:

  • Establish a community around ocean-sea ice modelling in Australia;
  • Discuss recent scientific advances in ocean and sea ice research in a forum that is inclusive and model-agnostic, particularly including observational programs;
  • Agree on immediate next steps in the COSIMA model development plan; and
  • Develop a long-term vision for Australian scientific advances in this area.


The 2018 workshop is our largest workshop yet, with 30 talks and 49 participants.

Attendees included:

Gary Brassington (Bureau of Meteorology), Matt Chamberlain (CSIRO), Chris Chapman (CSIRO), Fabio Dias (UTAS/CSIRO), Prasanth Divakaran (Bureau of Meteorology), Peter Dobrohotoff (CSIRO), Catia Domingues (UTAS), Matthew England (UNSW), Russ Fiedler (CSIRO), Annie Foppert (CSIRO), Leela Frankcombe (UNSW), Bishakhdatta Gayen (ANU), Angus Gibson (ANU), Stephen Griffies (NOAA/GFDL), Nicholas Hannah (COSIMA), Aidan Heerdegen (ANU/CLEX), Petra Heil (AAD & ACE CRC), Andy Hogg (ANU), Ryan Holmes (UNSW), Shane Keating (UNSW), Andrew Kiss (ANU), Vassili Kitsios (CSIRO), Veronique Lago (UNSW), Clothilde Langlais (CSIRO), Andrew Lenton (CSIRO), Kewei Lyu (CSIRO), Jie Ma (CSIRO), Simon Marsland (CSIRO), Paige Martin (University of Michigan), Josue Martinez Moreno (ANU), Richard Matear (CSIRO), Laurie Menviel (UNSW), Mainak Mondal (ANU), Ruth Moorman (ANU), Adele Morrison (ANU), Terry O’Kane (CSIRO), Peter Oke (CSIRO), Ramkrushnbhai Patel (UTAS), Paul Sandery (CSIRO), Abhishek Savita (UTAS-CSIRO), Kate Snow (NCI), Paul Spence (UNSW), Kial Stewart (ANU/UNSW), Veronica Tamsitt (UNSW/CSIRO), Mirko Velic (Bureau of Meteorology), Marshall Ward (NCI), Luwei Yang (IMAS, UTAS), Rui Yang (NCI), Jan Zika (UNSW)


The workshop was structured to focus on scientific questions on Day 1, particularly in the first two sessions. In these sessions, topics ranged from from Antarctic shelf processes to oceanic convection, from reversibility of the Earth system to frictional drag. The final session on day 1 focussed more on technical issues, including assessment of the optimisation status of existing models. On Day 2, talks focussed more on strategic issues, including an outline of Bluelink, ACCESS, CAFE and coastal programs. These strategic talks transitioned to small-group discussions (see synthesis below). The workshop finished with a tutorial on the COSIMA Cookbook framework for model analysis.

The Australian landscape in ocean-sea ice research involves a number of interleaving programs, each of which was represented at this workshop.  The figure below outlines the linkages between these programs:

By way of explanation:

ACCESS-CM2/-ESM1.5 will be Australia’s input to CMIP6, and use MOM5 and CICE at 1°.

CAFE is the decadal prediction system in development, which uses MOM5.

ARCCSS/CLEX, ARC CoE programs, use high-resolution ocean-sea ice models for process studies.

Bluelink/OFAM is the ocean forecasting and reanalysis system which will adopt ACCESS-OM2-01 in future versions.

CSHOR is the Centre for Southern Hemisphere Oceanographic Research; it focuses on observational studies but we hope to establish two-way interactions with this program.

Coastal Modelling includes the Australian coastal oceanography community, as well as Antarctic nearshore programs within AAD and ACE-CRC.

A major theme of the workshop was to review the status of the ACCESS-OM2 model which is the focus of COSIMA. In short, we have had success with model releases at 1° and 0.25° resolution – these models are now actively being used for scientific runs, and are available for download and use by the community. They include a recent upgrade to the file-based atmosphere (YATM) and new JRA55-do forcing datasets. The 0.1° version of the model has progressed significantly in the last year; there are outstanding tasks to evaluate model output and further optimise the model configuration.

The COSIMA Cookbook tutorial was attended by about a third of participants, and some progress was made. The aim of this tutorial was to entrain more active users to the system and encourage input from those users. The Cookbook is similar in style to the analysis system being developed for CAFE and it may be possible to merge elements of each framework at some stage in the future.


Where available, talk files are linked from the presenter’s name.

Monday 7 May
10:00 Arrival & Morning tea
10:30 Session 1 (Chair – Andy Hogg)
Stephen M Griffies (NOAA/GFDL): Understanding and projecting global and regional sea level: More reasons to include refined ocean resolution in global climate models
Andrew Kiss (ANU): Overview of the ACCESS-OM2 model suite
Andrew Lenton (CSIRO): Ocean Reversibility in ACCESS-ESM
Catia Domingues (UTAS): Global and spatial temporal changes in upper-ocean thermometric sea level
Fabio Dias (UTAS/CSIRO): Mean and seasonal states of the ocean heat and salt budgets in ACCESS-OM2
Adele Morrison (ANU): Circumpolar Deep Water transport towards Antarctica driven by dense water export
Jan Zika (UNSW): Getting an ocean model to obey: Prescribing and perturbing exact fluxes of heat and fresh water
12:30 Lunch
13:30 Session 2 (Chair – Clothilde Langlais)
Petra Heil (AAD & ACE CRC): ACCESS-OM2-01 sea ice
Paul Sandery (CSIRO): Sea-ice data assimilation and forecasting using an Ensemble Transform Kalman Filter
Paul Spence (UNSW): Does the Southern Ocean have sleep apnea?
Veronique Lago (UNSW): Impact of projected amplification of Antarctic meltwater on Antarctic Bottom Water formation
Ryan Holmes (UNSW): Numerical Mixing in the COSIMA Models
Luwei Yang (IMAS, UTAS): The impacts of bottom frictional drag on the sensitivity of the Southern Ocean circulation to changing wind
Vassili Kitsios (CSIRO): Stochastic subgrid turbulence parameterisation of eddy-eddy, eddy-topographic, eddy-meanfield and meanfield-meanfield interactions
Matt Chamberlain (CSIRO): Using transport matrices to probe circulation in ocean models
15:30 Afternoon tea
16:00 Session 3 (Chair – Petra Heil)
Nicholas Hannah (COSIMA): ACCESS-OM2 Software Development
Marshall Ward (NCI): ACCESS-OM2 performance analysis
Rui Yang (NCI): Parallel IO in MOM5
Angus Gibson (ANU): Towards an adaptive vertical coordinate in MOM6
Jie Ma (CSIRO ): Investigating interannual-decadal variability of Indian Ocean temperature transport in an eddy-resolving model
Paige Martin (University of Michigan): Frequency-domain analysis of energy transfer in an idealized ocean-atmosphere model
17:30 Close
19:00 Workshop dinner (Debacle24 Lonsdale St Braddon)
Tuesday 8 May
9:00 Session 4 (Chair – Andrew Kiss)
Andy Hogg (ANU): Are we Redi for 0.25° ocean-climate models?
Kial Stewart (ANU): The Repeat Year Forcing for JRA55-do
Terry O’Kane (CSIRO): Coupled data assimilation and ensemble initialization with application to multi-year ENSO prediction
Gary Brassington (Bureau of Meteorology): Ocean forecasting status and outlook
Peter Oke (CSIRO): Bluelink activities and plans
Matthew England (UNSW): A proposal for future projection simulations using COSIMA ocean-ice models
Richard Matear (CSIRO): CSIRO Decadal Climate Forecasting, update of the project’s progress
Simon Marsland (CSIRO): Preparing ACCESS for CMIP6
Clothilde Langlais (CSIRO): Downscaling towards the coast – a perspective on where the coastal modelling group would like to go
11:00 Morning tea
11:30 Discussion: COSIMA planning and strategy
13:00 Lunch
14:00 Strategy and planning summary
14:30 COSIMA Cookbook tutorial
16:00 Close

Synthesis of Discussion

Tuesday afternoon included discussions of present and future needs and directions of the COSIMA community, via breakout sessions on the topics Sea Ice, Coastal / Forecasting, Coupled Modelling, Process Modelling, Biogeochemistry, and Technical. The overall threads of  these discussions are summarised here.

Open and accessible code, configurations, output and analysis

Transparency, accessibility and reproducibility of model code development, run configurations and output data were named as priorities by many groups. Nic Hannah’s proposed REDB (Reproducible Experiment Database, was widely supported as a means to tie together and curate the source code, configurations, output and analysis of model experiments. Using consistent shared codebases was also a priority. Containerisation was suggested as a method to make experiments self-contained. Extension of the database to include idealised experiments was also suggested.

Model evaluation

There is a need for more model evaluation against observations. Several groups highlighted the importance of better integration of observations for model validation and a desire for this functionality to be better supported in the COSIMA Cookbook. Comparison of CICE to SIS-1 at 1 and 1/4 deg was also suggested.

Technical validation is also needed – e.g. BGC, bit reproducibility, broadened test suite, regression testing. Model performance and stability priorities include: resolve crashes, balance load, MPI benchmarks and stress testing.


Suggestions included a glossary for beginners, an online portal for control runs, and to minimise difficulty of running new model configurations. Standardised output files and naming conventions would facilitate analysis. Improved functionality and versatility of the COSIMA Cookbook was also suggested.

Documentation was a priority for many, in particular an ACCESS-OM2 documentation paper, but also open/evolving documentation as the models develop.

Parameter selection was also a concern for many – how to choose appropriate parameters (e.g. for ice or BGC), how to assess model sensitivity to parameters, how to document why parameters were chosen or altered. Data assimilation was suggested a way to improve ice parameter selection, including assimilation of under-ice observations (e.g. temperature). BGC was suggested as a way to constrain the dynamics.

It was pointed out that the payu run management software underpins model runs, yet formal funding for its continued development is presently lacking.

Model enhancements

Suggestions for enhanced modelling capability included: interannual forcing, WOMBAT BGC, coupling to an atmosphere model, 1-way nesting, coupling to wavewatch, explicit tides, wet/dry cells.

Community coordination, synergies and strategy

Suggestions included a streamlined process for providing community feedback and deciding on priorities, and for community involvement in developing the BGC component. It was also suggested to foster engagement with atmosphere and sea ice specialists, and have a more formalized ice group. The technical team is also seeking more input from scientists, especially regarding sea ice.

Regarding modelling strategy, it was suggested to have intelligent model diversity (not too many versions), a consensus on standard perturbation experiments, and to decide on resources to commit to MOM5 vs. MOM6.

Summary of Priority Tasks

The following list of tasks was identified as a priority for the near term. Volunteers to lead or assist with tasks much appreciated.

  1. IAF Runs: With the addition of YATM, we now have the facility to run Interannual Forcing (IAF) runs from the JRA55-do forcing dataset in ACCESS-OM2. Once YATM has been tested, we will conduct IAF runs at all resolutions, starting with 1°.
  2. Model Documentation: Production of a model documentation paper is a high priority for the coming months. This will be achieved by:
    1. Writing a larger technical documentation report ( that will be stripped down to feed into a paper; and
    2. Inviting community evaluation of existing model output.
  3. Model evaluation and analysis: We propose the COSIMA Cookbook as a framework for users to contribute model analyses. In particular, we encourage observational comparisons with existing model output, and also encourage users to submit bug reports and feature requests via
  4. WOMBAT: In the coming months we will look to implement the WOMBAT biogeochemistry model (already running in MOM5) into the ACCESS-OM2 framework.
  5. Capability gaps: The COSIMA community has been able to leverage expertise from a number of different programs. However, our community as a whole remains subcritical in several areas, including sea ice modelling and atmospheric dynamics.
  6. REDB: Nic Hannah proposed a new system for tracking simulations and the output data. This system was identified by many discussion groups as a potential solution to some of our collaboration roadblocks. We will investigate the viability of such a system.
  7. MOM6: Plan is to begin transition to MOM6, building up experience in the latter half of 2018.

Recommendations for COSIMA 2019 workshop

  • Institute a James Munroe award for contributions to COSIMA
  • Extend to a 2.5-day workshop to allow more time for discussion (not extra talks)