Coordinates in cube.dim_coords differ: time

199 views
Skip to first unread message

Marga

unread,
Oct 25, 2021, 4:54:50 AM10/25/21
to SciTools (iris, cartopy, cf_units, etc.) - https://github.com/scitools
Hello,

When I merge the CMIP6 model FIO-ESM-2-0 files the next error appears: "Coordinates in cube.dim_coords differ: time"

The code I use is the next one:
import iris
import numpy as np
from iris.experimental.equalise_cubes import equalise_attributes
cube = iris.load(['tos_Omon_FIO-ESM-2-0_ssp245_r1i1p1f1_gn_201501-210012.nc-ts0','tos_Omon_FIO-ESM-2-0_historical_r1i1p1f1_gn_185001-201412.nc-ts0-init'])
equalise_attributes(cube)
print(cube.merge_cube())
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/gpfs/projects/bsc32/software/suselinux/11/software/Iris/2.4.0-foss-2019b-Python-3.7.4/lib/python3.7/site-packages/scitools_iris-2.4.0-py3.7.egg/iris/cube.py", line 396, in merge_cube
    proto_cube.register(cube, error_on_mismatch=True)
  File "/gpfs/projects/bsc32/software/suselinux/11/software/Iris/2.4.0-foss-2019b-Python-3.7.4/lib/python3.7/site-packages/scitools_iris-2.4.0-py3.7.egg/iris/_merge.py", line 1279, in register
    error_on_mismatch)
  File "/gpfs/projects/bsc32/software/suselinux/11/software/Iris/2.4.0-foss-2019b-Python-3.7.4/lib/python3.7/site-packages/scitools_iris-2.4.0-py3.7.egg/iris/_merge.py", line 277, in match_signature
    raise iris.exceptions.MergeError(msgs)
iris.exceptions.MergeError: failed to merge into a single cube.
  Coordinates in cube.dim_coords differ: time.

Thank you very much in advance.
Marga


Message has been deleted

Marga

unread,
Oct 25, 2021, 5:02:21 AM10/25/21
to SciTools (iris, cartopy, cf_units, etc.) - https://github.com/scitools
I send the ncdump -hs output:

ncdump -hs tos_Omon_FIO-ESM-2-0_historical_r1i1p1f1_gn_185001-201412.nc-ts0
netcdf tos_Omon_FIO-ESM-2-0_historical_r1i1p1f1_gn_185001-201412 {
dimensions:
        i = 320 ;
        j = 384 ;
        time = UNLIMITED ; // (1 currently)
        bnds = 2 ;
        vertices = 4 ;
variables:
        int i(i) ;
                i:units = "1" ;
                i:long_name = "cell index along first dimension" ;
                i:_Storage = "chunked" ;
                i:_ChunkSizes = 320 ;
                i:_DeflateLevel = 4 ;
                i:_Shuffle = "true" ;
                i:_Endianness = "little" ;
                i:_NoFill = "true" ;
        int j(j) ;
                j:units = "1" ;
                j:long_name = "cell index along second dimension" ;
                j:_Storage = "chunked" ;
                j:_ChunkSizes = 384 ;
                j:_DeflateLevel = 4 ;
                j:_Shuffle = "true" ;
                j:_Endianness = "little" ;
                j:_NoFill = "true" ;
        double latitude(j, i) ;
                latitude:standard_name = "latitude" ;
                latitude:long_name = "latitude" ;
                latitude:units = "degrees_north" ;
                latitude:missing_value = 1.e+20 ;
                latitude:_FillValue = 1.e+20 ;
                latitude:bounds = "vertices_latitude" ;
                latitude:_Storage = "chunked" ;
                latitude:_ChunkSizes = 384, 320 ;
                latitude:_DeflateLevel = 4 ;
                latitude:_Shuffle = "true" ;
                latitude:_Endianness = "little" ;
                latitude:_NoFill = "true" ;
        double longitude(j, i) ;
                longitude:standard_name = "longitude" ;
                longitude:long_name = "longitude" ;
                longitude:units = "degrees_east" ;
                longitude:missing_value = 1.e+20 ;
                longitude:_FillValue = 1.e+20 ;
                longitude:bounds = "vertices_longitude" ;
                longitude:_Storage = "chunked" ;
                longitude:_ChunkSizes = 384, 320 ;
                longitude:_DeflateLevel = 4 ;
                longitude:_Shuffle = "true" ;
                longitude:_Endianness = "little" ;
                longitude:_NoFill = "true" ;
        double time(time) ;
                time:bounds = "time_bnds" ;
                time:units = "days since 0001-01-01" ;
                time:calendar = "365_day" ;
                time:axis = "T" ;
                time:long_name = "time" ;
                time:standard_name = "time" ;
                time:_Storage = "chunked" ;
                time:_ChunkSizes = 1 ;
                time:_DeflateLevel = 4 ;
                time:_Shuffle = "true" ;
                time:_Endianness = "little" ;
                time:_NoFill = "true" ;
        double time_bnds(time, bnds) ;
                time_bnds:_Storage = "chunked" ;
                time_bnds:_ChunkSizes = 1, 2 ;
                time_bnds:_DeflateLevel = 4 ;
                time_bnds:_Shuffle = "true" ;
                time_bnds:_Endianness = "little" ;
                time_bnds:_NoFill = "true" ;
        float tos(time, j, i) ;
                tos:standard_name = "sea_surface_temperature" ;
                tos:long_name = "Sea Surface Temperature" ;
                tos:comment = "Temperature of upper boundary of the liquid ocean, including temperatures below sea-ice and floating ice shelves." ;
                tos:units = "degC" ;
                tos:cell_methods = "area: mean where sea time: mean" ;
                tos:cell_measures = "area: areacello" ;
                tos:missing_value = 1.e+20f ;
                tos:_FillValue = 1.e+20f ;
                tos:history = "2019-11-22T07:16:56Z altered by CMOR: Converted type from \'d\' to \'f\'." ;
                tos:coordinates = "latitude longitude" ;
                tos:_Storage = "chunked" ;
                tos:_ChunkSizes = 1, 384, 320 ;
                tos:_DeflateLevel = 4 ;
                tos:_Shuffle = "true" ;
                tos:_Endianness = "little" ;
                tos:_NoFill = "true" ;
        double vertices_latitude(j, i, vertices) ;
                vertices_latitude:units = "degrees_north" ;
                vertices_latitude:missing_value = 1.e+20 ;
                vertices_latitude:_FillValue = 1.e+20 ;
                vertices_latitude:_Storage = "chunked" ;
                vertices_latitude:_ChunkSizes = 384, 320, 2 ;
                vertices_latitude:_DeflateLevel = 4 ;
                vertices_latitude:_Shuffle = "true" ;
                vertices_latitude:_Endianness = "little" ;
                vertices_latitude:_NoFill = "true" ;
        double vertices_longitude(j, i, vertices) ;
                vertices_longitude:units = "degrees_east" ;
                vertices_longitude:missing_value = 1.e+20 ;
                vertices_longitude:_FillValue = 1.e+20 ;
                vertices_longitude:_Storage = "chunked" ;
                vertices_longitude:_ChunkSizes = 384, 320, 2 ;
                vertices_longitude:_DeflateLevel = 4 ;
                vertices_longitude:_Shuffle = "true" ;
                vertices_longitude:_Endianness = "little" ;
                vertices_longitude:_NoFill = "true" ;


ncdump -hs tos_Omon_FIO-ESM-2-0_ssp245_r1i1p1f1_gn_201501-210012.nc-ts0
netcdf tos_Omon_FIO-ESM-2-0_ssp245_r1i1p1f1_gn_201501-210012 {
dimensions:
        i = 320 ;
        j = 384 ;
        time = UNLIMITED ; // (1 currently)
        bnds = 2 ;
        vertices = 4 ;
variables:
        int i(i) ;
                i:units = "1" ;
                i:long_name = "cell index along first dimension" ;
                i:_Storage = "chunked" ;
                i:_ChunkSizes = 320 ;
                i:_DeflateLevel = 4 ;
                i:_Shuffle = "true" ;
                i:_Endianness = "little" ;
        int j(j) ;
                j:units = "1" ;
                j:long_name = "cell index along second dimension" ;
                j:_Storage = "chunked" ;
                j:_ChunkSizes = 384 ;
                j:_DeflateLevel = 4 ;
                j:_Shuffle = "true" ;
                j:_Endianness = "little" ;
        double latitude(j, i) ;
                latitude:standard_name = "latitude" ;
                latitude:long_name = "latitude" ;
                latitude:units = "degrees_north" ;
                latitude:_FillValue = 1.e+20 ;
                latitude:bounds = "vertices_latitude" ;
                latitude:_Storage = "chunked" ;
                latitude:_ChunkSizes = 384, 320 ;
                latitude:_DeflateLevel = 4 ;
                latitude:_Shuffle = "true" ;
                latitude:_Endianness = "little" ;
        double longitude(j, i) ;
                longitude:standard_name = "longitude" ;
                longitude:long_name = "longitude" ;
                longitude:units = "degrees_east" ;
                longitude:_FillValue = 1.e+20 ;
                longitude:bounds = "vertices_longitude" ;
                longitude:_Storage = "chunked" ;
                longitude:_ChunkSizes = 384, 320 ;
                longitude:_DeflateLevel = 4 ;
                longitude:_Shuffle = "true" ;
                longitude:_Endianness = "little" ;
        double time(time) ;
                time:bounds = "time_bnds" ;
                time:units = "days since 0001-01-01" ;
                time:calendar = "365_day" ;
                time:axis = "T" ;
                time:long_name = "time" ;
                time:standard_name = "time" ;
                time:_Storage = "chunked" ;
                time:_ChunkSizes = 512 ;
                time:_DeflateLevel = 4 ;
                time:_Shuffle = "true" ;
                time:_Endianness = "little" ;
        double time_bnds(time, bnds) ;
                time_bnds:_Storage = "chunked" ;
                time_bnds:_ChunkSizes = 1, 2 ;
                time_bnds:_DeflateLevel = 4 ;
                time_bnds:_Shuffle = "true" ;
                time_bnds:_Endianness = "little" ;
        float tos(time, j, i) ;
                tos:standard_name = "sea_surface_temperature" ;
                tos:long_name = "Sea Surface Temperature" ;
                tos:comment = "Temperature of upper boundary of the liquid ocean, including temperatures below sea-ice and floating ice shelves." ;
                tos:units = "degC" ;
                tos:cell_methods = "area: mean where sea time: mean" ;
                tos:cell_measures = "area: areacello" ;
                tos:missing_value = 1.e+20f ;
                tos:_FillValue = 1.e+20f ;
                tos:coordinates = "latitude longitude" ;
                tos:_Storage = "chunked" ;
                tos:_ChunkSizes = 1, 384, 320 ;
                tos:_DeflateLevel = 4 ;
                tos:_Shuffle = "true" ;
                tos:_Endianness = "little" ;
        double vertices_latitude(j, i, vertices) ;
                vertices_latitude:_Storage = "chunked" ;
                vertices_latitude:_ChunkSizes = 384, 320, 4 ;
                vertices_latitude:_DeflateLevel = 4 ;
                vertices_latitude:_Shuffle = "true" ;
                vertices_latitude:_Endianness = "little" ;
        double vertices_longitude(j, i, vertices) ;
                vertices_longitude:_Storage = "chunked" ;
                vertices_longitude:_ChunkSizes = 384, 320, 4 ;
                vertices_longitude:_DeflateLevel = 4 ;
                vertices_longitude:_Shuffle = "true" ;
                vertices_longitude:_Endianness = "little" ;

Tony Phillips - BAS

unread,
Oct 26, 2021, 11:30:03 AM10/26/21
to Marga, SciTools (iris, cartopy, cf_units, etc.) - https://github.com/scitools

Hi

 

The immediate cause of the issue that you are facing is that when you want to join cubes into one along an existing dimension, you need to use cubes.concatenate_cube() instead of cubes.merge_cube(). See: https://scitools.org.uk/iris/docs/v2.4.0/userguide/merge_and_concat.html.

 

But if you do that (having removed inconsistent attributes) for the FIO-ESM-2-0 files you have below, this may well fail because the “latitude” and “longitude” auxiliary coordinates are not consistent (at least, this is the case for the versions of these files at BADC). If I do this (on JASMIN):

 

import iris

from iris.experimental.equalise_cubes import equalise_attributes

 

cubes = iris.load(['/badc/cmip6/data/CMIP6/CMIP/FIO-QLNM/FIO-ESM-2-0/historical/r1i1p1f1/Omon/tos/gn/v20191122/tos_Omon_FIO-ESM-2-0_historical_r1i1p1f1_gn_185001-201412.nc',

    '/badc/cmip6/data/CMIP6/ScenarioMIP/FIO-QLNM/FIO-ESM-2-0/ssp245/r1i1p1f1/Omon/tos/gn/v20191227/tos_Omon_FIO-ESM-2-0_ssp245_r1i1p1f1_gn_201501-210012.nc'])

 

equalise_attributes(cubes)

cubes.concatenate_cube()

 

This reports the following:

 

ConcatenateError: failed to concatenate into a single cube.

  An unexpected problem prevented concatenation.

  Expected only a single cube, found 2.

 

Investigating the consistency of coordinates tells me that the longitude and latitude coordinates differ:

 

for coord in cubes[0].coords():

    print('Coord {} is equal: {}'.format(coord.name(), coord == cubes[1].coord(coord.name())))

 

Coord time is equal: False

Coord cell index along second dimension is equal: True

Coord cell index along first dimension is equal: True

Coord latitude is equal: False

Coord longitude is equal: False

 

And further comparisons of the points and bounds of the longitude and latitude coordinates tell me that it is the points for both coordinates that are close, but different:

 

import numpy as np

 

for coord in ['longitude', 'latitude']:

    for attr in ['points', 'bounds']:

        print('Quantity {}.{} are all the same?: {}'.format(coord, attr,

            np.all(getattr(cubes[0].coord(coord), attr) == getattr(cubes[1].coord(coord), attr))))

        print('Quantity {}.{} are all close?: {}'.format(coord, attr,

            np.all(np.isclose(getattr(cubes[0].coord(coord), attr), getattr(cubes[1].coord(coord), attr)))))

 

Quantity longitude.points are all the same?: False

Quantity longitude.points are all close?: True

Quantity longitude.bounds are all the same?: True

Quantity longitude.bounds are all close?: True

Quantity latitude.points are all the same?: False

Quantity latitude.points are all close?: True

Quantity latitude.bounds are all the same?: True

Quantity latitude.bounds are all close?: True

 

Because the points are all close but not quite the same, one answer is to just copy them from one cube to the other:

 

for coord in ['longitude', 'latitude']:

    cubes[1].coord(coord).points = cubes[0].coord(coord).points

 

cubes.concatenate_cube()                                                                                                                                                                                                                                              

Out[58]: <iris 'Cube' of sea_surface_temperature / (degC) (time: 3012; cell index along second dimension: 384; cell index along first dimension: 320)>

 

And all is well. Be aware that you may also need to use iris.util.unify_time_coordinates() on the cubes for some CMIP6 models before they will concatenate.

 

Best wishes

Tony

--
You received this message because you are subscribed to the Google Groups "SciTools (iris, cartopy, cf_units, etc.) - https://github.com/scitools" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scitools-iri...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/scitools-iris/3e87d5da-66d5-4205-b414-74c34c2bc47bn%40googlegroups.com.



This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses.

Marga

unread,
Nov 4, 2021, 1:15:47 PM11/4/21
to SciTools (iris, cartopy, cf_units, etc.) - https://github.com/scitools
Hi Tony,

Thank you very much for your kind answer, it solved the issue.

Thanks a lot!
Marga
Reply all
Reply to author
Forward
0 new messages