Appending VIIRS scenes

578 views
Skip to first unread message

Rutger

unread,
Mar 9, 2012, 11:25:22 AM3/9/12
to pyt...@googlegroups.com
Hey,

I noticed that the BETA quality data of NPP VIIRS is already publicly available (See: http://ladsweb.nascom.nasa.gov/data/search.html). This encourage me to do some preliminary tests regarding the processing. As a start i followed the guide on the Pytroll website. My initial goal was to convert the high resolution thermal band ('I05') to a projected geotiff containing the temperature in Kelvin.
The first thing i encountered was the string substitution used to find the VIIRS HDF files in the specified folder in npp1.cfg. It kept giving me errors about an "invalid string format", it might be some version/platform related issue. My workaround consists of editing the template filename (in npp1.cfg) from:

SV???_%(satellite)s_d%Y%m%d_t%H%M???_e???????_b%(orbit)s_c*_ops.h5
 
to

SV???_{satellite}_d%Y%m%d_t%H%M???_e???????_b{orbit}_c*_ops.h5

And then editing the code in 'viirs_sdr.py' where the filename is constructed to:

filename_tmpl = satscene.time_slot.strftime(options["filename"].format(satellite=values['satellite'], orbit=values['orbit'])) % values

This solution seems to work fine for me.

The second problem was that the geolocation file specified in the HDF file containing the imagery band points at GIMGO_*. The only geolocation files available at this time are the GITCO_*, the latter is the terrain corrected geolocation file. To overcome this i added this code at the part where the 'N_GEO_Ref' tag is read:
            if key == 'N_GEO_Ref':
                self.geo_filename = h5f.attrs[key][0, 0]
               
                filename_tmpl = self.geo_filename.replace('GIMGO','GITCO')[:45] + '*_ops.h5'
                file_list = glob.glob(os.path.join('D:\\my_mpop_directory\\', filename_tmpl))
                filename = os.path.basename(file_list[0]).strip('[]')

                self.geo_filename = ', '.join(glob.glob(os.path.join('D:\\my_mpop_directory\\', filename)))

It replaces GIMGO with GITCO, truncates the processing timestamp (which isnt equal unfortunately), and does a file search in the mpop directory. I had to hardcore the directory because the 'options["dir"]' wasnt available at that point, maybe it could be passed to the 'def read' function, that would be more robust.

Using these changes enables me to export the HDF data to a geotiff like this:
global_data = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 27), "01789")
global_data.load(['I05'])

local_data = global_data1.project("my_area", mode="nearest")
img = GeoImage(local_data['I05'].data, local_data.area, local_data.time_slot, mode='L', fill_value=0)
img.geotiff_save('D:\\my_output_dir\\viirs_test.tif', floating_point=True, compression=0)

This works fine, and the data appears as i would expect.

The next step is to combine multiple swaths (from the same orbit) to fill up my entire area. I figured this could be done using the "append" function like:
global_data1 = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 27), "01789")
global_data1.load(['I05'])

global_data2 = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 28), "01789")
global_data2.load(['I05'])

global_data1 = global_data1.append(global_data2)

img = GeoImage(local_data['I05'].data, local_data.area, local_data.time_slot, mode='L', fill_value=0)
# note that it uses the modified geoimage posted in this list to support floating values as output
img.geotiff_save('D:\\01_Algemeen\\Research\\VIIRS\\mpop\\viirs_output_test_combined.tif', floating_point=True, compression=0)

This results in the error: "ValueError: Mismatch between geometry and dataset"

The mpop API documentation doesn't mention anything like appending geometry data separately. What am i doing wrong?

Aside from the above, is it also possible to use anything other then "nearest" when projecting the data. The documentation says:
mode sets the mode to project in: ‘quick’ which works between cartographic projections, and, as its denomination indicates, is quick (but lower quality), and ‘nearest’ which uses nearest neighbour for best projection. A mode set to None uses ‘quick’ when possible, ‘nearest’ otherwise. I noticed that it uses kd_tree.py for the reprojection, would that imply that a gaussian or custom reprojection is also possible?


Thanks in advance,
Rutger



 


Martin Raspaud

unread,
Mar 12, 2012, 5:25:03 AM3/12/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 09/03/12 17:25, Rutger wrote:
> Hey,

Hi Rutger,

> I noticed that the BETA quality data of NPP VIIRS is already
> publicly available (See:
> http://ladsweb.nascom.nasa.gov/data/search.html). This encourage
> me to do some preliminary tests regarding the processing. As a
> start i followed the guide on the Pytroll website. My initial goal
> was to convert the high resolution thermal band ('I05') to a
> projected geotiff containing the temperature in Kelvin. The first
> thing i encountered was the string substitution used to find the
> VIIRS HDF files in the specified folder in npp1.cfg. It kept giving
> me errors about an "invalid string format", it might be some
> version/platform related issue.

[...]

> This solution seems to work fine for me.

Ok, thanks for the solution. We are still developing with python 2.6
here, on linux platforms, so we didn't notice this. I will try to find
a way to fix this. However, we still have machines running python 2.4
here, so "format" might not be an option...


> The next step is to combine multiple swaths (from the same orbit)
> to fill up my entire area. I figured this could be done using the
> "append" function like:

[...]

> This results in the error: "ValueError: Mismatch between geometry
> and dataset"
>
> The mpop API documentation doesn't mention anything like appending
> geometry data separately. What am i doing wrong?

Could you try using the assemble_segments function instead ? Like this:

global_data = assemble_segments(global_data1, global_data2)

> Aside from the above, is it also possible to use anything other
> then "nearest" when projecting the data.

[...]


> I noticed that it uses kd_tree.py for the reprojection, would that
> imply that a gaussian or custom reprojection is also possible?

Yes, gaussian or custom reprojection is possible in pyresample, but it
hasn't been plugged in in mpop yet.

Have a look here:
http://pyresample.googlecode.com/git/docs/build/html/API.html#module-kd_tree

For example, a solution using pyresample would be something like:
>>> from pyresample import kd_tree local_data[10.8] =
>>> kd_tree.resample_gauss(
gloal_data[10.8].area,
global_data[10.8].data,
my_area,
radius_of_influence=100000,
sigmas=1000000,
fill_value=None,
nprocs=4)

Best regards,
Martin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPXcDvAAoJEBdvyODiyJI4tVoH/0jezf3bNEMckYoAW37mtYys
M+zCsXSt4IsHMVu47g/t3QvKsVQzjWeQEM1J7GQZ1khIR7fya5v7ES+6YBgopXz5
/mNWt2B41/bYeJukj7KQYMVnO7A74I4J9huJ7iH/NG3h/mc0bIfcIOpD4L4xUcyM
dYPOn/per1MdPg4GrS7PgqvsnvghjfyeG6PDKrXmE2dEYbsMre+6+dcLZNpT+0cl
snpAKUK2KPemLfPo3TAE+TFrF3fmO2p3NeGV5avx2x+6xP9tR7c54MDvadRsYEso
D7Qr7z7XvCWMDdBLBYvurYLZO6GFi7nNybpjZ/DW+pUuOUX8UxwdvUVQ0ZlTt/Y=
=C0ZB
-----END PGP SIGNATURE-----

martin_raspaud.vcf

Adam Dybbroe

unread,
Mar 12, 2012, 5:55:20 AM3/12/12
to pyt...@googlegroups.com
Hi Rutger,

Nice to see someone outside the pytroll-team testing the viirs-reader.
We have so far tested it on the sample data and real data from the CLASS archive.
We hope to soon test it on direct readout data from Norrköping.

[...]


The second problem was that the geolocation file specified in the HDF file containing the imagery band points at GIMGO_*. The only geolocation files available at this time are the GITCO_*, the latter is the terrain corrected geolocation file. To overcome this i added this code at the part where the 'N_GEO_Ref' tag is read:
            if key == 'N_GEO_Ref':
                self.geo_filename = h5f.attrs[key][0, 0]
               
                filename_tmpl = self.geo_filename.replace('GIMGO','GITCO')[:45] + '*_ops.h5'
                file_list = glob.glob(os.path.join('D:\\my_mpop_directory\\', filename_tmpl))
                filename = os.path.basename(file_list[0]).strip('[]')

                self.geo_filename = ', '.join(glob.glob(os.path.join('D:\\my_mpop_directory\\', filename)))

It replaces GIMGO with GITCO, truncates the processing timestamp (which isnt equal unfortunately), and does a file search in the mpop directory. I had to hardcore the directory because the 'options["dir"]' wasnt available at that point, maybe it could be passed to the 'def read' function, that would be more robust.

Allright, thanks for the feedback! I think as a default we should stick to reading the geo-location file pointed to by N_GEO_Ref.
But it should be possible to override it, specifying another geolocation file, as you indicate. I will have a look at it, if you agree?



Using these changes enables me to export the HDF data to a geotiff like this:
global_data = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 27), "01789")
We should eventually get rid of the "1". Should be replaced by an empty string in the future. It was my fault.
As there is only one NPP, there is no series, and there should be no numbers. So the config file should be just "npp.cfg" in the future.
Or actually it should be "suomi npp" or? I don't like when agencies change name of a satellite after launch, and I am not sure what to about a white space " " in the satellite name. I don't like it. Need to test a bit. Would be nice if the user could enter either "npp" or "suomi npp" as she/he likes.


[...]

There are some issues left on handling granules. It appears the VIIRD SDR data have scale/offset for the refl/tb to be applied for each granule.
So if you read an aggregated dataset with several granules, we should actually apply the scale,offsets applicable to each individual granule.
So far they have been identical in the examples I have seen. But we are not doing strictly correct yet.
That is on my to-do-list.


Thanks again!
Adam




-- 
Adam Dybbroe,
Satellite Remote Sensing Scientist,
Numerical models and Remote Sensing,
Core Services, Swedish Meteorological and Hydrological Institute (SMHI)
www.pytroll.org
nwcsaf.smhi.se
www.smhi.se

Rutger

unread,
Mar 12, 2012, 10:37:02 AM3/12/12
to pyt...@googlegroups.com
Hey Adam,

Thank you for your response.

Alright, thanks for the feedback! I think as a default we should stick to reading the geo-location file pointed to by N_GEO_Ref.
But it should be possible to override it, specifying another geolocation file, as you indicate. I will have a look at it, if you agree?

I think reading it from the geo_ref tag is very convenient for default behavior. Im not even sure if the terrain corrected and general geolocation are even interchangeable like i did. I will have to read the VIIRS documentation to be sure. It might be best to wait a bit until all VIIRS products are operational and publicly available. If they are interchangeable, the option for overriding would be a nice addition.
 
We should eventually get rid of the "1". Should be replaced by an empty string in the future. It was my fault.
As there is only one NPP, there is no series, and there should be no numbers. So the config file should be just "npp.cfg" in the future.
Or actually it should be "suomi npp" or? I don't like when agencies change name of a satellite after launch, and I am not sure what to about a white space " " in the satellite name. I don't like it. Need to test a bit. Would be nice if the user could enter either "npp" or "suomi npp" as she/he likes.

Isnt Suomi just an honorable nickname? It seems that NASA also sticks to NPP, see for example the "mission_name" tag in the HDF files. The LAADSweb and class websites also still call it NPP. I wouldn't bother changing everything i guess.

There are some issues left on handling granules. It appears the VIIRD SDR data have scale/offset for the refl/tb to be applied for each granule.
So if you read an aggregated dataset with several granules, we should actually apply the scale,offsets applicable to each individual granule.
So far they have been identical in the examples I have seen. But we are not doing strictly correct yet.

Good point. I have stitched many MODIS granules together, and as long as they are next to each other (time wise), the conversion values have always been the same for as far as i can remember. But i know that for MODIS they do change over time, so at some point there should be two granules with different scales. It would be best to read and apply them for each granule seperatly, especially if you completely automate the whole process without any expert review.
For the VIIRS (I05) data i have collected so far they are all equal, even if there is a time difference of couple of days.


Regards,
Rutger

Op maandag 12 maart 2012 10:55:20 UTC+1 schreef Adam het volgende:

Rutger

unread,
Mar 12, 2012, 10:37:04 AM3/12/12
to pyt...@googlegroups.com
Martin,


Ok, thanks for the solution. We are still developing with python 2.6
here, on linux platforms, so we didn't notice this. I will try to find
a way to fix this. However, we still have machines running python 2.4
here, so "format" might not be an option...

Maybe there is a way which works in all versions, i'm not very familiar with Python string substitution. A method which is backwards compatible with older Python versions is preferred i guess. If i find some other ways which work on Windows/Python 2.7 i will let you know.


Could you try using the assemble_segments function instead ? Like this:

global_data = assemble_segments(global_data1, global_data2)

I tried it as mentioned above. When i write the result to a Geotiff file, the file contains the merged segments as one would expect. Writing the geotiff threw some error about geotransform, but this makes sense (i think?) because the data is the unprojected raw swath, the output tiff was valid but missing geo-information.

The next thing i tried was reprojecting the global data before writing it to a geotiff. When i use the 'scene.project(' on the scene containing the 'assembled segments', i get this error:

Code: local_data = global_data.project('my_area', mode='nearest')
Error:
Traceback (most recent call last):
  File "load_viirs.py", line 33, in <module>
    local_data = global_data.project('my_area', mode='nearest')
  File "C:\Python27\lib\site-packages\mpop-v0.10.0-py2.7.egg\mpop\scene.py", line 566, in project
    if chn.area == dest_area:
  File "C:\Python27\lib\site-packages\pyresample-0.7.10-py2.7.egg\pyresample\geometry.py", line 650, in __eq__
    return super(AreaDefinition, self).__eq__(other)
  File "C:\Python27\lib\site-packages\pyresample-0.7.10-py2.7.egg\pyresample\geometry.py", line 178, in __eq__
    if other.lons.data is None or other.lats.data is None:

When i try doing it 'manually' with kd_tree as you mentioned i get his error:
Code: local_data = pr.kd_tree.resample_gauss(global_data['I05'].area, global_data['I05'].data, area_def, radius_of_influence=10000, sigmas=10000, fill_value=0, nprocs=4)
Error:
Traceback (most recent call last):
  File "load_viirs.py", line 35, in <module>
    local_data = pr.kd_tree.resample_gauss(global_data['I05'].area, global_data['I05'].data, area_def, radius_of_influence=10000, sigmas=10000, fill_value=0, nprocs=4)
  File "C:\Python27\lib\site-packages\pyresample-0.7.10-py2.7.egg\pyresample\kd_tree.py", line 144, in resample_gauss
    reduce_data=reduce_data, nprocs=nprocs, segments=segments)
  File "C:\Python27\lib\site-packages\pyresample-0.7.10-py2.7.egg\pyresample\kd_tree.py", line 216, in _resample
    segments=segments)
  File "C:\Python27\lib\site-packages\pyresample-0.7.10-py2.7.egg\pyresample\kd_tree.py", line 261, in get_neighbour_info
    if source_geo_def.size < neighbours:
AttributeError: 'NoneType' object has no attribute 'size'

For the last error, my entire script looks like:
from mpop.satellites import PolarFactory
from mpop.imageo.geo_image import GeoImage
from mpop.projector import get_area_def
from mpop.scene import assemble_segments
import pyresample as pr
import datetime


global_data1 = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 27), "01789")
global_data1.load(['I05'])

global_data2 = PolarFactory.create_scene("npp", "1", "viirs", datetime.datetime(2012, 3, 2, 12, 28), "01789")
global_data2.load(['I05'])

global_data = assemble_segments([global_data1, global_data2])

area_def =  get_area_def('my_area')
local_data = pr.kd_tree.resample_gauss(global_data['I05'].area, global_data['I05'].data, area_def, radius_of_influence=10000, sigmas=10000, fill_value=0, nprocs=4)

local_data['I05'].show()

It appears as if the scene returned from the assemble_segments command contains no (or an empty) .area property.

print global_data1['I05'].area returns:
Shape: (1536, 6400)
Lons: [[-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 ...,
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]]
Lats: [[-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 ...,
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]
 [-- -- -- ..., -- -- --]]

print global_data['I05'].area returns:
None

Im not sure whats going on. Is this behavior you would expect from "assemble_segments"? If i look at the 'custom data' example on the website, it appears as if the returned scene could (or even should) be accessed similar to a 'raw' scene.

Thanks in advance!


Rutger


Op maandag 12 maart 2012 10:25:03 UTC+1 schreef Martin Raspaud het volgende:

Adam Dybbroe

unread,
Mar 12, 2012, 11:06:17 AM3/12/12
to pyt...@googlegroups.com
Rutger,


On 2012-03-12 15:37, Rutger wrote:
Hey Adam,

Thank you for your response.

Alright, thanks for the feedback! I think as a default we should stick to reading the geo-location file pointed to by N_GEO_Ref.
But it should be possible to override it, specifying another geolocation file, as you indicate. I will have a look at it, if you agree?

I think reading it from the geo_ref tag is very convenient for default behavior. Im not even sure if the terrain corrected and general geolocation are even interchangeable like i did. I will have to read the VIIRS documentation to be sure. It might be best to wait a bit until all VIIRS products are operational and publicly available. If they are interchangeable, the option for overriding would be a nice addition.
It is my understanding that they are interchangable. It all depends on what your application is. If you focus on clouds I suppose it is not crucial to have the pixels layed out following the terrain, but if your interest is the surface and you want to collocate with other surface observations I could imagine a geolocation taking into account the terrain may be crucial.

But this thinking is a bit new to me, so far having worked mainly with AVHRR and SEVIRI data.
Also the MODIS data are terrain corrected, I think by default. Am I wrong?


 
We should eventually get rid of the "1". Should be replaced by an empty string in the future. It was my fault.
As there is only one NPP, there is no series, and there should be no numbers. So the config file should be just "npp.cfg" in the future.
Or actually it should be "suomi npp" or? I don't like when agencies change name of a satellite after launch, and I am not sure what to about a white space " " in the satellite name. I don't like it. Need to test a bit. Would be nice if the user could enter either "npp" or "suomi npp" as she/he likes.

Isnt Suomi just an honorable nickname? It seems that NASA also sticks to NPP, see for example the "mission_name" tag in the HDF files. The LAADSweb and class websites also still call it NPP. I wouldn't bother changing everything i guess.

Well, in the TLE files it is "SUOMI NPP" since January 26th!
So using pyorbital for instance would look like this since that date (it was a bit annoying when I discovered this):

"""
import pyorbital.orbital as orb

import datetime
import numpy as np

sat = orb.Orbital('SUOMI NPP',  tle_file="/data/24/saf/polar_in/tle2/tle-20120202.txt")
time_start = datetime.datetime(2012, 2, 2, 1, 58)
pos = sat.get_lonlatalt(time_start)
print pos
"""

I vote on sticking to "npp" in mpop for the time being, but perhaps at least giving some kind of "nice" warning message if the user enters "suomi npp".

Best regards

Rutger

unread,
Mar 12, 2012, 12:11:46 PM3/12/12
to pyt...@googlegroups.com
Hey,

Its even worse, the name in the TLE file depends on what breed you are. ;)
For you meteorologists it is indeed Suomi NPP:
http://celestrak.com/NORAD/elements/weather.txt

For the rest of the world its still NPP (with some extra tag):
http://celestrak.com/NORAD/elements/noaa.txt

Or just NPP
ftp://is.sci.gsfc.nasa.gov/ancillary/ephemeris/tle/drl.tle

It would be very hard to generalize the naming, there seems to be no convention at use. In the case of PyOrbital, the user specifies the TLE file, so I'd say let the user also pick the correct naming. In the case of an exception maybe output all lines containing text (or not starting with 1 or 2) as possible names.


Regards,
Rutger


Op maandag 12 maart 2012 16:06:17 UTC+1 schreef Adam het volgende:

Hróbjartur Þorsteinsson

unread,
Mar 12, 2012, 2:33:17 PM3/12/12
to pyt...@googlegroups.com

I would be quite interested in high resolution terrain correction,
it would be very valuable to snow mapping in alpine landscape.

The modis geolocation files MOD03 seem to contain a very low
resolution terrain correction, not at all that useful.

I wonder if high resolution terrain correction, with a DEM will
be easy enough to implement in the future?

Hrob.





Martin Raspaud

unread,
Mar 12, 2012, 5:13:43 PM3/12/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Rutger,

>
> Could you try using the assemble_segments function instead ? Like
> this:
>>
>> global_data = assemble_segments(global_data1, global_data2)
>>
> I tried it as mentioned above. When i write the result to a Geotiff
> file, the file contains the merged segments as one would expect.
> Writing the geotiff threw some error about geotransform, but this
> makes sense (i think?) because the data is the unprojected raw
> swath, the output tiff was valid but missing geo-information.
>
> The next thing i tried was reprojecting the global data before
> writing it to a geotiff. When i use the 'scene.project(' on the
> scene containing the 'assembled segments', i get this error:

Ok, I will try running this myself tomorrow with some npp data, and
I'll see if I can reproduce the problem.
Assemble_segments should concatenate the lon/lats arrays, so they
should not be empty/masked...

Best regards,
Martin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPXmcHAAoJEBdvyODiyJI4WPsIAN6mvcLiJrltXEAZX/pTCj26
F7lRTp3Km1gps1+aEz+vXTLQuIWiyjT0A0TDMVFwTOBUjJF2dx/M7YdV7yhLr0VE
HRJ3IFzr3kgq8fghY390iR5VaEnUp6QHtpTmle4opAlv0HtiNnsLwBpbCwYhJjA2
NvQrv+CcD2QgGu+lj9XhfQESuUtrOsUCxvP2cfay1Ip0hVCvyvX3qvXpiconWJBO
M+jOArHOCFzK+VQw0Lv8pZV5rme3gVdmvyj/kpZLp7AxPTaq3PBsnwwybXQmND8i
vU1Fsn92JnxrohLtuE2m2G4bEyhn9aWtGeOVwdCj9XNrFf8nu+pobf0OdiUnlng=
=XJ1a
-----END PGP SIGNATURE-----

martin_raspaud.vcf

Martin Raspaud

unread,
Mar 12, 2012, 6:26:32 PM3/12/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Rutger,


> Im not sure whats going on. Is this behavior you would expect from
> "assemble_segments"? If i look at the 'custom data' example on the
> website, it appears as if the returned scene could (or even should)
> be accessed similar to a 'raw' scene.

Ok, I got it now. The assemble_swath function expects the area of the
scene (not the channel !) to contain the geo info. So a quick fix in
your case is to do
>>> global_data1.area = global_data1["I05"].area global_data2.area
>>> = global_data2["I05"].area
just before you assemble.

Admittedly, this is not viable solution, so I'll fix that for the next
release...

Thanks for digging up bugs in mpop ! ;)

Best regards,
Martin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPXngYAAoJEBdvyODiyJI4PWQIANzllpWk7VrRDDbP9IPt5w3C
OUW2ZafseocVQ3W0CG2AzZRnwFvk3THawnlqy6PTahCI7Dd9ACPzxTQYw2XV1qMU
n6u1NgZ1PxUBn3YwDLjSlGjf6n+IJzRr7MXJNI5QEZ9bmdVq/p8igrDqd/VNUa0w
CN6ic9ISmYobjLGlvdI+Q/y1dFTveSwOTge0ux/9XSA6Esv8yaFioqocFUuKXTNP
TzcgaaazlqUtfJFce1IJMni7pZHuGyc9Re5BBEWY4clhdO7PSGJ1TZZifnz/7FdT
mBrPCHHWGOLzmuMjkuUbUXckmV5fLTmUpKAaSMSFLgWxl1aL67X4VCoyTjYDohw=
=2Cio
-----END PGP SIGNATURE-----

martin_raspaud.vcf

Rutger

unread,
Mar 13, 2012, 5:37:43 AM3/13/12
to pyt...@googlegroups.com
Martin,

You are correct, this workaround does the trick. I can now correctly append scenes and reproject them, both using the .project method or by using pyresample.kd_tree.

I must say that im very impressed by the whole Pytroll package. This workflow from raw granules to a projected and merged geotiff takes only about 20 lines of code. Much easier then my current workflow for MODIS which looks a bit like:
- Use NASA's MRT Swath to extract and reproject the HDF's
- Stitch the granules together using IDL or Erdas Imagine
- Convert to radiances/brightness temperature

So im definitly going to try this for MODIS as well. The difference between MRT swath reprojecting and the kd_tree approach would be interesting. It often seems that MRT swath returns bands which appear misaligned compared to each other, even if the resolution is the same. If its in the data bands it will of course always fail, regardless of the resample method.

Another step which im thinking about is using PyOrbital to predict which granules overlap an area defined in the areas.def file within a certain time range. Im doing it manually now based on the overpass maps provided by SSEC. Making overpass plots myself worked out of the box (very nice!), so the next step would be taking the altitude and a maximum sensorzenith to calculate the edge of the swath. On a flat earth it would be simple, correcting for the curvature of the earth makes it more difficult but not impossible i guess. Once you have the edge of the swath, testing if the edge point (lat, lon) intersects an area wouldnt be that hard. Maybe even use something like Shapely or GEOS to test if it intersects a polygon in a shapefile. I will post an example if i get it working.


Thanks a lot for your help so far!


Op maandag 12 maart 2012 23:26:32 UTC+1 schreef Martin Raspaud het volgende:

Martin Raspaud

unread,
Mar 13, 2012, 10:00:28 AM3/13/12
to pyt...@googlegroups.com
On 13/03/12 10:37, Rutger wrote:
> Martin,

Hi Rutger,

> You are correct, this workaround does the trick. I can now correctly append
> scenes and reproject them, both using the .project method or by using
> pyresample.kd_tree.
>
> I must say that im very impressed by the whole Pytroll package. This
> workflow from raw granules to a projected and merged geotiff takes only
> about 20 lines of code. Much easier then my current workflow for MODIS
> which looks a bit like:
> - Use NASA's MRT Swath to extract and reproject the HDF's
> - Stitch the granules together using IDL or Erdas Imagine
> - Convert to radiances/brightness temperature

I'm really glad you like it :)

> So im definitly going to try this for MODIS as well. The difference between
> MRT swath reprojecting and the kd_tree approach would be interesting. It
> often seems that MRT swath returns bands which appear misaligned compared
> to each other, even if the resolution is the same. If its in the data bands
> it will of course always fail, regardless of the resample method.
>
> Another step which im thinking about is using PyOrbital to predict which
> granules overlap an area defined in the areas.def file within a certain
> time range. Im doing it manually now based on the overpass maps provided by
> SSEC. Making overpass plots myself worked out of the box (very nice!), so
> the next step would be taking the altitude and a maximum sensorzenith to
> calculate the edge of the swath. On a flat earth it would be simple,
> correcting for the curvature of the earth makes it more difficult but not
> impossible i guess. Once you have the edge of the swath, testing if the
> edge point (lat, lon) intersects an area wouldnt be that hard. Maybe even
> use something like Shapely or GEOS to test if it intersects a polygon in a
> shapefile. I will post an example if i get it working.

About passes overlapping with an area, we are doing this in operations
here thanks to the following:
- pyresample has a few spherical geometry function allowing to check if
a point is within an area or not
- we also use the spherical geometry functions in pyresample to estimate
the corner positions of a 1- or 3-minutes granule (using the corners'
azimuth and distance from ground track), and we then use pyresamples
overlap functions.
- when geolocation is available in the granules, we use them to check
overlapping.
- In the "geoloc" branch of the pyorbital project there is code to
compute pixel positions given the viewing geometry of the instrument and
the time. This is still experimental, but the results so far are quite
ok. Maybe something to try out ?

Tell us how it's working !

Best regards,
Martin

martin_raspaud.vcf

Rutger

unread,
Mar 15, 2012, 8:30:38 AM3/15/12
to pyt...@googlegroups.com
Hi,


About passes overlapping with an area, we are doing this in operations
here thanks to the following:
- pyresample has a few spherical geometry function allowing to check if
a point is within an area or not
- we also use the spherical geometry functions in pyresample to estimate
the corner positions of a 1- or 3-minutes granule (using the corners'
azimuth and distance from ground track), and we then use pyresamples
overlap functions.
- when geolocation is available in the granules, we use them to check
overlapping.
- In the "geoloc" branch of the pyorbital project there is code to
compute pixel positions given the viewing geometry of the instrument and
the time. This is still experimental, but the results so far are quite
ok. Maybe something to try out ?

Tell us how it's working !

Good tip about the Geoloc branch, i hadn't noticed it myself and it didn't ship with the tarball i downloaded. This seems to be doing exactly what i want (and more). I have modified your AVHRR example so it should be behave like VIIRS.

I have attached my first result, just for fun. The red dots are the predicted VIIRS pixels at 375m resolution (so 6400*1536). The blue line is the outer most row and column from the latitude and longitude arrays of a VIIRS geolocation file which i'm trying to mimic.

Its already pretty good, and very encouraging to start with. Some inaccuracies may come from the sensor characteristics i dug up on the web. The width inaccuracy is not really important for me, when searching for granules i will probably theshold the zenithangle anyway to make sure a granule has for example atleast angles of <45° covering the area of interest. These also seems to be a slight rotation, as if the orbit (or center-line) is not completely correct, i will try to validate this by comparing the predicted one with some other trackers or the overpass maps from SSEC.

What kind of accuracies do you experience? Given my example, would you say this is as good as it gets or is it more like 'yikes whats going wrong there! '? ;)

If i run into some interesting results i will let you know.


Regards,

Rutger

Op dinsdag 13 maart 2012 15:00:28 UTC+1 schreef Martin Raspaud het volgende:
granule_test.jpg

Martin Raspaud

unread,
Mar 15, 2012, 8:42:17 AM3/15/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Rutger,

On 15/03/12 13:30, Rutger wrote:

> Good tip about the Geoloc branch, i hadn't noticed it myself and it
> didn't ship with the tarball i downloaded. This seems to be doing
> exactly what i want (and more). I have modified your AVHRR example
> so it should be behave like VIIRS.
>
> I have attached my first result, just for fun. The red dots are the
> predicted VIIRS pixels at 375m resolution (so 6400*1536). The blue
> line is the outer most row and column from the latitude and
> longitude arrays of a VIIRS geolocation file which i'm trying to
> mimic.

Nice picture ! I never got to plot this myself... I have to admit it
looks far better than I expected :)

> Its already pretty good, and very encouraging to start with. Some
> inaccuracies may come from the sensor characteristics i dug up on
> the web. The width inaccuracy is not really important for me, when
> searching for granules i will probably theshold the zenithangle
> anyway to make sure a granule has for example atleast angles of
> <45° covering the area of interest. These also seems to be a slight
> rotation, as if the orbit (or center-line) is not completely
> correct, i will try to validate this by comparing the predicted one
> with some other trackers or the overpass maps from SSEC.

> What kind of accuracies do you experience? Given my example, would
> you say this is as good as it gets or is it more like 'yikes whats
> going wrong there! '? ;)

Well, as stated above, I didn't check into that much detail yet. What
I can say though is that the max error I got when I wrote this for
avhrr was about 0.3 degrees iirc. However, the data I was comparing to
was attitude corrected, while the geoloc code is not taking attitude
into account at the moment (although it should be fairly trivial to add).

So actually, I have to admit it's better than I would have expected.

> If i run into some interesting results i will let you know.

Yes please ! I'm already intrested in your definition of the viirs
instrument :)

Best regards,
Martin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPYeOpAAoJEBdvyODiyJI4dHUH/25mDFPGWH/6ByZI7ja5cdrP
rYszruSa+UjkgBPz02mnN9ixvEPqrf8fcg7v+UxVGiRThLklXa2cq3tL7HH2w6kR
deMJfWaJAa+AI+JqBoTYkpdOfuRTUcbDYCvareArB+R08i+KDUC1iBd41m5oRqld
bu/d/S8u6NpGdM1rmZ+XNXFY450jizd7YVivmx1LoZFVo1P9kYN3PXs4AZvohNXZ
LAycbUn+Sm+9YEkZV7idwd/RhbeN7FKFCzNd5yBHzOHnT1sRSBpcip5lWWBDWBVk
IOss5fzyBeOqWwyvhpJ/iM9pCbDDFFkWEjIUltzMQ2b8g1MGo//kZ2rxZpC4SdU=
=yASv
-----END PGP SIGNATURE-----

martin_raspaud.vcf

Rutger

unread,
Mar 15, 2012, 9:15:02 AM3/15/12
to pyt...@googlegroups.com
Hey,


Yes please ! I'm already intrested in your definition of the viirs
instrument :)


This is the entire code i used. It is placed in the __main__ section of the geoloc.py file, replacing your AVHRR example.

    from mpl_toolkits.basemap import Basemap
    import matplotlib.pyplot as plt
 
    #  TLE comes from ftp://is.sci.gsfc.nasa.gov/ancillary/ephemeris/tle/drl.tle.2012030213
    # NPP
    npp_tle1 = "1 37849U 05001A   12061.00019361  .00000000  00000-0 -31799-4 2    06"
    npp_tle2 = "2 37849 098.7082 000.2437 0000785 084.9351 038.5818 14.19547815017683"

   # starttime is taken from filename of the 'target' granule
    t = datetime.datetime(2012, 3, 2, 12, 27, 23)

    # npp scanlines @ 375m
    # the sensor scans 48 taking 32 pixels at once (so the height of granule is 48 * 32 = 1536 pixels)
    scanline_nb = 48

    # building the npp angles, 6400 pixels from +55.84 to -55.84 degrees zenith
    npp = np.vstack(((np.arange(6400) - 3199.5) / 3200 * np.deg2rad(-55.84), np.zeros((6400,)))).transpose()
    npp = np.tile(npp, [scanline_nb, 1])    
   
    # from the timestamp in the filenames, a granule takes 1:25.400 to record (85.4 seconds)
    # so 1.779166667 would be the duration of 1 scanline
    # dividing the duration of a single scan by a width of 6400 pixels results in 0.0002779947917 seconds for each column of 32 pixels in the scanline
    # what is the 0.0025415??  this still comes from the AVHRR example at github
    # the individual times per pixel are probably wrong, unless the scanning behaves the same as for AVHRR, The VIIRS sensor rotates to allow internal calibration before each scanline. This would imply that the scanline always moves in the same direction.
    # more info @ http://www.eoportal.org/directory/pres_NPOESSNationalPolarorbitingOperationalEnvironmentalSatelliteSystem.html

    offset = np.arange(scanline_nb) * 1.779166667
    times = (np.tile(np.arange(6400) * 0.0002779947917 + 0.0025415, [scanline_nb, 1]) + np.expand_dims(offset, 1))
   
    # build the scan geometry object
    sgeom = ScanGeometry(npp, times.ravel())

    # get the pixel locations
    pixels_pos = compute_pixels((npp_tle1, npp_tle2), sgeom, t)
    pos_time = get_lonlatalt(pixels_pos, t)

    # Mercator map centered above the target granule, near South Africa
    m = Basemap(projection='merc',llcrnrlat=-45,urcrnrlat=-25,llcrnrlon=0,urcrnrlon=40,lat_ts=-35,resolution='l')
   
    # convert and plot the predicted pixels in red
    x, y = m(pos_time[0],pos_time[1])
    p1 = m.plot(x,y, marker='+', color='red', markerfacecolor='red', markeredgecolor='red', markersize=0.1, markevery=1, zorder=4, linewidth=0.0)
   
    # read the validation data from CSV, its lat and lon from the outer pixel edge in the geolocation file
    # source: GITCO_npp_d20120302_t1227233_e1228475_b01789_c20120303154859293970_noaa_ops.h5
    bound = np.genfromtxt('D:\\GITCO_npp_d20120302_t1227233_e1228475_b01789_c20120303154859293970_noaa_ops_outer_edge.csv', delimiter=',', names=True)
   
    # convert and plot validation data in blue
    tmpx, tmpy = m(bound['Lon'],bound['Lat'])
    p2 = m.plot(tmpx,tmpy, marker='o', color='blue', markerfacecolor='blue', markeredgecolor='blue', markersize=0.1, markevery=1, zorder=5, linewidth=0.0)
       
    # general map beautification
    m.fillcontinents(color='0.85', lake_color=None, zorder=3)
    m.drawparallels(np.arange(-90.,90.,5.), labels=[1,0,1,0],fontsize=10, dashes=[1, 0], color=[0.8,0.8,0.8], zorder=1)
    m.drawmeridians(np.arange(-180.,180.,5.), labels=[0,1,0,1],fontsize=10, dashes=[1, 0], color=[0.8,0.8,0.8], zorder=2)

    plt.title('NPP Granule (Start at 2012-03-02 12:27:23)')
    #    plt.show()
    plt.savefig('granule_test.png', dpi=400)


Im not sure if the 'anti-deformation' trick VIIRS does to keep the edges of the swath at a similar resolution are of any influence on all of this. If i can find some time i will do some more background reading on the sensor properties.

Regards,
Rutger


Op donderdag 15 maart 2012 13:42:17 UTC+1 schreef Martin Raspaud het volgende:

Rutger

unread,
Mar 16, 2012, 11:06:54 AM3/16/12
to pyt...@googlegroups.com
Hey,

A slight addon to my previous message. There are some mistakes and uknowns in it. First of all, as mentioned in the comments, the VIIRS sensor rotates. This means that recording a single scanline doesnt take 1.78~ seconds but the entire 360 degree spin does. If it spins at constant speed, the recording part only takes 0.5519 seconds for the entire swath of 111.68 degrees. This actually removes quite alot of the rotation relative to the validation data. The spinning direction also influences the results, i dont know which is the correct one. I have tested both by assigning the time using np.arange(6400) and np.linspace(6399,0,6399), the influence seems very small, but still relevant of course.

What remains are the edges of the swath, they seem significantly skewed compared to the validation data. I cant really get my head around even a potential cause of this. Maybe inspecting some other granules might shed some light on it.


Best regards,
Rutger

Op donderdag 15 maart 2012 14:15:02 UTC+1 schreef Rutger het volgende:
granule_test_v6_sub.jpg

Martin Raspaud

unread,
Mar 19, 2012, 6:07:01 AM3/19/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 16/03/12 16:06, Rutger wrote:
> Hey,
>
> A slight addon to my previous message. There are some mistakes and
> uknowns in it. First of all, as mentioned in the comments, the
> VIIRS sensor rotates. This means that recording a single scanline
> doesnt take 1.78~ seconds but the entire 360 degree spin does. If
> it spins at constant speed, the recording part only takes 0.5519
> seconds for the entire swath of 111.68 degrees. This actually
> removes quite alot of the rotation relative to the validation data.
> The spinning direction also influences the results, i dont know
> which is the correct one. I have tested both by assigning the time
> using np.arange(6400) and np.linspace(6399,0,6399), the influence
> seems very small, but still relevant of course.
>
> What remains are the edges of the swath, they seem significantly
> skewed compared to the validation data. I cant really get my head
> around even a potential cause of this. Maybe inspecting some other
> granules might shed some light on it.

Hi Rutger,

I've been thinking a bit over the week-end, and as of now I can see
different reasons for the problem you mention:

- - Attitude correction that we don't do in geoloc.
- - The fact that in geoloc the satellite z axis is pointing to the
center of the earth, and not the subsatellite point
- - The fact that the geolocation from npp is terrain corrected
(although you showed an example over sea...)


I'll try to find some time to investigate this under this week. Keep
me updated if you find something.

Best regards,
Martin


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPZwVFAAoJEBdvyODiyJI4VQAIAJjodFwLK6SgUV9ZkTwDO4NW
iSd3JDph9anSRv2p+w5CZRmOJbvSJh9th54GE0Xs04xt+qFO5dnFtvQsPyWYUlsi
MquO8Dcv+fo67ATtc+GM2D4o2OB2OeF+1ErB8RrbFoN/VeV3NT6EbmlNoMlbXnoq
J0tRVKtQMZfr4lF3cBFz0MLFjE9ql5L5LvF2WKR4hJg9Gg5hK814uX3ck6WLRn0J
PKSPaXVzHWBnAbByFB/eDZ6UGpNuG8ximr4LhQ25oifWEYTxvmfgOV06dAkffiYV
Q4hyKaEjEX3DeZdS+5/HOjtvaFlYY21Ro7DTQSdLOhTRprZBZw0xkfa8FDv1j2I=
=0SUk
-----END PGP SIGNATURE-----

martin_raspaud.vcf

David Hoese

unread,
Mar 19, 2012, 2:23:31 PM3/19/12
to pyt...@googlegroups.com
Martin,

About the string format problem, I noticed it was because you need a %% on the dictionary formatting so:

SV???_%(satellite)s_d%Y%m%d_t%H%M???_e???????_b%(orbit)s_c*_ops.h5

becomes:

SV???_%%(satellite)s_d%Y%m%d_t%H%M???_e???????_b%%(orbit)s_c*_ops.h5

because the strftime will get rid of the first percents for the datetime stuff.  That works for me on 2.7 on a Mac.

-Dave

Martin Raspaud

unread,
Mar 19, 2012, 2:31:10 PM3/19/12
to pyt...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 19/03/12 19:23, David Hoese wrote:
> Martin,
>
> About the string format problem, I noticed it was because you need
> a %% on the dictionary formatting so:
>
> SV???_%(satellite)s_d%Y%m%d_t%H%M???_e???????_b%(orbit)s_c*_ops.h5
>
> becomes:
>
> SV???_%%(satellite)s_d%Y%m%d_t%H%M???_e???????_b%%(orbit)s_c*_ops.h5
>
> because the strftime will get rid of the first percents for the
> datetime stuff. That works for me on 2.7 on a Mac.
>
> -Dave

Hi Dave,

Thanks for the tip !

I'll try this with on our systems, and modify the code accordingly.

Best regards,
Martin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJPZ3tuAAoJEBdvyODiyJI4+WwH/jgU852+G1oiHmZXJ1w8QFWE
3ySa/EIQtz7KnojUfD1XcPBE5VdH+sqKEwZAgHJpf36Aa5kFRRpphX824dytden5
Qz/wml/ZJN7z6chw2GrXGQGO43YF14XV2IkJCyeW1bUtJ5GjrYaiapoYXb3s9Icd
+v4FOZOlAPSMMsxb1qmItT7+7FV02izpLXPq4u03Ht2Paj+ZEpWf4rTSZ8vkBR3z
2Vyk1jNvOoeP0/2JNKbFCjvYMdGvem0NXV53aZGOo7egVfj+WVJ5RnYcMEDpsHUP
yeGxES9AiQx4BlWLYjB6O8S2FUen2sVkIyQlc0wmZk3SeyTLWXE9vKyy4P9lotk=
=ANW9
-----END PGP SIGNATURE-----

martin_raspaud.vcf
Reply all
Reply to author
Forward
0 new messages