NetCDF: HDF error when processing MODIST and VIIRSN

Post Reply
gbourdin
Posts: 24
Joined: Wed Nov 04, 2020 8:49 pm
company / institution: University of Maine
Location: Maine, USA

NetCDF: HDF error when processing MODIST and VIIRSN

Post by gbourdin »

Hello,
I have processed about 1200 images of MODISA from L1C to L2 without any trouble. I am now trying to process about the same number of images of MODIST and VIIRSN for the same dates and locations. It worked for 300 images for VIIRS and 348 images for MODIST, but after that, it throws me the following error:

Starting processing at 2021-02-04 09:27:37.091891
Initializing output file "/home/gbourdin/data/T2018174205500.POLY_L2_MODIS.nc"
Processing block: size (500, 400), offset (0, 0)
Processing block: size (500, 400), offset (0, 400)
Processing block: size (500, 400), offset (0, 800)
Processing block: size (500, 154), offset (0, 1200)
Processing block: size (500, 400), offset (500, 0)
Processing block: size (500, 400), offset (500, 400)
Processing block: size (500, 400), offset (500, 800)
Processing block: size (500, 154), offset (500, 1200)
Traceback (most recent call last):
File "POLYbatchL2.py", line 255, in <module>
L2process(singlref, options.instrument, options.force_process)
File "POLYbatchL2.py", line 185, in L2process
process_MODIS_L1C_to_L2(ref, force_process)
File "POLYbatchL2.py", line 159, in process_MODIS_L1C_to_L2
overwrite=force), multiprocessing = 0 # 0 for single thread | -1 for as many thread available | # for multiprocessing with limited number of thread
File "/home/gbourdin/polymer-v4.13/polymer/main.py", line 517, in run_atm_corr
for block in block_iter:
File "/home/gbourdin/polymer-v4.13/polymer/main.py", line 434, in blockiterator
for block in level1.blocks(params.bands_read()):
File "/home/gbourdin/polymer-v4.13/polymer/level1.py", line 175, in blocks
yield self.read_block(size, offset, bands_read)
File "/home/gbourdin/polymer-v4.13/polymer/level1_nasa.py", line 194, in read_block
'rhot_{}'.format(band)][SY, SX], ok=ok)
File "netCDF4/_netCDF4.pyx", line 4452, in netCDF4._netCDF4.Variable.__getitem__
File "netCDF4/_netCDF4.pyx", line 5394, in netCDF4._netCDF4.Variable._get
File "netCDF4/_netCDF4.pyx", line 1928, in netCDF4._netCDF4._ensure_nc_success
RuntimeError: NetCDF: HDF error

I have checked the L1C file I input in polymer but it looks good. I have tried to process the next files but it still gives this error and I can't process more image.
I am running polymer on linux with 24 cores and 240G of memory when run with the multiprocessing =-1 option.

Best,
Guillaume
gbourdin
Posts: 24
Joined: Wed Nov 04, 2020 8:49 pm
company / institution: University of Maine
Location: Maine, USA

Re: NetCDF: HDF error when processing MODIST and VIIRSN

Post by gbourdin »

I think I found the problem, some of my L1C files are corrupted, even though I could read them without problem with matlab ...
User avatar
fsteinmetz
Site Admin
Posts: 306
Joined: Fri Sep 07, 2018 1:34 pm
company / institution: Hygeos
Location: Lille, France
Contact:

Re: NetCDF: HDF error when processing MODIST and VIIRSN

Post by fsteinmetz »

Hi Guillaume,
Good to know that you could identify the issue ! Thanks for the notification.
Cheers,
François
gbourdin
Posts: 24
Joined: Wed Nov 04, 2020 8:49 pm
company / institution: University of Maine
Location: Maine, USA

Re: NetCDF: HDF error when processing MODIST and VIIRSN

Post by gbourdin »

Hi François,
As I said before some of my L1C images were corrupted and I could reprocess some, but there is still about 10% of VIIRS and MODIST images that are giving me another error. Here an example with a VIIRS-SNPP image:

Starting processing at 2021-02-16 16:09:16.426121
Initializing output file "/home/gbourdin/process_OCfiles/L2_to_L3/data/transect/V2017113040600.POLY_L2_SNPP.nc"
Processing block: size (500, 400), offset (0, 0)
Processing block: size (500, 400), offset (0, 400)
Processing block: size (500, 400), offset (0, 800)
Processing block: size (500, 400), offset (0, 1200)
Processing block: size (500, 400), offset (0, 1600)
Processing block: size (500, 400), offset (0, 2000)
Processing block: size (500, 400), offset (0, 2400)
Processing block: size (500, 400), offset (0, 2800)
Processing block: size (500, 400), offset (500, 0)
Processing block: size (500, 400), offset (500, 400)
Processing block: size (500, 400), offset (500, 800)
Processing block: size (500, 400), offset (500, 1200)
Processing block: size (500, 400), offset (500, 1600)
Processing block: size (500, 400), offset (500, 2000)
Processing block: size (500, 400), offset (500, 2400)
Processing block: size (500, 400), offset (500, 2800)
Processing block: size (500, 400), offset (1000, 0)
Processing block: size (500, 400), offset (1000, 400)
Processing block: size (500, 400), offset (1000, 800)
Processing block: size (500, 400), offset (1000, 1200)
Processing block: size (500, 400), offset (1000, 1600)
Processing block: size (500, 400), offset (1000, 2000)
Processing block: size (500, 400), offset (1000, 2400)
Processing block: size (500, 400), offset (1000, 2800)
Traceback (most recent call last):
File "POLYbatchL2.py", line 286, in <module>
L2process(singlref, options.instrument, options.force_process)
File "POLYbatchL2.py", line 214, in L2process
process_VIIRS_L1C_to_L2(ref, force_process)
File "POLYbatchL2.py", line 157, in process_VIIRS_L1C_to_L2
overwrite=force), multiprocessing = 0 # 0 for single thread | -1 for as many thread available | # for multiprocessing with limited number of thread
File "/home/gbourdin/polymer-v4.13/polymer/main.py", line 517, in run_atm_corr
for block in block_iter:
File "/home/gbourdin/polymer-v4.13/polymer/main.py", line 434, in blockiterator
for block in level1.blocks(params.bands_read()):
File "/home/gbourdin/polymer-v4.13/polymer/level1.py", line 175, in blocks
yield self.read_block(size, offset, bands_read)
File "/home/gbourdin/polymer-v4.13/polymer/level1_nasa.py", line 211, in read_block
block.ozone[ok] = self.ozone[block.latitude[ok], block.longitude[ok]]
File "/home/gbourdin/polymer-v4.13/polymer/ancillary.py", line 58, in __getitem__
return self.data[Idx(lat), Idx(lon)]
File "/home/gbourdin/polymer-v4.13/polymer/luts.py", line 399, in __getitem__
keys = k.index(self.axes)
File "/home/gbourdin/polymer-v4.13/polymer/luts.py", line 1112, in index
fill_value=fv)(self.value)
File "/home/gbourdin/.conda/envs/Polymer/lib/python3.7/site-packages/scipy/interpolate/polyint.py", line 73, in __call__
y = self._evaluate(x)
File "/home/gbourdin/.conda/envs/Polymer/lib/python3.7/site-packages/scipy/interpolate/interpolate.py", line 659, in _evaluate
below_bounds, above_bounds = self._check_bounds(x_new)
File "/home/gbourdin/.conda/envs/Polymer/lib/python3.7/site-packages/scipy/interpolate/interpolate.py", line 688, in _check_bounds
raise ValueError("A value in x_new is below the interpolation "
ValueError: A value in x_new is below the interpolation range.

I have tried to reprocess them from L1A to L1C but the same images are giving me this error again.
Thank you,
Best,
Guillaume
User avatar
fsteinmetz
Site Admin
Posts: 306
Joined: Fri Sep 07, 2018 1:34 pm
company / institution: Hygeos
Location: Lille, France
Contact:

Re: NetCDF: HDF error when processing MODIST and VIIRSN

Post by fsteinmetz »

Hi Guillaume,
I have tried on the file V2017113040600.L1C but I could process it without failure. Which version of SeaDAS are you using ?
I suspect that the lat and lon in the L1C may contain wrong values in your case, could you check ?
Cheers,
François
Post Reply