Hello François,
I get an error when I try to batch process thousands of images, subsetting them beforehand. There number of files opened reaches the limit set for the operating system (which is currently 1024).
I will increase it to 4096 to see what it does but I was wondering in the meantime, if there is a step in polymer that would keep files opened?
I followed another post to subset the image according to lat lon limits that I set.
# I open the full file:
l1full = Level1_OLCI(ref + '.SEN3')
# I find the start and end lines and columns:
startline,endline,startcol,endcol = subset_img(l1full, n, s, e, w)
# and finally run the program on the subset.
run_atm_corr(
Level1_OLCI(ref + '.SEN3',
ancillary=Ancillary_ERA5(),
sline=startline,
eline=endline,
scol=startcol,
ecol=endcol),
Level2_NETCDF(outdir=PATH_TO_DATA, ext=ref + '.POLY_L2_SEN3.nc',
overwrite=force), multiprocessing = -1
)
And I obtain this error after several hundreds of images are process without any issue:
Process L2 /PATH_TO_DATA/S3A_OL_2_ERR____20171009T201159_20171009T205614_20180718T142801_2655_023_128______LR2_R_NT_002 ...
height=15087, width=1217
Process L2 failed, unknown error:/PATH_TO_DATA/S3A_OL_1_ERR____20171009T201159_20171009T205614_20180718T142801_2655_023_128______LR2_R_NT_002.SEN3 ignored
Traceback (most recent call last):
File "POLYbatchL2.py", line 151, in process_SENT3_L1_to_L2
File "/PATH_TO_POL/polymer-v4.13/polymer/main.py", line 517, in run_atm_corr
File "/PATH_TO_POL/polymer-v4.13/polymer/main.py", line 434, in blockiterator
File "/PATH_TO_POL/polymer-v4.13/polymer/level1_olci.py", line 351, in blocks
File "/PATH_TO_POL/polymer-v4.13/polymer/level1_olci.py", line 261, in read_block
File "/PATH_TO_POL/polymer-v4.13/polymer/level1_olci.py", line 206, in read_band
File "/PATH_TO_POL/polymer-v4.13/polymer/level1_olci.py", line 172, in get_ncroot
File "netCDF4/_netCDF4.pyx", line 2358, in netCDF4._netCDF4.Dataset.__init__
File "netCDF4/_netCDF4.pyx", line 1926, in netCDF4._netCDF4._ensure_nc_success
OSError: [Errno 24] Too many open files: b'/PATH_TO_DATA/S3A_OL_1_ERR____20170804T201537_20170804T205952_20170806T002222_2655_020_342______LN1_O_NT_002.SEN3/Oa11_radiance.nc'
Thank you,
Cheers,
Guillaume
OSError: [Errno 24] Too many open files:
- fsteinmetz
- Site Admin
- Posts: 315
- Joined: Fri Sep 07, 2018 1:34 pm
- company / institution: Hygeos
- Location: Lille, France
- Contact:
Re: OSError: [Errno 24] Too many open files:
Hi Guillaume,
It seems that a file is not closed, maybe due to a garbage reference ?
If you are doing your processing in a python loop, I would suggest defining a function which processes a single file, if it is not the case already. This can help to avoid garbage references.
I hope this helps, cheers,
François
It seems that a file is not closed, maybe due to a garbage reference ?
If you are doing your processing in a python loop, I would suggest defining a function which processes a single file, if it is not the case already. This can help to avoid garbage references.
I hope this helps, cheers,
François
-
- Posts: 24
- Joined: Wed Nov 04, 2020 8:49 pm
- company / institution: University of Maine
- Location: Maine, USA
Re: OSError: [Errno 24] Too many open files:
I was already using a function that processes single files but you were right it was due to garbage references I was creating doing my subsets. Solved now thank you!
Cheers,
Guillaume
Cheers,
Guillaume