[yt-dev] Issue #1328: "Too many files" error when trying to create ExodusII simulation (yt_analysis/yt)

Alexander Lindsay issues-reply at bitbucket.org
Thu Mar 2 11:44:23 PST 2017


New issue 1328: "Too many files" error when trying to create ExodusII simulation
https://bitbucket.org/yt_analysis/yt/issues/1328/too-many-files-error-when-trying-to-create

Alexander Lindsay:

We have a user who has a large transient simulation broken up into several exodus files because of adaptive mesh refinement. In general in these AMR simulations, it is unknown which file will contain a certain time point. One can determine the file using a script like the following:
```
sim = yt.simulation("MOOSE_sample_data", "ExodusII")
sim.get_time_series()

desired_time = 1000.0
for ds in sim:
    time  = ds.current_time.value[()]
    if time == desired_time:
        file_name = ds.index_filename
        step = ds.step
        break
```
However, his data set is large enough that when executing `sim.get_time_series()`, he gets the error:
```
  File "netCDF4/_netCDF4.pyx", line 1811, in netCDF4._netCDF4.Dataset.__init__ (netCDF4/_netCDF4.c:12626)
IOError: Too many open files
```
This likely corresponds to this issue: https://github.com/Unidata/netcdf4-python/issues/113

Anyone (@atmyers @ngoldbaum) have an idea for a way around this? From what I understand the netCDF4 Dataset object is a persistent object that in general is always kept around because variable values aren't read from disk until they are needed. My understanding could be off however.




More information about the yt-dev mailing list