[yt-users] How to fully close a database

Nathan Goldbaum nathan12343 at gmail.com
Tue Feb 7 11:42:29 PST 2017


On Tue, Feb 7, 2017 at 1:13 PM, Yingchao Lu <yingchao.lu at rice.edu> wrote:

> Hi All,
>
>
>
> I want to do a time query over multiply hdf5 files as following:
>
>
>
> import yt
>
> from glob import glob
>
>
>
> fns = glob(‘flash_hdf5_plf_*’)
>
> data = []
>
>
>
> for fn in fns:
>
> ds = yt.load(fn)
>
> pt = ds.point([0,0,0])
>
> data.append([ds.parameters[‘time’], pt.mean(‘tele’)])
>
> ds.close()
>
>
>
> print data
>
>
>
> At the beginning, it takes about 1s to process each file. But later, it
> becomes slower and slower. Finally, 100 files cost 10 minutes, much large
> than 1s*100. I guess maybe the database is not fully closed.
>

Doing "del ds" should completely close and free the dataset object
(although python's garbage collector should be doing that each iteration of
your for loop).

There are some operations in your script (in particular "pt.mean('tele')")
that might be slower later in the time series, especially if the number of
cells in the simulation increases with time.


> When I try ds.field_info() after ds.close(), it still displays the field
> information. Does anyone have the same problem?
>
>
>
>
>
> Yingchao
>
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20170207/f5e9f2e3/attachment.htm>


More information about the yt-users mailing list