[Insight-users] Filter buffers and memory limits
Urlek
crapie at email.si
Fri Nov 21 11:40:07 EST 2008
Hi,
even when I push my big 3D images through through a fairly simple
pipeline, the memory fingerprint of the process is several times bigger than
what I would expect from the task. I guess filters are buffering the image
as it passes through the pipeline so that multiple (different) copies reside
in memory at any given time.
How do you usually go about this, when available memory is a limit? Do
you process the image in single steps, releasing filters once they have been
used? Where could I read more on ITK's memory management and on how to
optimally run a processing pipeline in memory-critical cases?
Thanks a lot!
--
View this message in context: http://www.nabble.com/Filter-buffers-and-memory-limits-tp20624873p20624873.html
Sent from the ITK - Users mailing list archive at Nabble.com.
More information about the Insight-users
mailing list