[Insight-users] Batch resampling: error writing files of different size

Jolinda Smith jolinda@darkwing.uoregon.edu
Mon, 5 May 2003 16:21:46 -0700


I'm still having a problem -- my errors are gone, but now no output file
is being produced. UpdateLargestPossibleRegion() doesn't appear to have
any effect whatsoever. Calling UpdateLargestPossibleRegion() and then
Update() on the writer doesn't work either -- it gives the same errors
as just calling Update(). What else should I be doing?


> Call UpdateLargestPossibleRegion() instead of Update() on the Writer.
>
> The first time a pipeline is executed, Update() and
> UpdateLargestPossibleRegion()
> do the same thing.  After the first update, however, Update() always
tries
> to
> produce the same requests as the previous call to Update().  If you
change
> the
> file attached to the pipeline, the image size can change (as you found
out),
>
> and using the "cached" region requests is inappropriate.  Calling
> UpdateLargestPossibleRegion() will force the pipeline to adapt to the
image
> sizes.
>
> In general, you want to call Update() when a pipeline or data has been
> modified
> but you "know" the image sizes have not changed.  After a change to
the
> pipeline
> or data that causes a change in image sizes, you need to call
> UpdateLargestPossibleRegion() or explictly manage the
RequestedRegion() on
> the last
> node in the pipeline.
>
>
> > -----Original Message-----
> > From: Jolinda Smith [mailto:jolinda@darkwing.uoregon.edu]
> > Sent: Wednesday, April 30, 2003 3:19 PM
> > To: insight users
> > Subject: [Insight-users] Batch resampling: error writing files of
> > different size
> >
> >
> > Hello,
> >
> > I'd like to resample several files in a batch process. I've written
a
> > program that parses a text file containing input filenames, output
> > filenames, and resampling parameters. I set up the pipeline, call
> > "SetFileName" on the reader and writer, set resampling parameters,
and
> > update the writer. Then I read in new filenames and parameters, set
> > them, and update the writer again. However, if the size of the
output
> > image is larger that it was for the previous set of files, the
writer
> > fails (referencing memory at 0x00000 when writing metaimage output,
an
> > ITK exception: "error writing image data" when writing
> > analyze output).
> > A stripped-down version of the code is below.
> >
> > Is the buffer for the output image data not getting resized
correctly?
> > Is there something else I can do to make this work?
> >
> > ReaderType::Pointer reader = ReaderType::New();
> > WriterType::Pointer writer = WriterType::New();
> > FilterType::Pointer filter = FilterType::New();
> >
> > filter->SetInput( reader->GetOutput() );
> > writer->SetInput( filter->GetOutput() );
> >
> > while (file.good()) {
> >
> >   getline(file, input_filename);
> >   getline(file, output_filename);
> >
> >   OutputImageType::SizeType size;
> >   file >> size[0];
> >   file >> size[1];
> >   file >> size[2];
> >
> >   reader->SetFileName(input_filename );
> >   writer->SetFileName(output_filename);
> >   filter->SetSize(size);
> >
> >   try
> >   {
> >     writer->Update();
> >   }
> >   catch (itk::ExceptionObject& err)
> >   {
> >     cout << err << endl;
> >     return -1;
> >   }
> >
> > }
> >
> > _______________________________________________
> > Insight-users mailing list
> > Insight-users@public.kitware.com
> > http://public.kitware.com/mailman/listinfo/insight-users
> >
>