[IGSTK-Developers] SpatialObject calibration transform

David Gobbi dgobbi at atamai.com
Wed Nov 30 17:33:49 EST 2005


Hi James,

I read through this, and it sounded too complicated, with some spatial 
object following other spatial objects, and with the positions of 
spatial objects used to figure out orientations, etc.  What you proposed 
at the end of this email and expanded on in your follow-up email sounded 
better.

I'm opposed to applying any sort of transformation at the SpatialObject 
level except for the transform that is provided by the Tracker.   I've 
done a lot of work in this area, and if you will pardon the pun, the 
difficulty in writing reliable image guided surgery code increases 
geometrically with each additional transform that is added to the pipeline.

At the very least, any transform at the SpatialObject level must not be 
accessible via public methods, and ideally if such transforms exist they 
should be hard-coded so that they can't be changed at run time.

 - David

Hui Zhang wrote:

> Hi,
>
> As to the igstkPivotCalibration, I think it can only return 3D 
> translation, not rotation. igstkPivotCalibration is to calculate the 
> transform from the tip (3D point) to the reference tracker position 
> (5D or 6D), so it can only return 3D transform.
>
> As to solve this problem, I propose a way like this:
> 1. Use igstkGroupSpatialObject to build a high level node, like 
> igstkNeedleSpatialObject. Those objects can be extended to 
> igstkProbeSpatialObject, igstkGuidewireSpatialObject and 
> igstkUltrasoundProbeSpatialObject ..., and may finally lead to a 
> surgical tool component group;
> 2. For igstkNeedleSpatialObject, it should have two kinds of 
> SpatialObjects. One is called fixed SpatialObject group, such as 
> igstkCylinderSpatialObject to represent the needle body, another 
> igstkCylinderSpatialObject to represent the needle hip. The transform 
> between those nodes are fixed to each other and measured by real 
> surgical tool's shape. Second is called moving 
> Sensor/MarkerSpatialObject, and this object directly represent the 
> sensor's position reported by the tracking system. There should be a 
> SetCalibrationTransform member function in the whole 
> igstkNeedleSpatialObject to set the trasnform between fixed 
> SpatialObjects and moving Sensor/markerSpatialObject;
> 3. When we use AttachTrackerToolToObject or AttachObjectToTrackerTool 
> functions, the reported positions from tracking system without any 
> calibration or transform will be directly linked to that moving 
> Sensor/MarkerSpatialObject to drive the whole surgical tool object to 
> move;
> 4. As the igstkPivotCalibration can only provide 3D translation from 
> math concepts, we need at least two 3D points to get the rotation 
> transform. What I did in my software is to have two points (tip and 
> hip of the needle) to set the correct CalibrationTransform. The steps 
> are: A) use PivotCalibration to get the 3D transform from tip to 
> sensor/marker. B) add an offset from hip to the tip (generally along 
> z-axis) to get the second 3D transform (from hip to sensor/marker). C) 
> use those two 3D points to set the correct 5D to 5D/6D calibration 
> matrix (including translation and rotation).
>
> As to the step 4 is not clear and will change under different 
> conditions, I am thinking to add a new calibration class called 
> igstkReferenceCalibration to get the full calibration transform 
> including translation and rotation. The working pipeline is like:
> 1. Use igstkPivotCalibration to get the 3D transform from tip to 
> sensor/marker;
> 2. Use tip's 3D translation, and another 3D translation with a 
> offseted tip's translation (hip in general condition) as input. That 
> is two 3D points, or any 5D point as the input to 
> igstkReferenceCalibration, along with the input from the concurrent 
> 5D/6D sensor/marker to get the 5D/6D calibration transform which will 
> include 3D translation and 2D/3D rotation;
> 3. Use the output from igstkReferenceCalibration to 
> SetCalibrationTransform to either igstkSpatialObject or 
> igstkTrackerTools to drive the surgical tool.
>
> Regards,
>
> James
>
> ---------------------------------------------------------------------------------------------- 
>
> ----- Original Message ----- From: "David Gobbi" <dgobbi at atamai.com>
> To: "Patrick Cheng" <cheng at isis.georgetown.edu>
> Cc: "'IGSTK-developers'" <igstk-developers at public.kitware.com>
> Sent: Wednesday, November 30, 2005 2:38 PM
> Subject: Re: [IGSTK-Developers] SpatialObject calibration transform
>
>
>> Hi Patrick,
>>
>> Thanks for the summary.  The reason that I would prefer to add the 
>> orientation as part of the tool calibration transform is that I think 
>> we should keep the number of coordinate transformations in IGSTK to a 
>> minimum.
>>
>> Each coordinate transformation is a possible source of error.  Even 
>> worse, if two different transformations can be used to achieve the 
>> same result (e.g. a tool calibration transform and a spatial object 
>> calibration transform), then both transforms can have errors that 
>> cancel each other out.  These sorts of errors can be very hard to debug.
>>
>> I think the best solution is if the ToolTipCalibration class has a 
>> method that allows you to specify the orientation of the tool that is 
>> being calibrated.
>>
>> - David
>>
>> Patrick Cheng wrote:
>>
>>> Hi everybody,
>>>
>>> What I was trying to say in the Tcon is:
>>>
>>> the cylinder object (longest axis) is align with Y axis in image 
>>> space (inherited from ITK by default). while the probe is aligned 
>>> with the tracker's X axis.
>>>
>>> So when we get the PatientTransform, which essentially align the 
>>> image space and tracker space together, but still the cylinder 
>>> spatialobject and the probe is not aligned correctly. We need first 
>>> rotate our spatial object to align with the probe, and then use the 
>>> pivot calibration transform to get the tip translation.
>>>
>>> This adjustment of the spatial object's coordinate system, can be 
>>> done by adding the rotational information to tool calibration 
>>> transform. or we can add it to a upper level transform when we 
>>> construct a spatial object group to model the probe.
>>>
>>> Patrick
>>
>>
>>
>> _______________________________________________
>> IGSTK-Developers mailing list
>> IGSTK-Developers at public.kitware.com
>> http://public.kitware.com/cgi-bin/mailman/listinfo/igstk-developers
>>
>
>
>




More information about the IGSTK-Developers mailing list