TubeTK/Intra-operative Ultrasound Registration

From KitwarePublic
Jump to navigationJump to search

Related Works

Surface-to-ultrasound registration

Ultrasound simulation

  • Real-Time Simulation of Medical Ultrasound from CT Images
    • Lecture Notes In Computer Science archive. Proceedings of the 11th International Conference on Medical Image Computing and Computer-Assisted Intervention, Part II, New York, New York, Pages: 734 - 741
    • Ramtin Shams, Richard Hartley
      • RSISE, The Australian National University, Canberra, and NICTA, Canberra, Australia
    • Nassir Navab, Computer Aided Medical Procedures (CAMP), TU München, Germany
    • Medical ultrasound interpretation requires a great deal of experience. Real-time simulation of medical ultrasound provides a cost-effective tool for training and easy access to a variety of cases and exercises. However, fully synthetic and realistic simulation of ultrasound is complex and extremely time-consuming. In this paper, we present a novel method for simulation of ultrasound images from 3D CT scans by breaking down the computations into a preprocessing and a run-time phase. The preprocessing phase produces detailed fixed-view 3D scattering images and the run-time phase generates view-dependent ultrasonic artifacts for a given aperture geometry and position within a volume of interest. We develop a simple acoustic model of the ultrasound for the run-time phase, which produces realistic ultrasound images in real-time when combined with the previously computed scattering image.
    • http://portal.acm.org/citation.cfm?id=1483392
  • Advanced training methods using an Augmented Reality ultrasound simulator
    • Blum, T.; Heining, S.M.; Kutter, O.; Navab, N.; Comput. Aided Med. Procedures & Augmented Reality (CAMP), Tech. Univ. Munchen, Munich, Germany
    • This paper appears in: Mixed and Augmented Reality, 2009. ISMAR 2009. 8th IEEE International Symposium on: 177 - 178
    • Ultrasound (US) is a medical imaging modality which is extremely difficult to learn as it is user-dependent, has low image quality and requires much knowledge about US physics and human anatomy. For training US we propose an Augmented Reality (AR) ultrasound simulator where the US slice is simulated from a CT volume. The location of the US slice inside the body is visualized using contextual in-situ techniques. We also propose advanced methods how to use an AR simulator for training.
    • http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5336476
  • Registration of 3D ultrasound to computed tomography images of the kidney
    • http://circle.ubc.ca/handle/2429/27817?show=full
    • The integration of 3D computed tomography (CT) and ultrasound (US) is of considerable interest because it can potentially improve many minimally invasive procedures such as robot-assisted laparoscopic partial nephrectomy. Partial nephrectomy patients often receive preoperative CT angiography for diagnosis. The 3D CT image is of high quality and has a large field of view. Intraoperatively, dynamic real-time images are acquired using ultrasound. While US is real-time and safe for frequent imaging, the images captured are noisy and only provide a limited perspective. Providing accurate registration between the two modalities would enhance navigation and image guidance for the surgeon because it can bring the pre-operative CT into a current view of the patient provided by US.
    • The challenging aspect of this registration problem is that US and CT produce very different images. Thus, a recurring strategy is to use preprocessing techniques to highlight the similar elements between the images. The registration technique presented here goes further by dynamically simulating an US image from the CT, and registering the simulated image to the actual US. This is validated on US and CT volumes of porcine phantom data. Validation on realistic phantoms remains an ongoing problem in the development of registration methods. A detailed protocol is presented here for constructing tissue phantoms that incorporate contrast agent into the tissue such that the kidneys appear representative of in vivo human CT angiography. Registration with 3D CT is performed successfully on the reconstructed 3D US volumes, and the mean TREs ranged from 1.8 to 3.5 mm. In addition, the simulation-based algorithm was revised to consider the shape of the US beam by using pre-scan converted US data. The corresponding CT image is iteratively interpolated along the direction of the US beam during simulation. The mean TREs resulting from registering the pre-scan US data and CT data were between 1.4 to 2.6 mm. The results show that both methods yield similar results and are promising for clinical application. Finally, the method is tested on a set of in vivo CT and US images of a partial nephrectomy patient, and the registration results are discussed.
  • http://portal.acm.org/citation.cfm?id=1844544&CFID=102173375&CFTOKEN=74693374
    • Rigid registration of segmented volumes in frequency domain using spherical correlation
    • Proceedings of the 12th WSEAS international conference on Mathematical methods, computational techniques and intelligent systems table of contents, Pages: 234-238
    • An algorithm for the rigid registration of binary volumes is described in this paper. Binary volumes result from a segmentation of ovarian ultrasound volumes. Rigid registration is preformed in frequency domain, where the rotation and translation can be calculated separately. The calculation of rotation is done using the amplitude spectrum and with the help of sphere correlation. The method was tested on 100 synthetic ultrasonic volume pairs. Registration accuracy was estimated by a ratio ρ that compares the intersection volume of the two registered volumes to the final volume. The average ratio ρ between registered volumes was 0.50 (std 0.09) when final result of registration was used. For comparison we tested transformation, used in synthetic volumes creation. The average ratio ρ was 0.53 (std. 0.08) in that case.

Registration

  • http://wwwx.cs.unc.edu/~mn/sites/default/files/lee2010_physically-based-deformable-image-registration.pdf
    • Physically-based deformable image registration with material properties and boundary conditions
    • We propose a new deformable medical image registration method that uses a physically-based simulator and an iterative optimizer to estimate the simulation parameters determining the deformation field between the two images. Although a simulation-based registration method can enforce physical constraints exactly and considers different material properties, it requires hand adjustment of material properties, and boundary conditions cannot be acquired directly from the images. We treat the material properties and boundary conditions as parameters for the optimizer, and integrate the physically-based simulation into the optimization loop to generate a physically accurate deformation automatically.
  • http://ukpmc.ac.uk/abstract/MED/18975707;jsessionid=5437290B6533DA7FFD4DBA261D257325.jvm4
    • Mutual-information-based image to patient re-registration using intraoperative ultrasound in image-guided neurosurgery.
    • An image-based re-registration scheme has been developed and evaluated that uses fiducial registration as a starting point to maximize the normalized mutual information (nMI) between intraoperative ultrasound (iUS) and preoperative magnetic resonance images (pMR). We show that this scheme significantly (p<0.001) reduces tumor boundary misalignment between iUS pre-durotomy and pMR from an average of 2.5 mm to 1.0 mm in six resection surgeries. The corrected tumor alignment before dural opening provides a more accurate reference for assessing subsequent intraoperative tumor displacement, which is important for brain shift compensation as surgery progresses. In addition, we report the translational and rotational capture ranges necessary for successful convergence of the nMI registration technique (5.9 mm and 5.2 deg, respectively). The proposed scheme is automatic, sufficiently robust, and computationally efficient (<2 min), and holds promise for routine clinical use in the operating room during image-guided neurosurgical procedures.