Development of a smart portable ultrasound system for real-time guidance of minimally invasive procedures.
Abstract
Minimally invasive procedures are very crucial in the treatment and diagnosis of non-communicable diseases like cancer. These procedures require accurate needle placement and are thus done under the guidance of medical imaging. Ultrasound is very commonly used because it is cheap, real-time and gives off no ionising radiation. Today, handheld 2D ultrasound devices are popularly used however they still suffer limitations such as the need for a two-person workflow and poor needle visibility due to anatomy artefacts or due to steep angle needle insertion. This poses high risk of injury or post-procedural complications and reduces the efficacy of these procedures.
Solutions have been proposed to tackle these limitations, however, most existing solutions today have not been integrated into ultrasound systems and have not been evaluated in in-vivo studies. To address this gap, models are developed for needle segmentation and localization in this project.
The best segmentation model, a custom U-Net, achieves an IoU of 0.7273 with an average inference time of 0.018 seconds. The best localization model, a Yolor model, achieves an mAP of 0.988 with an average inference time of 0.027 seconds. These models were successfully integrated into the Clarius L7, a handheld ultrasound device, and evaluated through ex-vivo animal studies. The integrated system was able to accurately detect the needle during ex-vivo animal studies using both segmentation and localization models achieving overall prediction times of 0.13 seconds and 0.142 seconds respectively.
This work reveals that, a smart portable ultrasound system that can guide minimally invasive procedures using machine learning, is a feasible solution to poor needle visibility when using ultrasound imaging. As a result, the efficacy of these procedures could be improved and the risk of injury minimized.