Integrating 3D Slicer with a Dynamic Simulator for Situational Aware Robotic Interventions

Image-guided robotic interventions represent a transformative frontier in surgery, blending advanced imaging and robotics for improved precision and outcomes. This paper addresses the critical need for integrating open-source platforms to enhance sit…

Authors: Manish Sahu, Hisashi Ishida, Laura Connolly

Integrating 3D Slicer with a Dynamic Simulator for Situational Aware Robotic Interventions
JOURNAL OF L A T E X CLASS FILES, VOL. 14, NO. 8, A UGUST 2023 1 Inte grating 3D Slicer with a Dynamic Simulator for Situational A ware Robotic Interventions Manish Sahu 1 , Hisashi Ishida 1 † , Laura Connolly 1 , 3 † , Hongyi Fan 1 † , Anton Deguet 1 , Peter Kazanzides 1 , Francis X. Creighton 1 , 2 , and Russell H. T aylor 1 , 2 , Life F ellow , IEEE , Adnan Munaw ar 1 Abstract —Image-guided robotic inter ventions repr esent a transformati ve frontier in surgery , blending advanced imaging and robotics for improv ed precision and outcomes. This paper ad- dresses the critical need for integrating open-source platforms to enhance situational awareness in image-guided robotic resear ch. W e pr esent an open-source toolset that seamlessly combines a physics-based constraint formulation framework, AMBF , with a state-of-the-art imaging platform application, 3D Slicer . Our toolset facilitates the creation of highly customizable interactive digital twins, that incorporates processing and visualization of medical imaging, robot kinematics, and scene dynamics for real- time robot control. Through a feasibility study , we showcase real-time synchronization of a physical robotic interventional en vironment in both 3D Slicer and AMBF , highlighting low- latency updates and improved visualization. Index T erms —Robot-assisted surgery , Surgical na vigation, image-guided surgery I . I N T R O D U C T I O N Image-guided robotic interventions represent an evolving frontier in surgical procedures, seamlessly blending robotic and imaging technologies to equip surgeons with enhanced in- formation, refine precision, and improve outcomes [1]. Acti ve imaging serves as a navigation tool for guiding surgical inter- ventions [2], while robotic assistance enhances proficiency by refining tool tip precision and mitigating the impact of hand tremors [3]. The development of intelligent image-guided robotic sys- tems necessitates the integration of complementary situational awareness [4], incorporating contextual models of patients and intra-operati ve devices. These models interpret surgical situations, delivering real-time feedback tailored to the on- going procedure. Physics-based robotics simulations facilitate dynamic interaction with robotic models in a surgical virtual en vironment [5]. Presently , 3D Slicer [6], an open-source medical imaging platform, stands as a widely adopted tool for research and prototyping in image guidance [7]. Simultane- ously , Robot Operating System (ROS) [8] is a widely popular open-source communication middlew are for robotics research. Howe ver , 3D Slicer lacks a straightforward interface for sur- gical robotics applications [9]. T o fully leverage the potential of image-guided robotic surgery , there is a crucial need to integrate open-source navig ation and robotics platforms. 1 Department of Computer Science, Johns Hopkins University , Baltimore, MD, USA. 2 Department of Otolaryngology-Head and Neck Surgery , Johns Hopkins Univ ersity School of Medicine, Baltimore, MD, USA. 3 School of Computing, Queen’ s University , Kingston, ON, Canada † These authors contributed equally . Email: manish.sahu@jhu.edu *Presented as a podium presentation at the HSMR Dev eloping complementary situational a wareness for image- guided robot-assisted research encounters a significant chal- lenge: the absence of flexible open-source framew orks that em- power researchers to swiftly develop and test systems tailored to specific surgical contexts. The simulation requirements for a situation-aware system are multi-faceted, encompass- ing the need for a real-time, physics-based dynamic virtual en vironment. This environment should seamlessly integrate various components, including surgical robots and patient anatomy , and extend its capabilities to provide real-time feed- back aligned with safety constraints and the surgical context. Moreov er , to ensure wide robotics compatibility , the platform must accommodate any robot supported in ROS. This paper outlines the system dev elopment work focused on integrating 3D Slicer and a physics-based simulation en vi- ronment, Asynchronous Multibody Framework (AMBF) [10], using ROS as the underlying communication middlew are. The objectiv e is to empower researchers by merging the visualiza- tion and registration capabilities of 3D Slicer with the dy- namic and constraint-based simulation capabilities of AMBF . Through this integration, researchers gain a comprehensiv e toolset to advance projects in medical robotics, contributing to the overall progression of the field. T o facilitate seamless integration, ROS modules were de- veloped for both AMBF and 3D Slicer , functioning as R OS nodes. These modules act as a bridge, con v eying conte xtual information between 3D Slicer and AMBF . This collabo- rativ e approach le verages the strengths of both tools, with 3D Slicer providing calibration, registration, and navigation visualization, and AMBF deliv ering real-time physics-based constraints. The incorporation of R OS not only facilitates communication but also opens the door to potentially integrate other open-source projects. T o sho wcase the capabilities of our integrated system, we conducted a feasibility study , demon- strating low-latenc y , real-time visualization of robots within the 3D Slicer en vironment. I I . R E L A T E D W O R K Image-guidance in medical robotics : 3D Slicer [6] is the most widely used medical imaging platform for research and prototyping, offering robust functionalities such as segmenta- tion, registration, and three-dimensional visualization of med- ical image data. Its dedicated image-guided therapy extension, SlicerIGT [11], further enriches its capabilities by providing image registration and interfacing with external devices to support image guidance. An open-source communication pro- tocol, OpenIGTLink [12], further e xtends its utility for IGT JOURNAL OF L A T E X CLASS FILES, VOL. 14, NO. 8, A UGUST 2023 2 Fig. 1. Overview of integration of 3D Slicer and AMBF . Dif ferent features for each software tool are highlighted around their respective logos. research by pro viding a direct interface with commercial track- ing systems [13]. T o foster image-guided robotics research, there hav e been attempts to integrate 3D Slicer and R OS by using a ROS interface with OpenIGTLink [14] or a ROS- IGTL bridge [15] implemented for the KUKA lightweight robot (L WR). Ho wev er, these interfaces merely facilitate low- lev el data exchange between the two en vironments [16]. T o enable more synchronous and seamless integration between the tw o platforms, Connolly et al. [16] de veloped a SlicerR OS2 module to facilitate bi-directional data transfer between the Slicer Medical Reality Modelling Language (MRML [6]) scene to the R OS scene graph (Tf [17]). Dynamic Simulation : Within the R OS community , R V iz is the default visualization tool, ho wever , it does not simulate robot kinematics, dynamics or interaction. While several open- source simulators such as Gazebo [18] and V irtual Robot Experimentation Platform (V -REP) [19] support features for robot dynamics, these are not suitable for real-time surgi- cal applications [10]. Asynchronous Multi-Body Framew ork (AMBF) [10], which enables the simulation of complex sur- gical robots and en vironments, is modular , real-time, and asynchronous, offering direct access to software- (e. g., system libraries or driv ers) and hardware components (input and hap- tic de vices, optical tracking, video input, etc.). Additionally , AMBF provides a hierarchical software plugin interface for the dev elopment of customized applications. Lev eraging these design choices and capabilities, AMBF has been successfully expanded into an immersive simulation system for surgical purposes [20]–[23]. In this work, we aim to integrate 3D Slicer and AMBF using Robot Operating System (ROS) to utilize the extensi ve capabilities of 3D Slicer for visualization, processing, and registration of medical imaging data, and the physics-based constraints of AMBF for simulating the interaction of a robot with the anatomy . I I I . M E T H O D S W e provide the system overvie w and functionality of our toolset in this section. A. System Design The functional requirements for prototyping of image- guided robotic systems necessitate real-time visualization of the surgical scene in 3D Slicer , coupled with bidirectional data transfer between the simulated scene and the Slicer MRML scene. The system design addresses these requirements by representing the scene in both AMBF and 3D Slicer . A crucial component of the design is providing communication between the MRML scene in 3D Slicer and the scene in AMBF represented by AMBF Description Format (ADF) [20]). T o this end, we developed two modules - AMBF-R OS module and Slicer-R OS module . These modules contrib ute to the system by providing a high-lev el real-time description of the en vironment. Fig. 2. Simulated robotic sinus endoscopy in AMBF (left), and real-time synchronized scene in 3D Slicer (right). The physics interaction between the robot and the anatomy in AMBF will be reflected in 3D Slicer. B. Scene Repr esentation and Updates 1) AMBF: In comparison to prior work [16], our system provides support for both URDF and a customized message structure representing scene data, ADF , encompassing body names (rigid bodies) and system filepaths for meshes. This adaptation became essential for the AMBF simulator , which handles relativ e poses of rigid bodies at a lower le vel, allowing JOURNAL OF L A T E X CLASS FILES, VOL. 14, NO. 8, A UGUST 2023 3 ROS Master 3D Slicer AMBF Runtime Scene Data Scene Data Scene Kinematics AMBF-ROS Module Param Server Scene Kinematics ROS TF MRML Node Slicer-ROS Module Fig. 3. Higher level structure of Slicer-R OS Module connected with AMBF Simulator . AMBF simulator and 3D slicer communicate Scene data and kinematics using ROS as an intermediary middlew are. direct queries of rigid body poses through the ADF file. During initialization, the simulator generates a custom message stored on the R OS parameter server . Subsequently , our 3D Slicer extension parses this message to seamlessly load the robot’ s rigid bodies. 2) 3D Slicer: The 3D Slicer scene is organized according to MRML nodes that correspond to different data types. For this work, the links of the robot are represented as vtkMRMLModelNodes and their respective positions are stored in vtkMRMLLinearT r ansformationNodes . In addition to the robot, 3D Slicer nativ ely offers sev eral visualization tools for dif ferent medical imaging formats including ultrasound, computed tomography (CT) and magnetic resonance imaging (MRI). One of the main utilities of 3D Slicer is that this imaging information can be registered to other data in the scene using the SlicerIGT modules [7]. Updates : The 3D Slicer ROS module utilizes the R OS Trans- formation package ( tf ) to obtain the robot’ s state. The AMBF simulator plugin updates each rigid body’ s position through tf, allo wing the 3D Slicer’ s R OS module to query each rigid body’ s transformation relative to the world frame and update the corresponding models within the MRML scene. T o synchronize the 3D viewer in Slicer with the simulator, a custom timer (QTimer from the Qt library) was implemented in the 3D Slicer R OS module. This timer refreshes at 200 H z , triggers the R OS callbacks, and updates the MRML scene. C. System dependencies and compatibility T o successfully build the system and replicate the results presented in this article, the following dependencies must be satisfied: • 3D Slicer : It is crucial to ha ve 3D Slicer built from source since the extension was dev eloped within the 3D Slicer platform. • R OS Noetic : The ROS1 communication infrastructure was utilized for data transmission between v arious com- ponents. Therefore, the installation of ROS Noetic, along with its standard packages, is necessary . • AMBF and Plugin : AMBF and the plugin designed for the task must be built from source. The AMBF plugin plays a ke y role in constructing the scene simulation and updating its state within the 3D Slicer en vironment. Fig. 4. Overview of the experimental setup for the simulated robotic sinus procedure. Galen robotics system holds an endoscope aimed at the sinus phantom (Screen A). AMBF simulator acquires motion data from the Galen robot, achieving real-time synchronization with the 3D Slicer application through the integration of a novel pipeline(Screen B). Our system can be integrated with other R OS packages provided they use URDF for robot description and tf for scene updates. I V . E X P E R I M E N T S The primary focus of our work is to ev aluate the real- time beha vior of the system in a sur gical robotics use case, specifically assessing communication performance for surgical scene updates. A. Surgical r obotics use case T o demonstrate the applicability of our system and assess its efficac y , we performed a feasibility experiment using a Robotic ENT (Ear, Nose, and Throat) Microsur gery System (REMS) which is a pre-clinical version dev eloped by Galen Robotics (Baltimore, MD). The virtual models of REMS are loaded into AMBF and with their movements synchronized by referencing the actual state of the real REMS. The phantom was registered using optical tracking and the 3D Slicer registration toolkit. Our system demonstrated efficient real-time tracking of the robotic system and anatomy , updating the virtual representa- tions seamlessly in both AMBF and 3D Slicer . B. P erformance Evaluation For ev aluating the performance of the system, the Galen Surgical Robot model comprising 25 rigid bodies and a skull model was loaded into AMBF . The 3D Slicer ROS module was designed to update at a maximum frequency of 200Hz. The ev aluation consisted of actuating the robot within AMBF , and for each update, we measured the round-trip delay (R TD). The results are presented in Fig.5, indicating Mean:19.98 ms, Median:18.99 ms, std:4.40 ms . Giv en that our system focuses on the unilateral communication from AMBF to 3D Slicer , we can reasonably assume that there is approximately 10 ms delay between the components. These statistics show that the R TD for the R OS module is stable at around 19 ms (9.5 ms for one- way delay). Considering the lowest reported latency of 16 ms JOURNAL OF L A T E X CLASS FILES, VOL. 14, NO. 8, A UGUST 2023 4 Fig. 5. Latency analysis: Round T ime Delay(R TD) over 1000 frames that impacts human-computer interaction (HCI) tasks [24], we believ e the latency of our system will not significantly affect the operator’ s tasks in real-world applications. V . D I S C U S S I O N & C O N C L U S I O N In this work, we addressed an essential need for an open- source navigation and robotics platform integration by seam- lessly integrating 3D Slicer , a widely adopted medical imag- ing platform, with the physics-based simulation en vironment AMBF , using R OS as the underlying communication mid- dlew are. The developed R OS modules, acting as nodes for both AMBF and 3D Slicer, serve as a bridge, facilitating real- time synchronization of scene states between the simulation and the 3D Slicer . Our implementation demonstrates the feasibility of achieving lo w-latency , real-time visualization of robotic interventions within the 3D Slicer environment. This integration represents a substantial contribution to the field of robotic intervention research and development, harmonizing the strengths of 3D Slicer in visualization and registration with AMBF’ s dynamic and constraint- based simulation capabilities. The collaborativ e approach lev erages ROS to enhance commu- nication and holds the potential for future integration with ad- ditional open-source projects, fostering a more comprehensiv e toolset for researchers in medical robotics. For future work, we aim to enhance the system by de- veloping assistive shared control functions tailored for var - ious robot-assisted ENT procedures. Recognizing surgeons’ familiarity with the intuiti ve 3D Slicer interface, we plan to empower them with interactive planning within 3D Slicer , while incorporating assisti ve functions based on the physics in- teractions between the anatomy and surgical tools into AMBF . Computational feedback generated within AMBF will be dy- namically reflected back to the 3D Slicer interface, offering surgeons a comprehensive and intuiti ve platform for planning intricate surgical procedures. This integration of computational feedback with the 3D Slicer interface aims to provide surgeons with a cohesive and user-friendly environment that aligns with their preferences and enhances their decision-making processes. A C K N OW L E D G M E N T S A N D D I S C L O S U R E S This work was also supported in part by a research contract from Galen Robotics, by NIDCD K08 Grant DC019708, by a research agreement with the Hong Kong Multi-Scale Medical Robotics Centre, and by Johns Hopkins University internal funds. Russell T aylor and Johns Hopkins University (JHU) may be entitled to royalty payments related to the technology discussed in this paper, and Dr . T aylor has recei ved or may receiv e some portion of these royalties. Also, Dr . T aylor is a paid consultant to and owns equity in Galen Robotics, Inc. These arrangements have been revie wed and approved by JHU in accordance with its conflict of interest policy . R E F E R E N C E S [1] G. Fichtinger , J. Troccaz, and T . Haidegger , “Image-guided interven- tional robotics: Lost in translation?” Pr oceedings of the IEEE , v ol. 110, no. 7, pp. 932–950, 2022. [2] K. Cleary and T . M. Peters, “Image-guided interventions: technology revie w and clinical applications, ” Annual Review of Biomedical Engi- neering , vol. 12, pp. 119–142, 2010. [3] R. T aylor , P . Jensen et al. , “ A steady-hand robotic system for microsur- gical augmentation, ” The International Journal of Robotics Research , vol. 18, no. 12, pp. 1201–1210, 1999. [4] N. Simaan, R. H. T aylor , and H. Choset, “Intelligent surgical robots with situational awareness, ” Mechanical Engineering , vol. 137, no. 09, pp. S3–S6, 2015. [5] H. Choi, C. Crump et al. , “On the use of simulation in robotics: Opportunities, challenges, and suggestions for moving forw ard, ” PN AS , vol. 118, no. 1, p. e1907856118, 2021. [6] A. Fedorov , R. Beichel et al. , “3D Slicer as an image computing platform for the quantitative imaging network, ” Magnetic Resonance Imaging , vol. 30, no. 9, pp. 1323–1341, 2012. [7] T . Ungi, A. Lasso, and G. Fichtinger, “Open-source platforms for navigated image-guided interventions, ” Medical Image Analysis , vol. 33, pp. 181–186, 2016. [8] M. Quigley , K. Conley et al. , “ROS: an open-source Robot Operating System, ” in ICRA W orkshop on Open Source Software , vol. 3, no. 3.2. K obe, Japan, 2009, p. 5. [9] L. Connolly , A. Deguet et al. , “ An open-source platform for cooperative, semi-autonomous robotic surgery , ” in IEEE International Conference on Autonomous Systems (ICAS) . IEEE, 2021, pp. 1–5. [10] A. Munawar and G. S. Fischer , “ An asynchronous multi-body simulation framew ork for real-time dynamics, haptics and learning with application to surgical robots. ” IROS, Nov 2019. [11] T . Ungi, A. Lasso, and G. Fichtinger, “Open-source platforms for navigated image-guided interventions, ” Medical Image Analysis , vol. 33, pp. 181–186, 2016. [12] J. T okuda, G. S. Fischer et al. , “OpenIGTLink: an open netw ork protocol for image-guided therapy environment, ” IJMRCAS , vol. 5, no. 4, pp. 423–434, 2009. [13] A. Lasso, T . Heffter et al. , “PLUS: open-source toolkit for ultrasound- guided intervention systems, ” TBME , vol. 61, no. 10, pp. 2527–2537, 2014. [14] S. T auscher, J. T okuda et al. , “OpenIGTLink interface for state control and visualisation of a robot for image-guided therapy systems, ” IJCARS , vol. 10, pp. 285–292, 2015. [15] T . Frank, A. Krieger et al. , “R OS-IGTL-Bridge: an open network in- terface for image-guided therap y using the R OS en vironment, ” IJCARS , vol. 12, pp. 1451–1460, 2017. [16] L. Connolly et al. , “Bridging 3D Slicer and ROS2 for image-guided robotic interventions. ” Sensors , vol. 22,14 5336, Jul 2022. [17] T . Foote, “tf: The transform library , ” in T ePRA . IEEE, 2013, pp. 1–6. [18] N. Koenig and A. Howard, “Design and use paradigms for gazebo, an open-source multi-robot simulator , ” in IROS , vol. 3. IEEE, 2004, pp. 2149–2154. [19] E. Rohmer, S. P . Singh, and M. Freese, “V -REP: A versatile and scalable robot simulation frame work, ” in IEEE/RSJ International Confer ence on Intelligent Robots and Systems (IR OS) . IEEE, 2013, pp. 1321–1326. [20] A. Munaw ar, Z. Li et al. , “Fully immersi ve virtual reality for skull-base surgery: surgical training and beyond, ” IJCARS , pp. 1–9, 2023. [21] H. Phalen, A. Munawar et al. , “Platform for in vestigating continuum manipulator behavior in orthopedics, ” IJCARS , pp. 1–6, 2023. [22] H. Shu, R. Liang et al. , “T win-s: a digital twin for skull base surgery , ” IJCARS , pp. 1–8, 2023. [23] A. Munawar , J. Y . Wu et al. , “Open simulation environment for learning and practice of robot-assisted sur gical suturing, ” RAL , vol. 7, no. 2, pp. 3843–3850, 2022. [24] C. Attig, N. Rauh et al. , “System latency guidelines then and now–is zero latency really considered necessary?” in Engineering Psychology and Cognitive Ergonomics . Springer, 2017, pp. 3–14.

Original Paper

Loading high-quality paper...

Comments & Academic Discussion

Loading comments...

Leave a Comment