This page has only limited features, please log in for full access.

Dr. IULIAN IORDACHITA
Scholl of Engineering, Department of Mechanical Engineering Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall 125, 3400 North Charles Street Baltimore, MD 21218-2682, USA

Basic Info

Basic Info is private.

Research Keywords & Expertise

0 Computer Assisted Surgery
0 Medical instrumentation
0 image-guided surgery
0 Surgical Robotics
0 Smart surgical tools

Honors and Awards

The user has no records in this section


Career Timeline

The user has no records in this section.


Short Biography

The user biography is not available.
Following
Followers
Co Authors
The list of users this user is following is empty.
Following: 0 users

Feed

Journal article
Published: 30 July 2021 in IEEE Robotics and Automation Letters
Reads 0
Downloads 0

Retinal surgery is known to be a complicated and challenging task for an ophthalmologist even for retina specialists. Image guided robot-assisted intervention is among the novel and promising solutions that may enhance human capabilities during microsurgery. In this paper, we demonstrate a novel method for 3D guidance of a microsurgical instrument based on the projection of a spotlight during robot-assisted retinal surgery. To test the feasibility and effectiveness of the proposed method, a vessel tracking task in a phantom with a Remote Center of Motion (RCM) constraint is performed by the Steady-Hand Eye Robot (SHER). The results are compared to manual tracking, cooperative control tracking with the SHER and spotlight-based automatic tracking with SHER. The reported results are that the spotlight-based automatic tracking with SHER can reach an average tracking error of 0.013 mm and keeping distance error of 0.1 mm from the desired range demonstrating a significant improvement compared to manual or cooperative control methods alone.

ACS Style

Mingchuan Zhou; Jiahao Wu; Ali Ebrahimi; Niravkumar Patel; Yunhui Liu; Nassir Navab; Peter Gehlbach; Alois Knoll; M. Ali Nasseri; Ioan Iulian Iordachita. Spotlight-Based 3D Instrument Guidance for Autonomous Task in Robot-Assisted Retinal Surgery. IEEE Robotics and Automation Letters 2021, 6, 7750 -7757.

AMA Style

Mingchuan Zhou, Jiahao Wu, Ali Ebrahimi, Niravkumar Patel, Yunhui Liu, Nassir Navab, Peter Gehlbach, Alois Knoll, M. Ali Nasseri, Ioan Iulian Iordachita. Spotlight-Based 3D Instrument Guidance for Autonomous Task in Robot-Assisted Retinal Surgery. IEEE Robotics and Automation Letters. 2021; 6 (4):7750-7757.

Chicago/Turabian Style

Mingchuan Zhou; Jiahao Wu; Ali Ebrahimi; Niravkumar Patel; Yunhui Liu; Nassir Navab; Peter Gehlbach; Alois Knoll; M. Ali Nasseri; Ioan Iulian Iordachita. 2021. "Spotlight-Based 3D Instrument Guidance for Autonomous Task in Robot-Assisted Retinal Surgery." IEEE Robotics and Automation Letters 6, no. 4: 7750-7757.

Journal article
Published: 25 March 2021 in IEEE Robotics and Automation Letters
Reads 0
Downloads 0

Focused ultrasound (FUS) technology attracts increasing interests accrediting to its non-invasive and painless treatment of tumors. Magnetic resonance imaging (MRI) guidance has been introduced to monitor this procedure, thus allowing the ultrasound foci to be precisely controlled. However, manual positioning of the FUS transducers is challenging, especially for the intra-operative adjustment in the MRI room. Currently, there are very few devices capable to provide robotic transducer positioning for the treatment of abdominopelvic organ diseases under MRI. The high intensity focused ultrasound (HIFU) spot would have to be steered to ablate large (> 3.5 cm) or multiple tumors (e.g. in liver). To this end, we proposed a hydraulic-driven tele-operated robot platform that enables 5-DoF manipulation of the FUS transducer. Even operated close to the MRI iso-center, the robot can guarantee zero electromagnetic artefact to the MR image. Our proof-of-concept robot prototype can offer a large workspace (100 100 35 mm) for FUS foci steering. Accurate manipulation (0.2 mm in translation, 0.4 in rotation) of the FUS transducer holder is achieved using rolling diaphragm-sealed hydraulic actuators. The robot control responsiveness (from 0.1 to 4 Hz) is also evaluated to show the potential to compensate the spot tracking error induced by respiratory motion. We also demonstrate the use of wireless radiofrequency (RF) markers so as to continuously register the robot task space in the MRI coordinates.

ACS Style

Jing Dai; Zhuoliang He; Ge Fang; Xiaomei Wang; Yingqi Li; Chim-Lee Cheung; Liyuan Liang; Ioan Iulian Iordachita; Hing-Chiu Chang; Ka-Wai Kwok. A Robotic Platform to Navigate MRI-guided Focused Ultrasound System. IEEE Robotics and Automation Letters 2021, 6, 5137 -5144.

AMA Style

Jing Dai, Zhuoliang He, Ge Fang, Xiaomei Wang, Yingqi Li, Chim-Lee Cheung, Liyuan Liang, Ioan Iulian Iordachita, Hing-Chiu Chang, Ka-Wai Kwok. A Robotic Platform to Navigate MRI-guided Focused Ultrasound System. IEEE Robotics and Automation Letters. 2021; 6 (3):5137-5144.

Chicago/Turabian Style

Jing Dai; Zhuoliang He; Ge Fang; Xiaomei Wang; Yingqi Li; Chim-Lee Cheung; Liyuan Liang; Ioan Iulian Iordachita; Hing-Chiu Chang; Ka-Wai Kwok. 2021. "A Robotic Platform to Navigate MRI-guided Focused Ultrasound System." IEEE Robotics and Automation Letters 6, no. 3: 5137-5144.

Journal article
Published: 01 October 2020 in IEEE Sensors Journal
Reads 0
Downloads 0

This article proposes a data-driven learning-based approach for shape sensing and Distal-end Position Estimation (DPE) of a surgical Continuum Manipulator (CM) in constrained environments using Fiber Bragg Grating (FBG) sensors. The proposed approach uses only the sensory data from an unmodeled uncalibrated sensor embedded in the CM to estimate the shape and DPE. It serves as an alternate to the conventional mechanics-based sensor-model-dependent approach which relies on several sensor and CM geometrical assumptions. Unlike the conventional approach where the shape is reconstructed from proximal to distal end of the device, we propose a reversed approach where the distal-end position is estimated first and given this information, shape is then reconstructed from distal to proximal end. The proposed methodology yields more accurate DPE by avoiding accumulation of integration errors in conventional approaches. We study three data-driven models, namely a linear regression model, a Deep Neural Network (DNN), and a Temporal Neural Network (TNN) and compare DPE and shape reconstruction results. Additionally, we test both approaches (data-driven and model-dependent) against internal and external disturbances to the CM and its environment such as incorporation of flexible medical instruments into the CM and contacts with obstacles in taskspace. Using the data-driven (DNN) and model-dependent approaches, the following max absolute errors are observed for DPE: 0.78 mm and 2.45 mm in free bending motion, 0.11 mm and 3.20 mm with flexible instruments, and 1.22 mm and 3.19 mm with taskspace obstacles, indicating superior performance of the proposed data-driven approach compared to the conventional approaches.

ACS Style

Shahriar Sefati; Cong Gao; Iulian Iordachita; Russell H. Taylor; Mehran Armand. Data-Driven Shape Sensing of a Surgical Continuum Manipulator Using an Uncalibrated Fiber Bragg Grating Sensor. IEEE Sensors Journal 2020, 21, 3066 -3076.

AMA Style

Shahriar Sefati, Cong Gao, Iulian Iordachita, Russell H. Taylor, Mehran Armand. Data-Driven Shape Sensing of a Surgical Continuum Manipulator Using an Uncalibrated Fiber Bragg Grating Sensor. IEEE Sensors Journal. 2020; 21 (3):3066-3076.

Chicago/Turabian Style

Shahriar Sefati; Cong Gao; Iulian Iordachita; Russell H. Taylor; Mehran Armand. 2020. "Data-Driven Shape Sensing of a Surgical Continuum Manipulator Using an Uncalibrated Fiber Bragg Grating Sensor." IEEE Sensors Journal 21, no. 3: 3066-3076.

Journal article
Published: 08 September 2020 in IEEE/ASME Transactions on Mechatronics
Reads 0
Downloads 0

Vitreoretinal surgery is among the most delicate surgical tasks during which surgeon hand tremor may severely attenuate surgeon performance. Robotic assistance has been demonstrated to be beneficial in diminishing hand tremor. Among the requirements for reliable assistance from the robot is to provide precise measurements of system states e.g. sclera forces, tool tip position and tool insertion depth. Providing this and other sensing information using existing technology would contribute towards development and implementation of autonomous robot-assisted tasks in retinal surgery such as laser ablation, guided suture placement/assisted needle vessel cannulation, among other applications. In the present work, we use a state-estimating Kalman filtering (KF) to improve the tool tip position and insertion depth estimates, which used to be purely obtained by robot forward kinematics (FWK) and direct sensor measurements, respectively. To improve tool tip localization, in addition to robot FWK, we also use sclera force measurements along with beam theory to account for tool deflection. For insertion depth, the robot FWK is combined with sensor measurements for the cases where sensor measurements are not reliable enough. The improved tool tip position and insertion depth measurements are validated using a stereo camera system through preliminary experiments and a case study. The results indicate that the tool tip position and insertion depth measurements are significantly improved by 77% and 94% after applying KF, respectively.

ACS Style

Ali Ebrahimi; Farshid Alambeigi; Shahriar Sefati; Niravkumar Patel; Changyan He; Peter Louis Gehlbach; Iulian Iordachita. Stochastic Force-Based Insertion Depth and Tip Position Estimations of Flexible FBG-Equipped Instruments in Robotic Retinal Surgery. IEEE/ASME Transactions on Mechatronics 2020, 26, 1512 -1523.

AMA Style

Ali Ebrahimi, Farshid Alambeigi, Shahriar Sefati, Niravkumar Patel, Changyan He, Peter Louis Gehlbach, Iulian Iordachita. Stochastic Force-Based Insertion Depth and Tip Position Estimations of Flexible FBG-Equipped Instruments in Robotic Retinal Surgery. IEEE/ASME Transactions on Mechatronics. 2020; 26 (3):1512-1523.

Chicago/Turabian Style

Ali Ebrahimi; Farshid Alambeigi; Shahriar Sefati; Niravkumar Patel; Changyan He; Peter Louis Gehlbach; Iulian Iordachita. 2020. "Stochastic Force-Based Insertion Depth and Tip Position Estimations of Flexible FBG-Equipped Instruments in Robotic Retinal Surgery." IEEE/ASME Transactions on Mechatronics 26, no. 3: 1512-1523.

Journal article
Published: 31 August 2020 in IEEE/ASME Transactions on Mechatronics
Reads 0
Downloads 0

This paper presents the development and experimental evaluation of a redundant robotic system for the less-invasive treatment of osteolysis (bone degradation) behind the acetabular implant during total hip replacement revision surgery. The system comprises a rigid-link positioning robot and a Continuum Dexterous Manipulator (CDM) equipped with highly flexible debriding tools and a Fiber Bragg Grating (FBG)-based sensor. The robot and the continuum manipulator are controlled concurrently via an optimization-based framework using the Tip Position Estimation (TPE) from the FBG sensor as feedback. Performance of the system is evaluated on a setup that consists of an acetabular cup and saw-bone phantom simulating the bone behind the cup. Experiments consist of performing the surgical procedure on the simulated phantom setup. CDM TPE using FBGs, target location placement, cutting performance, and the concurrent control algorithm capability in achieving the desired tasks are evaluated. Mean and standard deviation of the CDM TPE from the FBG sensor and the robotic system are 0.50 mm, and 0.18 mm, respectively. Using the developed surgical system, accurate positioning and successful cutting of desired straight-line and curvilinear paths on saw-bone phantoms behind the cup with different densities are demonstrated. Compared to the conventional rigid tools, the workspace reach behind the acetabular cup is 2.47 times greater when using the developed robotic system.

ACS Style

Shahriar Sefati; Rachel Hegeman; Farshid Alambeigi; Iulian Iordachita; Peter Kazanzides; Harpal Khanuja; Russell H. Taylor; Mehran Armand. A Surgical Robotic System for Treatment of Pelvic Osteolysis Using an FBG-Equipped Continuum Manipulator and Flexible Instruments. IEEE/ASME Transactions on Mechatronics 2020, 26, 369 -380.

AMA Style

Shahriar Sefati, Rachel Hegeman, Farshid Alambeigi, Iulian Iordachita, Peter Kazanzides, Harpal Khanuja, Russell H. Taylor, Mehran Armand. A Surgical Robotic System for Treatment of Pelvic Osteolysis Using an FBG-Equipped Continuum Manipulator and Flexible Instruments. IEEE/ASME Transactions on Mechatronics. 2020; 26 (1):369-380.

Chicago/Turabian Style

Shahriar Sefati; Rachel Hegeman; Farshid Alambeigi; Iulian Iordachita; Peter Kazanzides; Harpal Khanuja; Russell H. Taylor; Mehran Armand. 2020. "A Surgical Robotic System for Treatment of Pelvic Osteolysis Using an FBG-Equipped Continuum Manipulator and Flexible Instruments." IEEE/ASME Transactions on Mechatronics 26, no. 1: 369-380.

Review
Published: 24 August 2020 in IEEE Access
Reads 0
Downloads 0

In the last decades, fiber Bragg gratings (FBGs) have become increasingly attractive to medical applications due to their unique properties such as small size, biocompatibility, immunity to electromagnetic interferences, high sensitivity and multiplexing capability. FBGs have been employed in the development of surgical tools, assistive devices, wearables, and biosensors, showing great potentialities for medical uses. This paper reviews the FBG-based measuring systems, their principle of work, and their applications in medicine and healthcare. Particular attention is given to sensing solutions for biomechanics, minimally invasive surgery, physiological monitoring, and medical biosensing. Strengths, weaknesses, open challenges, and future trends are also discussed to highlight how FBGs can meet the demands of nextgeneration medical devices and healthcare system.

ACS Style

Daniela Lo Presti; Carlo Massaroni; Catia Sofia Jorge Leitao; Maria De Fatima Domingues; Marzhan Sypabekova; David Barrera; Ignazio Floris; Luca Massari; Calogero Maria Oddo; Salvador Sales; Iulian Ioan Iordachita; Daniele Tosi; Emiliano Schena. Fiber Bragg Gratings for Medical Applications and Future Challenges: A Review. IEEE Access 2020, 8, 156863 -156888.

AMA Style

Daniela Lo Presti, Carlo Massaroni, Catia Sofia Jorge Leitao, Maria De Fatima Domingues, Marzhan Sypabekova, David Barrera, Ignazio Floris, Luca Massari, Calogero Maria Oddo, Salvador Sales, Iulian Ioan Iordachita, Daniele Tosi, Emiliano Schena. Fiber Bragg Gratings for Medical Applications and Future Challenges: A Review. IEEE Access. 2020; 8 (99):156863-156888.

Chicago/Turabian Style

Daniela Lo Presti; Carlo Massaroni; Catia Sofia Jorge Leitao; Maria De Fatima Domingues; Marzhan Sypabekova; David Barrera; Ignazio Floris; Luca Massari; Calogero Maria Oddo; Salvador Sales; Iulian Ioan Iordachita; Daniele Tosi; Emiliano Schena. 2020. "Fiber Bragg Gratings for Medical Applications and Future Challenges: A Review." IEEE Access 8, no. 99: 156863-156888.

Journal article
Published: 07 July 2020 in IEEE Robotics and Automation Letters
Reads 0
Downloads 0

This letter reports the improved design, system integration, and initial experimental evaluation of a fully actuated body-mounted robotic system for real-time MRI-guided lower back pain injections. The 6-DOF robot is composed of a 4-DOF needle alignment module and a 2-DOF remotely actuated needle driver module, which together provide a fully actuated manipulator that can operate inside the scanner bore during imaging. The system minimizes the need to move the patient in and out of the scanner during a procedure, and thus may shorten the procedure time and streamline the clinical workflow. The robot is devised with a compact and lightweight structure that can be attached directly to the patient's lower back via straps. This approach minimizes the effect of patient motion by allowing the robot to move with the patient. The robot is integrated with an image-based surgical planning module. A dedicated clinical workflow is proposed for robot-assisted lower back pain injections under real-time MRI guidance. Targeting accuracy of the system was evaluated with a real-time MRI-guided phantom study, demonstrating the mean absolute errors (MAE) of the tip position to be 1.50 $\pm$ 0.68 mm and of the needle angle to be 1.56 $\pm \; \text{0.93}^\circ$ . An initial cadaver study was performed to validate the feasibility of the clinical workflow, indicating the maximum error of the position to be less than 1.90 mm and of the angle to be less than 3.14 $^\circ$ .

ACS Style

Gang Li; Niravkumar A. Patel; Yanzhou Wang; Charles Dumoulin; Wolfgang Loew; Olivia Loparo; Katherine Schneider; Karun Sharma; Kevin Cleary; Jan Fritz; Ioan Iulian Iordachita; Pietro Valdastri; Peter Robert Culmer. Fully Actuated Body-Mounted Robotic System for MRI-Guided Lower Back Pain Injections: Initial Phantom and Cadaver Studies. IEEE Robotics and Automation Letters 2020, 5, 1 -1.

AMA Style

Gang Li, Niravkumar A. Patel, Yanzhou Wang, Charles Dumoulin, Wolfgang Loew, Olivia Loparo, Katherine Schneider, Karun Sharma, Kevin Cleary, Jan Fritz, Ioan Iulian Iordachita, Pietro Valdastri, Peter Robert Culmer. Fully Actuated Body-Mounted Robotic System for MRI-Guided Lower Back Pain Injections: Initial Phantom and Cadaver Studies. IEEE Robotics and Automation Letters. 2020; 5 (4):1-1.

Chicago/Turabian Style

Gang Li; Niravkumar A. Patel; Yanzhou Wang; Charles Dumoulin; Wolfgang Loew; Olivia Loparo; Katherine Schneider; Karun Sharma; Kevin Cleary; Jan Fritz; Ioan Iulian Iordachita; Pietro Valdastri; Peter Robert Culmer. 2020. "Fully Actuated Body-Mounted Robotic System for MRI-Guided Lower Back Pain Injections: Initial Phantom and Cadaver Studies." IEEE Robotics and Automation Letters 5, no. 4: 1-1.

Journal article
Published: 22 May 2020 in IEEE/ASME Transactions on Mechatronics
Reads 0
Downloads 0

Retinal surgery is a bimanual operation in which surgeons operate with an instrument in their dominant hand (more capable hand) and simultaneously hold a light pipe (illuminating pipe) with their non-dominant hand (less capable hand) to provide illumination inside the eye. Manually holding and adjusting the light pipe places an additional burden on the surgeon and increases the overall complexity of the procedure. To overcome these challenges, a robot-assisted automatic light pipe actuating system is proposed. A customized light pipe with force-sensing capability is mounted at the end effector of a follower robot and is actuated through a hybrid force-velocity controller to automatically illuminate the target area on the retinal surface by pivoting about the scleral port (incision on the sclera). Static following-accuracy evaluation and dynamic light tracking experiments are carried out. The results show that the proposed system can successfully illuminate the desired area with negligible offset (the average offset is 2.45 mm with standard deviation of 1.33 mm). The average scleral forces are also below a specified threshold (50 mN). The proposed system not only can allow for increased focus on dominant hand instrument control, but also could be extended to three-arm procedures (two surgical instruments held by surgeon plus a robot-holding light pipe) in retinal surgery, potentially improving surgical efficiency and outcome.

ACS Style

Changyan He; Emily Yang; Niravkumar Patel; Ali Ebrahimi; Mahya Shahbazi; Peter Louis Gehlbach; Iulian Iordachita. Automatic Light Pipe Actuating System for Bimanual Robot-Assisted Retinal Surgery. IEEE/ASME Transactions on Mechatronics 2020, 25, 2846 -2857.

AMA Style

Changyan He, Emily Yang, Niravkumar Patel, Ali Ebrahimi, Mahya Shahbazi, Peter Louis Gehlbach, Iulian Iordachita. Automatic Light Pipe Actuating System for Bimanual Robot-Assisted Retinal Surgery. IEEE/ASME Transactions on Mechatronics. 2020; 25 (6):2846-2857.

Chicago/Turabian Style

Changyan He; Emily Yang; Niravkumar Patel; Ali Ebrahimi; Mahya Shahbazi; Peter Louis Gehlbach; Iulian Iordachita. 2020. "Automatic Light Pipe Actuating System for Bimanual Robot-Assisted Retinal Surgery." IEEE/ASME Transactions on Mechatronics 25, no. 6: 2846-2857.

Journal article
Published: 19 May 2020 in IEEE Transactions on Automation Science and Engineering
Reads 0
Downloads 0

The treatment of malaria is a global health challenge that stands to benefit from the widespread introduction of a vaccine for the disease. A method has been developed to create a live organism vaccine using the sporozoites (SPZ) of the parasite Plasmodium falciparum (Pf), which are concentrated in the salivary glands of infected mosquitoes. Current manual dissection methods to obtain these PfSPZ are not optimally efficient for large-scale vaccine production. We propose an improved dissection procedure and a mechanical fixture that increases the rate of mosquito dissection and helps to deskill this stage of the production process. We further demonstrate the automation of a key step in this production process, the picking and placing of mosquitoes from a staging apparatus into a dissection assembly. This unit test of a robotic mosquito pick-and-place system is performed using a custom-designed microgripper attached to a four-degree-of-freedom (4-DOF) robot under the guidance of a computer vision system. Mosquitoes are autonomously grasped and pulled to a pair of notched dissection blades to remove the head of the mosquito, allowing access to the salivary glands. Placement into these blades is adapted based on output from computer vision to accommodate for the unique anatomy and orientation of each grasped mosquito. In this pilot test of the system on 50 mosquitoes, we demonstrate a 100% grasping accuracy and a 90% accuracy in placing the mosquito with its neck within the blade notches such that the head can be removed. This is a promising result for this difficult and nonstandard pick-and-place task.

ACS Style

Henry Phalen; Prasad Vagdargi; Mariah L. Schrum; Sumana Chakravarty; Amanda Canezin; Michael Pozin; Suat Coemert; Iulian Iordachita; Stephen L. Hoffman; Gregory S. Chirikjian; Russell H. Taylor. A Mosquito Pick-and-Place System for PfSPZ-Based Malaria Vaccine Production. IEEE Transactions on Automation Science and Engineering 2020, 18, 299 -310.

AMA Style

Henry Phalen, Prasad Vagdargi, Mariah L. Schrum, Sumana Chakravarty, Amanda Canezin, Michael Pozin, Suat Coemert, Iulian Iordachita, Stephen L. Hoffman, Gregory S. Chirikjian, Russell H. Taylor. A Mosquito Pick-and-Place System for PfSPZ-Based Malaria Vaccine Production. IEEE Transactions on Automation Science and Engineering. 2020; 18 (1):299-310.

Chicago/Turabian Style

Henry Phalen; Prasad Vagdargi; Mariah L. Schrum; Sumana Chakravarty; Amanda Canezin; Michael Pozin; Suat Coemert; Iulian Iordachita; Stephen L. Hoffman; Gregory S. Chirikjian; Russell H. Taylor. 2020. "A Mosquito Pick-and-Place System for PfSPZ-Based Malaria Vaccine Production." IEEE Transactions on Automation Science and Engineering 18, no. 1: 299-310.

Journal article
Published: 16 April 2020 in IEEE Transactions on Medical Robotics and Bionics
Reads 0
Downloads 0

High-resolution real-time intraocular imaging of retina at the cellular level is very challenging due to the vulnerable and confined space within the eyeball as well as the limited availability of appropriate modalities. A probe-based confocal laser endomicroscopy (pCLE) system, can be a potential imaging modality for improved diagnosis. The ability to visualize the retina at the cellular level could provide information that may predict surgical outcomes. The adoption of intraocular pCLE scanning is currently limited due to the narrow field of view and the micron-scale range of focus. In the absence of motion compensation, physiological tremors of the surgeons hand and patient movements also contribute to the deterioration of the image quality. Therefore, an image-based hybrid control strategy is proposed to mitigate the above challenges. The proposed hybrid control strategy enables a shared control of the pCLE probe between surgeons and robots to scan the retina precisely, with the absence of hand tremors and with the advantages of an image-based auto-focus algorithm that optimizes the quality of pCLE images. The hybrid control strategy is deployed on two frameworks -cooperative and teleoperated. Better image quality, smoother motion, and reduced workload are all achieved in a statistically significant manner with the hybrid control frameworks.

ACS Style

Zhaoshuo Li; Mahya Shahbazi; Niravkumar Patel; Eimear O’ Sullivan; Haojie Zhang; Khushi Vyas; Preetham Chalasani; Anton Deguet; Peter L. Gehlbach; Iulian Iordachita; Guang-Zhong Yang; Russell H. Taylor. Hybrid Robot-Assisted Frameworks for Endomicroscopy Scanning in Retinal Surgeries. IEEE Transactions on Medical Robotics and Bionics 2020, 2, 176 -187.

AMA Style

Zhaoshuo Li, Mahya Shahbazi, Niravkumar Patel, Eimear O’ Sullivan, Haojie Zhang, Khushi Vyas, Preetham Chalasani, Anton Deguet, Peter L. Gehlbach, Iulian Iordachita, Guang-Zhong Yang, Russell H. Taylor. Hybrid Robot-Assisted Frameworks for Endomicroscopy Scanning in Retinal Surgeries. IEEE Transactions on Medical Robotics and Bionics. 2020; 2 (2):176-187.

Chicago/Turabian Style

Zhaoshuo Li; Mahya Shahbazi; Niravkumar Patel; Eimear O’ Sullivan; Haojie Zhang; Khushi Vyas; Preetham Chalasani; Anton Deguet; Peter L. Gehlbach; Iulian Iordachita; Guang-Zhong Yang; Russell H. Taylor. 2020. "Hybrid Robot-Assisted Frameworks for Endomicroscopy Scanning in Retinal Surgeries." IEEE Transactions on Medical Robotics and Bionics 2, no. 2: 176-187.

Commentary
Published: 16 December 2019 in International Journal of Retina and Vitreous
Reads 0
Downloads 0

Eye surgery, specifically retinal micro-surgery involves sensory and motor skill that approaches human boundaries and physiological limits for steadiness, accuracy, and the ability to detect the small forces involved. Despite assumptions as to the benefit of robots in surgery and also despite great development effort, numerous challenges to the full development and adoption of robotic assistance in surgical ophthalmology, remain. Historically, the first in-human–robot-assisted retinal surgery occurred nearly 30 years after the first experimental papers on the subject. Similarly, artificial intelligence emerged decades ago and it is only now being more fully realized in ophthalmology. The delay between conception and application has in part been due to the necessary technological advances required to implement new processing strategies. Chief among these has been the better matched processing power of specialty graphics processing units for machine learning. Transcending the classic concept of robots performing repetitive tasks, artificial intelligence and machine learning are related concepts that has proven their abilities to design concepts and solve problems. The implication of such abilities being that future machines may further intrude on the domain of heretofore “human-reserved” tasks. Although the potential of artificial intelligence/machine learning is profound, present marketing promises and hype exceeds its stage of development, analogous to the seventieth century mathematical “boom” with algebra. Nevertheless robotic systems augmented by machine learning may eventually improve robot-assisted retinal surgery and could potentially transform the discipline. This commentary analyzes advances in retinal robotic surgery, its current drawbacks and limitations, and the potential role of artificial intelligence in robotic retinal surgery.

ACS Style

Müller G. Urias; Niravkumar Patel; Changyan He; Ali Ebrahimi; Ji Woong Kim; Iulian Iordachita; Peter L. Gehlbach. Artificial intelligence, robotics and eye surgery: are we overfitted? International Journal of Retina and Vitreous 2019, 5, 1 -4.

AMA Style

Müller G. Urias, Niravkumar Patel, Changyan He, Ali Ebrahimi, Ji Woong Kim, Iulian Iordachita, Peter L. Gehlbach. Artificial intelligence, robotics and eye surgery: are we overfitted? International Journal of Retina and Vitreous. 2019; 5 (1):1-4.

Chicago/Turabian Style

Müller G. Urias; Niravkumar Patel; Changyan He; Ali Ebrahimi; Ji Woong Kim; Iulian Iordachita; Peter L. Gehlbach. 2019. "Artificial intelligence, robotics and eye surgery: are we overfitted?" International Journal of Retina and Vitreous 5, no. 1: 1-4.

Journal article
Published: 29 October 2019 in IEEE Transactions on Robotics
Reads 0
Downloads 0

In this article, we present a novel stochastic algorithm called simultaneous sensor calibration and deformation estimation (SCADE) to address the problem of modeling deformation behavior of a generic continuum manipulator (CM) in free and obstructed environments. In SCADE, using a novel mathematical formulation, we introduce a priori model-independent filtering algorithm to fuse the continuous and inaccurate measurements of an embedded sensor (e.g., magnetic or piezoelectric sensors) with an intermittent but accurate data of an external imaging system (e.g., optical trackers or cameras). The main motivation of this article is the crucial need of obtaining an accurate shape/position estimation of a CM utilized in a surgical intervention. In these robotic procedures, the CM is typically equipped with an embedded sensing unit (ESU) while an external imaging modality (e.g., ultrasound or a fluoroscopy machine) is also available in the surgical site. To evaluate the efficacy of the SCADE algorithm, we performed two different sets of experiments in free and obstructed environments. In these experiments, we utilized a CM specifically designed for orthopedic interventions equipped with an inaccurate fiber Bragg grating (FBG) ESU and an overhead camera. The experimental results demonstrated the successful performance of the SCADE algorithm in simultaneous estimation of unknown deformation behavior of the utilized unmodeled CM together with realizing the time-varying drift of the poor-calibrated FBG sensing unit. Moreover, the results showed the phenomenal out-performance of the SCADE algorithm in estimation of the CM’s tip position as compared to FBG-based position estimations.

ACS Style

Farshid Alambeigi; Sahba Aghajani Pedram; Jason L. Speyer; Jacob Rosen; Iulian Iordachita; Russell H. Taylor; Mehran Armand. SCADE: Simultaneous Sensor Calibration and Deformation Estimation of FBG-Equipped Unmodeled Continuum Manipulators. IEEE Transactions on Robotics 2019, 36, 222 -239.

AMA Style

Farshid Alambeigi, Sahba Aghajani Pedram, Jason L. Speyer, Jacob Rosen, Iulian Iordachita, Russell H. Taylor, Mehran Armand. SCADE: Simultaneous Sensor Calibration and Deformation Estimation of FBG-Equipped Unmodeled Continuum Manipulators. IEEE Transactions on Robotics. 2019; 36 (1):222-239.

Chicago/Turabian Style

Farshid Alambeigi; Sahba Aghajani Pedram; Jason L. Speyer; Jacob Rosen; Iulian Iordachita; Russell H. Taylor; Mehran Armand. 2019. "SCADE: Simultaneous Sensor Calibration and Deformation Estimation of FBG-Equipped Unmodeled Continuum Manipulators." IEEE Transactions on Robotics 36, no. 1: 222-239.

Journal article
Published: 01 July 2019 in IEEE Transactions on Biomedical Engineering
Reads 0
Downloads 0

Objective: Robotics-assisted retinal microsurgery provides several benefits including improvement of manipulation precision. The assistance provided to the surgeons by current robotic frameworks is, however, a “passive” support, e.g., by damping hand tremor. Intelligent assistance and active guidance are, however, lacking in the existing robotic frameworks. In this paper, an active interventional control framework (AICF) has been presented to increase operation safety by actively intervening the operation to avoid exertion of excessive forces to the sclera. Methods: AICF consists of the following four components: first, the steady-hand eye robot as the robotic module; second, a sensorized tool to measure tool-to-sclera forces; third, a recurrent neural network to predict occurrence of undesired events based on a short history of time series of sensor measurements; and finally, a variable admittance controller to command the robot away from the undesired instances. Results: A set of user studies were conducted involving 14 participants (with four surgeons). The users were asked to perform a vessel-following task on an eyeball phantom with the assistance of AICF as well as other two benchmark approaches, i.e., auditory feedback (AF) and real-time force feedback (RF). Statistical analysis shows that AICF results in a significant reduction of proportion of undesired instances to about 2.5%, compared with 38.4% and 26.2% using AF and RF, respectively. Conclusion: AICF can effectively predict excessive-force instances and augment performance of the user to avoid undesired events during robot-assisted microsurgical tasks. Significance: The proposed system may be extended to other fields of microsurgery and may potentially reduce tissue injury.

ACS Style

Changyan He; Niravkumar Patel; Mahya Shahbazi; Yang Yang; Peter Gehlbach; Marin Kobilarov; Iulian Iordachita. Toward Safe Retinal Microsurgery: Development and Evaluation of an RNN-Based Active Interventional Control Framework. IEEE Transactions on Biomedical Engineering 2019, 67, 966 -977.

AMA Style

Changyan He, Niravkumar Patel, Mahya Shahbazi, Yang Yang, Peter Gehlbach, Marin Kobilarov, Iulian Iordachita. Toward Safe Retinal Microsurgery: Development and Evaluation of an RNN-Based Active Interventional Control Framework. IEEE Transactions on Biomedical Engineering. 2019; 67 (4):966-977.

Chicago/Turabian Style

Changyan He; Niravkumar Patel; Mahya Shahbazi; Yang Yang; Peter Gehlbach; Marin Kobilarov; Iulian Iordachita. 2019. "Toward Safe Retinal Microsurgery: Development and Evaluation of an RNN-Based Active Interventional Control Framework." IEEE Transactions on Biomedical Engineering 67, no. 4: 966-977.

Journal article
Published: 28 January 2019 in IEEE Transactions on Medical Robotics and Bionics
Reads 0
Downloads 0

Accurate placement and stable fixation are the main goals of internal fixation of bone fractures using the traditional medical screws. These goals are necessary to expedite and avoid improper fracture healing due to misalignment of the bone fragments. However, the rigidity of the screw, geometry of the fractured anatomy (e.g., femur and pelvis), and osteoporosis may cause an array of complications. To address these challenges, we propose the use of a continuum manipulator and a bendable medical screw (BMS) to drill curved tunnels and fixate the bone fragments. This novel approach provides the clinicians with a degree of freedom in selecting the drilling entry point as well as the navigation of drill in complex anatomical and osteoporotic bones. This technique can also facilitate the treatment of osteonecrosis and augmentation of the hip to prevent osteoporotic fractures. In this paper: (1) we evaluated the performance of the curved drilling technique on human cadaveric specimens by making several curved tunnels with different curvatures and (2) we also demonstrated the feasibility of internal fixation using the BMS versus a rigid straight screw by performing finite element simulation of fracture fixation in an osteoporotic femur.

ACS Style

Farshid Alambeigi; Mahsan Bakhtiarinejad; Shahriar Sefati; Rachel Hegeman; Iulian Iordachita; Harpal Khanuja; Mehran Armand. On the Use of a Continuum Manipulator and a Bendable Medical Screw for Minimally Invasive Interventions in Orthopedic Surgery. IEEE Transactions on Medical Robotics and Bionics 2019, 1, 14 -21.

AMA Style

Farshid Alambeigi, Mahsan Bakhtiarinejad, Shahriar Sefati, Rachel Hegeman, Iulian Iordachita, Harpal Khanuja, Mehran Armand. On the Use of a Continuum Manipulator and a Bendable Medical Screw for Minimally Invasive Interventions in Orthopedic Surgery. IEEE Transactions on Medical Robotics and Bionics. 2019; 1 (1):14-21.

Chicago/Turabian Style

Farshid Alambeigi; Mahsan Bakhtiarinejad; Shahriar Sefati; Rachel Hegeman; Iulian Iordachita; Harpal Khanuja; Mehran Armand. 2019. "On the Use of a Continuum Manipulator and a Bendable Medical Screw for Minimally Invasive Interventions in Orthopedic Surgery." IEEE Transactions on Medical Robotics and Bionics 1, no. 1: 14-21.

Journal article
Published: 27 December 2018
Reads 0
Downloads 0

The performance of retinal microsurgery often requires the coordinated use of both hands. During bimanual retinal surgery, dominant hand performance may be negatively impacted by poor non-dominant hand assistance. Therefore understanding bimanual latent determinants, and establishing safety criteria for bimanual manipulation is relevant to robotic development and to eventual patient care. In this paper, we present a preliminary study to quantitatively evaluate one aspect of bimanual tool use in retinal surgery. Two force sensing tools were designed and fabricated using fiber Bragg grating sensors. Tool-to-sclera contact force is measured using the developed tools and analyzed. The tool forces were recorded during five basic surgical maneuvers typical of retinal surgery. Two subjects are involved in experiments, including one clinician and one engineer. For comparison, all manipulations were replicated under robot-assisted conditions. The results indicate that the average tool-to-sclera force recorded from the dominant hand tool is significantly higher than that from the non-dominant hand tool (p = 0.004). Moreover, the average forces under robot-assisted conditions with the present steady hand robot is notably higher than freehand conditions(p = 0.01). The forces obtained from the dominant and not-dominant hand instruments indicate a weak correlation.

ACS Style

Changyan He; Marina Roizenblatt; Niravkumar Patel; Ali Ebrahimi; Yang Yang; Peter L. Gehlbach; Iulian Iordachita. Towards Bimanual Robot-Assisted Retinal Surgery: Tool-to-Sclera Force Evaluation. 2018, 2018, 1 .

AMA Style

Changyan He, Marina Roizenblatt, Niravkumar Patel, Ali Ebrahimi, Yang Yang, Peter L. Gehlbach, Iulian Iordachita. Towards Bimanual Robot-Assisted Retinal Surgery: Tool-to-Sclera Force Evaluation. . 2018; 2018 ():1.

Chicago/Turabian Style

Changyan He; Marina Roizenblatt; Niravkumar Patel; Ali Ebrahimi; Yang Yang; Peter L. Gehlbach; Iulian Iordachita. 2018. "Towards Bimanual Robot-Assisted Retinal Surgery: Tool-to-Sclera Force Evaluation." 2018, no. : 1.

Preprint
Published: 20 December 2018
Reads 0
Downloads 0

Conventional shape sensing techniques using Fiber Bragg Grating (FBG) involve finding the curvature at discrete FBG active areas and integrating curvature over the length of the continuum dexterous manipulator (CDM) for tip position estimation (TPE). However, due to limited number of sensing locations and many geometrical assumptions, these methods are prone to large error propagation especially when the CDM undergoes large deflections. In this paper, we study the complications of using the conventional TPE methods that are dependent on sensor model and propose a new data-driven method that overcomes these challenges. The proposed method consists of a regression model that takes FBG wavelength raw data as input and directly estimates the CDM's tip position. This model is pre-operatively (off-line) trained on position information from optical trackers/cameras (as the ground truth) and it intra-operatively (on-line) estimates CDM tip position using only the FBG wavelength data. The method's performance is evaluated on a CDM developed for orthopedic applications, and the results are compared to conventional model-dependent methods during large deflection bendings. Mean absolute TPE error (and standard deviation) of 1.52 (0.67) mm and 0.11 (0.1) mm with maximum absolute errors of 3.63 mm and 0.62 mm for the conventional and the proposed data-driven techniques were obtained, respectively. These results demonstrate a significant out-performance of the proposed data-driven approach versus the conventional estimation technique.

ACS Style

Shahriar Sefati; Rachel Hegeman; Farshid Alambeigi; Iulian Iordachita; Mehran Armand. FBG-Based Position Estimation of Highly Deformable Continuum Manipulators: Model-Dependent vs. Data-Driven Approaches. 2018, 1 .

AMA Style

Shahriar Sefati, Rachel Hegeman, Farshid Alambeigi, Iulian Iordachita, Mehran Armand. FBG-Based Position Estimation of Highly Deformable Continuum Manipulators: Model-Dependent vs. Data-Driven Approaches. . 2018; ():1.

Chicago/Turabian Style

Shahriar Sefati; Rachel Hegeman; Farshid Alambeigi; Iulian Iordachita; Mehran Armand. 2018. "FBG-Based Position Estimation of Highly Deformable Continuum Manipulators: Model-Dependent vs. Data-Driven Approaches." , no. : 1.

Conference paper
Published: 01 October 2018 in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Reads 0
Downloads 0

This paper presents a body-mounted, four degree-of-freedom (4-DOF) parallel mechanism robot for image-guided percutaneous interventions. The design of the robot is optimized to be light weight and compact such that it could be mounted to the patient body. It has a modular design that can be adopted for assisting various image-guided, needle-based percutaneous interventions such as arthrography, biopsy and brachytherapy seed placement. The robot mechanism and the control system are designed and manufactured with components compatible with imaging modalities including Magnetic Resonance Imaging (MRI) and Computed Tomography (CT). The current version of the robot presented in this paper is optimized for shoulder arthrography under MRI guidance; a Z-shaped fiducial frame is attached to the robot, providing accurate and repeatable robot registration with the MR scanner coordinate system. Here we present the mechanical design of the manipulator, robot kinematics, robot calibration procedure, and preliminary bench-top accuracy assessment. The bench-top accuracy evaluation of the robotic manipulator shows average translational error of 1.01 mm and 0.96 mm in X and Z axes, respectively, and average rotational error of 3.06 degrees and 2.07 degrees about the X and Z axes, respectively.

ACS Style

Niravkumar Patel; Jiawen Yan; David Levi; Reza Monfaredi; Kevin Cleary; Iulian Iordachita. Body-Mounted Robot for Image-Guided Percutaneous Interventions: Mechanical Design and Preliminary Accuracy Evaluation. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018, 2018, 1443 -1448.

AMA Style

Niravkumar Patel, Jiawen Yan, David Levi, Reza Monfaredi, Kevin Cleary, Iulian Iordachita. Body-Mounted Robot for Image-Guided Percutaneous Interventions: Mechanical Design and Preliminary Accuracy Evaluation. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018; 2018 ():1443-1448.

Chicago/Turabian Style

Niravkumar Patel; Jiawen Yan; David Levi; Reza Monfaredi; Kevin Cleary; Iulian Iordachita. 2018. "Body-Mounted Robot for Image-Guided Percutaneous Interventions: Mechanical Design and Preliminary Accuracy Evaluation." 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018, no. : 1443-1448.

Accepted manuscript
Published: 18 September 2018 in Physics in Medicine & Biology
Reads 0
Downloads 0

Purpose: While the interaction between a needle and the surrounding tissue is known to cause a significant targeting error in prostate biopsy leading to false-negative results, few studies have demonstrated how it impacts in the actual procedure. We performed a pilot study on robot-assisted MRI-guided prostate biopsy with an emphasis on the in-depth analysis of the needle-tissue interaction in-vivo. Methods: The data were acquired during in-bore transperineal prostate biopsies in patients using a 4 degrees-of-freedom (DoF) MRI-compatible robot. The anatomical structures in the pelvic area and the needle path were reconstructed from MR images, and quantitatively analyzed. We analyzed each structure individually and also proposed a mathematical model to investigate the influence of those structures in the targeting error using the mixed-model regression. Results: The median targeting error in 188 insertions (27 patients) was 6.3mm. Both the individual anatomical structure analysis and the mixed-model analysis showed that the deviation resulted from the contact between the needle and the skin as the main source of error. On contrary, needle bending inside the tissue (expressed as needle curvature) did not vary among insertions with targeting errors above and below the average. The analysis indicated that insertions crossing the bulbospongiosus presented a targeting error lower than the average. The mixed-model analysis demonstrated that the distance between the needle guide and the patient skin, the deviation at the entry point, and the path length inside the pelvic diaphragm had a statistically significant contribution to the targeting error (p<0.05). Conclusions: Our results indicate that the errors associated with the elastic contact between the needle and the skin were more prominent than the needle bending along the insertion. Our findings will help to improve the preoperative planning of transperineal prostate biopsies.

ACS Style

Pedro Moreira; Niravkumar Patel; Marek Wartenberg; Gang Li; Kemal Tuncali; Tamas Heffter; Everette C Burdette; Iulian Iordachita; Gregory S Fischer; Nobuhiko Hata; Clare M C Tempany; Junichi Tokuda. Evaluation of robot-assisted MRI-guided prostate biopsy: needle path analysis during clinical trials. Physics in Medicine & Biology 2018, 63, 20NT02 .

AMA Style

Pedro Moreira, Niravkumar Patel, Marek Wartenberg, Gang Li, Kemal Tuncali, Tamas Heffter, Everette C Burdette, Iulian Iordachita, Gregory S Fischer, Nobuhiko Hata, Clare M C Tempany, Junichi Tokuda. Evaluation of robot-assisted MRI-guided prostate biopsy: needle path analysis during clinical trials. Physics in Medicine & Biology. 2018; 63 (20):20NT02.

Chicago/Turabian Style

Pedro Moreira; Niravkumar Patel; Marek Wartenberg; Gang Li; Kemal Tuncali; Tamas Heffter; Everette C Burdette; Iulian Iordachita; Gregory S Fischer; Nobuhiko Hata; Clare M C Tempany; Junichi Tokuda. 2018. "Evaluation of robot-assisted MRI-guided prostate biopsy: needle path analysis during clinical trials." Physics in Medicine & Biology 63, no. 20: 20NT02.

Conference
Published: 01 August 2018 in 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
Reads 0
Downloads 0

Retinal microsurgery is technically demanding and requires high surgical skill with very little room for manipulation error. The introduction of robotic assistance has the potential to enhance and expand a surgeon's manipulation capabilities during retinal surgery, i.e., improve precision, cancel physiological hand tremor, and provide sensing information. However, surgeon performance may also be negatively impacted by robotic assistance due to robot structural stiffness and nonintuitive controls. In complying with robotic constraints, the surgeon loses the dexterity of the human hand. In this paper, we present a preliminary experimental study to evaluate user behavior when affected by robotic assistance during mock retinal surgery. In these experiments user behavior is characterized by measuring the forces applied by the user to the sclera, the tool insertion/retraction speed, the tool insertion depth relative to the scleral entry point, and the duration of surgery. The users' behavior data is collected during three mock retinal surgery tasks with four users. Each task is conducted using both freehand and robot-assisted techniques. The univariate user behavior and the correlations of multiple parameters of user behavior are analyzed. The results show that robot assistance prolongs the duration of the surgery and increases the manipulation forces applied to sclera, but refines the insertion velocity and eliminates hand tremor.

ACS Style

Changyan He; Ali Ebrahimi; Marina Roizenblatt; Niravkumar Patel; Yang Yang; Peter L. Gehlbach; Iulian Iordachita. User Behavior Evaluation in Robot-Assisted Retinal Surgery. 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) 2018, 2018, 174 -179.

AMA Style

Changyan He, Ali Ebrahimi, Marina Roizenblatt, Niravkumar Patel, Yang Yang, Peter L. Gehlbach, Iulian Iordachita. User Behavior Evaluation in Robot-Assisted Retinal Surgery. 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 2018; 2018 ():174-179.

Chicago/Turabian Style

Changyan He; Ali Ebrahimi; Marina Roizenblatt; Niravkumar Patel; Yang Yang; Peter L. Gehlbach; Iulian Iordachita. 2018. "User Behavior Evaluation in Robot-Assisted Retinal Surgery." 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) 2018, no. : 174-179.

Conference
Published: 01 July 2018 in 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
Reads 0
Downloads 0

One of the major yet little recognized challenges in robotic vitreoretinal surgery is the matter of tool forces applied to the sclera. Tissue safety, coordinated tool use and interactions between tool tip and shaft forces are little studied. The introduction of robotic assist has further diminished the surgeon's ability to perceive scleral forces. Microsurgical tools capable of measuring such small forces integrated with robot-manipulators may therefore improve functionality and safety by providing sclera force feedback to the surgeon. In this paper, using a force-sensing tool, we have conducted robot-assisted eye manipulation experiments to evaluate the utility of providing scleral force feedback. The work assesses 1) passive audio feedback and 2) active haptic feedback and evaluates the impact of these feedbacks on scleral forces in excess of a boundary. The results show that in presence of passive or active feedback, the duration of experiment increases, while the duration for which scleral forces exceed a safe threshold decreases.

ACS Style

Ali Ebrahimi; Changyan He; Marina Roizenblatt; Niravkumar A. Patel; Shahriar Sefati; Peter Gehlbach; Iulian Iordachita. Real-Time Sclera Force Feedback for Enabling Safe Robot-Assisted Vitreoretinal Surgery. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2018, 2018, 3650 -3655.

AMA Style

Ali Ebrahimi, Changyan He, Marina Roizenblatt, Niravkumar A. Patel, Shahriar Sefati, Peter Gehlbach, Iulian Iordachita. Real-Time Sclera Force Feedback for Enabling Safe Robot-Assisted Vitreoretinal Surgery. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2018; 2018 ():3650-3655.

Chicago/Turabian Style

Ali Ebrahimi; Changyan He; Marina Roizenblatt; Niravkumar A. Patel; Shahriar Sefati; Peter Gehlbach; Iulian Iordachita. 2018. "Real-Time Sclera Force Feedback for Enabling Safe Robot-Assisted Vitreoretinal Surgery." 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2018, no. : 3650-3655.