0
Research Papers

Attention-Aware Robotic Laparoscope Based on Fuzzy Interpretation of Eye-Gaze Patterns

[+] Author and Article Information
Songpo Li

Department of Mechanical Engineering,
Colorado School of Mines,
Golden, CO 80401
e-mail: soli@mines.edu

Xiaoli Zhang

Mem. ASME
Department of Mechanical Engineering,
Colorado School of Mines,
Golden, CO 80401
e-mail: xlzhang@mines.edu

Fernando J. Kim

Department of Urology,
Denver Health Medical Center,
Denver, CO 80204
e-mail: Fernando.Kim@dhha.org

Rodrigo Donalisio da Silva

Department of Urology,
Denver Health Medical Center,
Denver, CO 80204
e-mail: Rodrigo.DonalisiodaSilva@dhha.og

Diedra Gustafson

Department of Urology,
Denver Health Medical Center,
Denver, CO 80204
e-mail: Diedra.Gustafson@dhha.org

Wilson R. Molina

Department of Urology,
Denver Health Medical Center,
Denver, CO 80204
e-mail: Wilson.Molina@dhha.org

1Corresponding author.

Manuscript received November 29, 2014; final manuscript received May 4, 2015; published online August 6, 2015. Assoc. Editor: Rafael V. Davalos.

J. Med. Devices 9(4), 041007 (Aug 06, 2015) (10 pages) Paper No: MED-14-1278; doi: 10.1115/1.4030608 History: Received November 29, 2014

Laparoscopic robots have been widely adopted in modern medical practice. However, explicitly interacting with these robots may increase the physical and cognitive load on the surgeon. An attention-aware robotic laparoscope system has been developed to free the surgeon from the technical limitations of visualization through the laparoscope. This system can implicitly recognize the surgeon's visual attention by interpreting the surgeon's natural eye movements using fuzzy logic and then automatically steer the laparoscope to focus on that viewing target. Experimental results show that this system can make the surgeon–robot interaction more effective, intuitive, and has the potential to make the execution of the surgery smoother and faster.

FIGURES IN THIS ARTICLE
<>
Copyright © 2015 by ASME
Your Session has timed out. Please sign back in to continue.

References

Wikipedia, 2014, “Laparoscopic Surgery,” accessed Sept. 29, 2014, http://en.wikipedia.org/wiki/Laparoscopic_surgery
Martins Rus, J. R. , Jatene, F. B. , de Campos, J. R. , Monteiro, R. , Tedde, M. L. , Samano, M. N. , Bernardo, W. M. , and Das-Neves-Perira, J. C. , 2009, “Robotic Versus Human Camera Holding in Video-Assisted Thoracic Sympathectomy: A Single Blind Randomized Trial of Efficacy and Safety,” Interact. Cardiovasc. Thorac. Surg., 8(2), pp. 195–199. [CrossRef] [PubMed]
Stolzenburg, J. U. , Franz, T. , Kallidonis, P. , Minh, D. , Dietel, A. , Hicks, J. , Nicolaus, M. , Al-Aown, A. , and Liatsikos, E. , 2010, “Comparison of the FreeHand® Robotic Camera Holder With Human Assistants During Endoscopic Extraperitoneal Radical Prostatectomy,” BJU Int., 107(6), pp. 970–974. [CrossRef] [PubMed]
Taniguchi, K. , Nishikawa, A. , Sekimoto, M. , Kobayashi, T. , Kazuhara, K. , Ichihara, T. , Kurashita, N. , Takiguchi, S. , Doki, Y. , Mori, M. , and Miyazaki, F. , 2010, “Classification, Design and Evaluation of Endoscope Robots,” Robot Surgery, Seung Kyuk Baik, ed., InTech, Rijeka, Croatia, pp. 1–24.
Sackier, J. M. , and Wang, Y. , 1994, “Robotically Assisted Laparoscopic Surgery: From Concept to Development,” Surg. Endoscopy, 8(1), pp. 63–66. [CrossRef]
Aiono, S. , Gilbert, J. M. , Soin, B. , Finlay, P. A. , and Gordan, A. , 2002, “Controlled Trial of the Introduction of a Robotic Camera Assistant for Laparoscopic Cholecystectomy,” Surg. Endoscopy, 16(9), pp. 1267–1270. [CrossRef]
Prosurgics, 2014, “FreeHand,” Prosurgics Inc., San Jose, CA, accessed Nov. 13, 2014, http://www.freehandsurgeon.com
Polet, R. , and Donnez, J. , 2004, “Gynecologic Laparoscopic Surgery With a Palm-Controlled Laparoscope Holder,” J. Am. Assoc. Gynecol. Laparoscopists, 11(1), pp. 73–78. [CrossRef]
Tanoue, K. , Yasunaga, T. , Kobayashi, E. , Miyamoto, S. , Sakuma, I. , Dohi, T. , Konishi, K. , Yamaguchi, S. , Kinjo, N. , Takenaka, K. , Maehara, Y. , and Hashizume, M. , 2006, “Laparoscopic Cholecystectomy Using a Newly Developed Laparoscope Manipulator for 10 Patients With Cholelithiasis,” Surg. Endoscopy, 20(5), pp. 753–756. [CrossRef]
Guthart, G. S. , and Salisbury, J. K., Jr ., 2000, “The Intuitive™ Telesurgery System: Overview and Application,” IEEE International Conference on Robotics and Automation (ICRA '00), San Francisco, CA, Apr. 24–28, pp. 618–621.
Kraft, B. M. , Jager, C. , Kraft, K. , Leibl, B. J. , and Bittner, R. , 2004, “The AESOP Robot System in Laparoscopic Surgery; Increased Risk or Advantage for Surgeon and Patient?” Surg. Laparoscope, 18(8), pp. 1216–1223.
Gilbert, J. M. , 2009, “The EndoAssist Robotic Camera Holder as an Aid to the Introduction of Laparoscopic Colorectal Surgery,” Ann. R. Coll. Surg. Engl., 91(5), pp. 389–393. [CrossRef] [PubMed]
Wei, G. , Arbter, K. , and Hirzinger, G. , 1997, “Real-Time Visual Servoing for Laparoscopic Surgery. Controlling Robot Motion With Color Image Segmentation,” IEEE Eng. Med. Biol. Mag., 16(1), pp. 40–45. [CrossRef] [PubMed]
Wang, Y. F. , Uecker, D. R. , and Wang, Y. , 1998, “A New Framework for Vision-Enabled and Robotically Assisted Minimally Invasive Surgery,” Comput. Med. Imaging Graphics, 22(6), pp. 429–437. [CrossRef]
Nishikawa, A. , Asano, S. , Fujita, R. , Yohda, T. , Miyazaki, F. , Sekimoto, M. , Yasui, M. , Takiguchi, S. , and Monden, M. , 2003, “Robust Visual Tracking of Multiple Surgical Instruments for Laparoscopic Surgery,” Computer Assisted Radiology and Surgery 17th International Congress and Exhibition (CARS 2003), London, June 25–28, p. 1372.
Ko, S. Y. , and Kwon, D. S. , 2004, “A Surgical Knowledge Based Interaction Method for a Laparoscopic Assistant Robot,” 13th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2004), Kurashiki, Japan, Sept. 20–22, pp. 313–318.
Lee, C. , Wang, Y.-F. , Uecker, D. R. , and Wang, Y. , 1994, “Image Analysis for Automated Tracking in Robot-Assisted Endoscopic Surgery,” 12th IAPR International Conference on Pattern Recognition (ICPR), Conference A: Computer Vision and Image Processing, Jerusalem, Oct. 9–13, pp. 88–92 .
Kranzfelder, M. , Schneider, A. , Blahusch, G. , Schaaf, H. , and Feussner, H. , 2009, “Feasibility of Opto-Electronic Surgical Instrument Identification,” Minimally Invasive Ther. Allied Technol., 18(5), pp. 253–258. [CrossRef]
Nishikawa, A. , Asano, S. , fujita, R. , Yamaguchi, S. , Yohda, T. , Miyazaki, F. , Sekimoto, M. , Yasui, M. , Miyake, Y. , Takiguchi, S. , and Monden, M. , 2003, “Selective Use of Face Gesture Interface and Instrument Tracking System for Control of a Robotic laproscope Positioner,” 6th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2003), Montréal, Canada, Nov. 15–18, pp. 973–974.
Merchant, J. , Morrissette, R. , and Porterfield, J. , 1974, “Remote Measurement of Eye Direction Allowing Subject Motion Over One Cubic Foot of Space,” IEEE Trans. Biomed. Eng., 21(4), pp. 309–317. [CrossRef] [PubMed]
Morimoto, C. H. , and Mimica, M. R. M. , 2000, “Eye Gaze Tracking Techniques for Interactive Applications,” Comput. Vision Image Understanding, 18(4), pp. 331–335. [CrossRef]
Stampe Dave, M. , 1993, “Heuristic Filtering and Reliable Calibration Method for Video-Based Pupil-Tracking Systems,” Behav. Res. Methods Instrum. Comput., 25(2), pp. 137–142. [CrossRef]
Ji, Q. , and Zhu, Z. , 2002, “Eye and Gaze Tracking for Interactive Graphic Display,” Machine Vision and Application, Springer, New York, pp. 139–148.
Barea, R. , Boquete, L. , Ortega, S. , Lopez, E. , and Rodriguez-Ascariz, J. M. , 2012, “EOG-Based Eye Movements Codification for Human Computer Interaction,” Expert Syst. Appl., 39(3), pp. 2677–2683. [CrossRef]
Krolak, A. , and Strumillo, P. , 2012, “Eye-Blink Detection System for Human-Computer Interaction,” Univers. Access Inf. Soc., 11(4), pp. 409–419. [CrossRef]
Liu, C. , Ishi, C. T. , Ishiguro, H. , and Hagita, N. , 2012, “Generation of Nodding, Head Tilting and Eye Gazing for Human-Robot Dialogue Interaction,” 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Boston, Mar. 5–8, pp. 285–292.
Ma, J. , Zhang, Y. , Cichocki, A. , and Matsuno, F. , 2015, “A Novel EOG/EEG Hybrid Human-Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control,” IEEE Trans. Biomed. Eng., 62(3), pp. 876–889. [CrossRef] [PubMed]
Hansen, D. W. , and Ji, Q. , 2010, “In the Eye of the Beholder: A Survey of Models for Eye and Gaze,” IEEE Trans. Pattern Anal. Mach. Intell., 32(3), pp. 478–500. [CrossRef] [PubMed]
Wikipedia, 2015, “Eye Tracking,” accessed Apr. 30, 2015, http://en.wikipedia.org/wiki/Eye_tracking
COGAIN Communication by Gaze Interaction, 2013, “Gazetalk,” COGAIN Association, Frederiksberg, Denmark, accessed Oct. 3, 2014, http://wiki.cogain.org/index.php/Gazetalk
Tobii, 2014, “Computer Access Through Eye Gaze,” Tobii AB, Danderyd, Sweden, accessed July 3, 2014, http://www.tobii.com/PCEye2011
Purwanto, D. , Mardiyanto, R. , and Arai, K. , 2009, “Electric Wheelchair Control With Gaze Direction and Eye Blinking,” Artif. Life Rob., 14(3), pp. 397–400. [CrossRef]
Atienza, R. , and Zelinsky, A. , 2005, “Intuitive Interface Through Active 3D Gaze Tracking,” International Conference on Active Media Technology (AMT 2005), Kagawa, Japan, May 19–21, pp. 16–21.
Noonan, D. P. , Mylonas, G. P. , Shang, J. , Payne, C. J. , Darzi, A. , and Yang, G.-Z. , 2010, “Gaze Contingent Control for an Articulated Mechatronic Laparoscope,” 3rd IEEE, RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Tokyo, Sept. 26–29.
Staub, C. , Can, S. , Jensen, B. , Knoll, A. , and Kohlbecher, S. , 2012, “Human-Computer Interfaces for Interaction With Surgical Tools in Robotic Surgery,” 4th IEEE, RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, June 24–27, pp. 81–86.
Zhang, X. , Li, S. , Zhang, J. , and Williams, H. , 2013, “Gaze Contingent Control for a Robotic Laparoscope Holder,” ASME J. Med. Devices, 7(2), p. 020915.
Li, S. , Zhang, J. , Xue, L. , Fernando, K. , and Zhang, X. , 2013, “Attention-Aware Robotic Laparoscope for Human-Robot Cooperative Surgery,” IEEE International Conference Robotics and Biomimetics (ROBIO), Shenzhen, China, Dec. 12–14, pp. 792–797.
Jacob, R. J. K. , and Karn, K. S. , 2003, “Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises,” Mind, 2(3), pp. 577–605.
Glenstrup, A. J. , and Engell-Nielsen, T. , 1995, “Eye Controlled Media: Present and Future,” Bachelor's thesis, Information Psychology, Laboratory of Psychology, University of Copenhagen, Copenhagen, Denmark.
Møllenbach, E. , Hansen, J. P. , and Lillholm, M. , 2013, “Eye Movements in Gaze Interaction,” J. Eye Mov. Res., 6(2), pp. 1–15.
Zhang, X. , and Nelson, C. A. , 2008, “Kinematic Analysis and Optimization of a Novel Robot for Surgical Tool Manipulation,” ASME J. Med. Devices, 2(2), p. 021003. [CrossRef]
Nelson, C. A. , Zhang, X. , Shah, B. C. , Goede, R. M. , and Oleynikov, D. , 2009, “Multipurpose Surgical Robot as a Laparoscope Assistant,” Surg. Endoscopy, 24(7), pp. 1528–1532.
Nelson, C. A. , Zhang, X. , and Oleynikov, D. , 2009, “Tool Guidance Using a Compact Robotic Assistant,” J. Rob. Surg., 3(3), pp. 171–173.
Kim, J. , Abdel-Malek, K. , Mi, Z. , and Nebel, K. , 2004, “Layout Design Using an Optimization-Based Human Energy Consumption Formulation,” SAE Technical Paper 2004-01-2175.
Zadeh, L. A. , 1965, “Fuzzy Sets,” Inf. Control, 8(3), pp. 338–383. [CrossRef]
Wikipedia, 2014, “Fuzzy Control System,” accessed Jan. 20, 2015, http://en.wikipedia.org/wiki/Fuzzy_control_system
Dadios, E. P. , Biliran, J. J. C. , Garcia, R.-R. G. , Johnson, D. , and Valencia, A. R. B. , 2012, “Humanoid Robot: Design and Fuzzy Logic Control Technique for Its Intelligent Behaviors,” Fuzzy Logic: Controls, Concepts, Theories and Applications, InTech, Rijeka, Croatia, pp. 3–20.
Hong, T. S. , Nakhaeinia, D. , and Karasfi, B. , 2012, “Application of Fuzzy Logic in Mobile Robot Navigation,” Fuzzy Logic: Controls, Concepts, Theories and Applications, InTech, Rijeka, Croatia, pp. 21–36.
Ahmad, S. , Siddique, N. H. , and Tokhi, M. O. , 2012, “Modular Fuzzy Logic Controller for Motion Control of Two-Wheeled Wheelchair,” Fuzzy Logic: Controls, Concepts, Theories and Applications, InTech, Rijeka, Croatia, pp. 37–58.
Brooke, J. , 1996, “SUS: A Quick and Dirty Usability Scale,” Usability Evaluation in Industry, Taylor & Francis, London, p. 189.

Figures

Grahic Jump Location
Fig. 1

The flowchart of the attention-aware gaze-guided robotic laparoscope system

Grahic Jump Location
Fig. 2

The setup of the gaze-guided automated robotic laparoscope system. The image of the artificial muscle in the surgery simulator is projected on a monitor through the robotic laparoscope.

Grahic Jump Location
Fig. 3

Two kinds of historical eye-gaze behaviors in a surgical operation

Grahic Jump Location
Fig. 4

The overall procedure of the fuzzy inference engine

Grahic Jump Location
Fig. 5

Fuzzy logic membership functions

Grahic Jump Location
Fig. 7

The experiment setup with a virtual simulator

Grahic Jump Location
Fig. 8

Comparison between the raw gaze data (left) with refined gaze data (right). The small squares are the gaze points. The big circles are the targets that the subject gazed upon on the monitor. The lines are the trajectory of the eye movements.

Grahic Jump Location
Fig. 9

Comparison of the response time in the dwell-time method and the fuzzy inference method

Grahic Jump Location
Fig. 10

The summary of system usability score in two different methods. The higher score means that the system has a better usability. ASoFI: average score of the fuzzy inference method. ASoDT: average score of the dwell-time method.

Grahic Jump Location
Fig. 11

The summary of user experience over repetitive tests for each method. The average score of the dwell-time method is 38.5, which is comparable to 38 for the fuzzy.

Grahic Jump Location
Fig. 12

User experience comparison of the fuzzy inference method over the dwell-time method. The total score ranges between 0 and 40 (the value closer to 0 means that the dwell-time method is superior to the fuzzy inference method and the value closer to 40 means that the fuzzy inference method is better than the dwell-time method).

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In