Gaze controlled robotic camera system

Download Report

Transcript Gaze controlled robotic camera system

Gaze Controlled Robotic
Camera System
Anuj Awasthi
Anand Sivadasan
Veeral Patel
Outline









Background
Significance
Problem Statement
Concept
Methodology
Specific aims
Budget
Project Participation
Time Frame
Background
Laparoscopic robotic surgery
Eye tracker application
•
•

Visual mouse
Human factors
•



Computer vision based control
Face mouse
Voice control
Requirements in Laparoscopic Surgery




Maintain the surgical point of interest in the centre of the
image.
Provide the required magnification of the area.
Produce and maintain a horizontal image of the point of
interest.
Perform the preceding actions automatically, although
they can be modulated by the surgeon.
Visual Mouse Application



Obtaining the horizontal and vertical coordinates with the
eye tracker
The technique of live streaming of the horizontal and
vertical coordinates
Interfacing of eye tracker and computer
Eye Tracker System
Human Factors
Computer vision based control of robotic camera



Camera control based on computer vision tracking of the
surgical tools.
Image processing used to differentiate surgical tool of
interest from surroundings.
No input required from the surgeon.
Disadvantages



Surgeon’s area of interest not taken into consideration
Assumes surgical area to be surgeon’s area of interest
always
Surgeon ends up looking at corners of the screen often
Human Factors
Face Mouse control for robotic camera
 Image based system
 Tracks facial features of surgeon real time
 Controls camera based on pitch, yaw and roll of
surgeon’s face
Disadvantages
 Constant face movements causes strain
 Difficult to keep pace with movement of tools
Human Factors
Voice control of Robotic camera



Uses voice and pedal controls
Uses voice recognition techniques
Set of voice commands the camera
Disadvantages


Considerable burden on surgeon
Difficult to perform dual inputs
Significance




Reduction in work load on surgeon
Accuracy of surgical tasks
Impact on surgical time
Hands Free Control
Problem Statement
“To develop a camera control system
which reduces the work load on the
surgeon without compromising on the
quality of surgeon’s video display ”
Concept
Gaze based robotic camera

Acquire gaze of the surgeon with eye tracker.

Camera manipulation using eye tracker data interfaced
with robot controls
Robotic Hardware









A small wireless 320 X 240 resolution camera with
an inbuilt transmitter
A Receiver Set
Two Servomotors (HS 422)
Links
Usbor Servo Controller
Pivot Post
Gripper
Washer, Set of Clamps, Bolts, Nuts
Eye tracker System
Methodology
Operation site
Surgeon Site
Surgical Site


Server System (HOST Computer)
Usbor Servo Controller




Servo Motors
Robot Arm
End Effecter


Inverse Kinematics to be followed
Wireless Camera


Visual C++ 6.0 Coding
AAA Battery supplied
Receiver
Surgeon’s Site


Dedicated system (Client )
Image Acquisition through Internet



Image Processing



Streaming Video
Live Motion JPEG System
Intel’s Open CV Library
Improve Brightness and Contrast
Eye tracker System
Fuzzy Based Control
Cluster 2
Cluster 1
Cluster 3
Pupil
Cluster 4
Cluster 5
Cluster 6
Fuzzy C-Means Algorithm





Point of Gaze keeps fluctuating.
Entire Eye tracker screen supposed to be
divided into clusters.
Fuzzy C-Means Algorithm used.
Degree of belongingness of the point of gaze
to a cluster is supposed to be the Degree of
Membership of the fuzzy function
Point of Gaze co-ordinates assumed to be
same as co-ordinates of cluster centers.
Specific Aims




To cover the surgical area with camera.
To obtain the point of gaze of the surgeon
with eye tracker.
To control the robotic camera based on the
point of gaze coordinates.
To facilitate Surgeon’s view.
Budget
Project Participation





Robot Assembling : Veeral, Anuj & Anand
Inverse kinematics : Anuj & Anand
Software for Kinematics Control: Anuj &
Veeral
Interfacing Eye tracker and Robot : Anuj,
Veeral & Anand
Eye Tracker Output : Anand & Veeral
Time Frame
Task
Time Duration
Conceptualization
Jan 10th-Jan 25th
Literature Review
Jan 26th-Feb 10th
Ordering Hardware
Feb 15th
Proposal Writing
Feb 15th-Feb 27th
Robot Assembling
Feb 28th-March 5th
Software Development
March 5th-March 25th
Final Report Writing
March 25th-April 10th
Testing
April 10th-April 20th
References



M. Farid,F. Murtagh,J.L. Starck.” Computer Display
Control and Interaction using Eye-Gaze". School of
Computer Science ,Belfast,UK.
Atsushi Nishikawa “Face Mouse : A Novel HumanMachine Interface for Controlling the Position of a
Laparoscope” IEEE Transactions on Robotics and
Automation,Vol. 19,No. 5 ,October 2003.
Murtagh F. ”Eye Gaze Tracking System-Visual
Mouse Application Development”,3rd Year Training
Report ,E.N.P.S. Engineering Degree, March-August
2001.
Reference (Contd..)



M.E. Allaf. “Laparoscopic Visual Field – Voice vs.
foot Pedal interfaces for control of AESOP Robot
"Surgical Endoscopy.Feb 1998.
A. Casals,J. Amat,E. Laporte. ”Automatic Guidance
of an Assistant Robot in Laparoscopic Surgery”
International Conference on Robotics and
Automation, IEEE 1996.
R. Hurteau,S. DeSantis “Laparoscopic Surgery
Assisted by a Robotic Cameraman:Concept and
Experimental Results”IEEE 1994.
References (Contd….)


George P. Mylonas,Danail Satyanov. ”Gaze
Contingent Soft tissue Deformation Tracking
for Minimally Invasive Robotic Surgery”
MICCAI 2005, LNCS 3749, pp. 843 – 850,
2005.
Shamsi T. Iqbal ,Brian P. Bailey. “Using EyeGaze Patterns to Identify User
tasks”GHC04,2004
THANK YOU!!!!!!
QUESTIONS???????