Senior Capstone Project Proposal Presentation

Download Report

Transcript Senior Capstone Project Proposal Presentation

Team IRALAR
Breanna Heidenburg -- Michael Lenisa -- Daniel Wentzel
Advisor: Dr. Malinowski

The Project
◦ Why is it important

The Goals
◦ System breakdown
 Image recognition
 Point transformation
 User Interface

The Results
What is our project?

Track a user’s eye and use the information to
control a computer cursor.
 Enhances
Human Computer
Interaction
◦Speed of use
◦Hands-free use

3 Part System
◦ Image Processing Application
◦ Calibration and Mapping system
◦ GUI designed for gaze-based interaction


Systems developed concurrently and
independently
Separate Applications at run-time
Hardware and Image
Processing Application
Software
Hardware
(Input)
User
Interface
Application
Image Processing
Application
Image Processing
& Cursor Control
Thread
OS channel
UI Thread
Shared Process Data
UDP Server
Thread
UDP channel
Hardware
(output)
Hardware
(Input)
Software
Hardware
(output)

Hardware
◦ Camera
 QuickCam Pro for Notebooks
 Visible Spectrum Camera
◦ Polarizer
 Tiffen 25mm polarizing filter
 Removes glare from eye reflections
◦ Lighting
 Diffused LED
 Slightly distracting to the user, but necessary to provide light
for the camera
Hardware
(Input)
Software
Hardware
(output)

LitEye LE-500
◦
◦
◦
◦
High resolution (SVGA)
Color Display
Translucent or opaque operation
Stationary relative to user’s eye
Hardware
Software
(Input)
Hardware
(output)
Software
Image Processing
Application

Real time pupil tracking system
◦ Developed in C using OpenCV image processing libraries
◦ Traditional image processing and blob tracking

Capabilities
◦
◦
◦
◦
◦
Locate and determine center of pupil in image
Low light and high reflection environments
All eye colors
Data logging and static test modes
Packaged into self contained Windows installer for easy
deployment onto any computer
Adapt
Algorithim
Capture
Image
Determine
center of
pupil blob
Extract
Red
Channel
Query
Calculate
Reject
Frame
False
Blob
From
Final
Contrast
Extract
Red
Recognition
Smooth
Channel
Stretch
Blobs
Positives
Center
Camera
Reject
false
positives
Locate
blobs
Contrast
Stretch
Smooth
image

Summary
◦ The Good
 Dynamically adapts to changing lighting
conditions and eye types
 Maintains performance in low-light and
specularly noisy conditions
◦ The Bad
 Still relies on Logitech camera drivers
 Extreme reflections still cause problems

Examples of performance in poor conditions
Low Light
Difficult
False Positive
Note: Image brightness and contrast artificially enhanced for human visibility
Calibration and point mapping
Software
Hardware
(Input)
Image Processing
Application
Image Processing
& Cursor Control
Thread
Hardware
(output)


System for mapping the location of the center of
the pupil to a pixel on a computer screen
Must Calibrate for each User
◦ Geometry
 The eye is not flat but a screen is
◦ User Customization
 All eyes are different
 Everyone wears the HMD differently
◦ User Training
 Calibration system also acts as a quick tutorial

3 dimensional best fit plane
◦ Currently using a 4th degree best fit
Xpix = A1 + Xeye*B1 + Yeye*C1
Ypix = A2 + Xeye*B2 + Yeye*C2

Calibration sub-system determines these
coefficients

How do we solve the problem?
◦ Multiple Variable Linear Regression – Least Squares
Y = B 0 + B 1 x 1 + … + Bk x k
◦ Uses matrix algebra to obtain a coefficient matrix
B[] = (X’X)-1X’Y
Error Mesh

Cursor
position error
 actual vs.
determined
position
 Horizontal and
vertical error in
screen pixels

Accuracy varies with position and skill of user
 Corners of screen most difficult to calibrate
 Focusing on a rapidly changing location requires skill

Limitations
 1° Accuracy of human vision system
 Eye Saccades

Original error goal
 2% of screen dimension on both axes

Achieved error
 1.18% Horizontal
 1.46% Vertical

How do we click?
◦ Monitor eye movements
◦ Identify pauses
◦ “Dwell time” - When eye position is focused on a
single area for a period of time
◦ Currently set at 5 frames (~200 mS)
◦ Generally, it takes 230 mS for a hand to click a
mouse.

Improved
interaction
speed
 Trackpad vs.
Mouse vs.
Gaze Tracking
 53% Increase
over Trackpad
 12% Increase
over Traditional
Mouse
Custom GUI interface and
Communications
Software
Hardware
(Input)
Image Processing
Application
Image Processing
& Cursor Control
Thread
User
Interface
Application
UI Thread
Hardware
(output)

Custom GUI for Gaze Tracking Applications

Why?
◦ Gaze tracking accuracy limited by inherent
properties of the human vision system
◦ Traditional GUI too small and intrusive for use with
transparent HMD
◦ Demonstrate applications of gaze tracking
Modern GUI Design
◦ WPF using XAML layout
◦ Windows Presentation Foundation
◦ eXtensible Application Markup Language
◦ XAML is similar to HTML
◦ Uses tags and ‘code-behind’ in a similar style to
javascript
◦ GUI coded in C#


Multiple
pages within
the interface
Screens for
functionality
testing
◦ even games

Ability to
minimize
interface
Main Menu Screen




Large text and
buttons
Wide spacing
between options
Simple layout
Placement of
features in
high-accuracy
areas
Software
Hardware
(Input)
User
Interface
Application
Image Processing
Application
Image Processing
& Cursor Control
Thread
OS channel
UI Thread
Hardware
(output)

Why
◦ Allows processes to communicate
◦ Allows relay of time sensitive information

2 Communication Channels
◦ OS Channel
 Omni-directional (Image Processing to User Interface)
◦ UDP Channel
 Multi-directional
 Separate Thread in Image Processing Application
Software
Hardware
(Input)
User
Interface
Application
Image Processing
Application
Image Processing
& Cursor Control
Thread
OS channel
UI Thread
Shared Process Data
UDP Server
Thread
Creates
Multi-threading
issue
UDP channel
Hardware
(output)

Multi-threading
◦ Public variable usage
◦ Solution: Critical Section
 Raises thread priority (thread is uninterruptable)
Receive request
for data
(over UDP)
Raise
Thread
priority
Read
Variable
Lower
Thread
Priority
Reply
to
request
Software
Hardware
(Input)
User
Interface
Application
Image Processing
Application
Image Processing
& Cursor Control
Thread
Software
OS channel
UI Thread
Controller
Shared Process Data
Controller Thread
UDP Server
Thread
UDP channel
Hardware
(output)
Goals met and future projects




Image Processing based eye tracking system
 Correlates eye position to location on screen
 Adapts to wide variety of eye types
 Ability to function as a hands-free input device
Cursor control via eye-tracking system
Display visual output to user
Visually controlled demo programs




Augmented reality system
 Transparent Heads-up display over user
vision
 Portable system for everyday use
 Forward looking camera for correlation of
user real-world vision to eye position
Real world applications
Network integration
Sound output
Installation Screen

Windows compatible
software package
 Standard Windows
Installer
 Contains image
processing application
and GUI




Pupil Tracking with Neural Network
Implementation of camera driver with DirectShow
Implement Head tracking for gaze-tracking without
head-mounted display
Augmented reality
◦ Front facing camera
◦ Object/face recognition

Implement real-world applications

Any one of these is a senior project in itself!
Who helped us out?



Dr. Malinowski and the EE faculty
Mr. Mattus & Mr. Schmidt
Our test subjects