Activity Recognition

Download Report

Transcript Activity Recognition

Activity Recognition
Taiwoo Park
May 7, 2013
Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin
Heidelberg, 2004. 1-17.
Park, Taiwoo, et al. "E-gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and
mobile devices."Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM, 2011.
Some slides are from CSCI 546 course materials by James Reinebold, USC
1
Activity?
• Higher level activities
– Giving a lecture, having a breakfast, playing soccer…
• Lower level activities
– Lying on a bed, standing still, running, walking, …
2
An easy example
• Assumptions:
– Sensors on smartphones are only available
• Accelerometer, compass, gyroscope, light, …
• You can attach smartphones on your body
– Only three target activities to recognize
• Running
• Standing still
• Lying on a bed
• How can we recognize activities?
3
An easy example (cont’d)
Is the phone
being shaken?
Phone
orientation
Activity
No
Upright
Standing still
Yes
Upright
Running
No
Lying down
Lying on a bed
Yes
Lying down
Nothing
y
z
x
Average value of
accelerometer y-axis sensor signals
for the last 3 seconds
Variance of accelerometer sensor signal
for the last 3 seconds
4
Activity recognition pipeline
Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and
Intention Recognition." (2011).
5
An easy example (revisited)
Is the phone
being shaken?
Phone
orientation
No
Upright
Yes
Upright
Running
No
Lying down
Lying on a bed
Yes
Lying down
…?
Activity
Standing still Classification
Average value of
accelerometer y-axis sensor signals Feature
extraction
for the last 3 seconds
Windowing
Variance of accelerometer sensor signal
for the last 3 seconds
6
Data collection
• Semi-Naturalistic, User-Driven
Data Collection
– Obstacle course / worksheet
– No researcher supervision while
subjects performed the tasks
• Timer synchronization
• Discard data within 10 seconds of
start and finish time for activities
Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin
Heidelberg, 2004. 1-17.
Activities
• Walking
• Walking Carrying Items
• Sitting and Relaxing
• Working on Computer
• Standing Still
• Eating or Drinking
• Watching TV
• Reading
• Running
• Bicycling
• Stretching
• Strength-training
• Scrubbing
• Vacuuming
• Folding Laundry
• Lying down & relaxing
• Brushing Teeth
• Climbing stairs
• Riding Elevator
• Riding escalator
Data collection
Source: Bao 2004
Sensors Used
• Five ADXL210E accelerometers (manufactured by
Analog Devices)
– Range of +/- 10g
– 5mm x 5mm x 2mm
– Low Power, Low Cost
– Measures both static and dynamic acceleration
• Sensor data was stored in a memory card by using
“Hoarder Board”
Source: http://vadim.oversigma.com/Hoarder/LayoutFront.htm
Example Signals
Source: Bao 2004
Activity recognition pipeline
Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and
Intention Recognition." (2011).
12
Classification
Question:
What is the most similar samples to the current one?
Data sample
?
‘Running’
‘Standing
still’
‘Lying on a
bed’
‘Walking’
Collected data samples (in advance)
Methods:
Naïve bayes, nearest neighbor, decision table/tree,
HMM (Hidden Markov Models), …
13
Decision Table
Is the phone
being shaken?
Phone
orientation
Activity
No
Upright
Standing still
Yes
Upright
Running
No
Lying down
Lying on a bed
Yes
Lying down
Nothing
y
z
x
14
Decision Trees
• Make a tree where the non-leaf nodes are the
features, and each leaf node is a classification. Each
edge of the tree represents a value range of the
feature.
• Move through the tree until you arrive at a leaf node
• Generally, the smaller the tree the better.
– Finding the smallest is NP-Hard
Source: http://pages.cs.wisc.edu/~dyer/cs540/notes/learning.html
Decision Tree Example
Upright
Phone
Orient
ation?
Lying on
a bed
Not lying
down: Is the
phone being
shaken?
Yes
Running
Lying down
No
Standing still
Nearest Neighbor
• Split up the domain into various dimensions, with
each dimension corresponding to a feature.
• Classify an unknown point by having its K nearest
neighbors “vote” on who it belongs to.
• Simple, easy to implement algorithm. Does not work
well when there are no clusters.
Source: http://pages.cs.wisc.edu/~dyer/cs540/notes/learning.html
Variance of accel
Nearest Neighbor Example
Running
Lying on a bed
Standing still
Average value of accel Y
Naïve Bayes Classifier
• Multiplies the probability of an observed datapoint
by looking at the priority probabilities that
encompass the training set.
– P(B|A) = P(A|B) * P(B) / P(A)
• Assumes that each of the features are independent.
• Relatively fast.
Source: cis.poly.edu/~mleung/FRE7851/f07/naiveBayesianClassifier.pdf
Activity recognition pipeline
Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and
Intention Recognition." (2011).
20
Feature Extraction
• Time-domain features [Maurer 2006]
– Mean (average), Root Mean Square, Variance, …
• FFT-based feature computation [Bao 2004]
– Sample at 76.25 Hz
– 512 sample windows (about 6.71 sec)
– Extract mean energy, entropy, and correlation features
Maurer, Uwe, et al. "Activity recognition and monitoring using multiple sensors on different body
positions." Wearable and Implantable Body Sensor Networks, 2006. BSN 2006. International Workshop on. IEEE, 2006.
Source: Bao 2004
Source: Bao 2004
Results
Classifier
Classification Accuracy (%, Leave-one-subject-out Training)
Decision Table
46.75 +/- 9.296
Nearest Neighbor
82.70 +/- 6.416
Decision Tree
84.26 +/- 5.178
Naïve Bayes
52.35 +/- 1.690
• Decision tree was the best performer, but…
Per-activity accuracy breakdown
Trying With Less Sensors
Accelerometer (s) Left In
Difference in Recognition Activity
Hip
-34.12 +/- 7.115
Wrist
-51.99 +/- 12.194
Arm
-63.65 +/- 13.143
Ankle
-37.08 +/- 7.601
Thigh
-29.47 +/- 4.855
Thigh and Wrist
-3.27 +/- 1.062
Hip and Wrist
-4.78 +/- 1.331
With only two accelerometers we can get good performance
Lessons
• Accelerometers can be used to affectively distinguish
between everyday activities.
• Decision trees and nearest neighbor algorithms are
good choices for activity recognition.
• Some sensor locations are more important than
others.
• Selecting a feature set is important to increase
recognition accuracy.
E-Gesture:
A Collaborative Architecture
for Energy-efficient Gesture Recognition
with Hand-worn Sensor and Mobile Devices
28
Motivation
Mobile applications using hand gestures
Hand-worn Sensors
…
Mobile Gesture Interaction Framework
Wristwatch-type
motion sensor
Smartphone
(Accelerometer,
Gyroscope)
Mobility!!!!!
29
Challenge: Energy and Accuracy
• Conventional gesture processing pipeline (for gesture recognition in stationary setting)
Mobile Device
Sensor
Data Sensing
Accel
Gyro
Continuous
Raw Data
Gesture
Segmentation
(Button,
Algorithms)
Gesture
Samples
(Candidate)
Classification
(HMM, DTW)
Result
Gesture ‘A’ or ‘B’
or non-gesture
30
Challenge: Energy and Accuracy
Sensor: 20hrs (250mAh)
Smartphone: 24hrs  17hrs
Mobile Device
Sensor
Data Sensing
Accel
Continuous
Raw Data
Gyro
Mobility Noises
Gesture
Segmentation
(Button,
Algorithms)
Gesture
Samples
(Candidate)
Classification
(HMM, DTW)
Result
Over 90% False segmentation
Only 70% Classification
31
E-Gesture Architecture
1. Device-wise collaboration
Detection on wristwatch, classification on smartphone
Collaborative Gesture Sensing
and Segmentation
Trigger
Accel
Adaptation
Gyro
Gesture
Samples
(Candidate)
Classification
(Adaptive and
Multi-Situation
HMM)
Wristwatch Sensor Device
Result
(e.g.
laydown)
Mobile Device
2. Sensor-wise collaboration
Accel turns on gyro for energy efficiency
Gyro adapts accel’s sensitivity for mobility changes
Accelerometer: (+)Energy-efficient, (-)Mobility-vulnerable
Gyroscope: (-)Energy-hungry, (+)Mobility-robust
32
Sensor-side Energy Savings
Continuous sensing
+ transmission:
46mW
20 hrs
Device-wise collaboration
(reduced transmission)
250mAh
Li-ion
Battery
39mW (↓15%)
23.7 hrs (1.2x)
Device-wise, Sensor-wise
collaboration
(gyroscope power control,
reduced transmission)
19mW (↓59%)
48.7 hrs (2.4x)
Energy Consumption
59% less energy consumption, 2.4x longer lifetime
33
Mobile-side Energy Savings
Nexus One
1400mAh
Li-ion
Battery
3G/WiFi on
All processing on mobile:
122mW
42.1hrs
Device-wise collaboration
(reduced transmission)
70mW(↓43%)
74hrs (1.8x)
Energy Consumption
34
Implementation
• Sensor node
Nokia N96
Bluetooth Headset
– Atmega128L MCU
– Bluetooth, ZigBee
– Sensors
• 3-Axis Accelerometer (ADXL335)
• 3-Axis Gyroscope (3 XV-3500CB)
• 40Hz Sensing
– Vib motor
Google Nexus One
Sensor node
• Smartphones
– Nokia N96, Google Nexus One
– Bluetooth Radio
35
Sample Applications
Hand-worn Sensors
• Swan Boat [Ubicomp09][MM09][ACE09]
– Collaborative boat-racing exertion game
– Utilizes hand gestures as additional game input
• Punching together, flapping together
• Mobile Music Player, Phone Call Manager
– Featuring eye-free, touch-free controls
– User can control the application by hand gestures
36
Conclusion
• Mobile gestural interaction platform
– Collaborative gesture processing
• 1.8x longer battery lifetime for Smartphone
• 2.4x longer battery lifetime for Hand-worn Sensor
+ preserving gyroscope’s detection performance
– Mobility-robust gesture classification using HMM
• Up to 94.6% of classification accuracy on mobile usage
by mobility-considered classification architecture design
– It will greatly facilitate gesture-based mobile applications
• Provided a novel sensor fusion scheme
• Serial fusion + feedback control
– Saves energy + preserves detection accuracy
37
38