Sei sulla pagina 1di 23

Eye Movement Analysis for Activity Recognition Using Electrooculography

AbstractIn this work, we investigate eye movement analysis as a new sensing modality for activity recognition Eye movement data were recorded using an electrooculography !E"#$ system %e first describe and evaluate algorithms for detecting three eye movement characteristics from E"# signalssaccades, fi&ations, and blinksand propose a method for assessing repetitive patterns of eye movements %e then devise '( different features based on these characteristics and select a subset of them using minimum redundancy ma&imum relevance !mRMR$ feature selection %e validate the method using an eight participant study in an office environment using an e&ample set of five activity classes) copying a te&t, reading a printed paper, taking handwritten notes, watching a video, and browsing the %eb %e also include periods with no specific activity !the *U++ class$ Using a support vector machine !,-M$ classifier and person.independent !leave.one.person.out$ training, we obtain an average precision of /0 1 percent and recall of /( 2 percent over all classes and participants 3he work demonstrates the promise of eye.based activity recognition !EAR$ and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities Inde& 3ermsUbi4uitous computing, feature evaluation and selection, pattern analysis, signal processing

5
1 INTRODUCTION

posture [(!" sound [)!" or interactions between people [*!. $here are" however" limitations to current sensor config% urations. Accelerometers or gyroscopes" for e+ample" are limited to sensing physical activity, they cannot easily be used for detecting predominantly visual tasks" such as reading" browsing the -eb" or watching a video. .ommon ambient sensors" such as reed switches or light sensors" are limited in that they only detect basic activity events" e.g." entering or leaving a room or switching an appliance on or off. /urther to these limitations" activity sensing using subtle cues" such as user attention or intention" remains largely une+plored.

UMAN recognition has become an important areaactivity for pattern recognition. Research in computerapplication vision has traditionally been at the forefront of this work [ !" [#!. $he growing use of ambient and body%worn sensors has paved the way for other sensing modalities" particularly in the domain of ubi&uitous computing. 'mportant advances in activity recognition were achieved using modalities such as body movement and

A rich source of information" as yet unused for activity recognition" is the movement of the eyes. $he movement patterns our eyes perform as we carry out specific activities have the potential to reveal much about the activities themselves0 independently of what we are looking at. $his includes information on visual tasks" such as reading [1!" information on predominantly physical activities" such as driving a car" but also on cognitive processes of visual perception" such as attention [2! or saliency determination [3!. 'n a similar manner" location or a particular environment may influence our eye move%ments. 4ecause we use our eyes in almost everything that we do" it is conceivable that eye movements provide useful information for activity recognition. 5eveloping sensors to record eye movements in daily life is still an active topic of research. Mobile settings call for highly miniaturi6ed" low%power eye trackers with real%time processing capabilities. $hese re&uirements are increasingly addressed by commonly used video%based systems" of which some can now be worn as relatively light headgear. However" these remain e+pensive" with demanding video processing tasks re&uiring bulky au+iliary e&uipment. 7lectrooculography 879:;0the measurement techni&ue used in this work0is an ine+pensive method for mobile eye movement recordings, it is computationally light weight and can be implemented using wearable sensors [<!. $his is crucial with a view to long%term recordings in mobile real%world settings.

. A 6ulling is with the 7omputer +aboratory, University of 7ambridge, 12 88 3homson Ave , %illiam #ates 6uilding, 7ambridge 769 (:;, U<, and the ,chool of 7omputing and 7ommunications, +ancaster University, Info+ab =1, ,outh ;rive, +ancaster +A1 >%A, U<

E.mail) andreas bulling?acm org . # 3ro@ster is with the ;epartment of Information 3echnology and Electrical Engineering, ,wiss :ederal Institute of 3echnology !E3A$ Burich, #loriastrasse 92, C('= Burich, ,witDerland E.mail) troester?ife ee ethD ch . 8 A %ard and A #ellersen are with the ,chool of 7omputing and 7ommunications, +ancaster University, Info+ab =1, ,outh ;rive, +an.caster +A1 >%A, U< E.mail) EF ward, hwgG?comp lancs ac uk Manuscript received =0 *ov =(('H accepted =0 :eb =(1(H published online 9( Mar =(1( Recommended for acceptance by 6 ,chiele :or information on obtaining reprints of this article, please send e.mail to) tpami?computer org, and reference IEEE7, +og *umber 3IAMI. =(('.11.(/C2 ;igital "bFect Identifier no 1( 11('J3IAMI =(1( C0

Paper Scope and Contributions

$he aim of this work is to assess the feasibility of recogni6ing human activity using eye movement analysis" so%called eye%based activity recognition 87AR;. $he specific contributions are=

. #.

the introduction of eye movement analysis as a new sensing modality for activity recognition" the development and characteri6ation of new algo%rithms for detecting three basic eye movement types from 79: signals 8saccades" fi+ations" and blinks; and a method to assess repetitive eye movement patterns" the development and evaluation of <> features derived from these eye movement types" and the implementation of a method for continuous 7AR and its evaluation using a multiparticipant 79: data set involving a study of five real% world office activities.

physically disabled people who have e+tremely limited peripheral mobility but still retain eye%motor coordination. $hese studies showed that 79: is a measure%ment techni&ue that is ine+pensive" easy to use" reliable" and relatively unobtrusive when compared to head%worn cameras used in video%based eye trackers. -hile these applications all used 79: as a direct control interface" our approach is to use 79: as a source of information on a personBs activity.

(. ).

#.# E$e %o&e'ent Ana"$sis


A growing number of researchers use video% based eye tracking to study eye movements in natural environments. $his has led to important advances in our understanding of how the brain processes tasks" and of the role that the visual system plays in this [ *!. 7ye movement analysis has a long history as a tool to investigate visual behavior. 'n an early study" Hacisalih6ade et al. used Markov processes to model visual fi+ations of observers recogni6ing an obCect [ 1!. $hey

.# Paper Organization
-e first survey related work" introduce 79:" and describe the main eye movement characteristics that we identify as useful for 7AR. -e then detail and characteri6e the recognition methodology= the methods used for removing drift and noise from 79: signals" and the algorithms developed for detecting saccades" fi+ations" blinks" and for analy6ing repetitive eye movement patterns. 4ased on these eye movement characteristics" we develop <> features, some directly derived from a particular characteristic" others devised to capture additional aspects of eye move%ment dynamics. -e rank these features using minimum redundancy ma+imum relevance 8mRMR; feature selection and a support vector machine 8?@M; classifier. $o evaluate both algorithms on a real%world e+ample" we devise an e+peri%ment involving a continuous se&uence of five office activities" plus a period without any specific activity 8the NUAA class;. /inally" we discuss the findings gained from this e+periment and give an outlook to future work.

2 RELATED OR! #. E"ectroocu"ograp#$


App"ications
7ye movement characteristics such as saccades" fi+ations" and blinks" as well as deliberate movement patterns detected in 79: signals" have already been used for hands%free operation of static human% computer [ ! and human%robot [ #! interfaces. 79:%based interfaces have also been developed for assistive robots [ (! or as a control for an electric wheelchair [ )!. ?uch systems are intended to be used by

transformed fi+ation se&uences into character strings and used the string edit distance to &uantify the similarity of eye movements. 7lhelw et al. used discrete time Markov chains on se&uences of temporal fi+ations to identify salient image features that affect the perception of visual realism [ 2!. $hey found that fi+ation clusters were able to uncover the features that most attract an observerBs attention. 5empere%Marco et al. presented a method for training novices in assessing tomography images [ 3!. $hey modeled the assessment behavior of domain e+perts based on the dynamics of their saccadic eye movements. ?alvucci and Anderson evaluated means for automated analysis of eye movements [ <!. $hey described three methods based on se&uence% matching and hidden Markov models that inter%preted eye movements as accurately as human e+perts but in significantly less time. All of these studies aimed to model visual behavior during specific tasks using a small number of well%known eye movement characteristics. $hey e+plored the link between the task and eye movements" but did not recogni6e the task or activity using this information.

large variety and number of ambient sensors" including R/'5 tags and infrared motion detectors [##!. -ard et al. investigated the use of wrist%worn accelerometers and microphones in a wood workshop to detect activities" such as hammering or cutting wood [)!. ?everal researchers investigated the recognition of reading activity in stationary and mobile settings using different eye tracking techni&ues [1!" [#(!. 9ur work" however" is the first to describe and apply a general%purpose architecture for 7AR to the problem of recogni6ing everyday activities.

( )AC!*ROUND (. E"ectroocu"ograp#$
$he eye can be modeled as a dipole with its positive pole at the cornea and its negative pole at the retina. Assuming a stable corneo% retinal potential difference" the eye is the origin of a steady electric potential field. $he electrical signal that can be measured from this field is called the electrooculogram 879:;. 'f the eye moves from the center position toward the periphery" the retina approaches one electrode while the cornea approaches the opposing one. $his change in dipole orientation causes a change in the electric potential field and thus the measured 79: signal amplitude. 4y analy6ing these changes" eye movements can be tracked. Using two pairs of skin electrodes placed at opposite sides of the eye and an additional reference electrode on the forehead" two signal components 8E"#h and E"#v;" corresponding to two movement components 0a hori6ontal and a vertical0 can be identified. 79: typically shows signal amplitudes

#.( Acti&it$ Recognition


'n ubi&uitous computing" one goal of activity recognition is to provide information that allows a system to best assist the user with his or her task [#>!. $raditionally" activity recognition research has focused on gait" posture" and gesture. 4ao and 'ntille used body%worn accelerometers to detect #> physical activities" such as cycling" walking" and scrubbing the floor" under real%world conditions [# !. Aogan et al. studied a wide range of daily activities" such as using a dishwasher or watching television" using a

:ig 1 ;enoised and baseline drift removed horiDontal !E"#h$ and vertical !E"#v$ signal components E&amples of the three main eye movement types are marked in gray) saccades !,$, fi&ations !:$, and blinks !6$

ranging from * to =( _V =degree and an essential fre&uency content between > and (> H6 [#)!.

movement of both eyes is called a saccade. $he duration of a saccade depends on the angular distance the eyes travel during this movement= the so%called saccade amplitude. $ypical characteristics of saccadic eye move% ments are #> degrees for the amplitude" and > to >> ms for the duration [#*!.

(.# E$e %o&e'ent T$pes


$o be able to use eye movement analysis for activity recognition" it is important to understand the different types of eye movement. -e identified three basic eye movement types that can be easily detected using 79:= saccades" fi+ations" and blinks 8see /ig. ;.

(+2+2 ,i-ations
/i+ations are the stationary states of the eyes during which ga6e is held upon a specific location in the visual scene. /i+ations are usually defined as the time between each two saccades. $he average fi+ation duration lies between >> and #>> ms [#1!.

(+2+1 Saccades
$he eyes do not remain still when viewing a visual scene. 'nstead" they have to move constantly to build up a mental DmapE from interesting parts of that scene. $he main reason for this is that only a small central region of the retina" the fovea" is able to perceive with high acuity. $he simulta%neous

)"in.s (+2+ (
$he frontal part of the cornea is coated with a thin li&uid film" the so%called Dprecornial tear film.E $o spread this fluid across the corneal surface" regular opening and closing of the eyelids" or blinking" is re&uired. $he average blink rate varies between # and < blinks per minute while at rest [#2!, it is influenced by environmental factors such as

relative humidity" temperature" or brightness" but also by physical activity" cognitive workload" or fatigue [#3!. $he average blink duration lies between >> and )>> ms [#<!.

/ %ET0ODOLO*1
-e first provide an overview of the architecture for 7AR used in this work. -e then detail our algorithms for
:ig = Architecture for eye.based activity recognition on the e&ample of E"# +ight gray indicates E"# signal processingH dark gray indicates use of a sliding window

removing baseline drift and noise from 79: signals" for detecting the three basic eye movement types" and for analy6ing repetitive patterns of eye movements. /inally" we describe the features e+tracted from these basic eye move%ment types and introduce the minimum redundancy ma+imum relevance feature selection and the support vector machine classifier.

).

Recognition Arc#itecture

/ig. # shows the overall architecture for 7AR. $he methods were all implemented offline using MA$AA4 and .. 'nput to the processing chain are the two 79: signals capturing the hori6ontal and the vertical eye movement components. 'n the first stage" these signals are processed to remove any artifacts that might hamper eye movement analysis. 'n the case of 79: signals" we apply algorithms for baseline drift and noise removal. 9nly this initial processing depends on the particular eye tracking techni&ue used, all further stages are completely independent of the underlying type of eye movement data. 'n the ne+t stage" three different eye movement types are detected from the processed eye movement data= saccades" fi+ations" and blinks. $he corresponding eye movement events returned by the detection algorithms are the basis for e+tracting different eye movement features using a sliding window. 'n the last stage" a hybrid method selects the most relevant of these features" and uses them for classification.

).# EO* Signa" Processing ' /+2+1 )ase"ine Dri2t Re'o&a"


4aseline drift is a slow signal change superposing the 79: signal but mostly unrelated to eye movements. 't has many possible sources" e.g." interfering background signals or electrode polari6ation [(>!. 4aseline drift only marginally influences the 79: signal during saccades, how

e other eye movements are subCect to baseline drift. 'n a five% electrode setup" as used in this work 8see /ig. 3;" baseline drift may also differ between the hori6ontal and vertical 79: signal component. ?everal approaches to remove baseline drift from electrocardiography 87.:; signals have been proposed 8for e+ample" see [( !" [(#!" [((!;. As 7.: shows repetitive signal characteristics" these algorithms perform sufficiently well at removing baseline drift. However" for signals with nonrepetitive characteristics such as 79:" developing algorithms for baseline drift removal is still an active area of research. -e used an approach based on wavelet transform [()!. $he algorithm first performed an appro+i%mated multilevel 5 wavelet decomposition at level nine using 5aubechies wavelets on each 79: signal component. $he reconstructed decomposition coefficients gave a base%line drift estimation. ?ubtracting this estimation from each original signal component yielded the corrected signals with reduced drift offset.

/+2+2 Noise Re'o&a"


79: signals may be corrupted with noise from different sources" such as the residential power line" the measurement circuitry" electrodes" and wires" or other interfering physio%logical sources such as electromyographic 87M:; signals. 'n addition" simultaneous physical activity may cause the electrodes to loose contact or move on the skin. As mentioned before" 79: signals are typically nonrepetitive. $his prohi%bits the application of denoising algorithms that make use of structural and temporal knowledge about the signal. ?everal 79: signal characteristics need to be preserved by the denoising. /irst" the steepness of signal edges needs to be retained to be able to detect blinks and saccades. ?econd" 79: signal amplitudes need to be preserved to be able to distinguish between different types and directions of saccadic eye movements. /inally" denoising filters must not introduce signal artifacts that may be misinterpreted as saccades or blinks in subse&uent signal processing steps. $o identify suitable methods for noise removal" we compared three different algorithms on real and synthetic 79: data= a low% pass filter" a filter based on wavelet shrinkage denoising [(*!" and a median filter. 4y visual inspection of the denoised signal" we found that the median filter performed best, it preserved edge steepness of saccadic eye movements" retained 79: signal amplitudes" and did not introduce any artificial signal changes. 't is crucial" however" to choose a window si6e %mf that is small enough to retain short signal pulses" particularly those caused by blinks. A median filter removes pulses of a width smaller than about half of its window si6e. 4y taking into account the average blink duration reported earlier" we fi+ed %mf to *> ms.

:ig 9 7ontinuous %avelet 3ransform,accade ;etection algorithm

!a$ ;enoised and baseline drift removed horiDontal E"# signal during reading with e&ample saccade amplitude !,A$H !b$ the transformed wavelet signal !E"#wl$, with application.specific small !_thsmall$ and large !_thlarge$ thresholdsH !c$ marker vectors for distinguishing between small !Msmall$ and large !Mlarge$ saccadesH and !d$ e&ample character encoding for part of the E"# signal

wordbook analysis are all reliant on it 8see /ig. #;. 'n the following" we introduce our saccade and blink detection algorithms and characteri6e their performance on 79: signals recorded under constrained conditions.

/+(+1 Saccade and ,i-ation Detection


/or saccade detection" we developed the so%called 7ontin.uous %avelet 3ransform,accade ;etection 8.-$%?5; algo% rithm 8see /ig. ( for an e+ample;. 'nput to .-$%?5 are the denoised and baseline drift removed 79: signal compo%nents E"#h and E"#v. .-$%?5 first computes the continuous 5 wavelet coefficients at scale #> using a Haar mother wavelet. Aet s be one of these signal components and the mother wavelet. $he wavelet coefficient 7b of s at scale a and position b is defined
a

).( Detection o2 )asic E$e %o&e'ent T$pes


5ifferent types of eye movements can be detected from the processed 79: signals. 'n this work" saccades" fi+ations" and blinks form the basis of all eye movement features used for classification. $he robustness of the algorithms for detecting these is key to achieving good recognition performance. ?accade detection is particularly important because fi+ation detection" eye movement encoding" and the

7 s
b

s t Z

1 _

t _ b
a

< 1H Ci ) 7iKsL N _thsdH

dt )
_

Mi M : _1H Ci ) 7iKsL O thsdH

KLM

K L pa

IR

4y applying an application%specific threshold thsd on the =( coefficients 7iKsL M 7i KsL" .-$%?5 creates a vector M with elements Mi=
8

(H Ci ) _thsd _ 7iKsL _ thsd) $his step divides E"#h and E"#v in saccadic 8M M 1H _1; and nonsaccadic 8fi+ational; 8M M (; segments. ?accadic segments shorter than #> ms and longer than #>> ms are removed. $hese boundaries appro+imate the typical physiological saccade characteristics described in literature [#*!. .-$%?5 then calculates the amplitude and direction of each detected saccade. $he saccade amplitude ,A is the difference in 79: signal amplitude before and

after the saccade 8c.f. /ig. (;. $he direction is derived from the sign of the corresponding elements in M. /inally" each saccade is encoded into a character representing the combination of amplitude and direction. /or e+ample" a small saccade in E"#h with negative direction gets encoded as DrE and a large saccade with positive direction as DA.E Humans typically alternate between saccades and fi+a%tions. $his allows us to also use .-$%?5 for detecting fi+ations. $he algorithm e+ploits the fact that ga6e remains stable during a fi+ation. $his results in the corresponding ga6e points" i.e." the points in a visual scene that the ga6e is directed at" to cluster together closely in time. $herefore" fi+ations can be identified by thresholding on the dispersion of these ga6e points [(1!. /or a segment , of length n comprised of a hori6ontal ,h and a vertical ,v 79: signal component" the dispersion is calculated as 'nitially" all nonsaccadic segments are assumed to contain a fi+ation. $he algorithm then drops segments for which the dispersion is above a ma+imum threshold thfd of >">>> or if its duration is below a minimum threshold thfdt of #>> ms. $he value of thfd was derived as part of the .-$%?5 evaluation, that of thfdt appro+imates the typical average fi+ation duration reported earlier. A particular activity may re&uire saccadic eye movements of different distance and direction. /or e+ample" reading involves a fast se&uence of small saccades while scanning each line of te+t" while large saccades are re&uired to Cump back to the beginning of the ne+t line. -e opted to detect saccades with two different amplitudes" DsmallE and Dlarge.E $his re&uires two thresholds" thsdsmall and thsdlarge " to divide the range of possible values of . into three bands 8see /ig. (;= no
saccade 8 th
7 N th
_

:ig > Evaluation of the 7%3.,; algorithm for both E"# signal components using a sweep of its main parameter, the threshold th sd 3he figure plots the mean :1 score over all five participantsH vertical lines show the standard deviation for selected thsd Ma&imum :1 score is indicated by a dashed line

sdsmall

N 7 N th
sdsmall

87 N th _ sdlarge

sdsmall

or th

N 7 N th

sdsmall

;" small saccade 8 th


_
sdlarge

;" and large saccade

sdlarge

or 7 O thsdlarge ;. 5epending on its peak value" each saccade is then assigned to one of these bands. $o evaluate the .-$%?5 algorithm" we performed an e+periment with five participants0one female and four males 8age= #*%*< years" mean M 90)C" sd M 12)>;. $o cover effects of differences in electrode placement and skin contact" the e+periment was performed on two different days, in between days the participants took off the 79: electrodes. A total of #> recordings were made per participant" > per day. 7ach e+periment involved tracking the participantsB eyes while they followed a se&uence of flashing dots on a computer screen. -e used a fi+ed se&uence to simplify labeling of individual saccades. $he se&uence was comprised of > eye movements consisting of five hori6ontal and eight vertical saccades. $his produced a total of *< hori6ontal and 3** vertical saccades. 4y matching saccade events with the annotated ground truth" we calculated true positives 83Is;" false positives 8:Is;" and false negatives 8:*s;" and from these" precision
3I 3I precision_recall

similar thresholds were used to achieve the top / scores of about >.<). 't is interesting to note that the standard deviation across all participants reaches a minimum for a whole range of values around this ma+imum. $his suggests that thresholds that are also close to this point can be selected that still achieve robust detection performance.

/+(+2 )"in. Detection


/or blink detection" we developed the 7ontinuous %avelet 3ransform6link ;etection 8.-$%45; algorithm. ?imilarly to .-$%?5" the algorithm uses a threshold thbd on the wavelet coefficients to detect blinks in E"#v. 'n contrast to a saccade" a blink is characteri6ed by a se&uence of two large peaks in the coefficient vector directly following each other= one positive" the other negative. $he time between these peaks is smaller than the minimum time between two successive saccades rapidly performed in opposite direction. $his is because" typically" two saccades have at least a short fi+ation in between them. /or this reason" blinks can be detected by applying a ma+imum threshold thbdt on this time difference. -e evaluated our algorithm on 79: signals recorded in a stationary setting from five participants looking at different pictures 8two females and three males" age= #*%#< years" mean M =0)>" sd M 1)/;. -e labeled a total of 2>1 blinks by visual inspection of the vertical 79: signal component. -ith an average blink rate of # blinks per minute" this corresponds to about one hour of eye movement data. -e evaluated .-$%45 over sweeps of its two main parameters= thbd M
1(( 2(H((( 8in *>> steps; and thbdt M 1(( 1H((( ms 8in

;" recall 3I _ ;. 8 ;" and the / score 8= precisionPrecall P:* -e then evaluated the / score across a sweep on the .-$%?5 threshold thsd M 1 2( 8in *> steps; separately for the hori6ontal and vertical 79: signal components. /ig. ) shows the mean / score over all five participants with vertical lines indicating the standard deviation for selected values of thsd. -hat can be seen from the figure is that
3I P:I

steps;. $he / score was calculated by matching blink events with the annotated ground truth. /ig. 1 shows the / scores for five selected values of thbdt over all participants. .-$%45 performs best with thbdt between )>> and 1>> ms while reaching

>

top performance 8/ score= >.<); using a thbdt of *>> ms. $ime differences outside this range" as e+emplarily shown for (>> and ">>> ms" are already subCect to a considerable drop in performance. $his finding nicely reflects the values for the average blink duration cited earlier from the literatur

e
movements of different direction and distance) dark gray indicates basic and light gray diagonal directions !b$ ,accades detected in both E"# signal

components and mapped to the eye movement se4uence of the Fumping point stimulus ,imultaneous saccades in both components are combined according to their direction and amplitude !e g , QlR and QuR become Qn,R and QRR and QUR become Q6R$

).) Ana"$sis o2 Repetiti&e E$e %o&e'ent Patterns


Activities such as reading typically involve characteristic se&uences of several consecutive eye movements [1!. -e propose encoding eye movements by mapping saccades with different direction and amplitude to a discrete" character%based representation. ?trings of these characters are then collected in wordbooks that are analy6ed to e+tract se&uence information on repetitive eye movement patterns.

/+/+1 E$e %o&e'ent Encoding


9ur algorithm for eye movement encoding maps the individual saccade information from both 79: components onto a single representation comprised of #) discrete characters 8see /ig. *a;. $his produces a representation that can be more efficiently processed and analy6ed. $he algorithm takes the .-$%?5 saccades from the hori6ontal and vertical 79: signal components as its input. 't first checks for simultaneous saccades in both components as these represent diagonal eye movements. ?imultaneous saccades are characteri6ed by overlapping saccade segments in the time domain. 'f no simultaneous saccades are detected"

:ig 2 !a$ 7haracters used to encode eye

neighboring character 8e.g." DlE and DUE become D9E;. $his procedure encodes each eye movement into a distinct character" thus" mapping saccades of both 79: signal components into one eye movement se&uence.

/+/+2

ordboo. Ana"$sis

4ased on the encoded eye movement se&uence" we propose a wordbook analysis to assess repetitive eye movement patterns 8see /ig. 2;. An eye movement pattern is defined as a string of l successive characters. As an e+ample with l M >" the pattern DAr4dE translates to large left 8A; S small right 8r; S large diagonal right 84; S small down 8d;. A sliding window of length l and a step si6e of one is used to scan the eye movement se&uence for these patterns. 7ach newly found eye movement pattern is added to the corresponding wordbook %bl. /or a pattern that is already included in %bl" its occurrence count is increased by one.
:ig 0 Evaluation of the 7%3.6; algorithm over a sweep of the blink threshold thbd, for five different ma&imum time differences th bdt 3he figure plots the mean :1 score over all participantsH vertical lines show the standard deviation for selected th bd Ma&imum :1 score is indicated by a dashed line

).* ,eature E-traction


-e e+tract four groups of features based on the detected saccades" fi+ations" blinks" and the wordbooks of eye movement patterns. $able details the naming scheme used for all of these features. $he features are calculated using a sliding window 8window si6e %fe and step si6e ,fe; on both E"#h and E"#v. /rom a pilot study" we were able to fi+ %fe at (> s and ,fe at >.#* s. /eatures calculated from saccadic eye movements make up the largest proportion of e+tracted features. 'n total" there are 1# such features comprising the mean" variance" and ma+imum 79: signal amplitudes of saccades and the normali6ed saccade rates. $hese are calculated for both

the saccadeBs character is directly used to denote the eye movement. 'f two saccades are detected" the algorithm combines both according to the following scheme 8see /ig. *b;= $he characters of two saccades with e&ually large 79: signal amplitudes are merged to the character e+actly in between 8e.g." DlE and DuE become Dn"E DRE and DUE become D4E;. 'f simultaneous saccades differ by more than *> percent in 79: signal amplitude" their characters are merged to the closest

3A6+E 1 *aming ,cheme for the :eatures Used in this %ork

9ur ?@M implementation uses a fast se&uential dual method for dealing with multiple classes [(<!" [)>!. $his reduces training time considerably while retaining recognition performance. $hese two algorithms are combined into a hybrid feature selection and classification method. 'n a first step" mRMR ranks all available features 8with , M '(;. 5uring classifica%tion" the si6e of the feature set is then optimi6ed with respect to recognition accuracy by sweeping ,.

:ig / E&ample wordbook analysis for eye movement patterns of length l M 9 A sliding window scans a se4uence of eye movements encoded into characters for repetitive patterns *ewly found patterns are added to the wordbookH otherwise, only the occurrence count !last column$ is increased by one

E"#h and E"#v" for small and large saccades" for saccades in positive or negative direction" and for all possible combinations of these. -e calculate five different features using fi+ations= the mean and variance of the 79: signal amplitude within a fi+ation", the mean and the variance of fi+ation duration" and the fi+ation rate over window %fe. /or blinks" we e+tract three features= blink rate and the mean and variance of the blink duration. -e use four wordbooks. $his allows us to account for all possible eye movement patterns up to a length of four 8 l M >;" with each wordbook containing the type and occurrence count of all patterns found. /or each wordbook we e+tract five features= the wordbook si6e" the ma+imum occurrence count" the difference between the ma+imum and minimum occurrence counts" and the variance and mean of all occurrence counts.

).1 ,eature Se"ection and C"assi2ication


/or feature selection" we chose a filter scheme over the commonly used wrapper approaches because of the lower computational costs and thus shorter runtime given the large data set. -e use minimum redundancy ma+imum relevance feature selection for discrete variables [(2!" [(3!. $he mRMR algorithm selects a feature subset of arbitrary si6e , that best characteri6es the statistical properties of the given target classes based on the ground truth labeling. 'n contrast to other methods such as the : %test" mRMR also considers relationship between features during the selec%tion. Among the possible underlying statistical measures described in the literature" mutual information was shown to yield the most promising results and was thus selected in this work. 9ur particular mRMR implementation combines the measures of redundancy and relevance among classes using the mutual information difference 8M'5;. /or classification" we chose a linear support vector machine.

:or a particular feature, e g , ,.rate,IAor, the capital letter represents the groupsaccadic !,$, blink !6$, fi&ation !:$, or wordbook !%$and the combination of abbreviations after the dash describes the particular type of feature and the characteristics it covers

E4PERI%ENT
-e designed a study to establish the feasibility of 7AR in a real% world setting. 9ur scenario involved five office%based activities 0copying a te+t" reading a printed paper" taking handwritten notes" watching a video" and browsing the -eb0and periods during which participants took a rest 8the NUAA class;. -e chose these activities for three reasons. /irst" they are all commonly

performed during a typical working day. ?econd" they e+hibit interesting eye movement patterns that are both structurally diverse and have varying levels of comple+ity. -e believe they represent the much broader range of activities observable in daily life. /inally" being able to detect these activities using on%body sensors such as 79: may enable novel attentive user interfaces that take into account cognitive aspects of interaction such as user inter%ruptibility or level of task engagement. 9riginally" we recorded > participants" but two were withdrawn due to poor signal &uality= 9ne participant had strong pathologic nystagmus. Nystagmus is a form of involuntary eye movement that is characteri6ed by alternat%ing smooth pursuit in one direction and saccadic movement in the other direction. $he hori6ontal 79: signal component turned out to be severely affected by the nystagmus and no reliable saccadic information could be e+tracted. /or the second participant" most probably due to bad electrode placement" the 79: signal was completely distorted. All of the remaining eight participants 8two females and si+ males;" aged between #( and ( years 8mean M =0)1" sd M =)>; were daily computer users" reporting 1 to ) hours of use per day 8mean M ')2" sd M =)/;. $hey were asked to follow two continuous se&uences" each composed of five different" randomly ordered activities" and a period of rest 8see /ig. 3b;. /or these" no activity was re&uired of the participants but they were asked not to engage in any of the other activities. 7ach activity 8including NUAA; lasted about five minutes" resulting in a total data set of about eight hours.

/>C

3+2 Procedure
/or the te+t copying task" the original document was shown on the right screen with the word processor on the left screen. Farticipants could copy the te+t in different ways. ?ome touch typed and only checked for errors in the te+t from time to time, others continuously switched attention between the screens or the keyboard while typing. 4ecause the screens were more than half a meter from the participantsB faces" the video was shown full screen to elicit more distinct eye movements. /or the browsing task" no constraints were imposed concerning the type of -ebsite or the manner of interaction. /or the reading and writing tasks" a book 8 # pt" one column with pictures; and a pad with a pen were provided.

*.( Para'eter Se"ection and E&a"uation


:ig C !a$ Electrode placement for E"# data collection !h) horiDontal, v) vertical, and r) reference$ !b$ 7ontinuous se4uence of five typical office activities) copying a te&t, reading a printed paper, taking handwritten notes, watching a video, browsing the %eb, and periods of no specific activity !the *U++ class$

$he same saccade and blink detection parameters were used throughout the evaluation= thbd M =9H>9C" thbdt M 9'( ms"
th
sdlarge M sdsmall M based on the typical length of a short scan saccade

19 H/2(" and th

=H(((. $he selection of th

sdsmall

was

*.

Apparatus

-e used a commercial 79: device" the Mobi3" from $wente Medical ?ystems 'nternational 8$M?';. 't was worn on a belt around each participantBs waist and recorded a four%channel 79: at a sampling rate of #3 H6. Farticipants were observed by an assistant who annotated activity changes with a wireless remote control. 5ata recording and synchroni6ation were handled by the .onte+t Recognition Network $oolbo+ [) !. 79: signals were picked up using an array of five #) mm AgGAg.l wet electrodes from $yco Healthcare placed around the right eye. $he hori6ontal signal was collected using one electrode on the nose and another directly across from this on the edge of the right eye socket. $he vertical signal was collected using one electrode above the right eyebrow and another on the lower edge of the right eye socket. $he fifth electrode" the signal reference" was placed in the middle of the forehead. /ive participants 8two females and three males; wore spectacles during the e+periment. /or these participants" the nose electrode was moved to the side of the left eye to avoid interference with the spectacles 8see /ig. 3a;. $he e+periment was carried out in an office during regular working hours. Farticipants were seated in front of two adCacent 2 inch flat screens with a resolution of 1H=C( _ 1H(=> pi+els on which a browser" a video player" a word processor" and te+t for copying were on%screen and ready for use. /ree movement of the head and upper body was possible throughout the e+periment.

during reading" and thsdlarge on the length of a typical newline movement. .lassification and feature selection were evaluated using a leave%one%person%out scheme= -e combined the data sets of all but one participant and used this for training, testing was done using both data sets of the remaining participant. $his was repeated for each participant. $he resulting train and test sets were standardi6ed to have 6ero mean and a standard deviation of one. /eature selection was always performed solely on the training set. $he two main parameters of the ?@M algorithm" the cost 7 and the tolerance of termination criterion _" were fi+ed to 7 M 1 and _ M ()1. /or each leave%one%person%out iteration" the predic%tion vector returned by the ?@M classifier was smoothed using a sliding maCority window. 'ts main parameter" the window si6e %sm" was obtained using a parameter sweep and fi+ed at #.) s.

5 RESULTS 1. C"assi2ication Per2or'ance


?@M classification was scored using a frame%by%frame comparison with the annotated ground truth. /or specific results on each participant or on each activity" class%relative precision and recall were used. $able # shows the average precision and recall" and the corresponding number of features selected for each participant. $he number of features used varied from only nine features 8F3; up to 3 features 8F ;. $he mean performance over all participants was 21. percent preci%sion and 2>.* percent recall. F) reported the worst result" with both precision and recall below *> percent. 'n contrast" F2 achieved the best result" indicated by recognition

3A6+E = Irecision, Recall, and the 7orresponding *umber of :eatures ,elected by the Aybrid mRMRJ,-M Method for Each Iarticipant

3he participantsT gender is given in bracketsH best and worst case results are indicated

:ig 1(

,ummed confusion matri& from all participants, normaliDed across ground truth rows :ig ' Irecision and recall for each activity and participant Mean performance !I1 to IC$ is marked by a star

performance in the 3>s and <>s and using a moderate%si6ed feature set. /ig. < plots the classification results in terms of precision and recall for each activity and participant. $he best results approach the top right corner" while worst results are close to the lower left. /or most activities" precision and recall fall within the top right corner. Recognition of reading and copying" however" completely fails for F)" and browsing also shows noticeably lower precision. ?imilar but less strong characteristics apply for the reading" writing" and browsing task for F*.

$he summed confusion matri+ from all participants" normali6ed across ground truth rows" is given in /ig. >. .orrect recognition is shown on the diagonal, substitution errors are off% diagonal. $he largest between%class substitu%tion errors not involving NUAA fall between # and ( percent of their class times. Most of these errors involve browsing that is falsely returned during ( percent each of

read" write" and copy activities. A similar amount is substituted by read during browse time.

1.# E$e %o&e'ent ,eatures


-e first analy6ed how mRMR ranked the features on each of the eight leave%one%person%out training sets. $he rank of a feature is the position at which mRMR selected it within a set. $he position corresponds to the importance with which mRMR assesses the featureBs ability to discriminate between classes in combination

with the features ranked before it. /ig. shows the top * features according to the median rank over all sets 8see $able for a description of the type and name of the features;. 7ach vertical bar represents the spread of mRMR ranks= /or each feature" there is one rank per training set. $he most useful features are those found with the highest rank 8close to one; for most training sets" indicated by shorter bars. ?ome features are not always included in the final result 8e.g." feature 1( only appears in five sets;. 7&ually" a useful feature that is ranked lowly by mRMR might be the one that improves a classification 8e.g."

:ig 11 3he top 12 features selected by mRMR for all eight training sets 3he U.a&is shows feature number and groupH the key on the right shows the corresponding feature names as described in 3able 1H the V.a&is shows the rank !top W 1$ :or each feature, the bars show) the total number of training sets for which the feature was chosen !bold number at the top$, the rank of the feature within each set !dots, with a number representing the set count$, and the median rank over all sets !black star$ :or e&ample, a useful feature is >/ !,$a saccadic feature selected for all sets, in seven of which it is ranked one or twoH less useful is 09 !6$a blink feature used in only five sets and ranked between > a

3A6+E 9 3he 3op :ive :eatures ,elected by mRMR for Each Activity over All 3raining ,ets !,ee 3able 1 for ;etails on :eature *ames$

for e+ample" as part of a reading assistant" or for monitoring workload to assess the risk of burnout syndrome. /or such applications" recognition per%formance may be further increased by combining eye move%ment analysis with additional sensing modalities.

feature 13 is spread between rank five and #1" but is included in all eight sets;. $his analysis reveals that the top three features" as Cudged by high ranks for all sets" are all based on hori6ontal saccades= )2 8,.rate,IAor;" *1 8,.ma&AmpIAor;" and > 8,. meanAmp,Aor;. /eature 13 8:.rate; is used in all sets" seven of which rank it highly. /eature 1( 8 6.rate; is selected for five out of the eight sets" only one of which gives it a high rank. -ordbook features 22 8%.ma&7ount.l=; and 3* 8%. ma&7ount.l9; are not used in one of the sets" but they are highly ranked by the other seven. -e performed an additional study into the effect of optimi6ing mRMR for each activity class. -e combined all training sets and performed a one%versus%many mRMR for each non%NUAA activity. $he top five features selected during this evaluation are shown in $able (. /or e+ample" the table reveals that reading and browsing can be described using wordbook features. -riting re&uires addi%tional fi+ation features. -atching video is characteri6ed by a mi+ture of fi+ation and saccade features for all directions and0as reading0the blink rate" while copying involves mainly hori6ontal saccade features.

6 DISCUSSION 2. Robustness across Participants


$he developed algorithms for detecting saccades and blinks in 79: signals proved robust and achieved / scores of up to >.<) across several people 8see /igs. ) and 1;. /or the e+perimental evaluation" the parameters of both algorithms were fi+ed to values common for all participants, the same applies to the parameters of the feature selection and classification algorithms. Under these conditions" despite person%independent training" si+ out of the eight partici%pants returned best average precision and recall values of between 1< and <( percent. $wo participants" however" returned results that were lower than *> percent. 9n closer inspection of the raw eye movement data" it turned out that for both the 79:" signal &uality was poor. .hanges in signal amplitude for saccades and blinks0upon which feature e+traction and thus recognition performance directly depend0were not distinc%tive enough to be reliably detected. As was found in an earlier study [1!" dry skin or poor electrode placement are the most likely culprits. ?till" the achieved recognition performance is promising for eye movement analysis to be implemented in real%world applications"

2.# Resu"ts 2or Eac# Acti&it$


As might have been e+pected" reading is detected with comparable accuracy to that reported earlier [1!. However" the methods used are &uite different. $he string matching approach applied in the earlier study makes use of a specific Dreading pattern.E $hat approach is not suited for activities involving less homogeneous eye movement patterns. /or e+ample" one would not e+pect to find a similarly uni&ue pattern for browsing or watching a video as there e+ists for reading. $his is because eye movements show much more variability during these activities as they are driven by an ever%changing stimulus. As shown here" the feature%based approach is much more fle+ible and scales better with the number and type of activities that are to be recogni6ed. Accordingly" we are now able to recogni6e four additional activities0-eb browsing" writing on paper" watching video" and copying te+t0with almost" or above" 2> percent precision and 2> percent recall. Farticularly impressive is video" with an average precision of 33 percent and recall of 3> percent. $his is indicative of a task where the user might be concentrated on a relatively small field of view 8like reading;" but follows a typically

unstructured path 8unlike reading;. ?imilar e+amples outside the current study might include interacting with a graphical user interface or watching television at home. -riting is similar to reading in that the eyes follow a structured path" albeit at a slower rate. -riting involves more eye DdistractionsE0 when the person looks up to think for e+ample. 4rowsing is recogni6ed less well over all participants 8average precision 2< percent and recall 1( percent;0but with a large spread between people. A likely reason for this is that it is not only unstructured" but also it involves a variety of subactivities0including reading0that may need to be modeled. $he copy activity" with an average precision of 21 percent and a recall of 11 percent" is representative of activities with a small field of view that include regular shifts in attention 8in this case" to another screen;. A comparable activity outside the chosen office scenario might be driving" where the eyes are on the road ahead with occasional checks to the side mirrors. /inally" the NUAA class returns a high recall of 3 percent. However" there are many false returns 8activity false negatives; for half of the participants" resulting in a precision of only 11 percent.

$hree of these activities0writing" copying" and browsing 0 all include sections of reading. /rom &uick checks over what has been written or copied to longer perusals of online te+t" reading is a pervasive subactivity in this scenario. $his is confirmed by the relatively high rate of confusion errors involving reading" as shown in /ig. >.

2.( ,eature *roups


$he feature groups selected by mRMR provide a snapshot of the types of eye movement features useful for activity recognition. /eatures from three of the four proposed groups0 saccade" fi+ation" and wordbook0were all prominently represented in our study. $he fact that each group covers complementary aspects of eye movement is promising for the general use of these features for other 7AR problems. Note that no one feature type performs well alone. $he best results were obtained using a mi+ture of different features. Among these" the fi+ation rate was always selected. $his result is akin to that of .anosa" who found that both fi+ation duration and saccade amplitude are strong indicators of certain activities [)#!. /eatures derived from blinks are less represented in the top ranks. 9ne e+planation for this is that for the short activity duration of only five minutes" the participants did not become fully engaged in the tasks" and were thus less likely to show the characteristic blink rate variations suggested by Falomba et al. [)(!. $hese features may be found to be more discriminative for longer duration activities. .oupled with the ease by which they were e+tracted" we believe blink features are still promising for future work.

2.) ,eatures 2or Eac# Acti&it$ C"ass


$he analysis of the most important features for each activity class is particularly revealing. Reading is a regular pattern characteri6ed by a specific se&uence of saccades and short fi+ations of similar duration. .onse&uently" mRMR chose mostly wordbook features describing eye movement se&uencing in its top ranks" as well as a feature describing the fi+ation duration variance. $he fifth feature" the blink rate" reflects that" for reading as an activity of high visual engagement" people tend to blink less [)(!. 4rowsing is structurally diverse and0depending on the -ebsite being viewed0may be comprised of different activities" e.g." watching a video" typing" or looking at a picture. 'n addition to the small" hori6ontal saccade rate" mRMR also selected several workbook features of varying lengths. $his is probably due to our participantsB browsing activities containing mostly se&uences of variable length reading such as scanning headlines or searching for a product in a list. -riting is similar to reading" but re&uires greater fi+ation duration 8it takes longer to write a word than to read it; and greater variance. mRMR correspondingly selected average fi+ation duration and its variance as well as a wordbook feature. However" writing is also characteri6ed by short thinking pauses" during which people invariably look up. $his corresponds e+tremely well to the choice of the fi+ation feature that captures variance in vertical position. -atching a video is a highly unstructured activity" but is carried out within a narrow field of view. $he lack of wordbook features reflects this" as does the mi+ed selection of features based on all three types= variance of both hori6ontal and vertical fi+ation positions" small positive and negative saccadic movements" and blink rate. $he use of blink rate likely reflects the tendency toward blink inhibi% tion when performing an engaging yet sedentary task [)(!. /inally" copying involves many back and forth saccades between screens. mRMR reflects this by choosing a mi+ture of small and large hori6ontal saccade features" as well as variance in hori6ontal fi+ation positions. $hese results suggest that for tasks that involve a known set of specific activity classes" recognition can be optimi6ed by only choosing features known to best describe these classes. 't remains to be investigated how well such prototype features discriminate between activity classes with very similar characteristics.

2.* Acti&it$ Seg'entation Using E$e %o&e'ents


?egmentation0the task of spotting individual activity instances in continuous data0remains an open challenge in activity recognition. -e found that eye movements can be used for activity segmentation on different levels depending on the timescale of the activities. $he lowest level of segmentation is that of individual saccades that define eye movements in different directions0Dleft"E Dright"E and so on. An e+ample for this is the end%of%line Dcarriage returnE eye movement performed during reading. $he ne+t level includes more comple+ activities that involve se&uences composed of a small number of saccades. /or these activities" the wordbook analysis proposed in this work may prove suitable. 'n earlier work" such short eye movement patterns" so%called eye gestures" were success%fully used for eye%based human%computer interaction [))!. At the highest level" activities are characteri6ed by comple+ combinations of eye movement se&uences of potentially arbitrary length. Unless wordbooks are used that span these long se&uences" dynamic modeling of activities is re&uired. /or this" it would be interesting to investigate methods such as hidden Markov models 8HMM;" .onditional Random /ields 8.R/;" or an approach based on eye movement grammars. $hese methods would allow us to model eye movement patterns at different hierarchical levels" and to spot composite activities from large streams of eye move%ment data more easily.

2.1 Li'itations
9ne limitation of the current work is that the e+perimental scenario considered only a handful of activities. 't is important to note" however" that the recognition architec%ture and feature set were developed independently of these activities. 'n addition" the method is not limited to 79:. All features can be e+tracted e&ually well from eye movement data recorded using a video%based eye tracker. $his

suggests that our approach is applicable to other activities" settings" and eye tracking techni&ues. $he study also reveals some of the comple+ity one might face in using the eyes as a source of information on a personBs activity. $he ubi&uity of the eyesB involvement in everything a person does means that it is challenging to annotate precisely what is being DdoneE at any one time. 't is also a challenge to define a single identifiable activity. Reading is perhaps one of the easiest to capture because of the intensity of eye focus that is re&uired and the well%defined paths that the eyes follow. A task such as -eb browsing is more difficult because of the wide variety of different eye move%ments involved. 't is challenging" too" to separate relevant eye movements from momentary distractions. $hese problems may be solved" in part" by using video and ga6e tracking for annotation. Activities from the current scenario could be redefined at a smaller timescale" breaking browsing into smaller activities such as Duse scroll bar"E

7omputing, vol.

" no. )" pp. #* %#1(" #>>2.

Dread"E Dlook at image"E or Dtype.E $his would also allow us to investigate more complicated activities outside the office. An alternative route is to study activities at larger timescales" to perform situation analysis rather than recognition of specific activities. Aong%term eye movement features" e.g." the average eye movement velocity and blink rate over one hour" might reveal whether a person is walking along an empty or busy street" whether they are at their desk working" or whether they are at home watching television. Annotation will still be an issue" but one that maybe alleviated using unsupervised or self%labeling methods [# !" [)*!.

2.2 Considerations 2or ,uture

or.

Additional eye movement characteristics that are potentially useful for activity recognition0such as pupil dilation" microsaccades" vestibulo%ocular refle+" or smooth pursuit movements0were not used here because of the difficulty in measuring them with 79:. $hese characteristics are still worth investigating in the future as they may carry informa%tion that complements that available in the current work. 7ye movements also reveal information on cognitive processes of visual perception" such as visual memory" learning" or attention. 'f it were possible to infer these processes from eye movements" this may lead to cognition%aware systems that are able to sense and adapt to a personBs cognitive state [)1!.

7 CONCLUSION
$his work reveals two main findings for activity recognition using eye movement analysis. /irst" we show that eye movements alone" i.e." without any information on ga6e" can be used to successfully recogni6e five office activities. -e argue that the developed methodology can be e+tended to other activities. ?econd" good recognition results were achieved using a mi+ture of features based on the fundamentals of eye movements. ?e&uence information on eye movement patterns" in the form of a wordbook analysis" also proved useful and can be e+tended to capture additional statistical properties. 5ifferent recognition tasks will likely re&uire different combinations of these features. $he importance of these findings lies in their significance for eye movement analysis to become a general tool for the automatic recognition of human activity.

RE,ERENCES

[ ! [#! [(!

?. Mitra and $. Acharya" D:esture Recognition= A ?urvey"E IEEE 3rans ,ystems, Man, and 7ybernetics, Iart 7) Applications and Rev , vol. (2" no. (" pp. ( %(#)" May #>>2. F. $uraga" R. .hellappa" @.?. ?ubrahmanian" and 9. Udrea" DMachine Recognition of Human Activities= A ?urvey"E IEEE 3rans 7ircuits and ,ystems for -ideo 3echnology, vol. 3" no. " pp. )2(% )33" Nov. #>>3. 4. NaCafi" H. Aminian" A. Faraschiv%'onescu" /. Aoew" ..I. 4ula" and F. Robert" DAmbulatory ?ystem for Human Motion Analysis Using a Hinematic ?ensor= Monitoring of 5aily Fhysical Activity in the 7lderly"E IEEE 3rans 6iomedical Eng , vol. *>" no. 1" pp. 2 %2#(" Iune #>>(. I.A. -ard" F. Aukowic6" :. $roJster" and $.7. ?tarner" DActivity Recognition of Assembly $asks Using 4ody%-orn Microphones and Accelerometers"E IEEE 3rans Iattern Analysis and Machine Intelligence, vol. #3" no. >" pp. **(% *12" 9ct. #>>1. N. Hern" 4. ?chiele" and A. ?chmidt" DRecogni6ing .onte+t for Annotating a Aive Aife Recording"E Iersonal and Ubi4uitous

[)! [*!

[1!

A. 4ulling" I.A. -ard" H. :ellersen" and :. $roJster" DRobust Recognition of Reading Activity in $ransit Using -earable 7lectrooculography"E Iroc ,i&th IntTl 7onf Iervasive 7omputing,

pp. %#>" #>>3.

[ 3! [ <!

pp.

<%(2" #>>3.

[2! [3! [<!

A. 5empere%Marco" N. Hu" ?.A.?. Mac5onald" ?.M. 7llis" 5.M. Hansell" and :.%M. Lang" D$he Use of @isual ?earch for Hnowl%edge :athering in 'mage 5ecision ?upport"E IEEE 3rans Medical Imaging, vol. # " no. 2" pp. 2) %2*)" Iuly #>>#. 5.5. ?alvucci and I.R. Anderson" DAutomated 7ye%Movement Frotocol Analysis"E Auman.7omputer Interaction, vol. 1" no. "

?.F. Aiversedge and I.M. /indlay" D?accadic 7ye Movements and .ognition"E 3rends in 7ognitive ,ciences, vol. )" no. " pp. 1% )" #>>>.

I.M. Henderson" DHuman :a6e .ontrol during Real%-orld ?cene Ferception"E 3rends in 7ognitive ,ciences, vol. 2" no. " pp. )<3% *>)" #>>(. A. 4ulling" 5. Roggen" and :. $roJster" D-earable 79: :oggles= ?eamless ?ensing and .onte+t%Awareness in 7veryday 7nviron%ments"E 8 Ambient Intelligence and ,mart Environments, vol. " no. #" -ard" H. :ellersen" and :. $roJster" D7ye Movement Analysis for Activity Recognition"E Iroc 11th IntTl 7onf Ubi4uitous 7omputing, pp. ) %*>" #>><. K. 5ing" H. $ong" and :. Ai" D5evelopment of an 79: 87lectro% 9culography; 4ased Human%.omputer 'nterface"E Iroc =/th IntTl 7onf Eng in Medicine and 6iology ,oc , pp. 13#<%13( " #>>*. L. .hen and -.?. Newman" DA Human%Robot 'nterface 4ased on 7lectrooculography"E Iroc IEEE IntTl 7onf Robotics and Automa. tion, vol. " pp. #)(%#)3" #>>). -.?. -iCesoma" H.?. -ee" 9... -ee" A.F. 4alasuriya" H.$. ?an" and H.H. ?oon" D79: 4ased .ontrol of Mobile Assistive Flatforms for the ?everely 5isabled"E Iroc IEEE IntTl 7onf Robotics and 6iomimetics, pp. )<>%)<)" #>>*. R. 4area" A. 4o&uete" M. Ma6o" and 7. Aope6" D?ystem for Assisted Mobility Using 7ye Movements 4ased on 7lectrooculography"E
IEEE 3rans *eural ,ystems and Rehabilitation Eng , vol. >" no. )"

pp. (<%31" #>> . [#>! 5. Abowd" A. 5ey" R. 9rr" and I. 4rotherton" D.onte+t%Awareness [# !

in -earable and Ubi&uitous .omputing"E -irtual Reality, vol. (" no. (" pp. #>>%# " <<3. A. 4ao and ?.?. 'ntille" DActivity Recognition from User% Annotated Acceleration 5ata"E Iroc ,econd IntTl 7onf Iervasive 7omputing,

pp. *2% 2 " #>><. [ >! A. 4ulling" I.A. [ !

pp. % 2" #>>). [##! 4. Aogan" I. Healey" M. Fhilipose" 7. $apia" and ?.?. 'ntille" DA
Aong%$erm 7valuation of ?ensing Modalities for Activity Recog% nition"E Iroc *inth IntTl 7onf Ubi4uitous 7omputing, pp. )3(%*>>" #>>2.

[ #! [ (! [ )!

[#(! [#)! [#*! [#1! [#2! [#3! [#<!

/.$. Heat" ?. Ranganath" and L.@. @enkatesh" D7ye :a6e 4ased Reading 5etection"E Iroc IEEE 7onf 7onvergent 3echnologies for the Asia. Region, vol. #" pp. 3#*%3#3" #>>(. M. 4rown" M. Marmor" and @aegan" D'?.7@ ?tandard for .linical 7lectro%9culography 879:;"E ;ocumenta "phthalmologica, vol. (" no. (" pp. #>*%# #" #>>1. A.$. 5uchowski" Eye 3racking Methodology) 3heory and Iractice ?pringer%@erlag New Lork" 'nc." #>>2. 4.R. Manor and 7. :ordon" D5efining the $emporal $hreshold for 9cular /i+ation in /ree%@iewing @isuocognitive $asks"E 8 *euroscience Methods, vol. #3" nos. G#" pp. 3*%<(" #>>(. ..N. Harson" H./. 4erman" 7./. 5onnelly" -.4. Mendelson" I.7. Hleinman" and R.I. -yatt" D?peaking" $hinking" and 4linking"E Isychiatry Research, vol. *" no. (" pp. #)(%#)1" <3 . R. ?chleicher" N. :alley" ?. 4riest" and A. :alley" D4links and ?accades as 'ndicators of /atigue in ?leepiness -arnings= Aooking $iredOE Ergonomics, vol. * " no. 2" pp. <3#% > >" #>>3. H.R. ?chiffman" ,ensation and Ierception) An Integrated Approach, fifth ed. Iohn -iley Iacific P ?ons" #>> .

pp. #><%# [ *! M.M. [ 1! [ 2!

3" 5ec. #>>#.

Hayhoe and 5.H. 4allard" D7ye Movements in Natural 4ehavior"E 3rends in 7ognitive ,ciences, vol. <" pp. 33% <)" #>>*. ?.?. Hacisalih6ade" A.-. ?tark" and I.?. Allen" D@isual Ferception and ?e&uences of 7ye Movement /i+ations= A ?tochastic Modeling Approach"E IEEE 3rans ,ystems, Man, and 7ybernetics, vol. ##" no. (" pp. )2)%)3 " MayGIune <<#. M. 7lhelw" M. Nicolaou" A. .hung" :.%M. Lang" and M.?. Atkins" DA :a6e%4ased ?tudy for 'nvestigating the Ferception of @isual Realism in ?imulated ?cenes"E A7M 3rans Applied Ierception, vol. *" no. " !E3A$ Burich, ,witDerland, in =(1( Ais

[(>!

I.I. :u" M. Meng" A. .ook" and :. /aulkner" DA ?tudy of Natural 7ye Movement 5etection and 9cular 'mplant Movement .ontrol Using Frocessed 79: ?ignals"E Iroc IEEE IntTl 7onf Robotics and Automation, vol. #" pp. ***% *1>" #>> .

[( ! N. Fan" @.M. '" M.F. Un" and F.?. Hang" DAccurate Removal of 4aseline -ander in 7.: Using 7mpirical Mode 5ecomposition"E Iroc 8oint Meeting ,i&th IntTl ,ymp *oninvasive :unctional ,ource Imaging of the 6rain and Aeart and the IntTl 7onf :unctional 6iomedical Imaging, pp. 22% 3>" #>>2. [(#! [((!
@.?. .houhan and ?.?. Mehta" D$otal Removal of 4aseline 5rift from 7.: ?ignal"E Iroc 1/th IntTl 7onf 7omputer 3heory and Applications, pp. * #%* *" #>>2.

Andreas 6ulling received the M,c degree in computer science from the 3echnical University of <arlsruhe, #ermany, in =((0, and the Ih; degree in information technology and electrical engineering from the ,wiss :ederal Institute of 3echnology research interests are in cognition. aware systems and multimodal activity and conte&t recognition with applications in ubi4uitous com.puting and human.computer interaction Ae is currently a postdoctoral research associate at the University of 7am. bridge, United <ingdom, and +ancaster University, United <ingdom,

A. Nu" 5. Mhang" and H. -ang" D-avelet%4ased .ascaded funded by a :eodor +ynen Research :ellowship of the Ale&ander von Adaptive /ilter for Removing 4aseline 5rift in Fulse -aveforms"E Aumboldt :oundation, #ermany Ae is a student member of the IEEE IEEE 3rans 6iomedical Eng , vol. *#" no. " pp. <2(% <2*" Nov. :or more details, see http)JJwww andreas.bulling euJ #>>*.

[()! [(*! [(1!

M.A. $inati and 4. Mo6affary" DA -avelet Fackets Approach to 7lectrocardiograph 4aseline 5rift .ancellation"E IntTl 8 6iomedical Imaging, vol. #>>1" pp. %<" #>>1. 5.A. 5onoho" D5e%Noising by ?oft% $hresholding"E IEEE 3rans Information 3heory, vol. ) " no. (" pp. 1 (%1#2" May <<*.

5.5. ?alvucci and I.H. :oldberg" D'dentifying /i+ations and ?accades in 7ye% $racking Frotocols"E Iroc ,ymp Eye 3racking Research X Applications, pp. 2 %23" #>>>.

8amie A %ard received the 6Eng degree with Foint honors in computer science and electronics from the University of Edinburgh, ,cotland, in =(((, and the Ih; degree in activity recognition !AR$ using body.worn sensors from the ,wiss :ederal Institute of 3echnology !E3A$ Burich in =((0 Ae spent a year working as an analogue circuit designer in Austria before Foining the %earable 7omputing +ab at E3A in Burich, ,witDerland In addition to his work on AR, he is

[(2! [(3! [(<!

H. Feng" /. Aong" and .. 5ing" D/eature ?election 4ased on Mutual 'nformation .riteria of Ma+%5ependency" Ma+%Relevance" and Min%Redundancy"E IEEE 3rans Iattern Analysis and Machine Intelligence, vol. #2" no. 3" pp. ##1% #(3" Aug. #>>*. H. Feng" DmRMR /eature ?election $oolbo+ for MA$AA4"E http=GGresearch.Canelia.orgGpengGproCGmRMRG" /eb. #>>3. H. .rammer and L. ?inger" DUltraconservative 9nline Algorithms for Multiclass Froblems"E 8 Machine +earning Research, vol. ("

actively involved in developing improved methods for evaluating AR performance Ae is currently a research associate at +ancaster University, United <ingdom
Aans #ellersen received the M,c and Ih; degrees in computer science from the 3echnical University of <arlsruhe, #ermany, in 1''0 :rom 1''9 to 1''0, he was a research assistant at the 3elematics Institute, and from 1''0 to =(((, was a director of the 3elecooperation "ffice ,ince =((1, he has been a professor for interactive systems at +ancaster University, United <ing.dom Ais research interest is in ubi4uitous computing and conte&t.aware systems Ae is

pp. <* %<< " #>>(. [)>! ..%I. Ain" DA'4A'N7AR0A Aibrary for Aarge Ainear .lassifica%
tion"E http=GGwww.csie.ntu.edu.twGQcClinGliblinearG" /eb. #>>3.

[) ! [)#! [)(!

5. 4annach" F. Aukowic6" and 9. Amft" DRapid Frototyping of Activity Recognition Applications"E IEEE Iervasive 7omputing, vol. 2" no. #" pp. ##%( " Apr.%Iune #>>3. R.A. .anosa" DReal%-orld @ision= ?elective Ferception and $ask"E A7M 3rans Applied Ierception, vol. 1" no. #" pp. %()" #>><. 5. Falomba" M. ?arlo" A. Angrilli" A. Mini" and A. ?tegagno" D.ardiac Responses Associated with Affective Frocessing of Unpleasant /ilm ?timuli"E IntTl 8 Isychophysiology, vol. (1" no. " Roggen" and :. $roJster" D'tBs in Lour 7yes0 $owards .onte+t%Awareness and Mobile H.' Using -earable 79: :oggles"E Iroc 1(th IntTl 7onf Ubi4uitous 7omputing, pp. 3)%<(" #>>3. $. Huynh" M. /rit6" and 4. ?chiele" D5iscovery of Activity Fatterns Using $opic Models"E Iroc 1(th IntTl 7onf Ubi4uitous 7omputing,

actively involved in the formation of the ubi4uitous computing research community and has initiated the Ubi7omp conference series Ae also serves as an editor of the Iersonal and Ubi4uitous 7omputing 8ournal #erhard 3ro@ster received the M,c degree in electrical engineering from the 3echnical University of <arlsruhe, #ermany, in 1'/C, and the Ih; degree in electrical engineering from the 3echnical University of ;armstadt, #ermany, in 1'C> ;uring eight years at 3elefunken, #ermany, he was responsible for various research proFects focused on key components of digital telephony ,ince 1''9, he has been a professor for wearable computing at the ,wiss :ederal Institute of 3echnology !E3A$ in Burich, ,witDerland Ais fields of research include wearable computing, smart te&tiles, electronic packaging, and miniaturiDed digital signal processing Ae is a senior member of the IEEE

pp. )*%*2" #>>>. [))! A. 4ulling" 5.

[)*!

pp. >% <" #>>3. [)1! A. 4ulling" 5. Roggen" and :. $roJster" D-hatBs in the 7yes for
.onte+t%AwarenessOE IEEE doi= >. ><GMFR@.#> >.)<. Iervasive 7omputing,

:or more information on this or any other computing topic, please visit our ;igital +ibrary at www computer orgJpublicationsJdlib

#> >"

Potrebbero piacerti anche