Sei sulla pagina 1di 51

High Transfer Rate, Real-time

Brain-Computer Interface

Machine-based learning techniques


towards a practical spelling device for
the completely paralyzed

1
Agenda
 Brain Computer Interfaces – brief intro.
 Our system
– Overview, technical details
– Machine learning – Support Vector Machines
– Additional Bandwidth – Word Prediction
– Results
 Future Improvements, Q&A
 Demonstration at Psychology Lab

April, 2005 ThinQ Innovation 2


BCIs – the Need

 ‘Locked-in’ patients
Example: J.D. Bauby, “The
Diving Bell and the Butterfly”
 Persistence of life –“butterfly”
 Extreme physical disability –
“diving bell”

April, 2005 ThinQ Innovation 3


BCIs – the Need
 Amyotrophic Lateral Sclerosis
(ALS), aka Lou Gherig’s

– Degeneration of motor neurons,


paralysis of voluntary muscles
– 120,000 diagnosed each year
worldwide
– 2000 Canadians live with ALS
right now
– Can leave patients ‘locked-in’
– Cognitive and sensory functions
remain intact

April, 2005 ThinQ Innovation 4


BCI(1): Slow Cortical Potentials
(SCPs)
• Extensive training ~ 3 months using biofeedback
mechanism
• Tested on ALS patients, learned to control SCPs

Ref: N. Birbaumer et al., “The thought translation device (TTD) for completely paralyzed patients,” IEEE Trans. Rehab. Eng., Vol. 8, pp. 190-193,June 2000.

April, 2005 ThinQ Innovation 5


BCI(1): SCPs cont.
• Most successful subject – artificially fed and
respirated for 4 years
• After 3 months of training, wrote letter below
-Took 16 hours to write ~ 2 letters/minute
-Expresses thanks, wants to have a party

April, 2005 ThinQ Innovation 6


BCI(2): Implants - Cyberkinetics Inc.

•BrainGate Neural Interface


System: Mkt. cap ~$45mil.
• Control of cursor on PC using
implant in motor cortex
• Undergoing limited clinical trials
• Limb movement possibilities

April, 2005 ThinQ Innovation 7


P300 Spelling Device – the P300 Event
Related Potential
• Known as ‘oddball’ or ‘surprise’ paradigm
• Inherent

8-40 uV avg.
deflection

300ms

April, 2005 ThinQ Innovation 8


P300 Spelling Device – the System
• Non-invasive
•Inherent Response

April, 2005 ThinQ Innovation 9


P300 Speller Terminology
 Epoch = One flash of any row or column
 Trial = 1 complete set of epochs - all rows and
columns
 Symbol = Alphanumeric characters or pictures

April, 2005 ThinQ Innovation 10


BCI Competition 2003

• Provided pre-collected data for competition


• P300 Spelling Paradigm:
-Winners included Kaper et al.
-Used Support Vector Machines
-Achieved high transfer rate with real-time
implementation possibilities

April, 2005 ThinQ Innovation 11


System Operation
 Steps
– Training (approximate 1hr)
• Provide visual stimuli (flashing of rows/columns)
• Record data with known classification label
• Run data through pattern recognition algorithm (SVM)
• Create customized models for each individual
– Spelling
• Load customized model for individual
• Provide visual stimuli (flashing of rows/columns)
• Record data with unknown classification label
• Run data through SVM classifier
• Sum up decision values
• Feedback most probable letter

April, 2005 ThinQ Innovation 12


Display
 Flexible matrix size
 Flexible matrix contents
– Alphanumeric Characters
– Words
– Symbols

April, 2005 ThinQ Innovation 13


Display cont…
 Random and exhaustive flashing of all of the rows
and columns on display
 Flashing cycle: 300ms
– 100ms intensification period
– 200ms de-intensification period
 10 second rest period at the end of each symbol

April, 2005 ThinQ Innovation 14


Data Collection
 Collect data from DAQ sampled at 240Hz
 600ms after intensification
 Buffer overlap
 Flexible data collection delay
 Flexible data recording time

April, 2005 ThinQ Innovation 15


Data Collection – cont.
 10 channels collected simultaneously
 Data from each channel concatenated together
 Data stored into program memory
 Collected until end of a symbol
– Converted to array
– Memory cleared for next symbol
 System is timing critical

April, 2005 ThinQ Innovation 16


Timing Issue
 Purpose
– Process within 300ms window
 Bottleneck
– Online SVM processing
• Old design = 340ms/Epoch
• New design = 17.67ms/Epoch
 Requirement
– Pentium4 or equivalent is sufficient

April, 2005 ThinQ Innovation 17


Matlab Interface
 Why we use Matlab?
 VB–Matlab interface
using APIs
 Common functions
– Pass matrix array to Matlab
workspace
– Get matrix array from
Matlab workspace
– Execute command line or
script

April, 2005 ThinQ Innovation 18


Support Vector Machines
 Pattern recognition Algorithm
 SVM used for:
– Creating models for different individuals (train)
– Getting discriminant scores (spelling)
 Detailed information covered later

April, 2005 ThinQ Innovation 19


Score Matrix
Coordinate
Index 1 2 3 4 5 6
7 -8.236
0.000 -6.458
0.000 -4.596
0.000 -1.987
0.000 -0.234
0.000
2.011 -0.239
0.000
8 -9.300
1.100 -8.985
0.000
1.100 -6.235
0.000
1.100 -1.256
0.000
1.100 13.234
0.000
1.100
3.111 -0.872
0.000
1.100
9 -9.632
0.000 -7.372
0.000 -6.999
0.000 -0.945
0.000 -1.300
0.000
2.011 -2.386
0.000
10 -8.123
0.000 -7.458
0.000 -4.563
0.000 -2.397
0.000 -2.011
0.000
2.011 -3.024
0.000
11 -2.500
-7.985
0.000 -8.001
-2.500
0.000 -5.335
-2.500
0.000 -3.598
-2.500
0.000 -3.501
-0.489
0.000
2.011 -2.500
-5.075
0.000
12 -9.234
0.000 -9.021
0.000 -6.346
0.000 -4.871
0.000 2.011
0.000 -5.897
0.000

April, 2005 ThinQ Innovation 20


Word Prediction
 Idea: predict intended words based on previous
spelling. Similar to cellular phone ‘smart text’
 Extract top ranked words
– SQL for fast searching
– Dynamic database
 Selection updated on
the bottom of the display
 Words chosen same way

April, 2005 ThinQ Innovation 21


System Design
 Modular Design Approach

April, 2005 ThinQ Innovation 22


What is SVM?
 Developed by Vapnik in 1992 at Bell Labs
 Broad applications
 Based on concept of ‘learn from examples’
 Key concepts:
– Linear Decision Boundary with Margin
– Nonlinear feature transformation

April, 2005 ThinQ Innovation 23


Basic Concept

 {x1, ..., xn} be our training data set


 yi  {1,-1} be the class label of xi then,

 Find a decision boundary


 Make a decision on disjoint test data

April, 2005 ThinQ Innovation 24


Decision Boundary (linear)

Class 1

 Infinite possibility

Class -1

April, 2005 ThinQ Innovation 25


Bad Decision Boundary

Class 1 Class 1

Class -1 Class -1

April, 2005 ThinQ Innovation 26


Good Decision Boundary

Class 1  Want to maximize m


m  Boundary found using
constrained optimization
problem

Class -1

April, 2005 ThinQ Innovation 27


Optimization Problem
 Optimization Problem

April, 2005 ThinQ Innovation 28


After Training
 xi’s on the decision boundary are called
SUPPORT VECTORS
 Support vectors and b defines the
decision boundary

April, 2005 ThinQ Innovation 29


Geometrical Interpretation
Class 1

8=0.6 10=0

7=0
2=0
5=0

1=0.8
4=0
6=1.4
9=0
3=0
Class -1

April, 2005 ThinQ Innovation 30


Non-separable Samples
 Use of Soft Margin Separation
 Kernel Transformation

April, 2005 ThinQ Innovation 31


Soft Margin Separation

Class 1

Class -1

April, 2005 ThinQ Innovation 32


Soft Margin Separation
 Idea: simultaneous maximization of margin and
minimization of training error

April, 2005 ThinQ Innovation 33


Nonlinear Samples
 Some Samples are inherently nonlinear in
input space

 No linear boundary is sufficiently accurate

April, 2005 ThinQ Innovation 34


Solution?

April, 2005 ThinQ Innovation 35


Kernel Transformation
 Idea: map input space into feature space
such that samples become linearly separable

April, 2005 ThinQ Innovation 36


Gaussian Kernel

April, 2005 ThinQ Innovation 37


SVM Implementation
 Matlab interface to libsvm
 Kernel: RBF with  = 6.6799e-4
 C parameter: 20.007

April, 2005 ThinQ Innovation 38


SVM Implementation
 Average Method (61.538%)
 Multi-Model Method (65.22%)
 Concatenation Method (82.418%)
 Weighted Concatenation Method
(max. 86.264%)

April, 2005 ThinQ Innovation 39


Possible Improvements
 Weighted concatenation method
Weighted Concatenation Method

88
1.50x 1.75x 2.00x
86 5x
Accuracy (% )

84 1.25x

82 0.75x
0.50x
80

78
weighting at (Pz, PO7, PO8 sites)

 Customized Kernel Parameters

April, 2005 ThinQ Innovation 40


Measure of Performance
 Bit Rate
60  1 p 
 log 2 N + p  log 2 p + ( 1  p)  log 2
t  N  1

– N: number of available symbols


– p: prediction accuracy
– t: number of seconds taken to choose one
symbol
 Letters per minute

April, 2005 ThinQ Innovation 41


Cont…
 Resulting Transfer Rates
– Without using dictionary

– With using dictionary

April, 2005 ThinQ Innovation 42


More Accurate Measure
 Resulting Transfer Rates
– Without using dictionary

– With using dictionary

April, 2005 ThinQ Innovation 43


Cont…
 Mechanism
– Receives a chosen letter from control module
– Appends the letter to current letters in the word
– Searches SQL database
– Return list of most probable target words based
on ranking

April, 2005 ThinQ Innovation 44


Result Analysis
1. Accuracy across subjects
2. Accuracy over time, same subject
3. Accuracy over number of trials
4. Accuracy versus model size

April, 2005 ThinQ Innovation 45


Accuracy Across Subjects

Date Subject # of Trials Accuracy Percentage


(letters)
March 26 Jack 15 12/14 86%

March 28 Min 15 12/21 57%

March 30 Brian 15 19/19 100%

March 31st Jyh-Liang 15 26/26 100%

April 2 Lucky 15 25/28 89%

April, 2005 ThinQ Innovation 46


Accuracy Across Subjects

Date Subject # of Trials Accuracy Percentage


(letters)
March 31st Jyh-Liang 3 13/13 100%

March 31st Jyh-Liang 2 18/20 90%

March 31st Jyh-Liang 1 10/21 48%

April, 2005 ThinQ Innovation 47


Accuracy Over Time, Same Subject
 Subject: Jack
Date Time # of Trials Accuracy Percentage
Change (letters)
from Model
Made
March 26 0 15 12/14 86%

March 28 2 days 15 19/19 100%

March 31 5 days 15 19/22 86%

April, 2005 ThinQ Innovation 48


Accuracy Over Number of Trials
 Subject: Jyh-Liang

Accuracy Vs Number of Trials

120%

100%

80%
Accuracy

60%

40%

20%

0%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Number of Trials

April, 2005 ThinQ Innovation 49


Accuracy Versus Model Size
 Subject: Jyh-Liang
Accuracy Vs Model Size
[5 Trials]

120.00%

100.00%
Accuracy (%)

80.00%

60.00%

40.00%

20.00%

0.00%
6.00 10.00 14.00 21.00 25.00 30.00
Size of Models (# of Symbols)

April, 2005 ThinQ Innovation 50


Questions?

April, 2005 ThinQ Innovation 51

Potrebbero piacerti anche