Sei sulla pagina 1di 111

DEVELOPMENT

OFAREAL-TIME
VISION
SYSTEM

FORAN
AUTONOMOUSMODEL
AIRPLANE

Danko
Antolovic

Submitted
tothe
facultyothe
f UniversityGraduateSchool
inpartialfulfillmentof
the
requirements
for
thedegree
Master of
Science
inthe
Departmentof Computer Science
IndianaUniversity

October,
2001
Acceptedbtyhe
Graduate
Faculty,Indiana
University,
in
partial fulfillmentof
the

requirementsfor
the
degree
oMaster
f of
Science.

________________________________________

Prof.
Steven
DJohnson,
. Ph.D.

________________________________________

Thesiscommittee Prof.
Florin
Cutzu,
Ph.D.

________________________________________

Prof.
MichaelE.
Gasser,Ph.D.

September
21,2001.

ii
Copyright©
2001

Danko
Antolovic

ALL
RIGHTSRESERVED

iii
ACKNOWLEDGMENTS

Thisthesisis
daescriptionohafardware/softwaresystemc onstructed
iseries
an of Y790

independentstudy
coursesinthe
Departmentof
Computer
ScienceaIndi
t ana
University.

The
workwasdone
under
the
supervision
oProfessor
f Steven
DJohnson,
. to
whomI am

gratefulfor
hissupport,hisinterestinthe
progressothe
f project and
, for
hisinsightful

andcriticalcomments.Professor
Johnson
constructed
the
cameragimba currently
l iunse.

am
I happy
tohave
hadthe
help
Mr.
ofBryce
Himebaugh,
engineer and
pilot

extraordinaire.
Beside
constructing
the
A/Dconverterand
the
ser vocircuit,
Bryce
has

sharedhisknowledge
andskillthroughmany
helpfuland
enjoyable
discuss ions.

also
I wishtothankProfessor
RobertDeVoe
the
ofIU
Schoolof
Optom etry.
His

expertise
onanimalvisionhashelpedmestablish
broader
a context for
some
othe
f

problemsencountered
inrobotic
perception.

Finally,my
thanksgtooLaurie,my
spouse,for
her
patience
during this,
thelatestof
my

academic
stints.

iv
ABSTRACT

Danko
Antolovic

DEVELOPMENT
OFAREAL-TIME
VISION
SYSTEM

FORAN
AUTONOMOUSMODEL
AIRPLANE

Thisthesisdescribes
raeal-time
embedded
visionsystemcapable of
tracking
two-

dimensionalobjectsin
relatively
a simple
(uncluttered)
scene,
in live
video.
Thisvision

systemisintended
acomponent
as of
raobotic
flightsystem,
used
t keep
o model
a

airplane
iholding
na patternaboveaonbjecton
the
ground.
The
systemuse taswo-

prongedapproachtobjecttracking,taking
into
accountthe
motion
othe
fs ceneand
the

graphic
“signature”
othe
f object.
The
vision
systemconsistsof these
main
components:

m
a otion-detection
andfiltering
ASIC,implemented
onFPGAs,
sace ne-analysisprogram

running
oMotorola
na ColdFire
processor,
daual-portRAMholding
the
ima ge
data,and

daigitalcamera
omotorized
na gimbal.

________________________________________

________________________________________

________________________________________

v
CONTENTS

Page

Acknowledgments iv

Abstract v

1. Introduction
tothe
SkeyeballVision
Project 1

1.1 History
othe
f visionsystem 1

1.2 Structure
othis
f document 4

2. FunctionalOverviewof
the
Vision
System 8

2.1 Visionmethodology 8

2.2 Biologicalparallels 11

3. ProjectStatus 13

3.1 Capabilitiesandlimitations 13

3.2 Measurementsof
the
tracking
speed 13

3.3 Summary
remarksontheperception
problem 16

4. Hardware
Architecture 19

4.1 Architecturalcomponents 19

4.2 Biomorphic
approach
toarchitecture 20

5. DesignSummary 24

5.1 XC4010
digitaldesign 24

5.1.1 Front-endFPGA 24

5.1.2 Back-endFPGA 25

5.2 MCF5307(ColdFire)code 26

vi
6. NTSCVideoSignal 28

6.1 Evenfield 28

6.2 Oddfield 31

7. Formatting
theImage
Scan 33

7.1 Verticalformatting 34

7.2 Horizontalformatting 37

7.3 Auxiliary
components 39

7.4 Signals 40

8. Digitizing
andThresholding 42

8.1 Black-and-white
inversion 42

9. Setting
the
ThresholdAutomatically 43

9.1 Heuristic
procedure 43

9.2 Thresholdcalculation
onthe
back-end
FPGA 45

9.2.1 Data
path 45

9.2.2 Control 47

9.2.3 Signals 47

10. DigitalZoom 51

10.1 Zoomimplementation
onthe
FPGA 52

11. RoundRobin
Procedurefor
Data
Sharing 55

11.1 Statusbyte 57

11.2 Roundrobin
onthe
front-end
FPGA 59

11.3 Roundrobin
onthe
MCF5307
processor 63

12. Pixelread/write
cycle 64

vii
13. Frame
Comparisonand
the
Motion
Vector 68

13.1 Methodology 68

13.2 Computation 69

13.3 Designcomponents 69

13.4 Signals 70

14. Writing
the
MotionVector
toDPRAM 72

15. Parametersof
the
Front-End
FPGA 75

16. IRQ5/ParallelPort
Complex 80

16.1 IRQ5
handler 80

16.2 Duty-cyclegenerator 81

16.3 Servomotionfeedback 81

16.4 Displacementvector 81

16.5 Saccadic
blanking 82

16.6 IRQ/PPcircuitonthe
back-end FPGA 83

17. Auxiliary
Features 88

17.1 Serialcommunication
with
the
MCF5307 88

17.2 Diagnostic
data
logging 88

17.3 Softrestartof
the
visionprogram 88

17.4 Radiocontrols 89

17.4.1Radiodecoder’ssignals 90

18. Feature
Recognitiononthe
MCF5307
Processor 91

18.1 Maindata
structuresin
the
MCF5307
code 93

19. Initializationothe
f SBC5307Board 95

viii
20. Characteristicsof
the
Camera/Servo
System 96

21. Supplementary
Diagrams 98

References 101

ix
1. INTRODUCTION
TOTHESKEYEBALLVISION
PROJECT

Skeyeballisanongoing
projectinthe
Departmentof
Computer
Sc ience
Indiana
at

University.
Itiscenteredaround
radio-controlled
a modelairplane, which
ibeing
s

convertedinto
semi-autonomous
a vehicle.
Itsprimary
perception
ias computer
vision

system,andiwill
t alsobequipped
withattitudesensors,
digital videoandtelemetry

downlink,anddigitalcommand
uplink.

The
objective
ito
sgive
the
airplane
the
autonomy
to
fly
beyond
the
l ine
osight,
f

navigate,andfindobjectsof
interestby
their
visualappearance
ra ther
thanblyocation.

The
objective
othe
f work
describedherewasto
build
vision
a syst emthatfollowsan

objectin
relatively
a simple
(uncluttered)
scene,
in
live
video. This vision
systemwillbe

integratedinto
larger
a robotic
navigation
systemused
to
steer
t he
airplaneinto
holding

patternabove
selected
a feature
ontheground.

1.1 Historyothe
f visionsystem

The
Skeyeballvisionwasfirstenvisioned
asasubsystemimpleme nted
ona

microcontroller
chip.
Soonibecame
t obviousthat
faast(and
nottoo
cos tly)

implementationothe
f early
processing
stageswasneeded:
vis ion
becameaA
n SIC-cum-

microprocessor
system,and
iisstill
t such system
a today.

1
Picture
Aerial
1: view
of
taargetoverflight

The
developmenthasgone
through
two
distinctphases.
The firstphasey ielded
strictly
a

laboratory
prototype:
thehardware
wasbuiltfromproto
boards,
andthe
p rocessorswere

X
a ilinxXC4010FPGA
and
Motorola
a MC68332. Data
were
shared
through
aS
n RAM

onthe
commonbus.
Thisarchitecture
required
considerable
data
copyin g,and
the
25

MHzMC68332processor
wasrather
too
slow
forthe
task.
Nevertheless the
, systemwas

capable
o(slow)
f objecttracking,moving
camera
a osimple
an g imbal.
Picturesa2nd
3

show
the
gimbalandthecircuitry
othe
f firstphase.

We
thenobtainedsome
realistic
footage
bfylying
the
airplane with
the
immobile
camera.

The
laboratory
prototypewascapable
odetecting
f targetfeat uresin
overflightsequences,

buttrackinganobjectreliably
aflight
t speedswasvery
problema tic.
Picture
shows
1 a

2
typicalaerialview:
the
plane
castsitsshadow
nextto
the brightsquare
target(a
brightly

coloredblanketon
thegrass).

Thisfirstphasegave
ufairly
as goodinsightintothe
minimalr equirementsof
such
a

system.The
second
(current)
phase
idescribed
s inthe
restof
the thesis.
Two major

architecturalimprovementsare
faster
a microprocessor
(90
MHz MotorolaColdFire)
and

daual-portRAMfor shareddata.
Elimination
oone
f very
cumbersome
proto
board
has

alsomade
the
ASIC(application-specific
integrated
circuit) implementation
much
easier.

The
fundamentalvisionalgorithmhasnotseen
much
change
over
time,
ex ceptfor
the

additionothe
f thresholdcalculationinthe
secondphase the
- improvementhasbeen
the

increasedspeed.
Muchgreater
modificationshadto
bme ade
ttohe
dat falow
procedures,

totake
advantage
othe
f dual-portmemory
and
better
busarchitecture.

Finally,inthe
second phase,the
systemwasgiven
the
proper
startup
procedure
andradio

controls,andthe
entirecircuitry
wasbuiltso
ato
sbseuitable
f or
mounting
inside
the

airplane.
Pictures4-7
show
the
equipmentbuiltin
the
second
phase:thr eecircuitboards,

the
new
cameragimbalandthe
radio-controlled
power
switch.Pict ure
7also
showsthe

radioandTV
linksconnectedtothe
visionsystem.

3
1.2 Structure
othis
f document

Thisdocumentserves
daualpurpose:itdescribesthe
constructed
sys temas
saolution
to

anengineering/computationalproblemin
broad
terms;italso
describe itat
sthe levelof

detailneededfor
modificationand
further
development.

The
systemdividesitselfnaturally
intoseveralsubsystems. Thefirstfive
sectionsof
this

documentprovide
anoverview,andtheremaining
sectionsdescribe
the
subs ystems

separately,withthe
levelof
detailincreasingwithin
each
subsy stemdescription.
We have

triedtomake
iclear
t where
the
broad
description
endsand
the
deta iled
onebegins:

typically,detaileddescriptionsaregrouped
into
specialized
subsec tions.

The
ultimate
levelof
detail schematics,
- pinoutlistsand
the
sourcecode has
- been

relegatedtoelectronicform.
Thisdocumentcontainssummary
descr iptionsof
those

materials,aswellaspassagesreferring
directly
tosourc deetails.Interestedreader
should

become
familiar
withthe
circuitschematicsand
the
sourcecode.

4
5

Picture
Gimbal,
2: phase1 5 Picture
Circuit
3: boards,
phase
1
Pictures4,5:
Circuitboardsandgimbal,
phase
2
6
Pictures4,5:
Circuitboardsandgimbal,
phase
2

Pictures6,7:
Visionsystem,
radio
and
TVlinks,power
switch

7
2. FUNCTIONALOVERVIEWOF
THEVISION
SYSTEM

2.1 Visionmethodology

Aswe
statedinthe
introduction, the
objective
othis
f work
wasto
build
vision
a system

thatwillfollow
anobjectin
relatively
a simple
scene,
in
live video.
We have
used
two-
a

prongedapproachtobjecttracking,taking
into
accountthe
motion
othe
fs ceneand
the

graphic
“signature”
othe
f object.

Thisapproachwasmotivatedbtyhe
factthatobjectrecognition
is computationally

intensive,andimpossible
toaccomplish
oframe-by-frame
a
n basis with
the
available

hardware.
Nevertheless,the
visionsystemmustoperatefastenoug nhottoallow
the

objecttodriftoutof
the
fieldovision.
f Picture
9illustratest hispoint.

The
objective
ito
srecognize
the
smalldark
objectasthe
targe t,
obtain
itsoffsetfromthe

center
of
the
visionfield,andmove
the
camera
tobring
the
object to
the
center.

Obviously,the
visionsystemmusttake
still
a frame
and
base
its calculationoIn
intt.he

meantime,the
objectwillchange
location,
perhapseven
driftoutof
t hefield.
By the
time

itiscalculated,the
displacementvector
may
wellbe
irrelev ant.

Toavoidthis,the
motion
othe
f scene
itracked
s frame
bfyram e,
andthecamera
moves

tocompensate
for
The
it. center
of
the
fieldmovesalong
with
the target,and
the

displacementvector,whenavailable,willstillbe
meaningful.
It should
bseaid,
however,

thatthe
importance
othe
f driftcompensation
decreased
aw
s uesed m
a uchfaster

processor
inthe
secondphase
othe
f project(see
Section
1.1).

8
Trackingthe
selected
object

Tracking the
overall Recognizing the
tracked
motionothe
f scene. objectwithin
thescene.
Shallow,pixel-specific Deep,
object-specific
computation. computation.

Frame-by-frame
motion Objectlocation
vector,
vector asavailable

Objectlocation
overrides
the
motionvector

Camera movesto
compensate for
scene
motion, or
to
bringthe
objectin
the
centerof
vision
field.

Picture
8Method
: overview

9
Objectrecognitionrequiresseveralstagesby
which
redundantand
non -essentialvisual

informationiparsed
s out,
untilwe
are
leftwith
selection
a owf ell-definedobjects,
also

referredtoathe
s featuresothe
f scene.
Inourcase,
thesta gesare:thresholding,
edge

detection,segmentationintoconnectedcomponents,
clean-up
and
signature/ location

calculation.

End
calc.

Startcalc.

Picture
9Drift
: compensation

Picture
10gives
functional
a overview
of
the
vision
system,
where the
progressively

thinner
arrowssignify
the
reductioninbulk
othe
f visualinformation. Thisisthe typical

“funnel”
othe
f visionproblem,
leading
fromsimple
computationson
large volume
of
*
data
tocomplexoneson
small
a volume,yielding
number
a or
twoats he
result.

*
At30frames
persecond,483lines
perframe,and 644byte-sizedpixels perline,raw
cameraoutput
amounts to9.33Mbytes/sec.
The644samples perl ineocontinuous
f signalconform tothe4:3aspect ratio
prescribedbN y TSC,andonebyteperpixelis
raea listicchoiceocolor
f depth.
10
Toidentify
target
a objectwithinthe
scene,we
used
the
second
mom entsaboutthe

object’sprincipalaxesofinertia
athe
s “signature.”
Second momentsare
invariantunder

rotationsandtranslations,fairly
easy
tocalculate,and
work
well in
simple
scenes.

2.2 Biologicalparallels

Itisnotentirely
surprising
thatcertainfunctionalanalogies should
developbetween

robotic
perception,suchathis
s real-time
vision
system,
and
percepti on
iannimals.
While

suchanalogiesmustnotbe
takentooliterally,
they
provide
glimps
a aeuseful
t

generalizationstobm
e ade
aboutperception
problems,
andww
e illske tchthemoutas

appropriate.

For
example,camera
motionbasedonfeature
recognition
bears
sai milarity
ttohe

(involuntary)saccadic
motion
othe
f eyes,
theone-shotmovementthatbri ngs
daetailof
1
interestintothe
center
ofthe
visionfield. Eyes’
saccadesare
fast,
have
largeamplitude

(aslargeathe
s displacementof
the
objectof
interest),
andthe ayre
ballistic
movements,

i.e.they
are
notcorrectedalong
theway.Such
motion
a icompatibl
s w
e ith
need
a for

speedover
precision:if
the objectidentification
icomputationally
s intensive,
usethe

resulttofullextent,andafast
s aspossible.

Attheendothe
f process,thevisionproduces
fae w
dozen
bytes
every
coupleoframes
f –actualrate
depends onthecontentsof
theimage.
11
Digitalcameraon
2a-servogimbal
Servoduty
cycles

Compositevideo

Syncseparator A/Dconverter
andsampler

8-bitvideo
Sync.signals

Thresholding Current
thresholdcalc.
B/Wimage
Motion
detector
Datasharing
(roundrobin) FPGA’s
Servo
Blanking
motion
MCF5307

Edgedetector

B/Wedgetrace

Segmentation,
removalofsmallfeatures;
signaturecalculation

Featurelist Target
selection

Motion
vector Target
recognition

Displacementvector
Vectorselection
andservodriver

Servoduty
cycles

Picture
10:Functionaloverview

12
3. PROJECTSTATUS

3.1 Capabilitiesandlimitations

We
have
testedthe
lab
visionsystemby
manually
moving
irregular shapesin
vision
a

fieldfairly
free
orandom
f clutter,andalsobm
y eansof
ra otating
table(Section
3.2

below).

The
recognitionsystemtracksitstargetreliably
under
tra nslation
androtation,
anditnhe

presence
several
of shapesintroducedainterference.
s Excess ive
skew
breaksoff
the

tracking,since
the
visionalgorithmmakesno
provision
for
it,
buta onblique
view
of
ca.

20degreesisstillacceptable
(Section3.2).
Likewise,
occlusion
is interpreted
acashange

of
shape,andthe
targetislostwhenpartially
occluded
(e.g.
b dyrifting
beyond
the
edge

of
the
visionfield).
These
limitationsare
obviousconsequencesof
the vision

methodology
describedinSection2.1
and
Picture10.

The
trackingandmotiondetection
workequally wellwith
the
zoomengaged.
As

expected,zoommakesthe
systemmore
reliable
itnracking
smal targets,
l atthe
expense

of
limiting
the
fieldovision
f ttohecentralone-ninth.

3.2 Measurementsofthe
tracking
speed

The
Skeyeballairplane
flieswithincertain
rangesospeed
f anda ltitude;
our
fixed-

camera
flightshave
rangedfrom18
to60mph,with
typical
a speed
of ca.
35 mph.

Likewise,target-overflightaltitudeshave
beenfrom60
t3o20
Cons
ft. equently,
theline

of
sighttothe
targetfeature
changesdirection
relative
ttohe body
othe
f airplane,
with

13
certainangular
velocity,and
the
visionsystemmustbe
able
tkoe ep
uw
p ith
In
itt.he
test

flights,the
targetpassedthroughthe
visionfield
othe
f fixed
cam era
itnime
intervals

rangingfrom0.8to4.2seconds,depending
on
thealtitude
and
velocity
otf he
airplane.

Inorder
toobtainsome
quantitative
measure
othe
f vision’stracki ng
abilities,
we have

constructed
test
a rig ra-otating
table
with
featuresto
track.
The vision
systeml ocks

successfully
ontothe
(largest)
presented
featureand
the
camer taurnsfollowing
the

rotationothe
f table.
Therotationspeedigradually
s increased,
unti the
l tracking
breaks

off
or
targetacquisitionbecomesimpossible.
Picture
11
showsthe
ex perimentalsetup:

the
cameragimbal,the
rotating
table
with
two
features,
andthe TV
screen
showing
the

camera’sview.

Picture
12showsthegeometry
othe
f setup.
Thespeed
othe
f table’ motor
s wasregulated

by
applying
variable
voltage,andtheangular
speed
othe
f table
wa measured
s athe
s time

neededfor
tenturns.A
simple
formula
relatesthetable’sangul ar
speed ω
to
the
camera’s

sweeping
speed, θ:

ω
θ=
d
1+  
2

r

14
Picture
1Laboratory
1: set-up
for
the
tracking-spe ed
measurements

ω
r

Picture
1Table
2: speedvs.
thecamera’ssweep

15
isthe
delevation
othe
f camera
above
the
table,
a nd is
rthe
distance
othe
f targetfeature

fromthe
center
the
oftable.

At =
d26.5
cm,
and =
1r c0m,
wefound
thatthe
tracking
wasstillreli ablewith
the

camera sweeping
aanrcathe
t maximumspeed
of:

θ(max)
45
=
degrees/second.

The
camera’sfield
ovision
f ica.s 47
degreeshigh and
ca.
60
degreeswide(see
Section

20),
which
putsthisvision
systemwithin
the
range of
speedsrequired
tkoeep
uw
p ith
the

overflightspeedsthatwere
quoted
above.

Of
course,
tracking
speeddependson
the
complexity of
the
scene.
These
measurements

were
performed
with
twoothree
r shapes,
plusan
in termittently
visible
edgeothe
f table.

The
scene
observed
ireal
an flightis richer
in
features,
butatleastforgrassland
and

trees,
featurestend
thoave
low
contrastand
disap pear
below
the
threshold,which
itnurn

issetto
single
outhigh-contrasttargets.

3.3 Summaryremarksonthe
perceptionproblem

Real-time
perception
canbenvisioned
afasunnel in
which
the
data
volume
ireduced,
s

butthe
algorithmic
complexity
increases.
Typically there
, willbe
severalstageswith

fairly
differentbreadth/depth
ratio.

Thisisintrinsically
not
paroblemamenable
tsoin gle-architecture
processing.
Ofcourse,

sapeed
tradeoff
isin
principle
alwayspossible,
b utengineering
considerationssuch
as

16
power
consumption
andheatdissipation
place
very
a reallimiton
thatapproach.
We

believe
thatitisbetter
to
use
severalprocessor architectures,
each
suitable
for
daifferent

stage
othe
f perception
process.
Appearance
otnhe scene
oconfigurable
f microchips

makesthisgoalboth
realistic
and
appealing.

Robotic
perception
ialso
s problem
a in
embeddedco mputing.
Requirementsimposed
by

the
smallmodelairplaneare
bit
a onthe
stringent side,
andone
can
envision
much
a

more
relaxed
design
foran
assembly
line
osecurit
r syystem.
However,
theneed
for

perception
inaturally
s the
greatestin
mobile
robo ts.In
such
applicationsthe
vision

systemwillalwayshavetboceompactandautonomou s,
because
ibestows
t autonomy
on

m
a obile
device
whose
primary
function
isomething
s other
thancarrying
vision
a system

around.

Architecture
should
follow
function,
starting
aat fairly
low
level.
Forexample,
data

collection
itnhissystemisdone
with
digital
a ca mera
which
serializesthe
(initially)

parallelimage
input.
Thischoice
wasdictated
bgy ood
practicalreasons,
butthe
system

lost
gareatdealof
processing
power
because
oth
f atserialization.Image
inputshould

have
been
done
ipnarallel,
which
itnurn
would
hav reequired
specialized
a device
anda

much
broader
data
path
itnhe
initialprocessing
st age.

The
segmentation
stage
ibetter
s suited
for
impleme ntation
ogneneral-purpose
processors

because
othe
f smaller
data
volume
and
more"sequen tial"
algorithms.
An architectural

alternative
may
bpeossible
here:segmentation
coul bdaettempted
ohighly
an connected

17
neuralnetcircuit,
trading
off
an
exactalgorithm for
an
approximate,
butparallelizable,

search
procedure.
Neuralnetsearches,
onthe
othe hand,
r are usually
slowto
converge

and
may
notimprove
theoverallspeed.

Animalvision
cannotbe
separated
fromcognitivefu nctionsand
motor
coordination,
and

thismustbe
true
for
robotic
vision
awell.
s How m uch
“intelligence”
ibuilt
s intohigh-

levelprocessing
ovisual
f information
dependsont he
ultimate
objectivesofSkeyeball:

for
example,
searching
for
3aD
shape
icomplex
an panorama
ipasroblemdifferentfrom

thatof
hovering
above
prominent
a feature
otnheg round.

In
termsof
steering
andmotor
coordination,
biolog icalparallelsare
relevant.Itisknown

thatinertialmotion
sensorsplay
large
a roleitn hegazecontrolof
mobile
animals[1].

Since
the
inputfromthemotion
sensorsissimpler, and
the
processing
presumably
faster,

thissensory
pathway
providesthe
supporting
motion information
much
faster
than
can
be

obtained
bvyisualprocessing.
Theairplane
may
ve ry
wellbenefitfromaneventual

integration
oits
fvision
and
attitude/motion
senso rs.

Visualperception
ian
sill-posed
problem,
and
exam plesof
functioning
compromisesmay

be
more
valuable
than
exactresults.
Throughoutthi document,
s we pointoutsimilarities

with
biologicalsystemswhich
strike
uas
sinterest ing,
although
wdenootpursue
themin

depth
for
lack
oexpertise
f otnhe
subject.
A
few pitfallsnotwithstanding,
we believe
that

saynthesisofcomputational,
physiologicalandeng ineering
knowledgewillbe
necessary

for
the
eventualdevelopmentof
reliable
and
versat ile
perception
systems.

18
4. HARDWAREARCHITECTURE

4.1 Architecturalcomponents

Picture
is
1an
3overview
of
the
architecture
otf he
vision
system.
Pictureshows
14 all

the
signalspertaining
ttohe
flow
of
data
fromthe camera,
through
the
processorsand

back
tsoervo
motors,
butitomitssome
peripheral details.

Camera The
- “eye”
othe
f systemis
samalldigitalcamera producing
, grayscale
(non-

color)
NTSCvideo
signal;itsother
characteristics are
largely
unknown.
Thecamera
is

mounted on
gimbal
a driven
btywo
servo
motors,with
50-d
a egree
range
omotion
f in

each
direction,
andipermanently
s focused
oinnfin ity.

Sync
separator t–he
NTSCgrayscale
video
signalcontainsthree
sy nchronization
signals.

These
are
extracted
bmy eansof
video
sync
separato LM1881
r bNy ational

Semiconductor,
mounted
oprototyping
an board
along with
supporting
circuitry.

A/D
converter w
– ueseAnalog
Devices’
AD876,which
ipasipeline 1d0-bitconverter.It

ismounted
otnhe
same
proto
board,
with
supporting circuitry
for
itsreference
voltages.

Sampling
control,
thresholding
and
thresholdcalcul ation,
motion
detection
and
zoomare

implemented
adigital
s designson
synchronous
a pai of
rXilinxXC4010
FPGA's,
running

at33.3
MHz.
Start-up
configuration
idone
s with
tw A
o tmel’sAT17LV
config
ROMs.

19
The
entire
objectrecognition
iimplemented
s acod
s e,
running
o90
anMHzMotorola

MCF5307
ColdFire
integrated
microprocessor.
We use caommercialevaluation
board,

SBC5307,
with
megabytes
8 ofDRAM,
start-up
flash
R OM,
expansion
busand

communication
ports.

The
two
processorsshareimage
data
through
32K
a d ual-portSRAM,
CY7C007AV
by

CypressSemiconductor.Data
accessisimplemented
a rasound-robin
procedure,
with
the

objective
ospeeding
f uphigh-volume
data
transfer in
the
early
stagesof
the
vision

process.

The
driverfor
the
servomotorsthatmove
the
camer iaimplemented
s onone
othe
f two

FPGA’s.
Motion
feedback
fromthe
servosisgenerat ed
bPIC16F877
ay microprocessor,

on
the
basisof
servos’analog
position
signals.

4.2 Biomorphic
approachtoarchitecture

Nervoussystemsof
animalsutilize
specialized
hard warealmostby
definition.
There
is

much
evidence
thatbiologicalarchitecturefollows function:
for
example,
theretina,
with

itslayersof
many specialized
typesof
cells,
isapparently
structu
a re
whichhasevolved

to
dealwith
the
initialstagesof
the
vision
funne l,
fromthe
cellular
levelup.

Architecture
othis
f robotic
systemfollowsthe
sam “ebiomorphic”
principle
amuch
s as

possible.In
order
to
increase
overallspeedand
th roughput,
we have
optedfor
ASICsand

dedicated
data
paths,
even
athe
t costof
under-uti lizing
some
components.
Multitasking

20
and
time-multiplexing
are
systematically
avoided.B iologicalsystemsfollow
this

principle
because
oevolutionary
f constraints,
but they
solvethe
real-time
perception

problemwell,
andthe
trade-offsthey
make
appeart botehe
rightones.

21
XC4010
FPGA
back end servo data
16 diagnostics
duty cycles 2 Parallel Serial
Servo duty port port
cycles
servo motion
IRQ5 MCF5307

B/W threshold
Unified cache
(DRAM only)

PIC16F877
DRAM(8M)
controls 4 8 video &
4 Servo position 4
threshold
(analog) enable
15 8
sigs.
Servos addr data
sync signals
enable
2
LM1881 sigs. 7

XC4010 addr 15
video CY7C007AV
digitized video FPGA DPRAM(32K)
front end
Camera data 8
AD876 8

sampling clk.

Picture 13:Architecturaloverview

22
FPGA board SBC5307 evaluation board
PIC16F877 XC4010 FPGA
back end
Servo motion servo highs
signal 14 PP 0-13 Serial
port
RDY PP 14 Parallel
Servo duty port
2 2 cycles ACK PP 15
SWM MCF5307
J8
PWM_1
J1 J4 IRQ5 IRQ5
PWM_2 Feature tracking

Gimbal IRQTRG
J4 J5 board OE 15 8
B/W threshold BWE0
CS4 A D
to/from servo motors
CS5

J8 J9
Picture
14:Architectural
overview d–etailed EVEN/ODD CE_L
SEM_L
p.
23 LATE CLK 8 video &
threshold
15 8
R/W_L
THR RDY OE_L addr data

CE_R
Left
SEM_R port
V. SYNC.
LM1881 R/W_R
Sync.sep. H. SYNC.
OE_R
BUSY_R
from RCA video
camera jack XC4010 FPGA BUSY_L CY7C007AV
dig. video front end Right DPRAM
M/S port
AD876 Shared image
8
converter addr 15 data
sampling
clk.
data 8
23
5. DESIGN
SUMMARY

Thissection
provides
taop-levelsummary
othe
f sc hematicsandcode
moduleswhich

comprise
the
functionalconfiguration
othe
f hardwa re
described
iS
n ection
4.

5.1 XC4010
digitaldesign

Configuration
othe
f twoFPGA
processorsisimpleme nted
with
the
Xilinx
Foundation
2
developmenttool,
eitherasschematicsorasAbelH DL
code. This listgivesan

overview
of
the
functionalitiescontained
itnhe
hi ghest-levelmodules.
Design

componentsand
signalsare
described
throughoutthe textand
itnhe
schematics

themselves.

5.1.1 Front-endFPGA

Thisdesign
consistsof
seven
top-levelschematics and
number
a of
macros.Itutilizes

about
60%of
the
logicalblocks(CLBs)
of
the
XC40 10
FPGA.

VISION_IN –containsthe
entry
pointfor
the
video
sync
signa ls,
verticaland
horizontal

imageframing
and
sampling,
andthe
zoom.

ANALOG_IN i–nputfromthe
A/D
converter,
normalization
tbol ack
reference
level,

black-and-white
thresholding.

FIELD_END p–laceholder
schematic,
invoking
macrosfor
vector output,run-time

parametersand
theroundrobin.

24
VISION_OUT i–nput/outputto
the
DPRAM.

RR_ADDRESS a–ddresscountersfor
image
buffers,
round-robin
a ddressmultiplexer.

MOTION_VECT i–nvokesthe
pixelread/write
cycle,
calculation
o the
f motion
vector.

CLOCKS e–ntry
pointfor
the
externalclock
andresetsign al,generation
ointernal
f

clocks.

5.1.2 Back-endFPGA

The
design
consistsof
five
top-levelschematicsan number
ad of
macros.Itutilizes96%

of
the
CLBsof
theXC4010
FPGA.

DUTY_CYC_PP –IRQ5/parallelportcommunication,
duty
cyclegene rator,
servo-

motion
signal.

HIST_DP d–ata
path
forthe
threshold
search
itnhe
histogr am.

HIST_IO_RAM h–istogramstorageand
management.

HIST_MINMAX c–omparison
logicfor
the
threshold
search,contro ASM.
l

HIST_INTEGRAL –calculation
othe
f histogramand
featuresarea.

25
5.2 MCF5307(ColdFire)
code

Programsrunning
otnheMCF5307
ColdFire
processor are
written
iC
nand
ColdFire
3
assembler,and
compiled/assembled
with
GNU
“gcc”and
“as.”
T hislistgroupsthe
code

modulesby
systemfunction;further
description
is provided
itnhe
textand
itnhe
source

comments.

main.c initialization
- and
main
loop
for
featu re
recognition

Configuration
and
startup:

cache.s cache
- initialization

ConfigRegs.s MCF5307
- configuration,
running
fr omflash

ConfigRegs2.s MCF5307
- configuration,
running
f romDRAM

crt0.s setup
- for the
Clanguage

globals.c init.
- of
globalvariablesfor
functi onalcode

glue.c heap
- setupand
other
book-keeping

start.s processor
- startup
sequence

vector.s vector
- table

Feature
recognition:

ConnectedComponents.c segmentation
- algorithm

Diagnostics.c vision
- system’serror
reporting

FeatureDetector.c feature
- “signatures”

FeaturePoints.c maintenance
- oheap
f data
struct ures

Features.c driver
- modulesfor
acquisition
and tracking

26
GraphDFS.c depth-first
- graph
traversal

SimpleEdge.c edge
- detector

Inter-processcommunication:

CyclesPP.s communication
- throughthe
parallel port

IRQHandler.s handler
- for
theInterrupt(5Proc ess1)

roundRobin.s round-robin
- DPRAMaccess

Servo
motion:

servo.s translating
- displacementsto
servo
du ty
cycles

Auxiliary:

Datalog.c interface
- library
for
thediagnostic data
log

DataOutput.c diagnostic
- data
output

serial.c communication
- library
for
the
serial port(s)

SerialHandler.s UART
- interrupthandler

IRQ7Handler.s handler
- for
theInterrupt(7soft restart)

TermInput.c stub
- forthe
terminalcommand
inpu t

27
6. NTSC
AND
THEEVENTSSYNCHRONOUS
WITHTHEVIDEOSIGN AL

4
Frames
the
ofNTSCtelevision
signal consist
of
two
interleavedfields,
marked
bayn

even/odd
synchronization
signal.
Atthe
beginning
o each
f field
there
ipaseriod
otime
f

when
the
beamretracesback
ttohe
top
othe
f image (atlow
intensity),
paeriod
marked

by
the
verticalsynchronization
signal.
Also,
horiz ontalretracing
between
video
linesis
5
marked
btyhe
horizontalsync. Pictures
15
t1os8how
some
detailsof
theoperat ions

synchronouswith
the
video
signal.

We
sample
the
contentofthe
video
signalduring
th even
field
oeach
f frame,
andat

variable
resolution e–very
third
line
itnhe
absence
ozooming,
f every linewhen
the

zoomisengaged.
Motion
detection,
which
compares adjacentvideoframes,
isperformed

simultaneously
with
the
sampling,
between
the
pixel s,
asitwere.An
early
fraction
othe
f

odd
field
iused
s forcommunication
between
process esand
for
the
threshold
calculation.

6.1 Evenfield

Picture
1s5howsthe
beginning
othe
f even
field,
t he
synchronization
signals,
theA/D

sampling
clock,and
the
composite
analog
video
sign Notice
al. thelong
verticalblanking

period
before
the
beginning
othe
f actualimage
tra nsmission.


Pictures
15-18,
27,
28,
32,
35,
38
and
4a2re
scre en
shots from
H
a ewlett-Packard
16500B logic
analyzer.

28
Blanking
period Image
transmission

Picture
15:The
evenfield

The
sampling
clock
operatesin
bursts,
during
thes ampled
portion
othe
f even
field
(see

Pictures15,
16,19
and
26).
Firstsample
oeach
f l ine
itaken
s during
the
so-called
back

porch
othe
f horizontalsync,atthe
blackreferenc ientensity.
We use
the
firstsample
as

the
zero
reference
for
theremaining
grayscale
valu esin
thatline:there
iquite
s bit
a of

intensity
wobble
itnhe
camera signal,and
thisreferencing
makesthe
image
steadi er.

Picture
1s6howsthe
video
contentof
one
line,
the sampling
clock
and
thethresholded

digitaldata
obtainedfromthe
video
signal.
Theva luesof
one
(black
pixels)
in
the
middle

of
the
line
correspond
tothe
dip
itnhe
video
sign al,
which
wasin
turn
caused
bdark
ay

objectatthe
top
ocamera’s
f vision
field.

29
Back
porch

Picture
16:Video
signalcarrying
one
line

Picture
1at1.7
7,MHzsampling
rate,
showsthede lay
between
the
sampling
clock
and
6
the
return
odigital
f data.
TheA/D
converter
isp ipelined,introducing
delay
a ofour
f

clock
periods,
whichwaeccountfor
by
delaying
the beginning
opixel
f read/write
cycles.

TheI/O
attheend
othe
f line
extendspastthe
sam pling
clock
btyhe
same
amount.

The
field
ARisthe
pixel’saddressin
the
DPRAM(r ightport),
VIDistheconverter’s

output.
Firsttick
othe
f AD
clock
occursatthee nd
oback
f porch,
andtheresulting
black

reference
value
ilatched
s four
tickslater,as0x3 i1tnhe
VID
signal.
HD isthe

normalized
grayscale
value:notice
thatVID H
– D=
0x31
pastthe
black
reference.

30
DRisthe
thresholded
signal,
showing
itnhiscase balack
objectin
the
firstline.
OE_R,

WE_Rand
CE_Rare
the
memory
controlsignalsof
the DPRAMsrightport.

Picture
1A/D
7: pipeline
delay

Motion
vector
calculationsand
storage
othe
f digit alimageare
described
iS
n ection
12,

dealing
with
tpixel
he read/write
cycle
(p.
64).

6.2 Odd
field

Picture
1s8howsthe
beginning
othe
f odd
field
and the
DPRAM
I/Oassociated
with
it.

Atthistime,
parametersare
updated,
themotion
ve ctor
hasbeen
calculatedand
iwritten
s

31
out(notice
the
twelve
dipsin
the
WEsignal),
and Processu1pdatesthe
statusbyte

(notice
thatthe
semaphore
operation
SEM_Rbrackets the
statusbytereadand
write).

IRQ5
iasserted,
s triggering
the
handler
on
MCF5307 and
starting
the
parallel-port

communication
(see
Section
16,
onIRQ/PP).

Param Motion
vector Statusbyte

Picture
18:The
odd
field

32
7. FORMATTING
THEIMAGESCAN

The
design
receivesthe
sync
signalsalready
separa ted
fromthe
totalvideosignal.Ituses

the
syncstocontrolthe
digitization
and
framing, to
assign
coordinatesto
pixels,
andto

counttotalnumbersof
pixelsand
black
pixelsin
t he
image.
Forconvenience
in

analyzing
the
VCRvideosignal,
which
doesnothave the
even/odd
sync,
thissync
signal

isbeing
generated
internally.

Deciding
awhich
t pointsto
sample
the
video
signal i.e.
, generating
sampling
a clock
for

the
A/D
converter,
isthemain
functionality
derive fdromthe
synchronization
signals.

Verticalformatting
meansthe
selection
ovideo
f li nes,
while
the
horizontalformatting

means
saelection
odiscrete
f sampling
pointson
th ceontinuousvideo
signal.
The two

formatsdiffer
in
details,
andare
made
somewhatmo re
complexbtyhe
presence
othe
f

zoom(see
Section
10,
on
the
digitalzoom).
Also, in
thissystem,
sampling
ilimited
s to

the
even
field
othe
f frame.

Picture
1s9howsthe
formatting
geometry
and
the
si gnalsinvolved,
in
theabsence
of

zooming.
Zoomgeometry
ishown
s iP
n icture
26.

33
Verticalblanking VERT_SYNC

Scanned
line
density;
SCAN_LINE

Scanned lines
sync

per
field
Line

Area
othe
f videosignal; (fixed)
even
field=
digitized
area

Sampling rate
1.83
MHz;
signal1_7_MHZ

DLY_END
=0 Samplesper
line
(fixed)

Blackreference
sample; BP_END

Picture
1Formatting
9: and
sampling

7.1 Verticalformatting

Briefly,
theverticalformatting
circuit(in
the
sc hematic
VISION_IN)
determinesthree

things:

- atwhich
line
itnhe
evenfield
tsotartsampling

- atwhatline
density
tsocan
(how
many
linesto
ski bpetween
scans,
if
any)

- how
many
linesto
scan
(i.e.
when
tsotop)

34
7.1.1 Startingline

Obviously,
sampling
mustbe
suppressed
during
the
v erticalretrace
(verticalblanking),

and
when
the
3Xzoomisengaged,
over
the
top
third of
the
imageawell.
s Thisis

accomplished
beyxtending
the
duration
othe
f verti calsync
ttohefirstscanned
line,

counting
the
requisite
number
of
lines.
Aloadable counterand
small
a ASM,triggered

by
the
signalVSNCandclocked
bLy INE_SYNC,
produc tehe
signalVERT_SYNC,

which
extendsto
the
firstscanned
line.

0 W
VERT_SYNC
=
0

0
VSNC

TC_LD

1R
TC_EN
VERT_SYNC=
1

0
TC_VS

Picture
20:VERT_SYNCASM

7.1.2 Line
density

ComponentLOAD_CNT2
icasounter
whichreloadsan
e xternalvalueD_IN
whenever
it

runsout,
then
continuesrunning
ttohatvalue.It has
parovision
for
the
zero
count,
and

two
termcountsignals,
fullperiod
and
half
period Itruns
. while
itsTRG
inputishigh.

Thiscomponentisused
tsoetthe
line
density,
by clocking
iwith
t LINE_SYNC.

35
(A=0) W

0
TRG

LOAD_CNT
CNT_EN

(A=1) R
CNT_EN

0
TRG

0
TC

LOAD_CNT

Picture
21:LOAD_CNT2
ASM

36
7.1.3 Line
count

ComponentSCAN_LPFisastopping
counterwhich
irs esetasynchronously
otnhe
rising

edge
oits
fTRG
input.
ItsoutputGATE
remainshig fhor
the
duration
othe
f count;

afterwardsthecounter
sleepsuntilthe
nextreset. Thiscomponentisused
tcoountthe

scanned
lines in
the
evenfield.

(BA)
00 W1
SYNC_LD
CNT_EN
Theasynchronous resetstate

01 R
GATE
CNT_EN

0
TC

10 W2

Picture
2SCAN_LPF
2: ASM

7.2 Horizontalformatting

The
horizontalformatting
circuit(also
iV
n ISION_I N)
doesthese
four
things:

- decidesatwhatpixeldensity
tsoample

- producesthe
sampling
signalfor
the
black
referenc aefatixed position
ilnine

- decideswhere
itnhe
scanned
line
tsotartsampling pixels

- and
how
many
pixelsto
sample
per
line
(i.e.
whent sotop)

37
7.2.1 Pixeldensity

An
ASMand
loadable
a counter,
identicalto
those
i nLOAD_CNT2,generate
clock
a

signalatthe
pixelsampling
frequency,
byreducing the
systemclock
bfactor
ay

dependenton
the
zoom.

7.2.2 Blackreference
sampling

ComponentLOAD_CNSicasounter
which
synchronously loadsan
externalvalue
D_IN,

runsto
thatvalue
and
stops.Ithas
parovision
fo the
r zero
count,
andtwo
termcount

signals,
fullperiodand
half
period.
Itstartswh enitsSYNC_LOAD
inputgoeshigh.

Thiscomponentisused
tgoenerate
the
blackref.s ampling
signal(BPE),
bycounting
off

the
length
othe
f back
porch
itnhe
intervalsof
th seampling
clock.

7.2.3 Startingpixel

A
secondLOAD_CNScountsoff
thedelay
fromBPE
to the
firstpixel(zero,
or
one
third

of
the
line
for
the
3Xzoom).Itraisesthe
signal EN_SAMP,
during
whichpixelsampling

isenabled.

7.2.4 Pixelcount

An
ordinary
counter
countsthe
sampled
pixelsin
th leine,
andlowersthe
EN_SAMP

when
the
fullnumber
ofpixelsisreached.

38
7.3 Auxiliarycomponents

CLK_DELAY t–hiscomponentcreatesburstsof
itsinputclock CLK,
forthe
duration
of

the
inputCTRL,
only
delayed
bfixed
ay number
of
c lock
periods.
CLKisassumed
tboe

caontinuousclock
Thiscomponentisused
tcoreat clock
ae burstdelayedbfyour
clock

periods,
which
ineeded
s tloatch
the
outputfromt he
A/D
converter’spipeline.

HCP(half-clock
pulse) –passesto
Q
the
firsthighhalf-period
oCLK,
f fo llowing
the

rising
edge
oits
finputD,
and
only
that.Itisus ed
tcoonvertthe
term-countsignals

(which
last
faullclock
period)
into
half-period
p ulses.
Thecomponentisasynchronous,

and
usesthreeFF’s(A,Band
Cwhich
) mutually
clea each
r other,
according
ttohis

timing
diagram:

CLK

B Q

signifies
falip -flopicnontinuous
clear

Picture
23:HCPtiming
diagram

39
7.4 Signals

Sync
signalsare
active
low,
inverted
and
used
aas ctive
high
through
the
design.
In
order

to
processstill-frame
outputfromVCRs,
which
lack the
s odd/even
signal,
thissignalis

generated
internally.

LINE_SYNC active
- high delimiter
between
video
lines,
typica lly
4.7
microseconds.

Suppressed
during
the
odd
field.

VSNC active
- high delimiter
betweenfields,
typically
2 30
microseconds.Suppressed

during
the
oddfield.

VERT_SYNC derivative
- oVSNC.
f Extendsfromthe
beginning
of VSNCto
the
first

sampled
line
itnhe
field,
covering
verticalretrac aend
verticalzooming
delay.
Coversup

unused
synchronization
intervalinLINE_SYNC.

SCAN_LINE –active
high
during
each
scanned
line;reflectsth veerticalsampling

density
(every
line
oevery
r third
line,
setby
the zoomlevel).
Itsderivative
SCAN_L1
is

low
during
horizontalsyncs.
Both
signalsare
activ oenly
during
even
fields.

EN_SAMP when
- thissignalishigh,
pixelsampling
othe
fv ideo
line
ipermitted.
s This

signalishigh
inevery
n-th
video
line,
assetby SCAN_LINE,and
coverseither
the
entire

line
othe
r middle
third
oit,
fassetby
the
zoom level.

40
1_7_MHZ clock
- whichcontrolsthe
density
opixel
f samplin ogthe
f videolines.
At108

pixelsper
line,
thisclock
runsat1.83
MHzo5.4
r M
9 Hz,
depending
otnhezoomlevel

(see
clock-reducing
counter).

BP_END the
- single
pulse
indicating
theend
oback
f porch Itsdelayed
. derivative,

LATE_BPE,
isused
tloatch
the
black
reference
leve l.
Unlike
the
restof
thesampling

signals,
theseare
notaffected
btyhe
zoom.

AD_CLK triggering
- signalsentto
the
A/D
converter.Itc omprisesBP_END
and
the

1_7_MHZ
line
sampling
burstcoveredbE
y N_SAMP.

LATE_CLK l–ine
sampling
burst,
delayed
bsyeveralperiods( 4)
of
the
sampling
clock,

to
allow
for
pipeline
delay
itnhe
A/Dconverter.
T hissignalclocksthe
utilization
othe
f

digitized
signal.

DLY_END s–ingle
pulse
indicating
the
end
ohorizontal
f zoo ming
delay.

41
8. DIGITIZING
AND
THRESHOLDING
(schematic
ANALOG_IN)

Video
signalfromthe
camera
idigitized
s with
the AD876
converter
chip.The
circuit

generatesthe
sampling
triggerfor
theconverter,
A D_CLK,and
receives8-bitgrayscale

signal,
VIDEO,
in
return.

Video
signaliscorrectedfor
the
intensity
fluctua tionsby
subtracting
the
black
reference

value
fromit.
Thecorrected
video
ithresholded
s t strictly
ao black
and
white
signal,

C_BIT,
which
iboth
s stored
itnhe
RAMand
passed
t foramecomparators.

The
corrected
video
ialso
s passed
ttohe
back-end FPGA
for
histogram/threshold

calculation,
during
the
even
field.
In
theodd
fie ld,
thesignalTHR_RDYfromback-end

latchesthe
calculated
threshold
into
data
a regist er,
to
buesed
itnhe
nextframe.

Presently,
thesampling
idone
s o8nl1inesof
ever eyven
field
othe
f videosignal,
at108

pixelsper
line.
Thisyields
1a08x81
b/w
digitized imageframe,
atthe
correctNTSC

width-to-heightratio
o4:3.
f

8.1 Black-and-white
inversion

Which
side
othe
f threshold
iconsidered
s active,o fareature,
is
m
a atter
of
convention,

and
can
bseetby
run-time
a parameter.
Thechoice doesnotaffecttheedgedetection,

although
iaffects
t the
sensitivity
othe
f motion
d etector
somewhat.

42
9. SETTING
THETHRESHOLD
AUTOMATICALLY

9.1 Heuristic
procedure

Thisvision
systemoperateson
the
assumption
that the
scene
consists
relatively
of

luminoustargetfeature(s)
and
relatively
dark
unin teresting
background(orthe
reverse).

An
early
and
importantstep
ito
sset
balack-and-w hite
threshold
thatwillseparate
the

featuresfromthe
background,
greatly
reducing
the complexity
othe
f scene.

Setting
thisthreshold
manually
idaselicate
task. The
selection
iguided
s btyheapparent

simplicity of
the
b/w
image,and
once
set,
thethreshold
usual ly
workswellfor
raange
of

similar
images.Itwouldbdeifficultfor
the
navig ator
to
adjustthe
threshold
ifnlight,
and

smallerrorsin
the
threshold
can
alter
theresult dramatically.
Automatic
thresholding

wasimplemented
tm
o ake
the
vision
more
robust.

Finding
the
threshold
followsthisheuristic
proced ure:

a)
Constructthegrayscale
histogramof
the
image. Thisis
satraightforward
pixelcount,

accumulated
iannarray
o256
f grayscale
levels.

b)
Find
the
highestmaximumin
the
histogramand
as sume
thatitislocated
itnhe
middle

of
laarge
"hump"
representing
the
background.

c)
Find
the
lowestminimumon
one
side
othe
f highe stmaximum(in
thiscase,
the

brighter
side).
Setthe
threshold
ttohatgrayscale level:the
area
opposite
(brighter
than)

the
background
hump
representsfeatures.

43
d)
Disregard
threshold
choiceswhich
define
very
a smallfeaturearea
itnhehistogram,

since
they
typically
havenvoisualsignificance.

e)If
thesearch
for
m
a eaningfulminimumfails,
re verse
thegrayscaleandlook
for

featureson
the
opposite
end
othe
f histogramin
th neextvideo
frame

f)
Clear
the
histogram.

The
assumption
here
ithat
s thelowestminimum
give best
s separation
othe
f histogram

into
background
and
featuresof
interest,
and
the successultimately
dependson
the

grayscale
separability
othe
f image.
Picture
24gi vesan
illustration
othe
f procedure.

The
grayscale
version
othe
f image
inot
s currently used
itnhe
later
vision
stages;neither

isthe
value
othe
f threshold.
Forthatreason,
the histogram/threshold
processis

implemented
ihnardware,
onthe
back-end
XC4010
con nected
ttohefront-end
vision

processor
via
dedicated
a data
bus.
While
the
hist ogram/threshold
algorithmiswell

suited
for
implementation
icnode,
running
ion
tt h C
e oldFire
processor
would
have

complicated
the
dataflow
and
slowed
down
thefeatu rerecognition.

44
threshold

0 background features 255

Picture
2Sample
4: histogram

9.2 Descriptionothe
f thresholdcalculationonthe
back-end FPGA

9.2.1 Data path

The
histogramisstored
isynchronous
an RAMcompon ent,
SYNC_RAM,
which
is

contained
itnhe
schematic
HIST_IO_RAM,
along
with elementswhich
build
(and
clear)

the
histogram.
Thereare256
word-sized
locations in
SYNC_RAM;the
histogramis

maintained by
presenting
the
grayscale
value
ttohe
RAMasthe address,
and

incrementing
the
corresponding
location
boyne(or setting
ito
tzero).

During
the
histogrambuild
(even
field),
histogram addressesare
thevideodata
coming

fromthe
front-end
FPGAthrough
the
busHIST_IO.
D epending
otnhe
nature
othe
f

image,
thesegrayscale
valuesmay
bienverted
bsyu btracting
themfrom255.
During
the

threshold
calculation
(odd
field),
HIST_ADDRisgen erated
internally
bcounter,
ay and

histogramvaluesappearon
the busHIST_OUT.

45
The
threshold
calculation
sweepsthe
histogramtwic e,
byconvention
idnownward

direction
(255
t0owhite
, tbolack).
The sweep
of histogramaddressesisgenerated
by

the
counter
C(P);specific
countvaluesare
latched in
MIN_LOCand
MAX_LOC

registers,
asthe
locationsof
histogramextrema
(s chematic
HIST_DP).

Minima
and
maxima
are
detected
bcyomparing
three
a djacenthistogramvalues,
which

flow
through
the
comparison
registersPL,
Pand
PR during
the
sweep.
These
are
the

relevantcomparisons,
with
black
dotsrepresenting relative
heightsof
theadjacentbarsof

the
histogram:

PRPL
P PRPPL
or (P>
PR) (•P<
PL) left-biased
maximum

or (P>
PL) (•P<
PR) right-biased
maximum

or (P<
PL) (•P>
PR) left-biased
minimum

or (P<
PR) (•P>
PL) right-biased
minimum

We
use
the
left-biasedcomparisons.
Currentextreme are
s latched
into
registers

CUR_MIN
and
CUR_MAX.
Since
waere
interested
itnh gelobalextremes,
locally

found
extremalvaluesmustbe
comparedwith
current largest/smallestvalues.
Allthe

comparison
logic
icontained
s itnhe
schematicHIST _MINMAX.

46
Step
dfrom
) previoussection
iimplemented
s itnhe schematic
HIST_INTEGRAL.

Histogramintegraland
the
featuresintegralareac cumulated
irnegisteredadders,
andthe

featuresintegraliscompared
with
anappropriatef raction
othe
f totalhistogramintegral.

Since
the
search
for
m
a eaningfulminimumcan
fail, the
success/failure
irecorded
s itnhe

flip-flop
MIN_FOUNDand
passed
ttohe
controlASM.

9.2.2 Control

ControlASMisimplemented
itnhe
Abelcodecompone ntHIST_ASM,
shown
iP
n icture

25
(two
pages).
The algorithmisby
itsnature
seq uential,
andcan
breoughly
divided
into

these
sixsteps:initialize
the
addressand
the
com pregisters,
search
for
themaximum,
re-

initialize,
searchfor
theminimum,
notify
fronten odinvert
r thegrayscale,clear
the

histogram.
Individualstatesand
logicare
explain ed
itnhe
ASMchart.

9.2.3 Signals

CP_SYNC_LD s–ynchronously
load
the
value
255
into
the
address counter
C(P).

CLR_REG s–ynchronously
clear
registersPR,
CUR_MAX,
MAX_LO C.

PL_LD,
P_LD,
PR_LD –enable
loading
ocomp
f registersPL,
Pand
PR.

47
CUR_MAX_LD,
MAX_LOC_LD,
CUR_MIN_LD,
MIN_LOC_LD e–nable
loading
of

maximum/minimumregisters.

PR_ASYNC_LD a–synchronously
setregistersPRand
CUR_MIN
to
“i nfinite”
value.

THR_RDY n–otify
thefrontend
thatthe
threshold
iready.
s

INV_GRAY s–etthegrayscale
inversion
for
the
nextframe.

CLR_WE e–nable
writing
zerosinto
the
histogram.

NORM_EN,
FT_EN –enable
the
accumulation
ohistogram
f and
feature integrals.

FFR f–eature
integralismeaningfulrelative
ttohe
en tire
histogramarea.

48
(CBA)
000 WAIT Keepthehistogram
location
at
CP_SYNC_LD C(P)
255
[ =255]

0
TRG Startthethresholdcalculation

001 IN11 Clearregisters


CUR_MAX,
CLR_REG
MAX_LOC,PR.
PL_LD Loadhistogram dataalocation
t 255

PL ->DATA(255)
P; ->PL
011 IN12 Completedinitialization
oPL,
f P,
P_LD PRformax.search.
PL_LD Calculatetheareaothe
f
NORM_EN histogram.

010 S1
Decrementhist.location
untilC(P)=0,
i.e.countdown
untilCP_TC
=1
NORM_EN

1
Search forthemaximumfinished;
TOSTATE starttheminimum search.
CP_TC
IN21
0

Checkcurrent
if valueim as aximum
1 (P_MAXtrue),andiitis
flargerthan
P_MAX& CUR_MAX_LD
NEW_MAX MAX_LOC_LD
earliermaxima
(NEW_MAXtrue).
Savecurrentmaximum valueandits
0 histogram location.

Shift
DATA[C(P)] ->PL ->P ->PR
PR_LD simultaneously
P_LD
PL_LD

Picture
2Control
5: ASMfor
the
histogram/threshol cdalculation,
p.1

49
FROMSTATE
S1

CP_SYNC_L Resethistogram
location
to255

110 IN21
Loadhistogram dataalocation
t 255
PL_LD intoPInitialize
L. PRtoinfinity,clear
PR_ASYNC_LD
MIN_FOUND

111 IN22 PL->


DATA(254)
P; ->PL
P_LD Completedinitialization
oPL,
f P,
PR
PL_LD formin.search.
FT_EN Calculatethehistogram
areaothe
f
features.

101 S2
Decrementhist.location
until
FT_EN
C(P)=MAX_LOC,
i.e.CP_FIN=1

1
CP_FIN

0 Savecurrentminimum
andits
histogram
location

P_MIN& 1 CUR_MIN_LD 1
NEW_MIN MIN_FOUND Seenotebelow
MIN_LOC_LD
0
0

PR_LD INV_GRAY THR_RDY


P_LD
PL_LD

Shift
DATA[C(P)] ->PL ->P ->PR
simultaneously Resethistogram
CP_SYNC_LD
locationto255.

100 CLR
Note:
maifinimumwas
found,signalthatthe CLR_WE
thresholdiready
s forthe Resetentire
front-end. histogram to
0 zero.
CP_TC
If
thesearch
failed,invert
thegrayscaletotry
the 1
oppositesearch itnhenext
frame.
TO
STATE
Picture
2p.2
5,
WAIT

50
10. DIGITALZOOM

Itbecame
very
obviousduring
testflightsthatthe human
navigator
and
thevision

steering
systemneed
twodifferentperspectiveson the
ground
scene.
Ataltitudesof
200-

300
ft,
view
through
thewide-anglecamera
wasadeq uate
forgeneralorientation,
butthe

targetwasimpractically
smallfor
the
vision
syste mto
handle.
Using
narrow-field
a (f=16

mm)
lens,
or
equivalently,flying
close
ttohegrou nd,
producedgood
targetimagesifand

when
the
targetwaseverlocated.
Thecamera
wasfi xed
ttohe
body
othe
f plane,
but

even
with
ainndependently
movable
camera
the
navig ator
would
have
difficulties

spotting
areasof
interestthrough
the
narrow
fiel d.

A
camera
with
the
zoomlens,
or
even
two
different camerason
the
samegimbal,
would

solve
thisdifficulty
athe
t costof
additionalmec hanicalequipment.
However,
since
the

vision
systemoriginally
used
only
one
sixth
othe
f totalimage
information(sampling

every
third
line
othe
f even
field),
there
wasroom for
electronic,
instead
ooptical,
f

zooming.
Onlythe
vision
systemseesthe
effectof the
zoom,
since
the
navigator

currently
receivesno
digitized
image
feedback.

The
digitalzoom,
asimplemented,
is
3aX
zoom.
It amountsto
using
the
fullline
density

of
the
evenfield,
sampling
pixelsatthree
timest he
"no
zoom"rate.In
order
to
maintain

the
same
data
volume,
only
the
centralone-ninth
of the
video
image
iactually
s digitized

and
passed
ttohe
vision
system.
A
6Xzoomcouldbe implemented
buysing
the
linesin

the
odd
field
also, butsome
care
would
have
tboexercised
regarding theend-of-field

communication
betweenprocesses.

51
10.1 Zoom
implementationonthe
FPGA

The
zoomhasbeen
implemented
otnhe
front-endFPGA as
,partof
the
overalldigitizing

circuit.
Thezoomlevelispassed
ttohe
FPGAasa run-time
parameter,
andselection
a is

made
between
two
setsof
five
constants.
These
five constantsdefine
the
resolution
and

framing
othe
f image.

In
order
to
cope
with
high
sampling
frequency,
the FPGA'sclock
rate
iset
s to
33.3
MHz,

and
the
pixelread/write
cycle
wasmade
ashort
s an pdipelined
afeasible
s (see

description
iS
n ection
12).
Picture
2s6howsthe
zo om’sgeometry
and
thesignals

involved.
Refer
back
tS
o ection
7Formatting
, the
i mage
scan,for
details(p.33).

Pictures27
and
28give
aonverview
of
the
formatti ng,
modified
btyhe
zoom.
Picture
27

showsthe
sampling
clock
active
itnhe
centralone- third
othe
f lines
the
ofeven
field.

Greater
magnification
showsthe
sampling
clock
also limited
ttohe
centralone-third of

one
line,
with
the
black
reference
sample
following the
horizontalsync
(Picture
28).
The

non-zero
data
signal(DR)
isdue
tdark
ao objecti tnhe
camera’sfield
ovision.
f

When
the
zoomisengaged,
rotationsof
the
camera
p roduce
larger
displacements in
the

image.
Therefore,
theprocedure
thatcalculatesser vo
duty
cyclesmustalso
take
the
zoom

into
accountand
turn
thecamera
bsymaller
angles (see
the
programmoduleServo.s).

52
Verticalblanking

VERT_SYNC

Area
othe
f video
signal
(even
field)

Scannedline Scanned lines


sync

density; per
field
Line

Digitized SCAN_LINE (fixed)


area

Samplingrate
5.49
MHz;
signal
1_7_MHZ

DLY_END Samplesper
line
(fixed)

Blackreference
sample; BP_END

Picture
2Formatting
6: and
zoom

53
Picture
2Vertical
7: formatting
(zoom)

ref. sampled

Picture
2Horizontal
8: formatting
(zoom)

54
11. ROUND
ROBINPROCEDUREFORDATASHARING

Presence
otwo
f processes(motion
detection
and
fea turerecognition),
running
on

separate
processors,
makesheavy
demandson
themem ory
containing
the
image
data.
In

thissystem,
accessconflictsand
buslogjamsarea voided
buysing
dual-port
a SRAM

chip
and
round-robin
a data
accessprocedure.

7
The
RAMchip
used
iCY7C007AV,
s an
asynchronous32K
xpart
8 byCypress

Semiconductor.Ithastwo
address/data
ports,
which can
read
simultaneously
fromthe

same
memory
location.
Thepartarbitratesread/wri te
accessconflictsin
hardware,

although
thatfeature
inot
s used
here.
Thechip
al so
has
baank
ohardware
f semaphores,

with
their
ownchip-selectsignalsand
arbitration logic.
Thisfeature
wasessentialin
8
implementing
the
round
robin
procedure.

In
thisscheme,
themotion
detection
(processP1)
r eadsfromonememory
area,
sayM1,

and
writesinto
another(M2).Itswapstheseareas on
each
new
video
frame.

Feature
recognition
(P2)takes
faew
frames'
time
t coomplete
one
calculation.
When
P2

needs
an
update,
itreadsthe
P1'sread
frame,
say M1
(P2
never
writes).
On the
next

video
frame,
P1 readsM2
and
writesto
M3,
thenswa psM2
and
Mu3ntilP2
claims

whichever
of
these
iP1's
s read
frameathe
t moment (see
Picture
29).

55
P1 P1
M1 M2 M1 M2 M1 M2

P1
P2 P2
M3 M3 M3

P2
Picture
29:Round
robin

Addr 1
M1

Addr 2
P1 M2 P2

Addr 3
M3

Picture
3DPRAM’s
0: memory
buffers

Buffers
M1-M3
are
implemented
adistinct
s memory
a reason
the
dual-portmemory

chip,
andwaiting
ieliminated
s completely.
P2 can startreading
the
P1'sread
frame

through
itsown
bus,
andthe
round-robin
motion
is performed
bsywitching
the
starting

addresses
M1-M3
of (see
Picture
30).
Read/writec onflictscannotoccur
onbuffer

access,
only
doublereads,
which
are
permitted
bty he
DPRAM.

56
Atany
moment,each
buffer
isassigned
tone
oth
f ese
three
states:P1
writes,
P1 reads,

P2
reads;any
buffercanbieanny
othem,
f andno two
buffersareever
thesame.
The

record
othe
f currentstate
imaintained
s idedi
an cated
location,
thestatusbyte,
which
is

updated
bPy a1nd
Po2envery
turn
othe
f round
rob in.

Since
Pa1nd
Pa2re
mutually
asynchronous,genuine accessconflictswilloccur
on
the

statusbyte.
These
are
avoided
bpyrotecting
the
s tatusbytewith
the
semaphore:only
one

portcan
hold
the
semaphore
(thisisarbitrated
by the
memory
chip),and
thatportupdates

the
statusbeforereleasing
the
semaphore.
The oth er
processstaysin
polling
a loop
until

accessisgranted,
buttheduration
othe
f busy
wai isno
t more
than
one-byte
a I/O

operation,
which
iinsignificant
s oneither
process or.

11.1 Statusbyte

The
statusbyte,atthe
DPRAMaddress0x08,
contain three
s two-bitfieldscorresponding

to
the
buffer
statesP1W,
P1Rand
P2R,
and
the
valu ieenach
field
ithe
s number
of
the

bufferassigned
ttohatstate.

0 0
0x08 P2R P1R P1W

Maintenance
othe
f statusbyte
ivery
s simple.
On r eset,
itsvalue
iset
s to
0b00100100

(0x24),
which
meansthat:

buffer
- zero iP1's
s writebuffer

buffer
- one iP1's
s read
buffer

buffer
- two iP2's
s readbuffer

57
Processs1wapsthecontentsof
fieldsP1Wand
P1R (interchangesitsworking
buffers).

Processs2wapsthecontentsof
fieldsP1Rand
P2R (releasesthe
buffer
itjustread
and

takesup
the
reading
buffer
of
the
Process1).
Bet ween
updates,
each
processmaintainsa

private
copy
othe
f statusinformation;otherwise, itsworking
buffer(s)
could
change
in

mid-cycle,
with
disagreeable
results.
Picture
3s1 howsthe
allowed
swapsof
the
status

byte
values.

P1
0x24 0x21
P2 P2

0x18 0x09

P1 P1
0x12 0x06
P2

Picture
31:Statusbyte
values

58
11.2 Round Robinonthe
front-endFPGA

The
procedure
bwy hichthe
two
processesshare
buff ersof
image
data
itnhe
DPRAM

hasalready
been
described
earlier.
Thissection
d ealswith
the
implementation
othe
f

round
robin
otnhe
FPGAside,
asthe
componentROUN D_ROBIN.

The
privatecopy
othe
f statusbyte
residesin
the register
SBYTE_REG,which
iread
s

fromand
written
ttohe
DPRAMaddress0x08.
Notice the
peculiar
ordering
oinput
f bus

leads,
which
accomplishesthe
swapping
ofields
f P1 Rand
P1W.

Currentaddressesfor
thethree
image
buffersresid ietnhree
countersdrivenbtyhe

LATE_CLK(see
schematic
RR_ADDRESS).
FieldsP1Ran P
d 1Ware
used
toperate

the
multiplexer
which
selectsthe
currentbuffer
fo read
r owrite
r operations.

The
ASM(see
picture
33)
isstraightforward:itpol lsthe
semaphore
foraccess,
readsand

writesthe
statusbyte,
then
releasesthe
semaphore Picture
. 3s4howsthe
timing
diagram.

SEM_IN r–ead
value
othe
f semaphore.

SEM_OUT w
– ritten
value
othe
f semaphore.

RR_SE s–emaphore
enable
(the
semaphore’schip
select).

SEM_WE,
SEM_OE,
SEM_D_TSB c–ontrolsignalsfor
the
semaphoreI/O.

RR_CE c–hip
enable
forthe
regular
RAMarea.

SB_WE,
SB_OE control
- signalsfor
the
RAMI/O.

STAT_A_TSB a–ddressTSB
controlfor
theentire
round
robin
se quence.

59
Picture
3Semaphore-protected
2: update
othe
f stat usbyte

60
(CBA)

000 WAIT

0
TRG Start
theupdate
othe
f status
byte

001 WSEM
SEM_WE
RR_SE Tentatively
write
zero
into
the
SEM_OUT=0 semaphore

RSEM011
SEM_OE Readbackthe
semaphore,to
checkif
RR_SE the
write
succeeded

1
SEM_IN Poll
semaphore
until
zero
iread
s back
0
111 WR

Pause oneclockperiod
taovoid
a
glitchotnhe
OEsignal

101 RSB
SB_OE
RR_CE Read
the
status
byte

100 WSB
SB_WE
RR_CE Write
back
the
status
byte
withfields
P1R
andP1Wswapped

110 WW

Pause oneclock
periodtaovoid
a
glitchotnhe
WEsignal

010 QSEM
SEM_WE
RR_SE Release
the
semaphore
bw
y riting
one
SEM_OUT=1 into
it
DONE

Picture
33:Round-robin
ASM

61
WAIT WSEM RSEM WSEM
-- RSEM
WR-- RSB WSB WW QSEM WAIT

TRG PulltheASM
out ofwait

ADDR SEM SB SEM Address


bus

DATA 0 10/ 0 10/ valid


SB new
SB 1 Databus

RR_SE_ Semaphoreenable&
controls

Semaphorevalueonwrite

RR_CE_ RAMenable&
controls

SB_OE_
Picture
34:Round-robin
timing
SB_WE_ p.
62

Address
TSB
62
11.3 RoundRobinonthe
MCF5307processor

Implementation
othe
f round
robin
procedure
icnode isstraightforward:
the
flowchartis

almost
identicalto
the
ASMchartin
Picture
33. The
procedure
iinvoked
s once
ienach

passof
the
featurerecognition
loop
(see
module
ma in.c).
Itpollsthe
semaphore
for

accessand
readsthe
statusbyte
when
the
accessi granted;
s swapsthefieldsP1Rand

P2R,
updatesthe
statusbyteand
releasesthe semaphore.
The new
P2R(in
the
localcopy

of
the
statusbyte!)
isused
tsoelectthe
starting addressof
the
read
buffer,and
that

addressismadeavailablettohe
vision
code
itnhe globalvariable
IMAGE_FRAME.

The
subroutine
round_robin
icontained
s itnhe
prog rammodule
RoundRobin.s,
along

with
the
subroutine
config_cs,which
configuresthe leftport(ColdFire
side)
of
the

DPRAM.

Note
otnhe
DPRAMaddresseson
MCF5307:
Thehardwa re
iconfigured
s stohatthe

ColdFire
processor
usesitschip
select f4or
the
semaphore
bank
othe
f DPRAM,
and
the

chip
selectf5or
the
regular
storagearea.
ColdFi rechip
selectsare
assignedblocksof
9
addressspace
MB
2 in
size; consequently,
thebase
addressesfor
the
semaphor esand

the
regular
RAMbecome0xFF800000
and
0xFFA00000
re spectively,eventhough
they

are
contiguousin
the
DPRAM’saddressspace
(see
Pi cture
4o8pn98).
. ColdFire

generatesthe
proper
addressin
the
lower
15
bits, and
the
high
bitsserve
only
taoctivate

the
rightchip
select.

63
12. THEPIXELREAD/WRITECYCLE

The
pixelread/write
cycle
icentral
s to
theearly vision
processing:
itstoresthe
digitized

B/Wimageand
performsthe
motion
detector'sframe comparison.
Thissection
describes

the
cycle
suitablefor
the33.3
MHzsystemclock
an tdhe
5.4
MHzpixelsampling
rate.

Eventswithin
the
cycle
are
sequenced
btyhe
state machine
PIXEL_CYCLE,
which
is

clocked
btyhe
systemclock
and
runsonefullseque nce
per
period
othe
f sampling
clock.

The
pixelcycle
accessestwo
memory
buffers,
P1Ran d P1W;corresponding
addressesof

the
pixelin
these
two
buffersare
determined
btyh reound
robin
algorithm.Pixel

calculation
itriggered
s btyhe
delayed
sampling
cl ock,LATE_CLK,
whichalso

incrementsthe
bufferaddresses.

The
pixel's
readaddressiscalculated
during
the high
time
othe
f sampling
clock,
andthe

value
istable
s otnhesignalM_BIT
one
clock
perio ldater;thisisthe
pixel'svalue
from

the
previousframe.
Thethresholded
signal,
C_BIT, becomesavailable
around
the
same

time.
Both
bits are
presented
ttohe
comparator/accumulator
(see
Se ction13.3,
the

description
othe
f FPGAdesign,
p.70,
fordetails) and
the
incremented
value
othe
f

motion
vector
isclockedionne
period
later.

The
addressisnow
switched
ttohe
pixel'swrite
ad dress,
andotnhe
rising
edge
oWE,
f

one
clock
after
theaddressswitch,
thenew
pixelv alue
iwritten
s ttohe
write
buffer.
The

64
entire
read/writecycle
takesfive
clockcycles:at 33.3
MHz,
thisissufficiently
fastto

complete
allpixelprocessing
athe
t sampling
rate.

Inaddition,
there
are
RAM-controlling
signalsin
t he
cycle:outputenable,write
enable

and
the
three-state
outputbuffer
on
the
zero
bito the
f data
line.
Allof
theseareactive

low.
Picture
3s7howsthe
timing
diagramof
the
rea d/writecycle,
onecolumn
per
state.

Picture
3s5howsthe
read/write
cycle
driven
btyhe 33.3
MHzclock,
andrunning
athe
t

fullsampling
rate
othe
f 3X
zoom.

Picture
3Pixel
5: cycle

65
(CBA)
000 WAIT

0
TRG

001 CALC

ZB_OE_
ZB_CS_

011 SW

CALC_EN
ZB_OE_
ZB_CS_

111 WR

CALC_EN
ZB_WR_BUF
ZB_CS_

110 DONE

ZB_WR_BUF
ZB_WE_
ZB_CS_

Picture
36:PIXEL_CYCLE
ASM

66
WAIT CALC SW WR DONE WAIT Picture
37:Pixelcycle
timing

TRG PulltheASM
outofwait

ADDR READ WRITE READ

C-BIT Databitfrom
videostream

M-BIT Databitfro m
memory

ZB_OE_ OEtoreadM -BIT

CALC_EN Enablethebitcomparison

ZB_WR_BUF 0-access
P1R;1 -access
P1W

ZB_WE_ WEtowriteC -BIT

ZB_TSB_ Open
TSBtowriteC -BIT

ZB_CS_ CSforthepixelcycleI/O

67
13. FRAMECOMPARISON
AND
THEMOTION
VECTOR

13.1 Methodology

Motion
detection
itnhissystemislimited
tloinea displacements
r of
the
entire
scene,

since
waere
interested
only
idnetecting
changesd ue
ttohe
movementsofthe
plane
(ego-

motion).
Driftdetectionwould
perhapsbe
more
a ac curate
term.

Formula
used
tcoalculate
the
motion
vector
isasf ollows:

∑ x (∆pix)
∑ pix
∆=
i i

Thisisessentially
the
formula
for
the
dipole
mome ntof
the
displacement,
with
the

previousframe’spixelscounting
athe
s negative
ch arge,
andthe
currentframe
as

positive.
Summationsare
over
the
entire
image, pix
ithe
s pixelvalue
(zero
oone),
r ∆pix

isthe
difference
betweenconsecutive
frames,and xi isthe
i-th
pixel’sposition.
The

normalization
constantin
the
denominator
issimply the
number
of
black
pixelsin
the

image.

Motion
detection
itnhe
vision
oinsects
f with
comp osite
eyesutilizesthe
principle
of

consecutive
activation/deactivation
oreceptors;
f d irection
omotion
f idetermined
s btyhe
10 11
pattern
oneural
f wiring
betweenadjacentreceptors (eyeletsin
the
composite
eye).

Our
detector
hasno
pixeladjacency
information,
an cdannotdetectlocalmotion
within

the
image.Instead,
itobtainsan
integrated
value of
spatialdistancesbetween

activated/deactivated
pixels.
Forsimple
driftmoti on,
thisisan
adequate
motion
vector,

68
with
the
caveatthatthe
detector
issensitive
tao ppearance
onew
f objectsin
the

periphery,
which
iinterprets
t asmotion.

13.2 Computation

Pixelsare
processed
irnealtime,
andeachpixelc ycle
containsthe
following
steps:

corresponding
- pixelfrompreviousframe
iread
s f romthe
DPRAM
bufferP1R;

current
- and
previouspixelare
presented
ttowoc omparators,which
calculate
the

componentsof
the
motion
vector;

current
- pixelisstored
itnhe
DPRAM
buffer
P1W.

Each
b/w
pixelisstored
itnhe
zero-th
bitof
bay te,
which
makesaddressing
simpler
and

faster.
Higher
bitsof
these
bytesare
notused.

13.3 Designcomponents

line-in-frame
counter counts
- scanned
video
lineswithin
one
frame.
Use dathe
s vertical

(Y)
coordinate
othe
f currentpixel.

pixel-in-line
counter counts
- the
sampling
clock
(LATE_CLK),
starting
a the
t beginning

of
each
video
line.
Usedathe
s horizontal(X)
coo rdinate
othe
f currentpixel.

pixel-in-frame
counter counts
- the
sampling
clock
(LATE_CLK),
starting
a the
t

beginning
ofaframe.
Itresetsto
the
starting
ad dressof
the
frame
itnhe
SRAM,
and
its

value
iused
s athe
s addressof
the
SRAM
byte
that containsthe
currentpixel.

69
COMP_ACCUM comparator/accumulator;
- thiscomponentadds/subtr actsthe
value
on

the
inputbusNUM[31:0]
to/fromthe
currentvaluei intsinternalregister.
Thecurrent

value
ialways
s availableotnhe
outputbusSUM[31: 0].
Thesign
othe
f operation

dependson
the
valuesofCBIT
and
MEM,asfollows:

CBITMEMoperation

0 0 none

1 0 add

0 1 sub

1 1 none

Thisoperation
idesigned
s tcoapture
the
differenc esin
pixelsof
adjacentframes.Itis

enabled
btyhe
ENsignal.
ASYNC_CTRL
resetsthe
re gister
value
tzoero

asynchronously,
andnoperationstake
place
while ASYNC_CTRL
ihigh.
s

black
pixelcounter counts
- the
sampling
clock
(LATE_CLK),
starting
a the
t beginning

of
farame,
only
iC_BIT
f ishigh
otnhe
rising
edg oethe
f clock.
Countofblack
pixelsin

one
frame.

13.4 Signals

C_BIT s–ingle-bitoutputof the


digitizing/thresholding
circuitANALOG_IN.
Thi is
s

the
currentpixelof
the
currentframe.

70
M_BIT current
- pixelof
the
previousframe,
retrieved
fr omDPRAM
andcompared
with

the
C_BIT
to
detectmotion.

CALC_EN e–nable
signalfor
the
comparison;outputof
the
P IXEL_CYCLE
sequencing

ASM.

71
14. WRITING
THEMOTION
VECTOR
TODPRAM

Atthe
beginning
othe
f odd
field,
vectorcomponent and
s the
normalization
constantare

written
iD
n PRAM,
atthe
address0x0C,
asthree
lon gwordsin
the
big
endian
order.

ComponentVECT_OUT
handlesthatprocedure.

VECT_OUT
hasanaddresscounter,
w
a ord
counter
and bayte-in-word
counter.
The

latter
two
countersoperate
the
multiplexerswhich selectthe
proper
byte
foroutput,
and

the
whole
procedureconsistsof
satraightforwardd ouble
loop,
corresponding
ttohree

wordsand
four
bytesperword.
TheASM
chartissho wn
iP
n icture
39.

TRG t–he
trigger
signal.

…_CNT_EN,
…_CNT_LD –controlsignalsfor
the
counters

WE_OUT,
TSB DPRAM
- controlsignals

DONE e–nding
signal;signalto
the
nextstage
tporocee d.

72
Picture
38:Vector
output

73
(BA)
00 WAIT

ADDR_CNT_EN
ADDR_CNT_LD Initializetheaddress
and
WORD_CNT_LD wordcounters

0
TRG Startthevectoroutput
1

01 W

WORD_CNT_EN
BYTE_CNT_LD Counttheword,restartthe
bytecount

11 B1

WE_OUT Putout
bayte

10 B2

ADDR_CNT_EN
BYTE_CNT_EN
Incrementthebyte’s address,
countthebyteitnheword

0
BYTE_TC Finishedwith
oneword?
1

0
WORD_TC Finishedwith
allthewords?
1

DONE

Picture
39:VECT_OUTASM

74
15. PARAMETERSOF
THEFRONT-ENDFPGA

The
componentPAR_IO1
handlesthe
parameter
logisti c.
Currently,
there
are
three
byte-

size
parametersallocatedttohe
front-endFPGA,
st arting
athe
t DPRAM
address0x09:

- b/w
threshold,
(0 2–55)

- one
byte
obitwise
f parameters:

o bit0 b/w
- inversion;zero
standsfor
darkfeatures,
one for
brightfeatures

o bit1 zoom
- level:zero
for
no
zoom,
one
for
3Xzoom

- mailbox,
written
tD
o PRAM
on
eachframe:

o bit0 z–oomlevelindicator:zero
for
no
zoom,
one
for
3 X
zoom

Parameterscan
baeddedaneeded,
s byfairly
a stra ightforward
extension
othis
f

component.

Defaultvaluesof
the
parametersare
contained
int he
circuit,
asbyte-size
constants.
On

the
firsthigh
TRG
after
power-up,
thatison
the
f irstodd
field,
defaultparametersare

written
tD
o PRAM
atconsecutive
addresses.
On subse quentTRGs,
each
parameter’s

addressispresented
otnhe
busADDR_OUT,andthe
c orresponding
parameter
selection

signalgoeshigh,
fortheduration
otwo
f clocks.
T hisallowsthe
currentparameter
values

to
breeadfromDPRAM
and
latched
into
parameter
re gistersin
the
circuit.
The mailbox

parameter
iswritten
tD
o PRAM
on
each
frame,
tobe read
btyhe
microprocessor.

The
ColdFire
programupdatesthe
parametersin
the DPRAM
during
theIRQ5
handler,

and
readsthe
mailbox.
Thismechanismallowsfor
c hangesin
the
parametersto
bm
e ade

75
atrun
time,
e.g.
by
operator
commands,
aswellas for
the
FPGA
to
send
signalsto
the

ColdFire.
Currently,
themailboxiused
s tnootify the
ColdFire
when
the
zoomcommand

hasbeen
radioed
ttohe
front-end
FPGA.

SequencerASM
for
PAR_IO1
ishown
s iP
n icture
40. Itssubcomponent,INIT_BOX,

raisesPAR_INIT
once
after
the
power-up,
anditsAS M
isshown
iP
n icture
41.

TRG t–he
trigger
signal;high
once
perframe,atthe
s tartof
odd
field

TRG_MACH d–erivative
trigger,
produced
btyhe
initializatio cnomponentINIT_BOX.

ADDR_CNT_EN,
ADDR_CNT_LD s–ignalsthatcontrolthe
addresscounter

PAR_INIT h–igh
onfirstoccurrence
othe
f TRG;passesthe
d efaultparameter
values

onto
the
PAR_IO
bus.

A_TSB,
D_TSB,OE_OUT,
WE_OUT R
– AM
controlsignalsinvolved
ipnarameterI/O

to
and
fromthe
DPRAM.

PAR_SEL0,
PAR_SEL1,PAR_SEL2 p–arameterselectors;enable
latching
othe
f

corresponding
parameterin
the
appropriate
data
reg ister
in
thecircuit.

PAR_IO d–ata
buswhich
carriesthe
parameters.

76
(CBA)
000 WAIT

FROM ADDR_CNT_EN Picture


40:PAR_IO1
ASM
STATEA3 ADDR_CNT_LD

0
TRG_MACH

1
001 W1

PAR_SEL0
A_TSB

0
PAR_INIT

WE,
D_TSB OE

011 A1

A_TSB

0 0
PAR_INIT PAR_INIT

1 1

D_TSB OE D_TSB OE

010 W2 111 W3

PAR_SEL1 PAR_SEL2
A_TSB WE
A_TSB
D_TSB
0
PAR_INIT

1 101 A3

A_TSB
WE,D_TSB OE D_TSB
DONE

110 A2

A_TSB
TO
STATE
WAIT

77
(BA)
00 SLEEP

Theresetstate

0 st
TRG Waitforthe1 TRG
1

PAR_INIT_SET RaisethePAR_INITsignal
onceafterthereset

11 NEXT 01 WAKE
0
PAR_INIT_CLR 1 TRIG_MACH Triggeronecycleoparameter
f
TRG
updates

Lowerthe
PAR_INITsignal 10 WAIT
andwaitfornext
TRG

Receivethesignalthatthe
0
DONE parameterupdateifinished
s

Picture
41:INIT_BOX
ASM

78
Picture
42:FPGA
parameters

79
16. THE
IRQ5/PARALLELPORT
COMPLEX

A
tightly
coupled
hardware/software
subsystem,
cent eredaround
theInterrupta5nd
the

parallelportof
the
MCF5307,
coordinatesthe
back- end
dataflow
in
the
vision
system.

Herewdeescribe
thatsubsystem.

16.1 IRQ5handler

Early
itnhe
odd
field,
after
the
motion
vector
has been
written
ttohe
DPRAM,
back-end

FPGAgeneratesanIRQ5,
as
haardware
signalto
the MCF5307.
Whenthe
processor

enterstheIRQ5
handler,the
interruptisacknowled ged
bhandshake
ay otnhe
parallel

port,
viatwo
signals,
ACKand
RDY
(bits14
and15) .

The
motion
vector
isread
fromDPRAM
(itwaswritte bnefore
theIRQ5,so
there
ino
s

accessconflict),and
iisnormalized
t bdyividing by
the
black
pixelcount.
Thisoperation

isperformed
hereand
notin
the
front-end
FPGA,wh ere
the
vector
isgenerated,
because

of
long
integer
divisions.

Subjectto
some
size
restrictions,
thevector
istr anslated
into
incrementsin
the
servo

cycle'spulse
widths,
andthese
incrementsare
used to
update
the
currentpulse
widths.

Notice
thatthe
desired
position
othe
f camera
ias lwaysknown
ttohe
vision
system(in

the
formof
calculated
pulse
widths),
butthatthe actualposition
may
notbeknown
irneal

time.

80
16.2 Duty-cycle
generator

The
generator
of
the
pulse
modulated
servo
signals resideson
the
back-endFPGA.It

receivesthe
pulse
widthsfromthe IRQ5
handler,and
producesthe
corresponding

waveforms.
Pulse
widthsare
passed
a14-bit
s number on
sthe
parallelport,in
protocol
a

synchronized
btyheACK/RDY
signals.

16.3 Servomotionfeedback

When
the
pulse
width
changes,
theservosstartmovi ng
into
the
new
position.
Theservo

circuit,
builtaround
PIC16F877
a microprocessor,
d etectsthe
pulse
changeand
beginsto

monitor
the
angle
othe
f servo
shaft,
asan
analog signal.Itassertsthe
servo-move
signal,

which
remainshigh
untilthe
servoshave
stopped.

In
thisfashion,
theinstantaneousinformation
abou the
t camera
position
idecoupled
s

fromthe
vision
system.The
vision
merely
issuesth deesired
position,
andreceives

confirmation
when
thatposition
ireached.
s

While
the
servo
motorsare
moving
ttoheir
newposi tion,
theIRQ5
inot
s being

generated,
since
the
motion
detector
would
countera ctthe
displacementmotion,
leading

to
unsteady
movementofthe
camera.

16.4 Displacement
vector

When
iisnot
t communicating
with
theIRQ5
handler, the
duty-cyclegenerator
waitsfor

the
displacementvector
transfer,
initiated
btyhe MCF5307.
The displacementvector
is

81
being
calculated
btyhe
Process2,
andwhen
ready, itistransferred
ttohe
back-endFPGA

in
the
same
way
athe
s motion
vector.

In
thistransfer
sequence,however,
theservo-motio snignal(theresponse
tothe
new

displacementvector)
ispassed
back
ttohe
MCF5307, forcing
the
Process2into
busy
a

waituntilthe
servoshavestopped
moving.
Thismi ghtappear
wastefulatfirst,
butitis

easy
tsoee
thatProcess2really
mustpause
during the
servo
motion.
Thesnapshotfor

feature
recognition
mustnotbe
taken
untilthe
cam era
hasmoved
ttohe
new
location.

Otherwise,
thechange
inthe
image
would
notbe
reg istered,
andthe
nextcycle
ofeature
f

recognition
would
end
uwp orking
with
stale
data.

16.5 Saccadic
blanking

On
the
face
oit,
fthissuppression
oimage
f sampli ng
during
camera
motion
resemblesthe
12
phenomenon
osaccadic
f blanking
ihnuman/animalvis ion.Itiswell
knownthatthe

sensitivity
othe
f optic
nerve
isuppressed
s while saaccade
(raapid
eyemovement)
isin

progress,
andiisplausible
t thatthe
purpose
otf hissuppression
ito
spreventvisual

confusion
ibniologicalsystemsaswell.

Interestingly,
there
isome
s question
whether
thes uppression
othe
f opticnerve
signalis

triggered
btyhe
blurring
oretinal
f image
oby
ra signalfromtension
sensorsin
the
eye
13
muscles.In
robotic
a systemitismuch
easier
to
detects ervo
motion
than
image
blur,

and
the
choice
omechanism
f isobvious.

82
16.6 Descriptionothe
f IRQ/PP
circuit
onthe
back-endFPG A

The
circuitdesign
icontained
s itnhe
schematicDU TY_CYC_PP.
Control
of
the
process

iscarried
outby
the
ASM
componentIRQ_PP1,
descri bed
itnhe
previoussection
(see

also
ASM
chart,
Picture43,
andtiming
diagrams,Pi ctures44
and
45).

The
data
path
leadsfromthe
parallelportto
two
1 4-bitregisters,
which
hold
the
current

pulse-width
values.
These
valuesare
itnurn
availa ble
ttohe
square-pulsegenerators.

The
pulsegeneratorcomponent,
CYCLE_GEN,
contains faixed-value
counter,
which

measuresthe
2m
0 period
s othe
f servo’sduty
cycle and
, loadable
a counter,which

measuresthecurrentpulse
width.
Asimple
two-sta te
ASM
switchesbetween
highand

low
signallevels.

The
servo-move
signalpassesthrough
the
component DIP_FILTERwhicheliminatesthe

short(lessthan
CLK
a cycle)
dipsin
the
signal(n oise).

ACK,
RDY h–andshakesignalson
the
parallelport:

ACK pin
- 14,
output

RDY pin
- 15,
input

IRQTRG s–ignalfromthe
front-endFPGA
to
starttheIRQ5 communication
athe
t start

of
odd
field
(see
Picture18).

83
IRQ h–ardwarerequestforInterrupto5C
n oldFire.

SERVO_MOVE t–he
cleaned-up
servo
motion
signal;also
sentout to
pinon
0the

parallelport.

PIN0_TSB s–ignal which


reversesthe
sense
opin
f 0output
: when
high input
, when
low.

Pins1-13
are
allinputpins

LATCH1,LATCH2 r–egister-enable
signalsto
capture
the
pulse
widt vhalues.

PWM1,
PWM2 g–enerated
servo
duty
waveforms.

84
(DCBA)
0000 P2

0
0
IRQTRG RDY
1 1

LATCH 1

0010 IRQ 1000


P3
Comm.with IRQ5_ ACK Commwith
Proc.2:
ACK
handler:
0 0 Latchpulsewidths
Acknowledge RDY RDY intodataregisters.
handlerand 1 1
terminateIRQ5 Polltheservo
0110
IDLE1
signal. LATCH 2 motion signaluntil
itgoes
high,then
1100 TSB
Latchpulse low again.
Idle
PIN0_TSB
widthsintodata 0 during thattime.
RDY
registers.
1 0
RDY
0111 P0
1

1110 SVS
PIN0_TSB
0
RDY ACK

1 0
SM
LATCH 1 1

0011 P1 1111 SVM


ACK PIN0_TSB
ACK

0 0
RDY SM
1 1

1101 P4
LATCH 2 PIN0_TSB
ACK

0
RDY
1
1001
IDLE2

Picture
43: 0
RDY
IRQ_PP1 ASM 1

85
WhileRDY=0,poll PollforRDY=1C.F.
; DropIRQ5, PollRDY=1, PollRDY=0, Picture
44:
IRQTRG=1;start acknowledges IRQ5 pollRDY=0 latch
value1 latch
value2 Communication
with
the
IRQ5handler
P2 P2 IRQ IRQ IRQ IDLE1 IDLE1P0 P0 P1 P1 P2

SignaltostarttheIRQ5exchange,
IRQTRG onceperframe

IRQ5,throwsColdFireintothe
IRQ5 handlerroutine

RDY

Acknowledgmentsignalgenerated
ACK by
theASM

Latch thevalueopnarallelport
LATCH
1 intoregister1

Latch thevalueopnarallelport
LATCH
2 intoregister2

Corresponding EnterIRQ5 AssertRDY, Deassert AssertRDY, DeassertRDY, Exithandler


events
on handler pollACK=0 RDY value1toPP, value2toPP,
MCF5307: poll
ACK=1 poll
ACK=0

86
While IRQTRG=0, PollRDY=0, PollRDY=1; Poll
SM=1,servos
are Poll
SM=0, end
oservo
f PollRDY=1 PollRDY=0,
poll
RDY=1, latchvalue
2 servomotion moving motion(usually
long
a interval) endtransfer
latchvalue
1

P2 P3 P3 TSB TSB SVS SVS SVS SVM SVM SVM P4 P4 IDLE2 IDLE2
P2

RDY

ACK

LATCH
1

LATCH
2

PIN0_TSB

SM

Corresponding AssertRDY, Deassert AssertRDY, DeassertRDY Poll


SM=0 AssertRDY, DeassertRDY,
eventson value1to
PP, RDY,value2 poll
ACK=1 Poll
SM=1 poll
ACK=0 exitroutine
MCF5307: poll
ACK=1 to
PP,
poll
ACK=0

Notes:
PIN0_TSB
1) reversesthe
sense
pin
ofon
0the PP.
When
low,pin
0is
the
input
bitzero.
When high,
pinoutputs
0 the
SM(servo
move)signa to
lC.F.
SM
isthe
2)
filt eredandsynchronized
inputfrom
the
camera
servob oard.
Whenhigh,
atleastone
servomotorisinmo tion.

Picture
45:Communication
with
Process2

87
17. AUXILIARYFEATURES

The
systemhasseveralauxiliary
features,
which
do notpertain
tiots main
function,
but

which
are
necessary
for
the
deploymenton
flying
a airplane.

17.1 Serialcommunicationwiththe
MCF5307

The
ColdFire
processor
hastwo
built-in
UARTs,
and the
SBC5307
board
hastwo
serial

ports.
One portisconfigured
and
active,
andthe other
can
beasily
madesaowell.
s

There
isasmallI/O
utility
library,
which
allows transmission
obyte
f strings,
and

transmission/reception
oindividual
f bytes.

17.2 Diagnostic
datalogging

Lower
bank
othe
f SBC5307'sDRAM
isorganized
aas four-megabytecircular
buffer,
to

be
used
for
logging
dataduring
the
system'soperat ion.
Thisbuffer
isinitialized
upon

board
reset,
sothe
systemshould
notbe
casually
r eset(or
turned
off!)
before
the

diagnostic
data
are
retrieved.
Autility
library
a llowsfor
recording
obytes
f and stringsof

bytes,
andfor
dumping
the
buffer'scontentsto
the serialport.
Data
shouldbceaptured
to

faile
using
terminal
a software.
Note:pressthe “D”
key
tsotartthe
data
dump
ttohe

terminal.

17.3 Soft
restart
ofthe
visionprogram

The
non-maskableInterrupti7used
s tiomplementa softrestartof
the
vision
programon

the
MCF5307.
WhenIRQ7
ireceived
s (as
haardware signal),
thecorresponding
handler

cleansup
the
cache
and
the
parallelport,
releases allheap
allocations,
resetsthe
stack
and

88
returnsto
the
main
entry
point.In
thisway,
thev ision
algorithmcan
bpeulled
outof

some
confused
statewithoutresetting
theentire
Co ldFire
board.

17.4 Radiocontrols

The
systemcontains
daecoder
(implemented
oanX
nC 9572
CPLD)
whichacceptsfour

PWM
signalsfrom
raadio
receiver,
andconvertsthe minto
four
logic
signals.
The

decoder
usesthe
flightcontrol'spower
supply
and receivesno
inputfromthe
restof
the

system.
Itspurpose
ito
simplementradio
control over
the
following
functions:

power
- on/off:circuitry
unrelated
tm
o anualcontr olsof
the
plane
can
bsehutoff
from

the
ground
ianenmergency.
Anelectronic
switchi attached
s ttohe
battery
pack
for
that

purpose.

reset:
- vision
boards(and
possibly
other
componen ts,
in
the
future)can
breesetto
a

clean
state
icnase
oirreparable
f malfunction.

soft
- restart:the
vision
systemcan
bberoughtba ck
tiotsclean
state
withoutresetand
the

consequentlossof
diagnostic
data.
TheFPGA
part of
the
systemisresetwithoutharm.

The
vision
can
also
bheeld
irnestartcontinuously in
,which
case
thecamera'sgaze
is

fixed,
thetracking
istopped
s and
ndoataare
logg ed.
Thisisusefulto
keepthe
camera

fromturning
around
aimlessly
during
takeoffand
la nding.

zoom:
- the
zoomof
the
vision
systemcan
bteurned on
ooff
r by
the
operator.

89
(BA)
00 WAIT

Waitfortheriseothe
f
0 radiosignal
CHNL

Loadthe1.5mcounter
s
LD

01 RUN

If
thecounterexpires
1
CHNL
&TCN whilesignalisup,decode
tologic1
0

0 1
ZERO CHNL If
thesignaldropswhile
counterisstillrunning,
decodetloogic0

ONE

11 WCH
Waitforthesignalto
dropbeforeresettingfor
thenextcycle
0 1
CHNL

Picture
46:Radio
decoder
ASM
(module
CHANNEL)

17.4.1Radiodecoder’ssignalsinthe
schematic
CHANNEL

CHNL p–ulse-width
modulated
signalfromthe
radio
recei ver.

LD s–ynchronousload
signalfor
the
1.5
mcounter.
s

TCN t–he
counter’sexpiration
signal(termcount).

ZERO/ONE –clear/setthe
logic
output.

90
18. FEATURERECOGNITION
ONTHEMCF5307
PROCESSOR

Functioning
othe
f programsrunning
otnhe
ColdFire processor
isbestunderstood
in

termsof
the
time
periodsinvolved.
Onecomponent, centeredaround
the
motion
vector,

runsin
response
toIRQ5,
which
iissued
s btyheFP GA
once
per
video
frame.Ithas

already
been
describedearlier,
in
Section
1o6tnh IeRQ5/parallelportsubsystem.

The
other
component,
feature
recognition,
runsin
t he
backgroundrelative
ttohe

interrupts.Ithasno
time
requirementsimposed
on itby
the
inputsignal,
only
thegeneral

consideration
thatfasterprocessing
leadsto
bette tracking.
r Herewdeescribe
that

procedure.

The
feature
recognition
consistsof
the
following
c omputations:

- the
thresholded
black-and-white
image
iscanned
s fo edges.
r Apixelisdefined
aan
s

edge
pointif
itisblack
icnolor,
andhasbetween 2and
black
7 neighbors.
Theoutput

of
edge
detection
ianother
s b/w
image:edgesaret raced
ibnlack,
andtheredundant

interior
of
the
featureshasbeen
removed.

- segmentation:edge
pointsare
logically
grouped
int coonnected
threadsor
loops,
and

eachconnected
componentisassumed
troepresenta distinctfeature(object)
in
the

scene.
Theoutputis
caollection
oarrays,
f each
c ontaining
the
edge
coordinatesof
14
one
feature.
We use
two-pass
a algorithmdescribed by
Lumiaeal.
t (Algorithm
3

in
the
reference).For
each
line,
adjacentpointsi tnhatand
the
previouslineare

91
labeled
abelonging
s ttohe
same
component,andthe labeling
equivalencesare

resolved
bmy eans
daof
epth-firstgraph
search.
H ere
isasimple
example:

Atthe
pixelX,
the
equivalence
olabels
f and bw illbe
recognized,
andchangesin

labeling
carried
outin
the
second
(bottomup)
pass The
. procedure
workswellfor

simple
scenes,
butthe
connectednessproblemiscom binatorialin
nature,
andthe

equivalence
search
willeventually
getbogged
down in
compleximages.

- insignificantly
smallfeaturesare
removed,
andan invariantsignature
icalculated,
s

which
iused
s foractualrecognition
othe
f object in
the
scene.
We use
thesecond

momentsaboutthe
principalaxesof
inertia,
which are
invariantunderrotation
and

translation
and
relatively
easy
tobtain.
Thecalc ulation
involvesone
square
rootper

feature,
butrequiresno
adjacency
information
abou the
t feature
points.
Asa

preliminary,
thisstep
also
calculatesthefeature’ center
s of mass.

- targetrecognition:signaturesof
the
objectsin
th secene
arecompared
ttohose
oaf

selected
targetfeature.Ifthe
targetisfound,
th deisplacementvector
isthedifference

between
itscenter
of
massand
the
centerpointof
t he
image.If
thetargetislost,
an

error
isreturned.

92
18.1 Main
datastructuresinthe
MCF5307
code

DigitizedImage:

IMAGE_FRAME starting
addressof
theread
buffer
P2 Ron
DPRAM

FRAME_WIDTH image
dimensionsin
pixels

FRAME_HEIGHT

Motion/DisplacementVectors:

XS_VECT_ADDR DPRAM
addressof
the
vector
component s

X_MVECT,
Y_MVECT localcopy
othe
f motionvector

NORM_MVECT normalization
constantof
the
motion
ve ctor

X_DVECT,
Y_DVECT the
displacementvector

Semaphores:

These
variablescarry
messagesbetween
(andwithin) the
two
main
processeson
the

MCF5307.
They
are
integers,
setto
one
andcleared to
zero.

DISP_VECT_AVAIL feature
recognition
announcesthat naew

displacementvector
isavailable

TARGET_AVAIL main
processannouncesthat
taargetf eature
has

been
selected
for
tracking

Post-Segmentation
Description
oImage
f Features:

Each
individualconnected
edge
istored
s istruc
an ture
othe
f typeFpoint.
Thestructure

containsthe
pointcount,
arraysof
coordinate
valu es,
center
of
massand
the
second

93
moments.
These
structuresareallocated
dynamically asthefeaturesemerge
fromimage

analysis.

compResult sa-tructureothe
f type
ComponentList*,
holdsthe overallresultof

the
image
segmentation.
Itcontainsthe
featureco unt,
andaanrray
opointers
f to

structuresof
typeFpoint,
which
contain
the
detail of
seachfeature.

targetFeature the
- feature
thatisbeing
tracked
istored
s itnh isFpointstructure.

Fpointand
ComponentListdata
typesare
specified
i tnhe
headerfile
FeaturePoints.h

Parametersof
the
ServoDuty
Cycles:

See
the
data
section
othe
f programmodule
Servo.s for
description
and
currentvalues.

94
19. INITIALIZATION
OFTHESBC5307
BOARD

Pre-initialization
conditions:

The
entirecode
(initandfunctional,
ca.45K
in
si ze)
residesin
flash
ROM,
and
the
flash

ROM
isassociated
with
the
globalchip
select.
Vect or
table
and
initialization
code
are

linked
to,
and
reside
at,
zero
address;vector
base register
(VBR)
pointsto
zero.

Functionalcode
ilinked
s ttoheupper
bank
oDRAM
f and
, residesin
theflash,
justabove

the
initcode.
Seethe
linker
scriptsbc5307vis.ld for
the
detailsof
the
linking
procedure.

Initialization
sequence:

on
-resetexception,
initialPCand
SPare
loaded fromROM,
aslongwordsataddresses

zero
and
four.
PCpointsto
start-up
code
iR
n OM.

ColdFire
- processor
isconfigured:cache
idisabl
s ed
and
turned
off;SIMand
upper
bank

of
DRAM
areconfigured;chip
selectsare
configured and
the
systemispulled
outof
the

globalchip
select

contents
- of
the
flash
ROM
arecopied
tuopperDRA M,
VBRissetto
the
base
oupper
f

DRAM

program
- controljumpsto
DRAM
starting
point

chip
- selectzero
ireconfigured
s ttohe
top
oad
f dressspace(to
getthe
flashmemory
out

of
the
way)

lower
- bank
oDRAM
f isconfigured

control
- jumpsto
main
and
startsrunning
function alcode

95
20. CHARACTERISTICSOF
THECAMERA/SERVOSYSTEM
(GIMBA L)

A
technicaldescription
othe
f camera
isnotavail able,
butby
measuring
the
screen
image

of
an
objectof
known
size,
we have
determined
the following:

the
- angle
corresponding to
one-pixel
a displacement(the
cameraratio)
is0 .62

degrees/pixel,
atourgiven
imageresolution,

the
- camera’sfield
ica.
s 60degreeswide
andca. 47
degreeshigh.

The
servo
motorswhichmove
the
camera
have
anangu lar
motion
range
oca.
f 90

degrees.
Their
duty
cyclei20
smilliseconds,
with the
recommended
high
time
ranging

fromt1ms.
2oThese
duty
cyclesaregeneratedby the
back-end
FPGA.

The
camera’splatformisdriven
blever
ay mechanis m(see
Picture
47),
anditsangular

motion
inot
s entirely
linear
relative
ttohe
motio onthe
f servos.
However,within
the

above
high-time
range
the
relationship
ireasonabl
s lyinear,and
the
platformcoversan

angle
oca.
f 46
degreesineachcoordinate
directio n.

In
thecurrentcircuitdesign,
theto
12 msrange
correspondsto
the
range
o2048
f clock

ticksof
high
time,
or
45.3
ticksper
degree
opla
f tformmotion.
Combined
with
the
above

camera
ratio,
thisyields28.3
ticksper
one
pixel of
displacement,
theconstantof

proportionality
betweenmotion/displacementvectors and
the
motor
movement.

96
Picture
4View
7: of
thecameragimbal

97
0x00 0x00
ROM
Low Semaphores
DRAM

0x08 Status
byte
0x09
FPGA
parameters
0x400000
Codeanddata
0x0C

Heap
High
DRAM Motion
vectors
Stack

0x800000

0x20

BufferM0

0x10000000 Config.
0x2710

BufferM1

0xFF800000 Semaphores

0xFFA00000 DPRAM 0x4E20

BufferM2

0xFFE00000
High
ROM

Memory
mapothe
f ColdFireboard FPGA/DPRAMmemory
map

Picture
48:Memory
maps
98
J11

J8 J9
J6

33.333 MHz
RST
J10
XC4010 FPGA XC4010 FPGA
back end front end

CY7C007AV
J5 J4 J3 DPRAM

J7

AT17LV J2
front
J1
2.4576
AU2904 AT17LV MHz
back SW

LM317T POWER
(3.3V) LM317T ZOOM
(5V) MAX622
5ACPA XC9572 RESET
LM1881
AD876AR NMI
RCA video
jack

Picture
49:Layoutof
theFPGA
board

99
4
+12V
Switch
+12V
+7.5V +7.5V Battery
GND pack
serial
Diagnostic
Power
on/off
logic GND
comm. 2

SBC5307 FPGA
board and
Camera
gimbal
4
5

4 Dutycyclesand
servo
positions

+7.5V
+7.5V Servoboard
Dutycyclesand
servo-move +7.5V +7.5V

Video
signal
fromthe
camera
3
3
3 Radio
3

Picture
50:Airplane Radio TV
system’soverview receiver transmitter
+12V

100
REFERENCES

1
R.W.
Rodieck,
“The
FirstStepsin
Seeing,”
Sinaue r,
1998.
2
David
VandenBout,
“The
PracticalXilinxDesigne Lab
r Book,
Version1.5,”
Prentice-Hall,
1999.
3
“ColdFire
Microprocessor
Family
Programmer’sRefer ence
Manual,”
Motorola,
1997
(MCF5200PRM/AD)
4
Leon
WCouch,
. “Digitaland
Analog
Communication
S ystems,”
third
edition,
MacMillan,
1990,
Chapter 5-9.
5
Data
sheetfor
the
videosync
separatorLM1881,Na tionalSemiconductor,February
1995.
6
Data
sheetfor
theA/Dconverter
AD876,
Analog
Dev ices,
December
1997.
7
Data
sheetfor
the
CY7CnnnAV
Dual-PortStatic
RAM Cypress
, Semiconductor,
December 7,
2000.
8
“Understanding
AsynchronousDual-PortRAMs,”
tech nicalnote,
Cypress
Semiconductor,rev.
November7,
1997.
9
MCF5307
User’sManual,
Motorola,
1998
(MCF5307UM /AD)
10
NicolasFranceschini,
“Early
Processing
oColour
f and
Motion
iMosaic
an Visual
System,”NeuroscienceResearch,
Supplement2,pp.
S 17-S49,
Elsevier,
1985.
11
Jean-Marc
Pichon,
Christian
Blanes,
NicolasFrance schini,
“VisualGuidance
oaf
Mobile
Robot,”
SPIE,
v.1195,
1989.
12
Brian
AWandell,
. “Foundationsof
Vision,”
Sinauer 1995.
,
13
S.
Yuand
T.S.Lee,"WhatDo
VN1 euronsTellUsab outSaccadic
Suppression?,"
Journalof
NeuralComputing,2000.
14
RonaldLumia,Linda Shapiro,
OscarZuniga,
“ANew Connected
Components
AlgorithmforVirtualMemoryComputers,”
Computer Vision,
GraphicsandImage
Processing, 22287-300
, (1983).

101
CURRICULUM
VITAE

DankoAntolovic

Born
i1n955,
inZagreb,Croatia

Education:

• B.S.
Chemistry,
University
oZagreb,
f Croatia(197 8)

• M.S.
Chemistry,
JohnsHopkinsUniversity,Baltimore MD
, (1981)

• Ph.D.
Chemistry,
Johns
HopkinsUniversity,Baltimor e,
MD (1983)

• M.S.
Computer
Science,
Indiana
University,Blooming ton,IN
(2001)

Work
experience:

• 25yearsin
systemsanalysisand
computer
software development

Platforms:
desktop,
servers,
mainframesand
superc omputers

Applications:
Robotics,
Computer
Vision,Industria Programming,
l Solarand

GasCombustion
Modeling,
QuantumChemicalModeling, Molecular

Modeling

• 10yearsin
scientificresearch

• 5yearsin
teaching
athe
t undergraduate
level

• 5yearsin
running
software
a developmentbusiness

102

Potrebbero piacerti anche