Sei sulla pagina 1di 6

4th

4th International
International Conference
Conference on
on Advances
Advances in in Control
Control and
and
4th International
Optimization
4th of
International Conference
Dynamical
Conference on
on Advances
Advances in
Systems in Control
Control and
and
Optimization
4th
Optimization of
International
of Dynamical
Conference
Dynamical Systems
on Advances
Systems in Control and
February 1-5, of
Optimization
February 1-5, 2016.
Dynamical
2016. NIT Systems Available
NIT Tiruchirappalli,
Tiruchirappalli, India online at www.sciencedirect.com
India
Optimization
February of Dynamical Systems
February 1-5,
1-5, 2016.
2016. NIT
NIT Tiruchirappalli,
Tiruchirappalli, India
India
February 1-5, 2016. NIT Tiruchirappalli, India
ScienceDirect
IFAC-PapersOnLine 49-1 (2016) 407–412
A
A Brief
Brief Survey
Survey on
on Bio-inspired
Bio-inspired Algorithms
Algorithms
A
A Brief
Brief Survey
Survey on Bio-inspired
on Bio-inspired Algorithms
Algorithms
∗∗
for
for
for Autonomous
Autonomous Landing
Landing ∗∗
for Autonomous
Autonomous Landing
Landing ∗
Maitra ∗∗∗ Sri
Amritesh Maitra
Amritesh Prasath ∗∗
Sri Ram Prasath ∗∗ Radhakant Padhi ∗∗∗ ∗∗∗
Amritesh Maitra ∗ Sri Ram
Ram Prasath ∗∗ ∗∗ Radhakant
Radhakant PadhiPadhi ∗∗∗
∗∗∗
Amritesh Maitra ∗ Sri Ram Prasath ∗∗ Radhakant Padhi ∗∗∗
Amritesh Maitra Sri Ram Prasath Radhakant Padhi

∗ Aerospace Engineering Department,
∗ Aerospace Department, Indian
Indian Institute
Institute of
of Science
Science
∗ Aerospace Engineering
∗∗ Engineering Department, Indian Institute of Science
∗ Aerospace
∗∗ Aerospace
Aerospace Engineering
Engineering
Engineering Department,
Department,
Department, Indian
Indian
Indian Institute
Institute
Institute of
of
of Science
Science
Science
∗∗ Aerospace Engineering Department, Indian Institute of Science
Aerospace Engineering Department, Indian Institute of
∗∗
∗∗∗
∗∗∗
∗∗ Aerospace
Aerospace Engineering
Engineering Department,
Department, Indian
Indian Institute
Institute of
of Science
Science
Science,
Aerospace
∗∗∗ Aerospace Engineering Department, Indian Institute of Science,
∗∗∗ Aerospace Engineering Department, Indian Institute of Science
Engineering Department, Indian Institute of Science,
∗∗∗ Aerospace Engineering
Bangalore
Bangalore
Aerospace Department,
(e-mail
(e-mail
Engineering :
: Indian
Indian Institute of Science,
Institute
padhi@aero.iisc.ernet.in)
padhi@aero.iisc.ernet.in)
Department, of Science,
Bangalore
Bangalore (e-mail
(e-mail :
: padhi@aero.iisc.ernet.in)
padhi@aero.iisc.ernet.in)
Bangalore (e-mail : padhi@aero.iisc.ernet.in)
Abstract: The
Abstract: The paper
paper briefly
briefly introduces
introduces aa few few bio-inspired
bio-inspired algorithms
algorithms whichwhich can can be be applied
applied
Abstract:
Abstract:
to autonomous The
The paper
paper
landing briefly
briefly
for introduces
introduces
Unmanned Aerialaa few
few bio-inspired
bio-inspired
Vehicles (UAVs). algorithms
algorithms
A number which
which
of can
can
vision be
be
based applied
applied
data
to
to autonomous
Abstract:
autonomous The landing
paper
landing for
briefly
for Unmanned
introduces
Unmanned Aerial
Aerial a few Vehicles (UAVs).
bio-inspired
Vehicles (UAVs). A number
algorithms
A number of
which
of vision
can
vision based
be
based data
applied
data
to autonomous
acquisition
acquisition
to autonomous landing
methodologies
methodologies
landing for
for Unmanned
used
used
Unmannedfor
for UAV
UAV Aerial
Aerial Vehicles
navigation
navigation
Vehicles (UAVs).
and
and A
guidance
guidance
(UAVs). A number
are
are
number of vision
described.
described.
of vision based
Monocular
Monocular
based data
data
acquisition
acquisition
camera based methodologies
methodologies
vision data used
used
can befor
for UAV
UAV
used to navigation
navigation
extract and
and
plethora guidance
guidance
of are
are
information described.
described.
about the Monocular
Monocular
dynamics
camera
acquisition
camera based
based vision
methodologies
vision data
data can
usedbe
can be used
for UAV
used to
to extract
navigation
extract plethora of information
and guidance
plethora of about
are described.
information about the
the dynamics
Monocular
dynamics
camera
of
of a target
a
camera target based
plane
plane
based vision
in
vision data
data can
in landing
landing can be
be used
problems
problems and
usedand to extract
also
to also
extract plethora
about
about the pose
the
plethora of
of information
pose of the
of cameraabout
the camera
information itself.the
itself.
about Some
Some
the dynamics
recent
recent
dynamics
of
of a
a target
target
developments plane
plane in in
in
the landing
landing
areas problems
problems
of and
and
autonomous also
also about
about
landing the
the
and pose
pose of
of
perching the
the camera
camera
based on itself.
itself.
vision Some
Some
data recent
recent
from
developments
of a target
developments plane in
in the
in
the areas
landing
areas of
of autonomous
problems and
autonomous alsolanding
about
landing and
the
and perching
pose of
perching thebased
camera
based on
on vision
itself.
vision data
Some
data from
recent
from
developments
monocular
monocular
developments in
camera
camera
in the
theare
are areas of
introduced.
introduced.
areas of autonomous
Some
Some
autonomous landing
possible
possible
landing and
relevant
relevant
and perching
extensions
extensions
perching based
are
are
based on
also
also
on vision
provided.
provided.
vision data
data from
from
monocular
monocular camera
camera are
are introduced. Some Some possible relevant extensions are also provided.
monocular
© 2016, IFAC camera are introduced.
(Internationalintroduced.
FederationSome
possible relevant
possibleControl)
of Automatic relevant extensions
extensions
Hosting
are
are also
by Elsevier also provided.
provided.
Ltd. All rights reserved.
Keywords:
Keywords: Autonomous
Autonomous Vehicles,
Vehicles, Trajectory
Trajectory Planning,
Planning, Bio-Inspired
Bio-Inspired Navigation
Navigation and
and Guidance,
Guidance,
Keywords: Autonomous
Keywords:
Unmanned Autonomous
Aerial Vehicles Vehicles, Trajectory
Vehicles, Trajectory Planning,
Planning, Bio-Inspired
Bio-Inspired Navigation
Navigation and
and Guidance,
Guidance,
Unmanned
Keywords:
Unmanned Aerial Aerial
Autonomous Vehicles
Aerial Vehicles
Vehicles Vehicles, Trajectory Planning, Bio-Inspired Navigation and Guidance,
Unmanned
Unmanned Aerial Vehicles
1.
1. INTRODUCTION
INTRODUCTION will
will be
be toto implement
implement thesethese strategies
strategies in in UAV
UAV engagement
engagement
1. INTRODUCTION
1. INTRODUCTION will
will be
be to
to implement
implement these
these strategies
strategies in
in UAV
UAV engagement
1. INTRODUCTION scenarios
scenarios
will be
scenarios to to
to obtain similar
obtain
implement
to obtain
similar
these
similar
capabilities.
capabilities.
strategies
capabilities. in UAV engagement
engagement
Unmanned Aerial
Unmanned Aerial Vehicles
Vehicles are are being
being ubiquitous
ubiquitous for surveil- scenarios
for surveil- scenarios to
to obtain
obtain similar
similar capabilities.
capabilities.
Unmanned
Unmanned
lance, Aerial Vehicles
Aerial
reconnaissance Vehicles are being
are
missions, being
searchubiquitous
ubiquitous
and foretc.
for
rescue surveil-
surveil-Re- Pursuit Methodologies:
Pursuit Methodologies: The The following
following methodologies
methodologies are are
lance,
Unmanned
lance, reconnaissance
Aerial Vehicles
reconnaissance missions,
missions, search
are being
search and
ubiquitous
and rescue
foretc.
rescue surveil-
etc. Re-
Re- Pursuit
Pursuit Methodologies:
Methodologies: The
The following
following methodologies
methodologies are
are
lance,
search reconnaissance
directions have missions,
been search
towards and
autonomousrescue etc.
naviga- Re- used
used
Pursuitby
by birds
birds and
and
Methodologies:insects
insects in
in
The nature
nature
followingto
to capture
capture its
its
methodologies prey:
prey:are
search
lance,
search directions
reconnaissance
directions have
have been
missions,
been towards
search
towards autonomous
and
autonomousrescue naviga-
etc.
naviga- Re- used
used by
by birds
birds and
and insects
insects in
in nature
nature to
to capture
capture its
its prey:
prey:
search
tion
tion
search and
and directions
guidance
guidance
directions have
haveof
of been
unmanned
unmanned towards
been towards autonomous
aerial
aerial vehicles
vehicles
autonomous naviga-
(UAVs)
(UAVs) used by birds
naviga- 2.1 Classical Pursuit (CP) and insects in nature to capture its prey:
tionapplications
and guidance of unmanned
unmanned aerial vehicles (UAVs)
tion
for
for
tion
for
and
and guidance
applications
guidance
applications
ranging
ranging
ranging
of from
of unmanned
from
aerial
aerial vehicles
from surveillance,
surveillance,
surveillance, rescue
vehicles
rescue (UAVs) 2.1
(UAVs)
rescue mission,
mission,
mission, 2.1
2.1
Classical
Classical
Classical
Pursuit
Pursuit
Pursuit
(CP)
(CP)
(CP)
for applications
inspection
inspection
for to
to
applications ranging
transportation
transportation
ranging from
from surveillance,
(Visiongain
(Visiongain
surveillance, rescue
(2009)).
(2009)).
rescue mission,
Manual
Manual
mission, 2.1 Classical Pursuit (CP)
inspection
inspection
control of to
to
UAVs transportation
transportation
has been (Visiongain
(Visiongain
quite in vogue (2009)).
(2009)).
in Manual
Manual
military and In
In this
this strategy
strategy the
the predator
predator aligns
aligns its
its velocity
velocity vector
vector V
V p
control
inspection
control of
of UAVs
UAVs has
to transportation
has been
been quite
quite in
in vogue
(Visiongain
vogue in
in military
(2009)). Manual
military and
and In
In this
this
towards strategy
strategy
the prey the
the predator
predator
position, aligns
aligns
and its
its
continues velocity
velocity
to vector
vector
change V
V
p
p
its
control of UAVs has been quite in vogue in military and towards
In this the
strategyprey theposition,
predator and
alignscontinues
its to
velocity change
vector V p
its
civil
civil applications.
applications.
control of UAVs has Normally
Normally
been quiteexpensive
expensive
in vogue sensors
sensors are
are required
required
in military towards
and magnitude the prey position, and continues to change its
p
civil
civil applications.
applications. Normally
Normally expensive
expensive sensors
sensors are
are required
required towards
magnitude
towards the
the of
ofprey
the
the
prey position,
velocity
velocity
position, and
towards
towards
and continues
the
the
continues to
target
target
to change
to
to its
achieve
achieve
change its
for
for
civilsensing
sensing the environment
the
applications. environment
Normally and control
and
expensive control
sensors computation
computation
are required is magnitude of the velocity towards the target to achieve
is
for sensing
sensing the However,
environment and control
control computation magnitude
is target
target (Nahin
(Nahin of
of the velocity
(2012)).
(2012)). This
This towards
law
law is
is the
the target
akin
akin to
to to
to achieve
Velocity
Velocity Pur-
Pur-
for
largely
largely
for sensing
largely
the
off-line.
off-line.
off-line.
environment
the However,
environment
However,
and
autonomy
autonomy
autonomy
is
is
and control
is
a computation
aa vital
vital
vital
requirement
requirement
computation
requirement is magnitude
is target
target
suit in (Nahin
(Nahin
missile
the velocity
(2012)).
(2012)).
guidance. This
This
towards
law
law
Although is
is akin
akin
this
target
to
to Velocity
Velocity
strategy
achieve
Pur-
Pur-
leads to
largely
for
for UAV
UAV
largely off-line.
operations.
operations.
off-line. However,
However,Moreover
Moreoverautonomy
large
large
autonomy is
scale
scale
is aa vital requirement
deployment
deployment
vital requirement calls
calls suit
target
suit in
in missile
(Nahin
missile guidance.
(2012)).
guidance. Although
This law
Although is this
akin
this strategy
to
strategy leads
Velocity Pur-
leads to
to
for UAV
for UAV operations.
operations. Moreover
Moreover large
large scale
scale deployment
deployment calls
calls suit in
optimum
optimum
suit in missile guidance.
trajectory
trajectory
missile Although
between
between
guidance. Althoughthe
the this
predator
predator
this strategy
and
and
strategy leads
the
the to
prey,
prey,
leads to
for inexpensive,
inexpensive,
UAV operations. less
less sophisticated
sophisticated
Moreover sensory
sensory
largesensory and
and
scale deployment processing
processing
calls updateoptimum trajectory between the predator and the prey,
for
for inexpensive,
inexpensive, less
less sophisticated
sophisticated sensory and
and processing
processing optimum
update
optimum of
of trajectory
the
the velocity
velocity
trajectory between
vector
vector
between the
in
in
the predator
classical
classical
predator and the
pursuit
pursuit
and the is
isprey,
not
not
prey,
units. It is to be noted that successful autonomous oper-
units.
for
units.
ation
It
It
is
inexpensive,
units. It is
includes
to
is to
to be be
be noted
less
noted
noted
autonomous
that successful
sophisticated
thattake-off
that
sensory
successful
successful and
autonomous
and processing
autonomous
autonomous
landing. oper- update
oper-
oper-
Surveil-
update
always
always
update
of
of the velocity
of the
the velocity
achieved,
achieved, so
so this
this
velocity
vector
vector
strategy
strategy
vector
in
in classical
classical
in is
is limited
limited
classical
pursuit
pursuit
to
to
is
is not
is not
stationary
stationary
pursuit not
ation
units. includes
It is
ation includes
includesto autonomous
be noted that
autonomous take-off take-off
successful
take-off and landing.
autonomous
and landing.
landing. Surveil-
oper-
Surveil- always
always
targets achieved,
achieved,
only. See so
so
Fig.this
this
1. strategy
strategy is
is limited
limited to
to stationary
stationary
ation
lance
lance
ation requires
requires
includes autonomous
”perch-and-stare”
”perch-and-stare”
autonomous take-off and
functionality
functionality
and landing. for
for Surveil-
intelli-
intelli-
Surveil- targets
always
targets only.
only. See
achieved,
See Fig.
so
Fig. 1.
this
1. strategy is limited to stationary
lance gathering
requires ”perch-and-stare”
”perch-and-stare” functionality intelli- targets
forrecharg- only. See Fig. 1.
lance
gence
gence
lance
gence
requires
gathering at a
at a strategic
strategic location
requires ”perch-and-stare”
gathering at a strategic
functionality
location
location
and also
and
functionality
and also
for
also for
for intelli- targets only. See Fig. 1.
intelli-
forrecharg-
for recharg-
gence
ing.
ing.
gence One
Onegathering
of
of
gathering the
the at aa strategic
atlatest
latest trends
trends
strategic location
in
in and
and also
autonomous
autonomous
location also for recharg-
navigation
navigation
for recharg-
ing.UAVs
ing.
of One
One of
of
has thebeen
the latest
latest to trends
trends
look in autonomous
in autonomous
towards nature, navigation
navigation
the birds
of
ing.UAVs
One
of UAVs
UAVs has
of the
has been been
latest
been to
to look
trends towards
in
look towards autonomous
towards nature,
nature, the the birds
navigation
thecontrol
birds
of
and
and
of the
the
UAVs has
insects,
insects,
has beenfor
for to look
inspiration.
inspiration.
to look All
All
towards nature,
established
established
nature, the birds
control
birds
and the
and the insects,
insects,
guidance for
for inspiration.
inspiration.
methodologies All
All
for All established
established
UAVestablished
manoeuvring control
control re-
and
and guidance
the insects,
guidance methodologies
for inspiration.
methodologies for
for UAV
UAV manoeuvring
manoeuvring control re-
re-
and
quire
quire
and guidance
accurate
accurate
guidance methodologies
target
target data
data
methodologies and
and for
UAV
UAV
for UAV
UAV states.
states.manoeuvring
But
But
manoeuvringbirds
birds and
andre-
re-
quire
quire
insects accurate
accurate
do not target
target
make data
data and
and
extrinsic UAV
UAV states.
states.
measurements But
But asbirds
birds and
and
possible
insects
quire
insects do
accuratenot
do not
not make make
target extrinsic
data and
extrinsic measurements
UAV states.
measurements But as
as possible
birds and
possible
insects
with
with
insects do
do not make
sophisticated
sophisticated make extrinsic
sensors.
sensors.
extrinsicYet
Yet measurements
they
they are
are
measurements able
able as
to
to
as possible
execute
execute
possible
with
with
complex sophisticated
sophisticated
manoeuvres sensors.
sensors. Yet
Yet
while taking-off,they
they
taking-off, are
are
landing,able
able to
to executea
execute
pursuing
complex
with
complex manoeuvres
sophisticated
manoeuvres while
sensors.
while Yet they
taking-off, landing,
are
landing,able pursuing
to executeaa
pursuing
complex
prey,
prey,
complex manoeuvres
evading
evading a
a
manoeuvres predator
predator while
while taking-off,
etc.
etc. even
even
taking-off, under
underlanding,
landing,the
the pursuingof
presence
presence
pursuing ofaa
prey, evading
prey, evading aa predator
uncertainties. predator etc. etc. even
even under
under the the presence
presence of of
uncertainties.
prey, evading a predator etc. even under the presence of
uncertainties.
uncertainties.
uncertainties.
2.
2. BIO-INSPIRED
BIO-INSPIRED GUIDANCE
GUIDANCE
2. BIO-INSPIRED
2. BIO-INSPIRED GUIDANCE GUIDANCE Fig.
2. BIO-INSPIRED GUIDANCE Fig. 1.
Fig. 1. Predator
1. Predator velocity
Predator velocity vector
velocity vector directed
vector directed towards
directed towards prey
towards prey in
prey in
in
Ecological
Ecological science
science researchers
researchers provide
provide results
results on the
on the birds
the birds
birds Fig.
Fig. 1. Predator
Classical
Classical
1. Predator velocity
Pursuit
Pursuit
velocity vector
vector directed
directed towards
towards prey
prey inin
Ecological
Ecological science
science researchers provide results on Classical
Classical Pursuit
Pursuit
flying,
flying,
Ecological hunting
hunting
science andresearchers
and evasion
researchers provide
evasion strategies.
strategies.
provide results
A
results on the
A preliminary
preliminary
on the birds
birds
step
step Classical Pursuit
flying, hunting
flying, hunting and and evasion
evasion strategies.
strategies. A A preliminary
preliminary step step
flying,
 hunting and evasion strategies. Acollaborative
preliminary step 2.2 2.2 Constant Interception Angle

 The
The work
work is
is supported
supported under
under IISc-DRDO
IISc-DRDO collaborative project
project 2.2 Constant
2.2 Constant Interception
Interception AngleAngle
 The
 The work
work is
is
TR-DRDO-PAME-2015-05.
TR-DRDO-PAME-2015-05.
supported
supported
The work is supported under
under
under
The
The
IISc-DRDO
IISc-DRDO
authors
authors
IISc-DRDO
collaborative
collaborative
acknowledge
acknowledge Mr
Mr
collaborative
project
project
Ashutosh
Ashutosh
project 2.2 Constant Interception Angle
Constant Interception Angle
TR-DRDO-PAME-2015-05.
TR-DRDO-PAME-2015-05.
Simha, SERC, IISc IISc for The authors
The authors
for technical
technical acknowledge
inputs acknowledge
and DRDO-FIST
DRDO-FISTMr Ashutosh
Mr Ashutosh
for the
the This strategy is equivalent to deviated pursuit approach of
Simha, SERC,
TR-DRDO-PAME-2015-05.
Simha, SERC, IISc for inputs
The authors
technical inputs and
acknowledge
and DRDO-FIST for
Mr Ashutosh
for the
This
This strategy
strategy is
is equivalent
equivalent to
to deviated
deviated pursuit
pursuit approach
approach of
of
Simha, SERC,
infrastructure
infrastructure IISc
provided.
provided.for technical inputs and DRDO-FIST for the This
missile
missile
This strategy
guidance
guidance
strategy is
is equivalent
where
where
equivalent a
a to deviated
trajectory
trajectory
to deviated is
is pursuit
predicted
predicted
pursuit approach
based
based
approach of
on
on
of
Simha, SERC,
infrastructure IISc
provided.for technical inputs and DRDO-FIST for the missile guidance where a trajectory is predicted based on
infrastructure provided.
infrastructure provided.
missile
missile guidance
guidance wherewhere aa trajectory
trajectory is is predicted
predicted based
based on on
Copyright
2405-8963 ©
Copyright © 2016,
2016
2016 IFAC 407
IFAC (International Federation of Automatic Control)
IFAC 407 Hosting by Elsevier Ltd. All rights reserved.
Copyright
Copyright © 2016
©under IFAC
2016 responsibility
IFAC 407
407Control.
Peer review
Copyright © 2016 IFAC of International Federation of Automatic
407
10.1016/j.ifacol.2016.03.088
IFAC ACODS 2016
408 Amritesh Maitra et al. / IFAC-PapersOnLine 49-1 (2016) 407–412
February 1-5, 2016. NIT Tiruchirappalli, India

the velocity vector of the prey and predator (equivalent to


target and projectile respectively). The velocity vector of
the prey subtends a constant angle (say, δ) with the line-of-
sight (LOS) between the predator and the prey See Fig.2
(Nahin (2012), Collett and Land (1978)). The limitation of
the strategy is that if the prey changes its initial heading
after engagement starts, effective interception may not
take place.
In realistic situations, birds or other predators in nature
take a finite amount of time in sensing their prey, control
formulation and actuation. So the predator includes a
finite deviation in their trajectory, resulting in a curved
path (Shima (2007)). Hence this is not exact to deviated
pursuit strategy in missile guidance.
Fig. 3. Constant absolute LOS Angle maintained in Con-
stant Absolute Interception Angle

induces drag. Hence falcons fly in logarithmic spiral path


to intercept the target while maintaining the target in its
visual range  See Fig. 4 (Tucker and Enderson (2000)).
This situation is typical in seeker missiles where due to
limited sensing range of the seeker, the target has to be
in that range only. This strategy can be employed in UAV
landing or docking scenarios, where the target is moving.

Fig. 2. Predator velocity vector directed ahead of LOS to


prey in Constant Interception Angle
Fig. 4. Interception based on logarithmic spiral path under
2.3 Constant Absolute Interception Angle constraint on field-of-view 

This strategy is mostly equivalent to the constant inter- All the strategies mentioned above ensure successful inter-
ception angle strategy in the sense the velocity vector ception but do not explicitly ensure safety of the agents
of the prey is adjusted throughout the engagement to involved. UAV landing should ideally be a soft landing
maintain a constant LOS angle θ (Olberg (2012)). When unlike a missile impact. This vital requirement, is not
the prey changes its direction intermittently, the predator typically met by the above methods. In order to ensure
also changes its velocity vector such that it continues to ”soft” interception, i.e., terminal velocities and accelera-
orient itself ahead of the LOS and achieve similar results to tions should be as low as possible which maybe ensured
that of Constant Interception angle strategy (Ghose and by the following strategies.
Moss (2006), Reddy and Krishnaprasad (2006)).
Fig. 3 shows that initially the predator is heading at 2.5 Height Descent Approach
an angle δ1 ahead of the prey and corresponding LOS
is making an angle θ with the z-axis. With the prey is This approach was inspired by studies on flight trajectories
continuously manoeuvring in the x-y plane, the predator of honeybees. It is surmised that the velocity of the preda-
is changing its heading accordingly by keeping θ constant. tor directed towards land is approximately proportional
When target is stationary Vt = 0 then this approach is to the height of their current position (Baird E (2005)).
basically equivalent to the classical pursuit. The honeybees visual cue gives the angular velocity of the
image. If a constant angular rate is maintained, in order
Recent studies on goshawk revealed that the most of the to achieve smooth landing, forward velocity Vfb and the
interception path taken by the goshawk is this approach,
descent velocity Vdb should be adjusted based on the height
however at the final phase of the capture they approach
See Fig.5.
the target which appears as a different trajectory (Suzanne
Amador Kane and Rosenthal (2015)). The angle at which the descent should approach is

2.4 Visual Constraint Approach


V b 
ϕ = tan−1 d
(1)
In predators which have visual constraints like falcons, at Vfb
the time of approach to the target, the head movement

408
IFAC ACODS 2016
February 1-5, 2016. NIT Tiruchirappalli, India
Amritesh Maitra et al. / IFAC-PapersOnLine 49-1 (2016) 407–412 409

τx = kx,y τy (3)
where k is a constant, x refers to motion gap 1 and y
refers to motion gap 2. In intrinsic coupling, the closure of
the distance motion gap is coupled with intrinsic variable
associated with the predator, say the kinematics of the
gravity. However in extrinsic coupling the variables can be
sensed by the predator such as distance coupled with angle,
force coupled with angle etc. For instance in aerospace
applications, the angle of approach of the aircraft plays a
vital role in landing scenario, such as vertical take-off and
landing in rotary wing UAVs (VTOL) and conventional
take-off and landing in fixed wing UAVs (CTOL). See Fig.
Fig. 5. Landing by constant angular velocity of image in 6.
Height Descent Approach
2.6 General Tau theory 2.7 Summary

General Tau theory was proposed by David N Lee in Table 1. Usage of Methodologies
the eighties to analyse braking control in car by human
Biological Beings Methodologies
visual system based on time-to-collision. Time-to-collision
Bees, flies, bats, beetles Classical Pursuit
(also known as time-to-contact) (Hecht and Savelsbergh
Constant Absolute Interception
(2004))can be defined as the time remaining for two Dragon flies, bats, humans
Angle
systems to collide with one another. The general tau Falcon, eagles Visual Constraint Approach
theory focusses on closing the motion gap between two Pigeons, humming bird,
Tau theory based approach
systems, which can be distance, force, angle etc. See Fig. humans
6. Tau τ is defined as the ratio of the motion gap χ(t) to Honeybees Height Descent Approach
the rate at which the motion gap closes (Lee (2009)).
χ(t) 3. RELEVANT NAVIGATION AND GUIDANCE
τ (t) = (2) METHODOLOGIES FOR BIO-INSPIRED LANDING
χ̇(t)
General Tau theory states that for successful gap closing, As mentioned in previous section, for insects like hon-
rate of change of τ should remain constant, i.e., τ̇ = k. eybees, birds like hummingbirds, landing manoeuvre de-
The degree of collision is defined by the value of k. pends on the time-to-contact(TTC), rather than the ve-
The tgo of missile guidance stems from constant velocity locity or distance to the target. It is to be noted that
consideration throughout the engagement resulting inter- birds/insects probably do not have perceptible mecha-
ception at the constant velocity, which is equivalent to a nisms to explicitly calculate the action gap or the velocity
crashing scenario in UAV landing. It can be noted that the of closure of the action gap. However they can estimate
tau theory is equivalent to the velocity pursuit in missile TTC value, which is a function of image dilation (Lee
guidance. However in tau theory the degree of collision can (2009)).
be adjusted by using the constant k (Lee (2009)). Srinivasan noted while passing through narrow gaps, bees
General Tau theory (Lee (2009)) implies that convergence pass precisely through centre of the hole despite being
of the motion gap occurs depending on initial motion gap, equipped with monocular vision only, hence no depth
initial magnitude of the rate of closure and the constant k. perception (Mandyam V Srinivasan (2012)). The author
Under these conditions, clearly the direction of the velocity conjectured that bees perform this action by following
vector is not mentioned. Moreover τ formulation will be simple strategy of balancing the speeds of motion of the
undefined if the predator is at rest. However this limitation two images of the two edges of the gap. If one edge
is solved by using τ coupling. appears to be moving faster than the other, then the bee is
moving towards the former edge and change its direction
till equilibrium in the two speeds is achieved (Srinivasan
(2011b)). The concept of ventral optic flow is based on
identical considerations. The angular velocity of the image,
measured in degrees per second, is defined as the rate of
change of angle produced by the point in the image with
the horizontal (Srinivasan (2011b)). By this formulation,
angular velocity of the object will be proportional to the
flight speed and inversely proportional to the distance
between them.
Moreover it implies that the image produced in the eye
expands when an object is approached, hence the rate of
Fig. 6. Tau based approach expansion increases as the distance to the object decreases.
It has been experimentally found that the bees use visual
τ coupling requires that the closure of two motion gaps odometry- integrating the optic flow over the time to
should occur simultaneously. It is defined by navigate back to their hives (Srinivasan (2011a)).

409
IFAC ACODS 2016
410 Amritesh Maitra et al. / IFAC-PapersOnLine 49-1 (2016) 407–412
February 1-5, 2016. NIT Tiruchirappalli, India

The concept of measuring the optical flow of the two edges Let us define the augmented vector X = [x, y, 1]T where
and comparing them can achieve landing and manoeuvring (x, y) are the coordinates on the target plane and express
in UAVs. Several optical flow measurement methodologies the ellipse equation in a compact form as
for different scenarios have been proposed by researchers  
A B D
for computer vision applications. Some relevant ones ap-
plied for UAV landing are described below. X T B C E  X = 0; (4)
D E F
4. VISION-BASED DATA ACQUISITION USING
OPTIC FLOW If f is the camera, then the image plane can be defined
to be at z = f . An oblique elliptical cone of the form
Vision based data acquisition requires projection of 3D P = k(x, y, f )T will be subtended by the bundle of straight
relative velocity vectors (between camera and target) in lines passing through the optical centre. k is the scale
the 2D camera plane. Motion field provides the 3D relative factor describing the distance from the origin to P. The
velocity information from dynamical information of the oblique elliptical cone is defined by
target point (which is a part of the desired landing site P T QP = 0 (5)
in our case). This is reflected in the change of brightness  
patterns of the corresponding image points in the 2D image A B D f
 
plane. In most of the cases, optic flow provides a faithful where Q =   B C E
f 
approximation of relative motion. D E F
f f f2
4.1 Pose Estimation for UAV Navigation and Guidance The 5-DOF pose information will follow from Q. The unit
vector of the ZW axis, n and the origin of the world frame
Proper estimation of pose of the quadrotor is essential as it tW
C described in the camera frame are calculated as
is used in computation of control for tracking a trajectory  
and also landing. Metric pose estimation in aerial robots or λ2 − λ1 λ1 − λ3
n = S1 u 2 + S2 u3 (6)
UAVs has variously been carried out by monocular vision,
 λ 2 − λ 3 λ2 − λ3 
stereo vision, and sensor data fusion besides conventional  
W λ2 − λ1 λ1 − λ 3
distance measuring instruments. Laser scanner provides a tC = z0 S1 λ3 u 2 + S2 λ 2 u3 (7)
more accurate estimate of distance or altitude than vision λ2 − λ3 λ2 − λ 3
based systems. However camera based vision systems are Here r is the radius of the circle which is to be projected
capable of extracting a wide range of data not possible as the ellipse, λ1 , λ2 and λ3 are the eigenvalues of Q while
with laser scanners, viz. 6-DOF pose estimation, slope of u2 and u3 are the eigenvectors of the eigenvalues corre-
the image plane, metric distance measurement etc. r
sponding to λ2 and λ3 respectively. Here z0 = S3 √
−λ2 λ3
4.2 Vision Based Algorithm for Landing Pad Recognition and Si are undetermined signs. The yaw angle (ψ) is eval-
uated from the orientation of the major axis of the ellipse
Shaowu Yang (2013) introduced a monocular vision based fitted around the letter ”H”. The ellipse fitting measure
system which computes 5-DOF pose of the quadrotor from along with the yaw angle estimation on suitable rotational
elliptic projection of a circular landing pad (encompassing transformation gives the 6-DOF pose estimation.
a letter ”H”)by using projective geometry. The landing
pad, consisting of an ”H” letter is identified by succes-
4.3 Vision Based Motion Estimation
sive binarization of the camera image, finding connected
components and classifying connected components using
an artificial neural network which classify each connected Egomotion is defined as the motion of the camera which
component as Circle, letter H or other. The 5-DOF pose has 6 degree-of-freedom. The associated visual motion
estimation is subsequently accomplished by elliptic projec- parameters are estimated from optic flow. Neglecting the
tion of the circular contour of the landing site. rotational components of the optic flow, the translational
components of the optic flow uI , vI are expressed as
Each letter H which is surrounded by a circle is considered (Mohamad T Alkowatly (2015))
to be a part of the landing pad. Once the landing pad is u w v w
satisfactorily identified, it is processed further for metric uI = −f + x , vI = −f + y (8)
h h h h
pose estimation. Now comes the concern for projection  
of the image point in the lens (of the camera). It is to where f : focal length, h : depth of the image,
  u, v, w :
be noted that the perspective projection of a circle when translational velocities of the visual system, x, y : are the
viewed from a wide range of perspective views is an ellipse. projection of the world co-ordinates on the image plane.
Canny edge detection is performed on the gray scale Least squares regression is used to form the optic flow
image pattern. Subsequently ellipse fitting along the edge estimates at multiple points, which can be defined by
contours is done by direct Least square fitting algorithm.
The authors also provided a look-up table based method uI = b1 + b2 x, vI = b3 + b2 y (9)
to account for lens distortion.
Finally the visual parameters can be found by using
5-DOF pose estimation from the fitted ellipse includes 3
u bˆ2 v bˆ3 w
position co-ordinates of the quadrotor with respect to the =− , =− , = bˆ2 (10)
world frame, along with roll (φ) and pitch angles (θ). h f h f h

410
IFAC ACODS 2016
February 1-5, 2016. NIT Tiruchirappalli, India
Amritesh Maitra et al. / IFAC-PapersOnLine 49-1 (2016) 407–412 411

4.4 Visual Servo Control: Optic Flow Approach (2011), Daniel Mellinger (2012)). Successful grasping by
the quadrotor-grasper ensemble entails that the gripper
Mahony et al (R Mahony (2008), Bruno Herisse (2012)), should be oriented vertically. So the required trajectory
have produced a rich body of research which involves is obtained by minimizing a functional of snap of the
image-based servo control. In their approach, image plane quadrotor subjected to initial, pickup and goal position
kinematics and dynamics are derived from the coordinates constraints. The velocity and higher derivatives of the
of target points (which may be part of the desired landing trajectory are zero at desired initial and goal locations,
site). The 2D pixel locations of image of the points are while they are free and required to be continuous at pick-
calculated and projected on a spherical image plane. The up point.
target point image kinematics includes information of
Perching on UAV on lines or thin tubes have been taken
kinematics of the UAV itself. The target point kinematics
up by some research groups Courtney E. Doyle and Minor
include distance of the target surface to the origin of the
(2013), Mohta et al. (2014). It is to be noted that sig-
body frame. The dynamics of the image feature (average
nificant efforts have been made towards developing avian-
landmark vector), which, in turn, is a function of image
inspired grasping mechanism like Cai Luo (2014), Court-
and UAV kinematics. The visual servo control task is
ney E. Doyle and Minor (2013), and the relevant aerody-
to track a desired image feature (corresponding to a
namic aspects (Rick Cory (2008)). On-board estimation of
desired pose). Translational optic flow (ratio of UAV
a perching line based on vision and Inertial Measurement
velocity in body-fixed frame and height of the UAV) is
Unit (IMU) data only has been formulated by Mohta et al.
the visual velocity measure which is the control input
(2014). Image optical flow is shown to be related to the
while the difference between current and desired image
camera velocity by a scale factor which has been evaluated
feature serves as the error signal. The authors formulated
by continuous homography.
a Lyapunov stable control law for tracking the pose of
UAV while manoeuvring which can be extended to landing
6. CONCLUSION
problems also.
In conclusion, a wide array of information can be obtained
5. BIO-INSPIRED LANDING AND PERCHING
from various monocular camera based vision data acquisi-
tion systems. Pose estimation of the UAV with respect to
Cai Luo (2014) et al designed a UAV landing mechanism image point, altitude estimation, information about the
by using τ -based backstepping control. They had extracted normal to the image plane which basically defines the
pose estimation data and 3D ground map by a VSLAM plane, time-to-contact estimation, recognition of landing
approach called Parallel Tracking And Mapping (PTAM). site based on certain optical cues among others.
From the 3D map, a safe landing area was identified
via Recursive Multi-Frame Planar Parallax (RMFPP) An analytic framework to obtain smooth trajectory for
algorithm. Once the UAV reaches a particular altitude quadrotors along with continuous resultant velocity, accel-
above the landing site, it employs a tau theory based eration, jerk and snap profiles at points of interest exists
control algorithm for soft landing. Pu Xie (2013) et al in Matthew Turpin (2012), Daniel Mellinger (2011). If the
conducted detailed studies regarding tau coupling and bio-inspired guidance laws can be mathematically manipu-
intrinsic tau gravity guidance. It was demonstrated that lated to yield smooth snap trajectories, then the desirable
tau coupling can result in an approaching trajectory for attributes of both analytical and bio-inspired concepts
fixed wing UAVs which can close the gap in attitude that can be fused. Optical flow based methods, especially the
is vital for safe landing. Mohamad T Alkowatly (2015) ventral flow based approaches, depend on intrinsic sensing
implemented a near-ground landing strategy by tau theory only whereas the snap minimization methods are typically
using only visual information from a monocular camera dependent on extrinsic measurements. An initial attempt
and inertial measurement unit. Dario Izzo (2012) proposed has been pointed out (Justin thomas (2013)). If this con-
a combined ventral optic flow and a variable time-to- flict is resolved by proper formulation, elegant landing
contact based model to yield a smooth descent profile for methods can be generated. Fusion of IMU and visual
autonomous spacecraft using only visual cues. Although information, which typically evolve at different operating
the spacecraft landing scenario in this method resulted in frequencies, will make hierarchical control possible (Mo-
increasing fuel expenditure, the UAV extension may be hamad T Alkowatly (2015)).
inviting as fuel consideration in landing is not essential.
Another area largely unexplored is hybrid bio-inspired
Farid Kendoul (2012) propounded a hybrid τ -controller
algorithms which may be obtained by fusing valuable
and nonlinear ratio τ -control law for 4D docking and
attributes of different approaches. While searching for a
landing of rotorcraft on another moving target.
suitable landing site or a moving target, the strategy
Justin Thomas and Kumar (2014), Justin thomas (2013) should focus on optimal coverage by the cameras and
recently developed a methodology for dynamic grasping other sensors, while after locking on a target, the UAV
on a cylindrical perch inspired by eagle grasping. They should focus on smooth landing. More efficient estimation
framed the grasping problem in sagittal plane so only algorithms are also to be developed so the different metric
two image features (point-of-contact of the two tangents distances and relative velocities should not drift away with
drawn from the image to the cylinder) are necessary to time while tracking. A pivotal aspect of autonomous nav-
frame a relationship between the vision system and pose igation and guidance is to identify proper cues or triggers
of the robot. The authors previously established that while initiating a particular manoeuvre. There is much
norm of the input to the quadrotors can be minimized scope for bio mimicry (Srinivasan (2011a)) in identifying
by minimizing snap of the trajectory (Daniel Mellinger these cues. So continued synergy between biology (espe-

411
IFAC ACODS 2016
412 Amritesh Maitra et al. / IFAC-PapersOnLine 49-1 (2016) 407–412
February 1-5, 2016. NIT Tiruchirappalli, India

cially neurobiology and psychology), guidance theory and Matthew Turpin, Nathan Michael, V.K. (2012). Trajec-
computer vision science is posed to solve and propose tory design and control for aggressive formation flight
many sophisticated problems in aerial robotics, especially with quadrotors. Autonomous Robots, 33, 143–156. doi:
landing. 10.1007/s10514-012-9279-y.
Mohamad T Alkowatly, Victor M Becerra, W.H. (2015).
Bioinspired autonomous visual vertical control of a
REFERENCES quadrotor unmanned aerial vehicle. Journal of Guid-
ance, Control, and Guidance, 38(2), 249–262. doi:
Baird E, Srinivasan MV, Z.S.C.A. (2005). Visual control 10.2514/1.G000634.
of flight speed in honeybees. Journal of Experimental Mohta, K., Kumar, V., and Daniilidis, K. (2014). Vision-
Biology, 208, 3895–3905. based control of a quadrotor for perching on lines. In
Bruno Herisse, Tarek Hamel, R.M.F.X.R. (2012). Landing 2014 IEEE International Conference on Robotics and
a vtol unmanned aerial vehicle on a moving platform Automation (ICRA), 3130–3136. IEEE, Hong Kong,
using optical flow. IEEE Transactions on Robotics, China. doi:10.1109/ICRA.2014.6907309.
28(1). Nahin, P.J. (2012). Chases and Escapes. Princeton,
Cai Luo, Xiu Li, Y.L.Q.D. (2014). Biomimetic design NJ:Princeton University Press.
for unmanned aerial vehicle safe landing in hazardous Olberg, R.M. (2012). Visual control of prey-capture flight
terrain. IEEE/ASME Transactions on Mechatronics. in dragonflies. Current Opinion in Neurobioliology, 22,
doi:10.1109/TMECH.2015.2438328. 267–271.
Collett, T.S. and Land, M.F. (1978). How hoverflies Pu Xie, Ou Ma, Z.Z. (2013). A bio-inspired approach for
compute interception courses. Journal of Comparative uav landing and perching. In Guidance, Navigation, and
Physiology, 125, 191–204. Control and Co-located Conferences. AIAA Guidance,
Courtney E. Doyle, Justin J. Bird, Navigation, and Control (GNC) Conferences.
T.A.I.J.C.K.D.F.B.D.J.D.R.J.K.J.J.A. and Minor, R Mahony, P Corke, T.H. (2008). Dynamic image-based
M.A. (2013). An avian-inspired passive mechanism visual servo control using centroid and optic flow fea-
for quadrotor perching. IEEE/ASME Transactions on tures. Journal of Dynamical Systems, Measurement,
Mechatronics, 18(2), 506–517. and Control, 130.
Daniel Mellinger, V.K. (2011). Minimum snap trajectory Reddy, P. V., J.E.W. and Krishnaprasad, P.S. (2006). Mo-
generation and control of quadrotors. In 2011 IEEE tion camouflage in three dimensions. In In Proceedings
International Conference on Robotics and Automation. of the 45th IEEE Conference on Decision and Control
Daniel Mellinger, Nathan Michael, V.K. (2012). Tra- (IEEE Cat. No. 06CH37770), 3327–3332. New York,
jectory generation and control for precise aggres- NY:IEEE, San Diego, CA.
sive maneuvers with quadrotors. the International Rick Cory, R.T. (2008). Experiments in fixed-wing uav
Journal of Robotics Research, 31(5), 664–674. doi: perching. In AIAA Guidance, Navigation ad Control
10.1177/0278364911434236. Conference and Exhibit. AIAA.
Dario Izzo, G.d.C. (2012). Landing with time-to-contact Shaowu Yang, Sebastian A Scherer, A.Z. (2013). A perch-
and ventral optic flow estimates. Journal of Guid- ing monocular vision system for autonomous takeoff,
ance, Control, and Dynamics, 35(4), 1362–1367. doi: hovering and landing of a micro aerial vehicle. Jour-
10.2514/1.56598. nal of Intelligent Robotic Systems, 69, 499–515. doi:
Farid Kendoul, B.A. (2012). Bio-inspired taupilot for 10.1007/s10846-012-9749-7.
automated aerial 4d docking and landing of unmanned Shima, T. (2007). Deviated velocity pursuit. In In
aircraft systems. In 2012 IEEE/RSJ International AIAA Guidance, Navigation and Control Conference
Conference on Intelligent Robots and Systems. and Exhibit, 2007–6782. SC: AAIA., Hilton Head.
Ghose, K., H.T.K.K.P.S. and Moss, C.F. (2006). Echolo- Srinivasan, M.V. (2011a). Honeybees as a model for
cating bats use a nearly time-optimal strategy to inter- the study of visually guided flight, navigation, and
cept prey. PLoS Biology, 4, e108. biologically inspired robotics. Physiological Review, 91,
Hecht, H. and Savelsbergh, G. (2004). Time-to-contact. 389–411.
Eds. Elsevier. Srinivasan, M.V. (2011b). Visual control of naviga-
Justin Thomas, Giuseppe Loianno, K.S. and Kumar, V. tion in insects and its relevance for robotics. Cur-
(2014). Toward image based visual servoing for aerial rent Opinion in Neurobiology, 21, 535–543. doi:
grasping and perching. In 2014 IEEE International 10.1016/j.conb.2011.05.020.
Conference on Robotics & Automation (ICRA), 2113– Suzanne Amador Kane, A.H.F. and Rosenthal, L.J.
2118. (2015). When hawks attack: animal-borne video stud-
Justin thomas, Joe Polin, K.S.V.K. (2013). Avian inspired ies of goshawk pursuit and prey-evasion strategies.
grasping for quadrotor micro uavs. In Proceedings Journal of Experimental Biology, 218, 212–222. doi:
of the ASME 2013 International Design Engineering 10.1242/jeb.108597.
Technical Conferences & Computers and Information in Tucker, V. A., T.A.E.A.K. and Enderson, J.H. (2000).
Engineering Conference (IDTC/CIE 2013). Curved flight paths and sideways vision in peregrine
Lee, D. (2009). General tau theory: evolution to date. falcons (falco peregrinus). Journal of Experimental
Perception, 38, 837–858. Biology, 203, 3755–3763.
Mandyam V Srinivasan, Richard JD Moore, S.T.D.S.D.b. Visiongain (2009). The unmanned aerial vehicles (uav)
(2012). Frontiers in Sensing, chapter From Biology to market 2009-2019.
engineering:insect vision and applications to robotics,
19–39. Springer Verlag, Wien.

412

Potrebbero piacerti anche