Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Martin White
Zeeshan Patoli
Department of Informatics
University of Sussex
Brighton, UK
Email: t.pascu@sussex.ac.uk
Department of Informatics
University of Sussex
Brighton, UK
Email: m.white@sussex.ac.uk
Department of Informatics
University of Sussex
Brighton, UK
Email: m.z.patoli@sussex.ac.uk
I. I NTRODUCTION
Early smartphones contained low-power accelerometers and
magnetometers to measure device tilt. Tilt tracking allows
users to interact with applications by simply rotating the
device. For example, turning the phone on its side will rotate
its interface from portrait to landscape. Modern smartphones
include additional micro-electro mechanical gyroscopes. This
combination of sensing hardware can also be found in motion
capture systems in the form of inertial measurement units
(IMU). This paper draws a parallel between smartphones
and IMUs to answer one question: Is motion capture and
activity tracking feasible using smartphone-driven body sensor
networks?
The motion capture software environment [1] has been
ported to the Android platform while targeting small handheld
devices in terms of interface dimensions and touchscreen
interaction. The proposed mobile application, previously introduced in the context of medicine and healthcare [2], encloses
a 3D engine that is optimized to render kinematic models on
the smartphones screen. The phone acts as an IMU by fusing
the gyroscope, accelerometer and magnetometer data. The
processed motion is applied to the kinematic rig and rendered
on the smartphone screens to give the user an interpretable
visualisation of the data.
456
2)
3)
B. Smartphone Sensing
The smartphone has become the most ubiquitous [7] type
of wearable computing. Its worldwide uptake has prompted
many advances in micro-electro mechanical sensors (MEMS)
technologies. Modern smartphones embed a variety of sensors:
optical cameras, gyroscopes, accelerometers, magnetometers,
pressure sensors, thermometers and light sensors. These components are designed to be robust and withstand everyday use
for the lifespan of the device. The smartphones innate ability to
sense while sustaining an Internet connection has encouraged
new application areas focused around aggregating sensor data
in online repositories. Multiple smartphones connected to the
Internet can form large sensor networks [8] that generate big
data in accordance to the Internet of Things model. For example, networks of smartphones can be used to conclude road
condition maps from accelerometer data [9]. Another popular
use is activity tracking whereby sensor data is amalgamated to
deduce the overall comportment of smartphone users [10] [11].
Smartphones have the ability to stimulate physical activity and
improve the overall health of their operators [12] [13].
III. A RCHITECTURE OVERVIEW
This section provides an architectural overview of the application by introducing its main components. As shown in Fig. 1,
the back-ends constituent objects are distributed throughout
five main packages entitled: mathematics, kinematics, animation, recorder and connectivity.
4)
5)
application, rotations are stored as quaternions (for performance reasons) while vectors and matrixes are used
primarily by the OpenGL ES renderer.
The kinematic package generates virtual bone objects
based on BVH hierarchies. Each bone object is assigned
a rotational and positional constraint to form an articulated skeletal hierarchy.
The animation package is a temporary storehouse for
recorded data. Its primary functionality is to feed data
to the skeleton during animation playback. Additionally,
it provides the basic functionality for the animation
controls.
The recorder package is used to extract data from the
devices gyroscope, accelerometer and magnetometer.
The three sets of data are fused to produce a result that
has little drift or noise. The recorder checks the data for
errors and applies post-processing filters.
The connectivity package allows the application to function as a node in a BSN by establishing a connection
to the cloud server. The authenticator identifies the
phone while the uploader and downloader transfer data
between the smartphone and an online database. The
trigger listener awaits server events so that the device
can be controlled remotely by other smartphones or by
the server.
IV. I NTERFACE OVERVIEW
457
B. Drift Cancellation
In this context, drift is the difference between the
gyroscopes rotation and the combined accelerometer/magnetometer rotation. The two streams of data
are compared to determine a drift coefficient that is
represented by a percentage ratio. Controlled drift is
induced so that the gyroscope rotation gradually matches
the accelerometer/magnetometer rotation. This process is
achieved using quaternion linear interpolation and the drift
coefficient. The fused rotation will replace the gyroscopes
accumulated rotation before the next frame of motion is
computed. The result contains some unwanted noise that can
be masked using filters. A popular alternative to this solution
is the Kalman filter [14].
C. Post-Processing
Fig. 2. Main user interface (a) rendering kinematic motion and recorder
interface (b).
V. S ENSOR F USION
The applications sensor fusion relies on quaternion linear
interpolation to combine the gyroscope, accelerometer and
magnetometer data. Gyroscopes suffer from drift but produce
noise-free data at high frequencies. Accelerometers and magnetometers can be combined to produce a rough approximation
of rotation. The resulting signal contains no noticeable drift
but oscillates in excess of one degree per frame of motion.
The three sensors can be fused so that the accelerometer and
magnetometer are used by the gyroscope as a reference point
to cancel drift. The entire motion processing can be divided in
three separate stages: data preparation, drift cancellation and
post-processing.
A. Data Preparation
The first step of the preparation stage is to combine the
accelerometer and magnetometer readings using the Android
native API. The result is a quaternion representing a rough
approximation of the devices orientation. Its rotation contains
noise that can be attenuated by applying a low-pass filter. A
set of transformations is applied so that the data matches the
3D engines orthogonal configuration.
The second step is to gather the gyroscope data and compute
its world space rotation. The gyroscope outputs three angular
speeds corresponding to azimuth, pitch and roll. Angular
speeds are converted into rotation increments when multiplied
by the time step. The time step is the time interval between
two consecutive samplings. A summation of the rotation
increments produces a world space rotation. The two rotations
are converted from Euler to quaternions to avoid gimbal lock
and for performance reasons (quaternion multiplications are
much faster to compute than trigonometry).
458
Fig. 3.
Flow chart of the applications two operating modes: inertial
measurement unit and networked.
Fig. 4.
Every smartphone receives an identical copy of the processed motion from the server. The rotational data is applied to
the kinematic model using the skeletal segment ids. Notably,
the fused sensor data is a stream of world space rotations
while the kinematic model requires local space. Each skeletal
segments quaternion must be divided by its kinematic predecessors in accordance with the hierarchy specification. This
process is repeated for every frame of motion.
459
Fig. 5.
Fig. 6. Motion cloud web portal displaying two channels streaming data
from two smartphones.
B. Web Portal
As shown in Fig. 6, the motion clouds data can be
interfaced with through a web portal. The web portal has
two primary functionalities. First, the interface allows users to
visualize data as set of interactive graphs, export data as CSV
tables or as BVH motion files. Second, the portals control
panel provides remote control functionality for enabling or
disabling the recording process throughout a BSNs. Future
iterations of the interface will implement HTML5s WebGL
approach to facilitate 3D visualizations in the web browser.
Fig. 7.
Hand wave gesture recorded using three Samsung Galaxy S3
smartphones.
460
Fig. 9.
Recorded motion data illustrating sedentary (green) behaviour
overlaying active (gray) behaviour.
Sedentary
Active
Ratio
X Activity
57441.11
158525.17
2.75
Y Activity
26935.33
90022.84
3.34
Z Activity
55560.02
167786.56
3.02
Total Activity
139936.45
416334.57
2.96
Steps Taken
1249
3717
2.96
TABLE I
ACTIVITY RESULTS SUMMARIZED .
X. C ONCLUSION
The aim of this study has been to investigate the smartphones relevance to the field of inertial motion capture
and activity tracking. The papers developments expose solutions for sensor fusion, server-side data synchronization, data
streaming protocols and remote control functionality using
event triggers. When combined to form one application, these
461
462