Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Abdul Mannan
Shan-e-Fazal-e-Haq
Abdul Qadir Chauhan
M. Osman Riaz
DDP-FA11-BTE-145
DDP-FA11-BTE-123
DDP-FA11-BTE-127
DDP-FA11-BTE-094
PROJECT ID
TITLE
NUMBER OF
MEMBERS
05
04
SUPERVISOR NAME
MEMBER NAME
LU ID.
EMAIL ADDRESS
Abdul Mannan
DDP-FA11-BTE-145
33065292
abdulmannan1117@hotmail.com
Shan-e-Fazal-e-Haq
DDP-FA11-BTE-123
33065270
shanefazal123@gmail.com
33065274
aqchauhan.11@gmail.com
M. Osman Riaz
33065241
malik_osman01@hotmail.com
DDP-FA11-BTE-094
CHECKLIST:
Number of pages in this report
84
YES / NO
YES / NO
I/We confirm to state that this project is free from any type of
plagiarism and misuse of copyrighted material
YES / NO
MEMBERS SIGNATURES
Supervisors Signature
This work, entitled WSN Nodes for Health Monitoring of Players on Football Field
Through Collaborative Communication has been approved for the award of
Spring 2015
External Examiner:
Head of Department:
DECLARATION
No portion of the work referred to in the dissertation has been submitted in support of an
application for another degree or qualification of this or any other university/institute or other
institution of learning.
MEMBERS SIGNATURES
ii
ACKNOWLEDGEMENTS
In the name of Almighty Allah, the most kind and most merciful. We are grateful to Almighty
Allah who provides us with all the resources, so that we make their proper use for the benefit of
mankind. We would like to extend our immense gratitude towards our parents and family members
who kept backing us up in all the times, both financially and morally.
We are tremendously thankful and highly obliged to our project supervisor as well as mentor Dr.
Ali Nawaz Khan for his guidance, enlightenment and encouragement to work hard and smart. We
have found him supremely helpful while discussing the optimization issues in this dissertation as
well as practical work. His critical comments on our work have certainly made us think of new
ideas and techniques in the fields of optimization as well as hardware/ software integration and
simulation.
We are eminently grateful to all of our group members who efficiently managed to complete this
project within the given time. This project could not be completed without the individual efforts
and team co-operation from our group members, Mr. Abdul Qadir Chauhan, Mr. Shan-e-Fazal,
Mr. Abdul Mannan and Mr. Muhammad Osman Riaz.
We would like to express our recognition and appreciation towards the staff as well as facilities
provided to us in final year project lab. We also eulogize our friends and respondents for their
support and willingness to spend some of their valuable time with us to fill in the questionnaires
that proved to be very helpful for the fulfilment of our goal. However, this project would not have
been possible without the kind support and help of many individuals and experts, we would like
to extend our sincere regard to all of them.
iii
ABSTRACT
Wireless Sensor Networks (WSNs) consist of spatially distributed autonomous devices which are
capable of; interacting with their environment through various sensors, processing gathered
information and communicating this information wirelessly with their neighbour nodes. We have
designed an ad-hoc system that can be used for continuous measurement and observation of the
football players physiological and location statistics (e.g. heart rate, blood pressure, temperature,
speed etc.) through bio-medical sensors attached to a wearable kit and creating a sophisticated
wireless sensor network (WSN) to transport this data back to the sink/ base station node through
collaborative communication for further processing. By assessing the acquired statistics for the
purpose of guiding management decisions, including when to make therapeutic interventions and
assessment of those interventions, the designed and implemented ad-hoc system shall prove to be
a life saver for the football players on the field.
iv
Table of Contents
1
1.1
INTRODUCTION ............................................................................................................................. 2
WIRELESS SENSOR NETWORKS .....................................................................................................................2
1.1.1
1.1.2
1.2
1.2.1
1.2.2
1.3
1.3.1
1.3.2
1.3.3
1.4
2
2.1
2.2
NETWORK ARCHITECTURE............................................................................................................................9
2.2.1
2.2.2
2.3
2.3.1
2.3.2
2.3.3
2.3.4
2.3.5
2.3.6
2.3.7
Coverage Range..................................................................................................................................... 13
2.3.8
2.3.9
2.4
2.4.1
2.4.2
2.4.3
2.4.4
2.5
2.5.1
2.5.2
2.6
2.6.1
2.6.2
2.6.3
2.7
2.7.1
2.7.2
2.8
2.8.1
2.8.2
2.8.3
3
3.1
3.1.1
3.1.2
3.2
3.2.1
3.2.2
3.2.3
3.2.4
3.2.5
3.3
3.3.1
3.3.2
3.3.3
3.3.4
3.3.5
4.1
4.2
4.2.1
4.2.2
4.3
4.3.1
4.3.2
Features ................................................................................................................................................. 41
vi
5.1
CONCLUSIONS ............................................................................................................................................. 44
5.2
vii
Table of Figures
FIGURE 1-1 OPERATING PRINCIPLE OF THE ARTIFICIAL RETINA .....................................................................5
FIGURE 1-2 MEDICAL SENSOR NODES USED IN CODEBLUE ............................................................................5
FIGURE 2-1 ARCHITECTURE OF HETEROGENEOUS WSNS ........................................................................... 10
FIGURE 2-2 DEPICTION OF THE COVERAGE IN WSNS .................................................................................. 16
FIGURE 2-3 MTS310 SENSOR BOARD ......................................................................................................... 17
FIGURE 2-4 CYCLOPS VIDEO SENSOR MOTE ............................................................................................... 19
FIGURE 2-5 HIERARCHY OF INTEGERATED CMUCAM CAMERA SYSTEMS................................................... 19
FIGURE 2-6 MESHEYE VIDEO SENSOR MOTE .............................................................................................. 20
FIGURE 2-7 PANOPTES VIDEO SENSOR PLATFORM ...................................................................................... 21
FIGURE 2-8 GARCIA MOBILE ROBOT INTEGRATED WITH PANTILT CAMERA .......................................... 22
FIGURE 2-9 SENSEYE: MULTI-TIER ARCHITECTURE DEPICTION.................................................................. 23
FIGURE 2-10 GRAPH DEPICTING AVERAGE CURRENT CONSUMPTION FOR VARIOUS TASKS ....................... 24
FIGURE 3-1 SYSTEM DIAGRAM OF IMPLEMENTED AD HOC NETWORK ........................................................ 28
FIGURE 3-2 ARDUINO MEGA 2560 MCU BOARD ........................................................................................ 29
FIGURE 3-3 ZIGBEE BASED XBEE SERIES 2 MODULE ................................................................................. 30
FIGURE 3-4 SKM53 GPS MODULE ............................................................................................................. 31
FIGURE 3-5 SEN-11574 - PULSE SENSOR .................................................................................................... 31
FIGURE 3-6 DS18B20 - A 1-WIRE DIGITAL TEMPERATURE SENSOR ........................................................... 32
FIGURE 3-7 X-CTU SOFTWARE ................................................................................................................... 33
FIGURE 3-8 ANDROID BASED MOBILE APP ................................................................................................. 36
FIGURE 3-9 MICROSOFT SQL SERVER MANAGEMENT STUDIO ................................................................... 37
FIGURE 4-1 LOS AND NLOS TRANSMISSION UNDER NORMAL CONDITIONS .............................................. 40
FIGURE 4-2 LOS AND NLOS TRANSMISSION DURING COLLABORATIVE COMMUNICATION ....................... 40
FIGURE 4-3 WEARABLE WSN KIT............................................................................................................... 42
FIGURE 5-1 PIE CHART DEPICTING BATTERY CONSUMED BY VARIOUS TASKS IN WSN NODE ................... 45
List of Tables
TABLE 1-1 APPLICATIONS OF WSNS .............................................................................................................4
TABLE 2-1 SPECIFICATIONS OF VIDEO SENSORS .......................................................................................... 18
TABLE 3-1 PERFORMANCE SPECIFICATIONS OF XBEE SERIES 2 MODULE ................................................... 30
vi
CHAPTER 1
INTRODUCTION
1 INTRODUCTION
1.1
Wireless Sensor Networks (WSNs) consist of tiny sensing nodes, which can act as both data
generators as well as network relays. Each node consists of a microprocessor, sensor(s), and a
transceiver and is capable of interacting with its environment through various sensors, processing
gathered information and communicating this information wirelessly with their neighbour nodes.
Sensor nodes can be programmed to accomplish complex tasks rather than transmitting only what
they observe through on-board microprocessors. To communicate the observed phenomena of
interest the transceiver provides wireless connectivity. Sensor nodes are powered by limited
capacity batteries and are generally stationary [1]. Selected wireless protocol depends on
application requirement. Some of the available standards include IEEE 802.15.4(LR-WPANs),
IEEE 802.11(WLAN) [2] standard, Bluetooth or proprietary radios, which usually operate around
900 MHz.
1.1.1 Energy Efficient Operation
To save energy, nodes aggressively switch their transceivers off and essentially become
disconnected from the network. Therefore, although the locations of the nodes do not change, the
network topology dynamically changes due to the power management activities of the sensor
nodes. It is a major challenge to provide connectivity of the network while minimizing the energy
consumption in this dynamic environment. The energy-efficient operation of WSNs, however,
provides significantly long lifetimes that surpass any system that relies on batteries [3].
Wireless communications, and digital electronics, the design and development of low-power, lowcost, multifunctional sensor nodes that are small in size and communicate untethered in short
distances have become feasible with the recent advances in micro electro-mechanical systems
(MEMS) technology [1].
1.1.2 WSN Anatomy
The realization of wireless sensor networks (WSNs) based on the collaborative effort of a large
number of sensor nodes which include sensing, data processing, and communicating, enable the
ever-increasing capabilities of these tiny sensor nodes. Sophisticated and extremely efficient
2
communication protocols are required in order to realize the existing and potential applications for
WSNs [4]. WSNs are composed of a large number of sensor nodes, which are densely deployed
either inside a physical phenomenon or very close to it. In order to enable reliable and efficient
observation and to initiate the right actions, physical features of the phenomenon should be reliably
detected/estimated from the collaborative information provided by the sensor nodes.
1.2
Design Challenges
Sensor nodes use their processing capabilities to locally carry out simple computations and
transmit only the required and partially processed data, instead of sending the raw data to the nodes
responsible for the fusion. Hence, unique challenges for the development of communication
protocols are presented by these properties of WSNs.
Additional challenges are posed to the communication protocols in terms of energy consumption
due to the intrinsic properties of individual sensor nodes [1]. As Sensor nodes carry limited power
sources therefore WSN applications and communication protocols are mainly tailored to provide
high energy efficiency while traditional networks are designed to improve performance metrics
such as throughput and delay, WSN protocols focus primarily on power conservation.
1.2.1 WSN Deployment
Another factor that is considered in developing WSN protocols is the deployment of WSNs. The
position of the sensor nodes need not be engineered or predetermined permitting random
deployment in inaccessible terrains or disaster relief operations. Besides the development of selforganizing protocols for the communication protocol stack is required for this random deployment.
The density in the network is also exploited in WSN protocols in addition to the placement of
nodes. Neighbouring nodes may be very close to each other due to dense deployment of large
numbers of sensor nodes and short transmission ranges.
1.2.2 Spatial-Temporal Correlation of WSNs
Since multi-hop communication leads to less power consumption than the traditional single hop
communication hence, it is exploited in communications between WSN nodes. Furthermore, the
spatio-temporal correlation-based protocols emerged for improved efficiency in networking
wireless sensors as a result of the introduction of correlation in spatial and temporal domains of
the dense deployment coupled with the physical properties of the sensed phenomenon.
3
1.3
Applications of WSNs
WSNs have a wide range of applications in various fields of life [5, 6, 7]. A considerable amount
of research in the last decade, has enabled the actual implementation and deployment of sensor
networks tailored to the unique requirements of certain sensing and monitoring applications. In
accordance with our vision, WSNs are slowly becoming an integral part of our lives [8, 9, 10].
These ever-increasing applications of WSNs can be mainly organized into four main categories:
Table 1-1 Applications of WSNs [11, 12, 13].
Application Areas
Health Monitoring
Environmental
Sensing
Industrial
Applications
Military Applications
Patient Monitoring
Disaster Relief Operations
Water/ Waste Monitoring
Artificial Retina
Traffic Avoidance,
Enforcement, and Control
Structural Health Monitoring
Emergency Response
Biodiversity Mapping
(Wildlife Observation)
Preventive Maintenance
Sniper Detection
VigilNet
Smart Dust
photoreceptors. The AR project aims to replace these damaged photoreceptors with an array of
microsensors. The ultimate goal for the prosthetic device is to create a lasting device that will
enable facial recognition and the ability to read large print.
Accordingly, pulse rate, blood oxygen saturation, electrical activities of the heart, patient
movements, and muscular activity can be monitored continuously. The CodeBlue software
platform enables these nodes to be operated in a networked setting, where medical personnel can
monitor patients through a PDA.
CHAPTER 2
LITERATURE REVIEW
2 LITERATURE REVIEW
2.1 Heterogeneous Wireless Sensor Networks
Heterogeneous WSNs consist of multiple sensing nodes with diverse range of integrated sensors
that generate their data traffic with varying dynamic data rates depending on the nature of the
sensed phenomenon. Whereas homogeneous WSNs consist of multiple sensing nodes that consist
of scalar sensing devices and consequently generate similar amount of data traffic. The design
of heterogeneous Wireless Sensor Networks (WSNs) needs expertise from a wide variety of
research areas including communication, signal processing, networking, control theory, and
embedded systems. By incorporating these design properties, robust and long lasting networks
can be deployed while enabling more sophisticated and meaningful acquisition of useful data than
the conventional data-only wireless sensor networks.
The development of sensing devices that consist of single chip modules that could easily be
integrated with inexpensive transceivers has become possible only with these ground-breaking
innovations in CMOS technology. Moreover, the ongoing research in the networking of these
inexpensive communication devices has enabled the utilization of heterogeneous sensing devices
in various areas of networking that were not possible before [16, 17]. The expertise enhanced from
all these areas that allow the easy acquisition of heterogeneous data traffic, as well as delivering it
in real time, helps to design and implement heterogeneous WSNs independently.
While incorporating the ability to retrieve heterogeneous data, heterogeneous WSNs can also store
and process data in real time, merge and superimpose by correlating heterogeneous data traffic
obtained from heterogeneous traffic sources. These networks enable the possibility of a wide
variety of applications that were not imaginable before with scalar sensor networks. The most
important application areas can be summarized as:
Multimedia Surveillance.
Environmental Monitoring.
2.2
Network Architecture
The architecture of WSNs is based on many dominating factors depending upon the nature of the
application for which WSNs are being deployed i.e. ad hoc networks, mobile networks or selforganizing infrastructures. The composition of WSNs is envisioned to be of heterogeneous sensor
devices having diverse capabilities in terms of processing, sensing and communication. These
heterogeneous sensing devices can thus produce miscellaneous forms of traffic. Conventional
WSNs architecture design problems have been centred on scalable network architectures in which
each sensing node has the same sensing capabilities, which are homogenous and flat architectures.
On the other hand, a different perspective is required by the innate heterogeneity of these sensors
due to diverse technologies that sustain different traffic and dynamic sensor types.
The realization of heterogeneous network architectures led by these intrinsic differences of WSNs
are mainly classified into following two categories:
2.2.1 Single-Tier Architectures in WSNs
A single node of the deployed sensor network can have higher processing capabilities as compared
to the other sensing nodes in a single tier architecture. These nodes which can be used for local
processing of the sensed media information are referred to as processing hubs. Furthermore,
various processing hubs can create a distributed processing architecture, which may be used to
execute a particular processing job distributively.
The stored and processed heterogeneous data is relayed to a remote wireless gateway through a
multi-hop track made by intermediate sensing nodes in the single tier architecture. For successive
redemption, a storage hub which is responsible for storing local heterogeneous data, is
interconnected to the remote wireless gateway. Consequently, to perform more sophisticated
processing tasks offline and to mitigate the storage hindrances on the sensing nodes, the central
location can be used to store acquired and processed data easily.
For networks comprising of heterogeneous devices, the single tier architecture may be
implemented to reduce the overall load. In this scenario by organizing all the sensing nodes into
clusters, a central cluster head is responsible for controlling the diverse type of sensors in the
cluster. However, extra resource-hungry jobs, such as aggregation or intensive heterogeneous
traffic processing can be achieved by the cluster head. As a result, the cluster heads can also be
utilized as processing hubs in this type of architecture. The gathered content, is then relayed to a
9
remote wireless gateway with dedicated and enhanced processing capabilities for further
processing and storage by the cluster head in order to minimize the overall load on an individual
sensing node.
2.2.2 Multi-Tier Architectures in WSNs
The multi-tiered network hierarchical structure with heterogeneous sensing devices delivers the
adaptive scalability and flexibility to utilize the network resources more efficiently. To perform
elementary jobs, the multitier architecture incorporates low-end scalar sensors at the subsidiary
hierarchical levels. The data traffic assembled by these sensing nodes can thus be used to trigger
more complicated sensing functions, such as real-time observation of the acquired content. Highend sensing nodes perform these important functions at leading hierarchical levels that are
equipped with high data rate traffic sensors. Moreover, storage and processing can be triggered
based on the report by low-end devices only when there is adequate temptation in the sensed
phenomenon.
By boosting the efficiency and robustness of the established heterogeneous sensors, the network
lifetime is enhanced by utilizing such hierarchical architecture. Each tier may include cluster for
enhanced and dedicated processing in the multi-tier architecture. Consequently, due to
communication between these sensors, autonomous functions for each cluster can be achieved,
which reduces the energy consumption. Furthermore, the traffic load in the network may be
reduced by accumulating and transporting useful data to the higher tiers at the cluster heads.
2.3
There are several design challenges to the perception of heterogeneous WSNs in terms of
communication capability, digital signal processing, networking infrastructure and coverage
ranges, few of which are explained as follow:
demand, in terms of bounds on delay, energy consumption, distortion, reliability and network
lifetime, are imperative for the design and development of heterogeneous WSNs.
WSN Coverage
The coverage of the network also impact the architecture for WSNs. The architecture should be
developed by considering the coverage issues for each of the components in the network, in order
to provide necessary coverage for a particular application. The derivation of various coverage
factors of the scalar sensor types such as humidity or temperature sensors as compared to that of
heterogeneous traffic sensors such as microphones and cameras is an imperative and concluding
challenge.
The range that a node can reach through wireless communication is defined by the trade-off
between communication and sensing range. As a result, the connectivity in the network is usually
limited by the coverage problem in WSNs [19]. WSNs present basically different properties on
the coverage in terms of range, directivity, line of sight and dynamic view.
14
two sensors can be fixed at far end locations. As a result, through a multi-hop track, the efficient
communication of the variance in view of one of these sensing nodes should be delivered to the
other sensor.
mole. To collect external and internal sounds, multiple sensors can be attached to different parts
of the body. Through magnetic induction communication between the sensors and a wearable
reader is performed. The sensor modulates an electromagnetic field, which is generated by the
transceiver in the reading, according to the observed acoustic wave by changing its capacitance.
As a result, the sensors operate battery-less and only the reader is powered using a battery. Since
no heavy batteries or wires are necessary for the sensors so this phenomenon provides flexibility
for operation on the human body.
2.5.2 Mic-on-board Sensor Board
The Panasonic WM-62A microphone is one of the most common sensor boards used in
Crossbows MTS300/310 and MTS510. Only a small current of 500 A is required for its
operation. A preamplifier and a second-stage amplifier, is used in composition of the microphone
circuitry, with a digital-pot control [21]. The sounds coming from any direction can be recorded
as the microphone works in an omnidirectional manner. These audio sensors can receive sound
waves with a frequency smaller than 5 kHz.
Acoustic ranging is the most common usage of the on-board microphones. To estimate the distance
between two nodes, the difference between an acoustic signal and the propagation speed of a radio
wave is used. Audio signal transmission and speech recognition is another promising application
for audio sensors. Thus, with the low-end microphones like this one, many potential applications
are possible such as sound detection.
17
2.6
For the development of video cameras, charge-coupled device (CCD) technology has been used
traditionally. But recently, a new technology which is generally used for the manufacture of
computer processors, called complementary metal oxide semiconductor (CMOS) is being used for
video cameras. Image sensors can be implemented with an image processing circuit, a lens and an
image sensor on the same chip using CMOS technology. The cost and scale of image sensors is
significantly reduced using this composition. The reduction in size does not affect the quality since
complementary metal oxide semiconductor (CMOS) image quality closely follows charge-coupled
device (CCD) quality for low-resolution and medium-resolution sensors. Thus, complementary
metal oxide semiconductor (CMOS) sensors have become most suitable candidates for
heterogeneous WSNs since they consume much less energy than their charge-coupled device
(CCD) counterparts.
Table 2-1 Specifications of video sensors.
Name
Embedded
(MHz)
(kbps)
(fps)
Transceiver
Cyclops
352 x 288
7.3
N/A
N/A
No
CMUcam3
80 x 143
75
N/A
16.7
No
CMUcam2
176 x 255
75
N/A
26
No
CMUcam3
352 x 288
60
N/A
50
No
MeshEye (Low-Resolution)
30 x 30
50
250
N/A
Yes
MeshEye (High-Resolution)
352 x 288
50
250
15
Yes
Panoptes
320 x 240
206
1024
20
Yes
Acroname Garcia
640 x 480
400
250,1024
30
Yes
18
(a) CMUcam
(b) CMUcam2
(c) CMUcam3
19
20
21
Following are some recent experimental studies which are mostly limited to video sensor
networks:
2.8.1 SensEye Testbed Platform
Object detection, object recognition, and object tracking are the three tasks that are accomplished
through the SensEye application. A multi-tier architecture is followed by the SensEye network
architecture. A hierarchical structure of various heterogeneous components with various
processing and sensing capabilities is organized. While minimizing energy consumption
continuous and adaptive surveillance operation is provided by the resulting structure. SensEye
application is composed of three tiers that is why it is called a multi-tier architecture. The SensEye
testbed is an example of how heterogeneous components can be used in a WSN to provide a
surveillance application [27].
Lowest tier sensors that provide continuous sensing of the surveillance area record any activity
which is then communicated to the second tier where low-resolution video sensors perform object
recognition. Object detection and recognition is also provided by these low-resolution video
sensors in second tier. Medium-resolution webcams are located in the third tier which are
interfaced with Crossbow Stargate boards.
22
categories in WMSNs. By direct measurements of current both steady state and transient energy
consumption behaviour are obtained with the help of a digital multimeter. The figure below shows
23
the energy consumption for sleep mode as well as the five major operations. Moreover, we can
also be observe the breakdown of the energy consumption in the sensor, processor and the radio.
The conventional balance of energy consumption for processing and communication, is
contradicted by an important observation that reading from or writing to flash memory as well as
image the processing applications are more energy consuming than communication in
heterogeneous WSNs, which favors processing. Furthermore, visual sensing and communication
consumes almost same energy, which is also an important characteristic of heterogeneous WSNs.
So the additional amounts of energy consumed and delays due to transitions (e.g., to go to sleep
mode) are not negligible and must be accounted for in the network and protocol design in
heterogeneous WSNs.
Figure 2.10 Graph Depicting Average Current Consumption for Various Tasks [28]
2.8.3 IrisNet Software Platform
For deploying heterogeneous services on heterogeneous WSNs, Internet-scale Resource-Intensive
Sensor Network Services (IrisNet) is an example software platform [29]. By performing Internetlike queries IrisNet allows a global wide area sensor network to be harnessed on this infrastructure.
To collect potentially useful data scalar sensors and video sensors are spread throughout the
24
environment. Internet-like queries to video sensors are provided to the users by IrisNet. The sensor
network can be queried by the user through a high-level language i.e. Extensible Markup Language
(XML) which allows easy organization of hierarchical data.
Two tiered architecture of IrisNet is implemented. In first tier a common shared interface called
sensing agents (SAs) is implemented by the heterogeneous sensors, while in second tier the
produced data is stored in a distributed database by the sensors that is implemented on organizing
agents (OAs). Thus, the same hardware infrastructure can provide different sensing services as
different sensing services run simultaneously on the architecture. For instance, a surveillance
service as well as a parking space finder service can be provided by a set of video sensors. To
answer the class of relevant queries a group of organizing agents (OAs) are responsible for a
sensing service, collecting the data produced by sensing service and organizing the information in
a distributed database.
25
CHAPTER 3
DESIGN CHALLENGES
26
3 DESIGN CHALLENGES
3.1 Methodology
In our project we are applying collaborative communication approach for monitoring physiological
conditions of on-field football players. Due to drastic increase in number of matches played by the
players, it is necessary for the players to keep their physical condition and fitness up-to the
requirement of the game. In order to avoid any serious physical injury during an ongoing match
their physicians and coaches should have to monitor and analyze their body parameters
continuously such as fatigue through temperature, heart rate etc., distance run during matches, and
location data to improve team strategy during matches.
A range of biomedical sensors for the continuous observation and measurement of different
physiological functions of the targeted player can be embedded for this purpose with the designed
WSN node. Biomedical Sensors which we are going to use are Pulse Sensor SEN-11574 to
measure heart rate of the football player and DS18B20 which is a 1-wire digital temperature sensor
both of them provides the real time statistics of the football player in the football field.
To establish an autonomous, efficient, effective and intelligent data management system we are
designing and implementing an ad-hoc system which may prove to be a life saver for the football
players on the field. The basic idea is to collect players basic physiological and location statistics.
For example heart rate, blood pressure, speed, temperature through bio-medical sensors attached
to a wearable kit and creating a sophisticated wireless sensor network (WSN) to transport this data
back to the sink/ base station node through collaborative communication for further processing.
3.1.1 Collaborative Communication
Collaborative Communication [30, 31] is an effective physical layer approach to extend the
transmission range and increase energy efficiency. According to Collaborative Communication, a
group of collaborative nodes participate to transmit or receive a common signal when they
communicate with an isolated node located far away from them. The key point in Collaborative
Communication is to modify the carrier phase of the collaborative nodes so that multiple signals
are synchronized [32]. For instance, in the transmit mode, multiple signals are received by the
isolated node synchronously and combined constructively to increase the signal quality, or
equivalently extend transmission range. Collaborative Communication is a feasible alternative to
27
control the network topology in wireless networks with resource-restricted nodes, since the
performance of Collaborative Communication mainly depends on the number of collaborative
nodes rather than on the capabilities of each individual node. In other words, increasing in the
number of collaborative nodes can somehow compensate the weak communication capabilities of
individual nodes.
3.2
Hardware Components
Following are the tools which we are using in order to achieve our desired objectives:
3.2.1 Processing Unit [34]
The Arduino Mega 2560 is a microcontroller board based on the ATmega2560. It has 54 digital
input/output pins (of which 15 can be used as PWM outputs), 16 analog inputs, 4 UARTs
(hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP
header, and a reset button. It contains everything needed to support the microcontroller; simply
connect it to a computer with a USB cable or power it with an AC-to-DC adapter or battery to get
started. The Mega is compatible with most shields designed for the Arduino Duemilanove or
Diecimila.
29
XBEE Specification
Value
2mW (+ 3dBm)
RF Data Rate
25,000 bps
Data Throughput
Up to 35000 bps
Receiver Sensitivity
-96 dBm
Supply Voltage
Operating Temperature
-40 to 85 C
30
of -55C to +125C and is accurate to 0.5C over the range of -10C to +85C. In addition, the
DS18B20 can derive power directly from the data line (parasite power), eliminating the need for
an external power supply. Each DS18B20 has a unique 64-bit serial code, which allows multiple
DS18B20s to function on the same 1-Wire bus. Thus, it is simple to use one microprocessor to
control many DS18B20s distributed over a large area. Applications that can benefit from this
feature include HVAC environmental controls, temperature monitoring systems inside buildings,
equipment, or machinery, and process monitoring and control systems.
statistical data into the server for both online and an Android mobile app based access, we moved
towards interfacing of various biomedical and location tracking sensor.
We interfaced SEN-11574 Pulse Sensor to measure the heart rate and for location tracking we
interfaced SKM53 GPS module to acquire GPS coordinates through satellite for tracking the
movements of different football players on the field with all the configured WSN nodes. A
Microsoft SQL Server based database has also been created to store the physiological parameters
and the GPS coordinates of the football players.
It includes new tools that make it easy to set-up, configure and test XBee RF modules. XCTU
includes all of the tools a developer needs to quickly get up and running with XBee. Unique
features like graphical network view, which graphically represents the XBee network along with
the signal strength of each connection, and the XBee API frame builder, which intuitively helps to
build and interpret API frames for XBees being used in API mode, combine to make development
on the XBee platform easier than ever.
2. After that Sign up page is designed in which editable text fields for email, password for
coach login, brief introduction and photograph of coach are provided.
3. After that player profiles are made in which bio data and other necessary information like
player position and playing style of each individual player in the football team is entered.
4. After registering all the players, when the coach of the football team logins using his email
and password following information relating to coach and players is shown in the App:
35
5. Player mobility map to track location of the football player on the field is incorporated
using the GPS coordinates which is portrayed using the Google Maps.
36
CHAPTER 4
38
4.3.2 Features
The designed and implemented ad hoc network is a solution to serious health issues for monitoring
critical health conditions of patients. The acquired statistics of the football players can also be used
for the comparison of their performance on the field. These statistics can help in detailed grading
process to gauge how players execute their roles over the course of a game. The implemented
system delivers the ease of access to the team manager for assessing statistics for guiding
management decisions from anywhere. Efficient energy communication methods are employed in
the implemented system through the usage of collaborative communication. It is a cost effective
method for health monitoring of the football players. The designed ad hoc network is scalable and
41
flexible in nature with the aim for future expansion, in which the WSN nodes have increased life
cycle due to the inherent capability of low power consumption.
42
CHAPTER 5
43
Conclusions
Our primary goal was to interface a range of biomedical sensors for continuous observation and
measurement of the targeted players physiological function for assessing the acquired statistics
for the purpose of guiding management decisions, including when to make therapeutic
interventions and assessment of those interventions. Furthermore, we can use the acquired
statistics of the football players for the comparison of their performance on the field. These
statistics can be used to help identify players who are demonstrating sustained form and
consistently delivering over a number of weeks rather than just a big haul in one game.
These statistics can also help for the goal of detailed grading process to gauge how players execute
their roles over the course of a game by looking at the performance of each individual on each
play. By looking beyond the box score and studying the statistics to grade how well each player
performs on each play, and as a result over the course of a game and a season. How well a lineman
blocks on a given play, how much space and help a runner receives, how effectively a pass rusher
brings pressure or how well a defender covers a receiver. A lot of extra statistics such as yards
after catch, yards after contact, missed tackles, dropped passes, etc. can also be collected for
grading individual performance on each play.
The designed and implemented ad hoc network is a solution to serious health issues for monitoring
critical health conditions of patients. Efficient energy communication methods are employed in
the implemented system through the usage of collaborative communication due to which network
life cycle is increased. Also the designed ad hoc network is scalable and flexible in nature with the
aim for future expansion. By assessing the acquired statistics by the WSN wearable kit for the
purpose of guiding management decisions, including when to make therapeutic interventions and
assessment of those interventions, the designed and implemented ad-hoc system shall prove to be
a life saver for the football players on the field.
44
Battery Consumption
4%
33%
Cost of Processing
36%
Cost of Communication
Cost of Location Tracking
Cost of Sensors
27%
Figure 5.1 Pie Chart Depicting Battery Consumed by Various Tasks in WSN Node
by the capacity of its power source and power consumption of the node. By using this approach
we can not only equip our WSN node with unlimited source of energy but also we can increase
the life of the WSN node.
We can further implement our current designed system over Bluetooth Low Energy (BLE) which
is a wireless communications system aimed at novel applications in the healthcare, fitness,
security, and home entertainment industries [41]. Bluetooth Low Energy (BLE) is intended to
replace the cables connecting many types of devices, from mobile phones and headsets to hear
monitors and medical equipment. Bluetooth Low Energy (BLE) or Bluetooth Smart can provide
considerably reduced power consumption and cost while maintaining a similar communication
range as conventional Bluetooth wireless technology. In revamped WSN nodes all the wires
connecting many types of devices with each other will be replaced with wireless connectivity and
ultimately improved and renovated product will be delivered with better capabilities. The resulting
system will be more feasible in terms of miniaturization, on body placement, commercial
application point of view and power consumption, managing which is a major a major issue in
case of WSNs.
REFERENCES
[1]
I. F. Akyildiz and M. C. Vuran, Wireless Sensor Networks, 1st ed. Chichester, United
Kingdom: Wiley, 2010, ch. 1, 2 and 15 sec. 1.1, 2.2-2.5 and 15.1-15.3, pp. 1, 17-33 and
350-357.
[2]
[3]
[4]
[5]
[6]
[7]
[8]
Center
for
Coastal
Margin
Observation
and
Prediction.
Available:
http://www.ccalmr.ogi.edu/corie.
[9]
[10]
[11]
[12]
[13]
J. M. Kahn et al., Next Century Challenges: Mobile Networking for Smart Dust, in Proc.
of 5th Annu. ACM/IEEE International Conf. on Mobile Computing and Networking, New
York, NY, 1999, pp. 271-278.
[14]
[15]
[16]
[17]
[18]
B. Girod et al., Distributed Video Coding in Proceedings of the IEEE 93, Vol. 1, January
2005, pp. 71-83.
[19]
[20]
[21]
47
[22]
M. Rahimi et al., Cyclops: in Situ Image Sensing and Interpretation in Wireless Sensor
Networks in Proceedings of the ACM Conference on Embedded Networked Sensor
Systems (SenSys), San Diego, CA, USA, November 2005.
[23]
A. Rowe et al., A Low Cost Embedded Color Vision System in Proceedings of the
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Lausanne,
Switzerland, October 2002.
[24]
[25]
[26]
[27]
[28]
[29]
[30]
[31]
[32]
[33]
Sensor Technology, 1st ed., Elsevier, Burlington, MA, 2005, pp. 575-589.
[34]
[35]
[36]
[37]
[38]
DS18B20
1-Wire
Digital
Temperature
Sensor.
Available:
https://www.sparkfun.com/products/245.
[39]
[40]
49
volatile boolean Pulse = false; // "True" when User's live heartbeat is detected. "False" when not a "live
beat".
volatile boolean QS = false;
// becomes true when Arduoino finds a beat.
// Regards Serial OutPut -- Set This Up to your needs
static boolean serialVisual = true; // Set to 'false' by Default. Re-set to 'true' to see Arduino Serial Monitor
ASCII Visual Pulse
void setup(){
sensors.begin();
pinMode(blinkPin,OUTPUT);
// pin that will blink to your heartbeat!
pinMode(fadePin,OUTPUT);
// pin that will fade to your heartbeat!
Serial.begin(9600);
// we agree to talk fast!
interruptSetup();
// sets up to read Pulse Sensor signal every 2mS
// UN-COMMENT THE NEXT LINE IF YOU ARE POWERING The Pulse Sensor AT LOW
VOLTAGE,
// AND APPLY THAT VOLTAGE TO THE A-REF PIN
// analogReference(EXTERNAL);
Serial.print("Testing TinyGPS library v. "); Serial.println(TinyGPS::library_version());
Serial.println("by Dr.Ali Nawaz Khan And their Fyp Students in 2015");
Serial.println();
Serial.println("---- ---- Latitude Longitude ");
Serial.println("
(deg) (deg)
");
Serial.println("------------------------------------------------------------------------------------------------------------------------------------");
pinMode(led,OUTPUT);
ss.begin(9600);
}
// Where the Magic Happens
void loop(){
float flat, flon;
//unsigned long age, date, time, chars = 0;
// unsigned short sentences = 0, failed = 0;
gps.f_get_position(&flat, &flon);
serialOutput() ;
if (QS == true){ // A Heartbeat Was Found
// BPM and IBI have been Determined
// Quantified Self "QS" true when arduino finds a heartbeat
digitalWrite(blinkPin,HIGH); // Blink LED, we got a beat.
fadeRate = 255;
// Makes the LED Fade Effect Happen
// Set 'fadeRate' Variable to 255 to fade LED with pulse
BPMS= serialOutputWhenBeatHappens(); // A Beat Happened, Output that to serial.
QS = false;
// reset the Quantified Self flag for next time
}
else {
digitalWrite(blinkPin,LOW);
// There is not beat, turn off pin 13 LED
}
ledFadeToBeat(); // Makes the LED Fade Effect Happen
sensors.requestTemperatures(); // Send the command to get temperatures
//Serial.println("DONE");
//Serial.print("Temperature for Device 1 is: ");
float n=sensors.getTempCByIndex(0);
51
//Serial.println(n);
//smartdelay(1000);
//delay(1000);
Serial.print("2");Serial.print(";");print_float(flat,
TinyGPS::GPS_INVALID_F_ANGLE,
6);Serial.print(";");print_float(flon,
TinyGPS::GPS_INVALID_F_ANGLE,
6);Serial.print(";");Serial.print(BPMS); Serial.print(";");Serial.print(n);Serial.print(";");
while(Serial.available() > 0) {
msg=Serial.readString();
if(msg[0]=='2') {
digitalWrite(led,HIGH);
Serial.println(msg);
delay(1000);
//delay(1000);
// delay(1000);
}
if(msg[0]=='1') {
digitalWrite(led,HIGH);
Serial.println(msg);
delay(1000);
delay(1000);
//delay (1000); // take a break
}
static void smartdelay(unsigned long ms)
{
unsigned long start = millis();
do
{
while (ss.available())
gps.encode(ss.read());
} while (millis() - start < ms);
}
static void print_float(float val, float invalid, int len, int prec)
{
if (val == invalid)
{
while (len-- > 1)
Serial.print('*');
Serial.print(' ');
}
else
{
Serial.print(val, prec);
int vi = abs((int)val);
int flen = prec + (val < 0.0 ? 2 : 1); // . and flen += vi >= 1000 ? 4 : vi >= 100 ? 3 : vi >= 10 ? 2 : 1;
for (int i=flen; i<len; ++i)
Serial.print(' ');
}
smartdelay(0);
}
static void print_int(unsigned long val, unsigned long invalid, int len)
52
10,
11,
{
char sz[32];
if (val == invalid)
strcpy(sz, "*******");
else
sprintf(sz, "%ld", val);
sz[len] = 0;
for (int i=strlen(sz); i<len; ++i)
sz[i] = ' ';
if (len > 0)
sz[len-1] = ' ';
Serial.print(sz);
smartdelay(0);
}
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import com.parse.ParseFile;
import com.parse.ParseImageView;
import com.parse.ParseUser;
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_coach, menu);
return true;
}
54
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.logout) {
ParseUser.getCurrentUser().logOut();
startActivity(new Intent(this,Login.class));
this.finish();
return true;
}
return super.onOptionsItemSelected(item);
}
}
CustomAdaptor:
package com.example.shanu.myapp;
import android.app.Activity;
import android.content.Context;
import android.content.res.Resources;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.BaseAdapter;
import android.widget.LinearLayout;
import android.widget.TextView;
import java.util.ArrayList;
/**
* Created by Shanu on 4/5/2015.
*/
public class CustomAdaptor extends BaseAdapter
{
ArrayList<String> data = new ArrayList<String>();
private static LayoutInflater inflater=null;
Activity activity;
public Resources res;
public CustomAdaptor(Activity a, ArrayList<String> d, Resources reslocal){
activity = a;
data=d;
res = reslocal;
inflater = ( LayoutInflater )activity.
55
getSystemService(Context.LAYOUT_INFLATER_SERVICE);
}
//
}
@Override
public int getCount() {
return data.size();
}
@Override
public Object getItem(int position) {
return position;
}
@Override
public long getItemId(int position) {
return position;
}
@Override
public View getView(int position, View convertView, ViewGroup parent) {
View vi=convertView;
ViewHolder holder;
if(vi==null){
vi=inflater.inflate(R.layout.player_name,parent,false);
holder=new ViewHolder();
holder.name =(TextView) vi.findViewById(R.id.playerName);
//
holder.button =(LinearLayout) vi.findViewById(R.id.Buttons);
vi.setTag(holder);
}else{
holder=(ViewHolder) vi.getTag();
}
holder.name.setText(data.get(position));
return vi;
}
}
EnterTeam:
package com.example.shanu.myapp;
import android.app.Dialog;
import android.content.Intent;
56
import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import android.widget.Toast;
import com.parse.ParseUser;
11
player
added",
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_enter_team, menu);
return true;
}
58
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_skip) {
startActivity(new Intent(this,Coach.class));
this.finish();
return true;
}
return super.onOptionsItemSelected(item);
}
}
Login:
package com.example.shanu.myapp;
import android.content.Intent;
import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.Toast;
import com.parse.LogInCallback;
import com.parse.Parse;
import com.parse.ParseException;
import com.parse.ParseUser;
signup=(Button) findViewById(R.id.signup);
submit=(Button) findViewById(R.id.submit);
submit.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
if(dataCheck(user.getText().toString(),pass.getText().toString() )) {
ParseUser.logInInBackground(user.getText().toString(),
pass.getText().toString(),
LogInCallback() {
@Override
public void done(ParseUser parseUser, ParseException e) {
if (e==null){
startActivity(new Intent(Login.this,Coach.class));
Login.this.finish();
}else{
Toast.makeText(getApplicationContext(),"Incorrect
Password",Toast.LENGTH_SHORT).show();
}
}
});
}
}
});
signup.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
startActivity(new Intent(Login.this,Signup.class));
}
});
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_login, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
60
Username
new
or
}
return super.onOptionsItemSelected(item);
}
private boolean dataCheck(String user, String pass){
boolean ret = true;
if (user.isEmpty()){
ret = false;
}
else if (pass.isEmpty()){
ret = false;
}
return ret;
}
}
MainActivity:
package com.example.shanu.myapp;
import android.content.Intent;
import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import com.parse.ParseUser;
}
}
}
};
timer.start();
}
@Override
protected void onPause() {
super.onPause();
this.finish();
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
}
PlayerMovement:
package com.example.shanu.myapp;
import android.graphics.Point;
import android.os.Handler;
import android.os.SystemClock;
import android.support.v4.app.FragmentActivity;
import android.os.Bundle;
import android.view.animation.Interpolator;
import android.view.animation.LinearInterpolator;
import com.google.android.gms.maps.CameraUpdate;
62
import com.google.android.gms.maps.CameraUpdateFactory;
import com.google.android.gms.maps.GoogleMap;
import com.google.android.gms.maps.Projection;
import com.google.android.gms.maps.SupportMapFragment;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
import com.google.android.gms.maps.model.MarkerOptions;
public class PlayerMovement extends FragmentActivity {
private GoogleMap mMap; // Might be null if Google Play services APK is not available.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_player_movement);
setUpMapIfNeeded();
}
@Override
protected void onResume() {
super.onResume();
setUpMapIfNeeded();
}
/**
* Sets up the map if it is possible to do so (i.e., the Google Play services APK is correctly
* installed) and the map has not already been instantiated.. This will ensure that we only ever
* call {@link #setUpMap()} once when {@link #mMap} is not null.
* <p/>
* If it isn't installed {@link SupportMapFragment} (and
* {@link com.google.android.gms.maps.MapView MapView}) will show a prompt for the user to
* install/update the Google Play services APK on their device.
* <p/>
* A user can return to this FragmentActivity after following the prompt and correctly
* installing/updating/enabling the Google Play services. Since the FragmentActivity may not
* have been completely destroyed during this process (it is likely that it would only be
* stopped or paused), {@link #onCreate(Bundle)} may not be called again so we should call this
* method in {@link #onResume()} to guarantee that it will be called.
*/
private void setUpMapIfNeeded() {
// Do a null check to confirm that we have not already instantiated the map.
if (mMap == null) {
// Try to obtain the map from the SupportMapFragment.
mMap = ((SupportMapFragment) getSupportFragmentManager().findFragmentById(R.id.map))
.getMap();
// Check if we were successful in obtaining the map.
if (mMap != null) {
setUpMap();
}
}
63
}
/**
* This is where we can add markers or lines, add listeners or move the camera. In this case, we
* just add a marker near Africa.
* <p/>
* This should only be called once and when we are sure that {@link #mMap} is not null.
*/
private void setUpMap() {
double lat=31.3997258;
double lon=74.2089438;
final Marker marker = mMap.addMarker(new MarkerOptions().position(new LatLng(lat,
lon)).title("Player"));
final double finalLat = lat+0.00079;
final double finalLon = lon+0.00089;
mMap.animateCamera(CameraUpdateFactory.newLatLngZoom(new LatLng(lat, lon), 17), new
GoogleMap.CancelableCallback() {
@Override
public void onFinish() {
animateMarker(marker, new LatLng(finalLat, finalLon), false);
//
animateMarker(marker, new LatLng(finalLat-0.00079, finalLon+0.00089), false);
}
@Override
public void onCancel() {
}
});
//
for(int i=0;i<20;i++) {
//
final double finalLat = lat;
//
final double finalLon = lon;
//
Thread timer = new Thread(){
//
public void run(){
//
try {
//
sleep(3000);
//
} catch (InterruptedException e) {
//
e.printStackTrace();
//
}finally {
////
mMap.clear();
////
mMap.addMarker(new MarkerOptions().position(new LatLng(finalLat,
finalLon)).title("Player"));
//
}
//
}
//
};
//
timer.start();
//
}
}
public void animateMarker(final Marker marker, final LatLng toPosition,
final boolean hideMarker) {
final Handler handler = new Handler();
64
PlayerProfileData:
package com.example.shanu.myapp;
/**
* Created by Farhan Ahmed on 5/24/2015.
*/
public class PlayerProfileData {
private String name;
private String age;
private String position;
PlayerProfileData(String name,String age,String position){
setName(name);
setAge(age);
setPosition(position);
65
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getAge() {
return age;
}
public void setAge(String age) {
this.age = age;
}
public String getPosition() {
return position;
}
public void setPosition(String position) {
this.position = position;
}
}
Signup:
package com.example.shanu.myapp;
import android.app.AlertDialog;
import android.content.DialogInterface;
import android.content.Intent;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.provider.MediaStore;
import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ImageView;
import android.widget.Toast;
import com.parse.ParseException;
import com.parse.ParseFile;
66
import com.parse.ParseUser;
import com.parse.ProgressCallback;
import com.parse.SignUpCallback;
import java.io.ByteArrayOutputStream;
import java.io.File;
intent.setType("image/*");
startActivityForResult(
Intent.createChooser(intent, "Select File"),1);
} else if (items[item].equals("Cancel")) {
dialog.dismiss();
}
}
});
builder.show();
}
});
enterTeam.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
if(dataEntered()){
final ParseUser user=new ParseUser();
user.setEmail(email.getText().toString());
user.setPassword(password.getText().toString());
user.setUsername(email.getText().toString());
user.put("name", name.getText().toString());
user.put("introduction", intro.getText().toString());
if(bm!=null){
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bm.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte[] byteArray = stream.toByteArray();
final ParseFile file=new ParseFile("dp.jpeg",byteArray);
try {
file.save();
} catch (ParseException e) {
e.printStackTrace();
}
user.put("image", file);
}
user.signUpInBackground(new SignUpCallback() {
@Override
public void done(ParseException e) {
if (e == null) {
Intent enterT = new Intent(Signup.this, EnterTeam.class);
//
enterT.putExtra("name",name.getText().toString());
//
enterT.putExtra("intro",intro.getText().toString());
//
enterT.putExtra("imgPath",intro.getText().toString());
startActivity(enterT);
Signup.this.finish();
} else {
e.printStackTrace();
Toast.makeText(getApplicationContext(), "Something went wrong. Cannot Signup",
Toast.LENGTH_SHORT).show();
}
}
68
});
}
}
});
}
private boolean dataEntered() {
boolean ret=true;
String nam=name.getText().toString();
String em=email.getText().toString();
String pass=password.getText().toString();
String into=intro.getText().toString();
if(nam.isEmpty()){
Toast.makeText(getApplicationContext(),"Kindly
Enter
your
name",Toast.LENGTH_SHORT).show();
ret=false;
}else if(into.isEmpty()){
Toast.makeText(getApplicationContext(),"Kindly
Enter
your
introduction",Toast.LENGTH_SHORT).show();
ret=false;
}else if(!nam.contains(" ")){
Toast.makeText(getApplicationContext(),"Kindly
Enter
your
full
name",Toast.LENGTH_SHORT).show();
ret=false;
}else if(em.isEmpty()){
Toast.makeText(getApplicationContext(),"Kindly
Enter
your
Email",Toast.LENGTH_SHORT).show();
ret=false;
}else if(pass.isEmpty()){
Toast.makeText(getApplicationContext(),"Kindly
Enter
your
Password",Toast.LENGTH_SHORT).show();
ret=false;
}else if(!em.contains("@")){
Toast.makeText(getApplicationContext(),"Email is incorrect",Toast.LENGTH_SHORT).show();
ret=false;
}else if(pass.length()<6){
Toast.makeText(getApplicationContext(),"Password
too
short.
Must
have
6
letters",Toast.LENGTH_SHORT).show();
ret=false;
}
return ret;
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_signup, menu);
return true;
}
69
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode == RESULT_OK) {
if(requestCode==0){
Bundle extras = data.getExtras();
Uri selectedImageUri = data.getData();
String tempPath = fromCam;
//get orientation here
selectedImagePath=tempPath;
BitmapFactory.Options btmapOptions = new BitmapFactory.Options();
btmapOptions.inSampleSize=4;
bm=(Bitmap) extras.get("data");
bm = BitmapFactory.decodeFile(tempPath, btmapOptions);
img.setImageBitmap(bm);
} else if (requestCode == 1) {
//
//
//
//
//
}
}
}
/**
* helper to retrieve the path of an image URI
*/
public String getPath(Uri uri) {
// just some safety built in
if( uri == null ) {
70
Team:
package com.example.shanu.myapp;
import android.app.Dialog;
import android.content.Intent;
import android.graphics.Color;
import android.graphics.drawable.ColorDrawable;
import android.os.Handler;
import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.util.TypedValue;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.view.animation.Animation;
import android.view.animation.AnimationUtils;
import android.widget.AdapterView;
import android.widget.Button;
import android.widget.TextView;
import android.widget.Toast;
import com.parse.ParseUser;
import java.util.ArrayList;
import com.example.shanu.myapp.helper.SwipeMenu;
import com.example.shanu.myapp.helper.SwipeMenuCreator;
import com.example.shanu.myapp.helper.SwipeMenuItem;
import com.example.shanu.myapp.helper.SwipeMenuListView;
String[] teamRefined;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_team);
String teamGannd= ParseUser.getCurrentUser().getString("team");
teamRefined=teamGannd.split(";");
ArrayList<String> arrayList=new ArrayList<>();
for (int i=0;i<teamRefined.length;i++){
arrayList.add(teamRefined[i].split(",")[0]);
}
for (int j=teamRefined.length+1;j<=11;j++){
arrayList.add("Player "+j);
}
//
//
//
//
//
//
//
//
//
//
//
arrayList.add("Abdul Mannan");
arrayList.add("Abdul Qadir");
arrayList.add("Usman Riaz");
arrayList.add("Shan e Fazal");
arrayList.add("Player 5");
arrayList.add("Player 6");
arrayList.add("Player 7");
arrayList.add("Player 8");
arrayList.add("Player 9");
arrayList.add("Player 10");
arrayList.add("Player 11");
CustomAdaptor adaptor=new CustomAdaptor(this,arrayList,getResources());
final SwipeMenuListView listView=(SwipeMenuListView) findViewById(R.id.listyn);
listView.setAdapter(adaptor);
deleteItem.setBackground(new ColorDrawable(Color.parseColor("#ececec")));
// set item width
deleteItem.setWidth(dp2px(90));
// set a icon
deleteItem.setIcon(R.drawable.ic_action_action_account_box);
// add to menu
menu.addMenuItem(deleteItem);
// set item background
openItem.setBackground(new ColorDrawable(Color.parseColor("#444333")));
// set item width
openItem.setWidth(dp2px(90));
// set item title
openItem.setIcon(R.drawable.ic_action_device_dvr);
// set item title fontsize
openItem.setTitleSize(18);
// set item title font color
openItem.setTitleColor(Color.WHITE);
// add to menu
menu.addMenuItem(openItem);
}
private int dp2px(int dp) {
// TODO Auto-generated method stub
return
(int)
TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP,
getResources().getDisplayMetrics());
}
};
// set creator
listView.setMenuCreator(creator);
// step 2. listener item click event
listView.setOnMenuItemClickListener(new SwipeMenuListView.OnMenuItemClickListener() {
@Override
public boolean onMenuItemClick(final int position, SwipeMenu menu,int index) {
switch (index) {
case 0:
Animation anim = AnimationUtils.loadAnimation(
Team.this, R.anim.slide_out_left);
anim.setDuration(800);
listView.getChildAt(position).startAnimation(anim);
new Handler().postDelayed(new Runnable() {
public void run() {
// dialog edit Player
dialogPlayer(position);
73
dp,
}
}, anim.getDuration());
break;
case 1:
// open
startActivity(new Intent(Team.this,PlayerMovement.class));
break;
}
return false;
}
});
// set SwipeListener
listView.setOnSwipeListener(new SwipeMenuListView.OnSwipeListener() {
@Override
public void onSwipeStart(int position) {
// swipe start
}
@Override
public void onSwipeEnd(int position) {
// swipe end
}
});
// other setting
// listView.setCloseInterpolator(new BounceInterpolator());
// test item long click
listView.setOnItemLongClickListener(new AdapterView.OnItemLongClickListener() {
@Override
public boolean onItemLongClick(AdapterView<?> parent, View view,
int position, long id) {
Toast.makeText(getApplicationContext(),
position + " Chadd dee", Toast.LENGTH_LONG).show();
return false;
}
});
}
private void dialogPlayer(final int pos) {
final Dialog enterPlayer=new Dialog(this);
enterPlayer.setContentView(R.layout.new_player_entry);
final TextView playerName=(TextView) enterPlayer.findViewById(R.id.dialogName);
final TextView playerAge=(TextView) enterPlayer.findViewById(R.id.dialogAge);
74
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_team, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
75
if (id == R.id.action_newPlayer) {
startActivity(new Intent(this,EnterTeam.class));
this.finish();
return true;
}
return super.onOptionsItemSelected(item);
}
@Override
protected void onStop() {
String team="";
for (int a=0;a<teamRefined.length;a++){
team=team+teamRefined[a]+";";
}
ParseUser.getCurrentUser().put("team",team);
ParseUser.getCurrentUser().saveEventually();
super.onStop();
}
}
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[tblTrackingInfo](
[intCode] [int] IDENTITY(1,1) NOT NULL,
[fltLat] [float] NULL,
[fltLang] [float] NULL,
[fltTemp] [float] NULL,
[fltHeart] [float] NULL,
[fltExtra1] [float] NULL,
[fltExtra2] [float] NULL,
[fltExtra3] [float] NULL,
[fltExtra4] [float] NULL,
[strExtra1] [nvarchar](250) NULL,
[strExtra2] [nvarchar](250) NULL,
[strExtra3] [nvarchar](250) NULL,
76
C Sharp Code:
// SQL SERVER CODE
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System;
using System.Collections.Generic;
using System.IO.Ports;
using System.Configuration;
using System.Threading;
using System.Data.SqlClient;
namespace ArdineoTesting
{
public partial class Form1 : Form
{
SerialPort currentPort;
bool portFound;
String RxString = String.Empty;
static string connetionString = "Data Source=AQCHAUHAN-VAIO;Initial Catalog=dbTracking;User
ID=sa;Password=sedese";
SqlConnection con = new SqlConnection(connetionString);
Int32 intCounter = 0;
public Form1()
{
InitializeComponent();
}
77
// port.Write(s + '\n');
//}
//port.Close();
//SetComPort();
//DetectArduino();
}
private void arduinoBoard_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
String str = arduinoBoard.ReadLine();
string[] dataArray = str.Split(';');
if (dataArray.Length > 4)
{
//RxString = arduinoBoard.ReadExisting();
}
}
private void DisplayText(object sender, EventArgs e)
{
textBox1.AppendText(RxString);
{
return false;
}
}
catch (Exception e)
{
return false;
}
}
private void textBox1_TextChanged(object sender, System.EventArgs e)
{
}
private void buttonStop_Click(object sender, System.EventArgs e)
{
}
}
}
81
82
83
PROJECT ID
TITLE
TOTAL NUMBER
OF WEEKS IN
PLAN
05
40
No.
STARTING
WEEK
September
October
DESCRIPTION OF MILESTONE
DURATION
2 Weeks
Literature Review
5 Weeks
November
5 Weeks
December
6 Weeks
January
3 Weeks
February
7 Weeks
March
4 Weeks
April
2 Weeks
May
4 Weeks
10
June
2 Weeks
84