Sei sulla pagina 1di 128

Multi User live video streaming of events via mobile applications

Jonathan Hussey

Submitted in partial fulfilment of the requirements of Edinburgh Napier University for the Degree of MEng in Software Engineering

School of Computing

April 2014

Abstract

Social media platforms have acted as a catalyst for users to share their life experiences by means of video, images and text. Live video streams are becoming more and more common place causing an explosion in content generators and viewing patterns are equally changing. The aim of this dissertation is to build a proof of concept which allows users to broadcast and view live video streams over the internet from an event and for the broadcasts to be consolidated in real time in a central location.

This is accomplished by creating a mobile application (Android application, Libstreaming library), a streaming server composed of several technologies (VMWare, Ubuntu, FFMPEG, Node.js, Socket IO, Nginx and the Nginx RTMP module) and a web application using JW Player and jQuery. A component based and iterative approach is used to develop the project which is then evaluated using integration testing, usability testing and software quality metrics. The objectives of the project against the outcomes are appraised and recommendations are provided for future work.

Honours Project - SOC10101

Contents

10010000

Acknowledgements

7

Authorship Declaration

8

Data Protection Declaration

9

1. Introduction

10

1.1. Project Overview

10

1.2. Aims & Objectives

10

1.3. Report Structure

11

2. Literature & Technology Review

14

2.1. Introduction

14

2.2. Existing Applications

15

2.2.1. Skype

16

2.2.2. Google Hangouts

16

2.2.3. Apple Facetime

17

2.2.4. Ustream.tv

17

2.2.5. Twitch.tv

18

2.2.6. Established and emerging sharing applications

18

2.2.7. Conclusion

19

2.3.

Video Streaming

20

2.3.1. Introduction

20

2.3.2. Video and Audio Coding

20

2.3.3. Compression

21

2.3.4. Network Protocols

22

2.3.5. Conclusion

22

2.4.

Cellular Networks

23

2.4.1. Introduction

23

2.4.2. Evolution of cellular technologies

23

2.4.3. Conclusion

26

2.5. The Rapid Evolution of Mobile Technologies

26

2.6. Ethics & Legal Issues

28

2.7. Conclusion

30

3. Requirements Engineering

31

3.1.

Introduction

31

3

Honours Project - SOC10101

10010000

3.2.

Architecture

31

3.2.1. Mobile Development

32

3.2.2. Streaming Server

34

3.2.3. Web Application

35

3.2.4. Real Time Monitoring

36

4. System Modelling

37

4.1.

Mobile Application

37

4.1.1. The Android OS Architectural Overview

38

4.1.2. The Android API

40

4.1.3. Developer libraries

42

4.2.

The Streaming Server

43

4.2.1. Streaming media server

44

4.2.2. FFMPEG

45

4.2.3. Socket IO

47

4.2.4. NodeJS

47

4.3.

Web Application

48

5. System Design

51

6. Implementation

56

6.1. The first iteration

57

6.2. The second iteration

64

6.3. The third iteration

65

6.4. The fourth iteration

69

6.5. Final Iteration

75

6.6. Results

77

7. Product Evaluation

80

7.1.

Integration testing

80

7.1.1.

Testing approach

81

7.2.

Usability Testing

82

7.2.1. Preparation

82

7.2.2. Testing Session

83

7.2.3. Findings

84

7.2.4. Feedback and Questions

84

7.3.

Software Quality Metrics

86

4

Honours Project - SOC10101

10010000

7.4.

Conclusion

88

8. Process Evaluation

89

8.1. Key processes and tools

89

8.2. Project Management

91

8.3. Reflection

91

9. Conclusion

93

9.1. Introduction

93

9.2. Achievement of objectives

93

9.2.1. Objective 1

93

9.2.2. Objective 2

93

9.2.3. Objective 3

94

9.2.4. Objective 4

94

9.2.5. Objective 5

94

9.2.6. Objective 6

95

9.2.7. Objective 7

95

9.2.8. Comparison of application to the work of others

96

9.3.

Further Work

97

References

99

Appendix 1 – IPO

102

Appendix 2 – Report on the Interim Review Meeting

106

Appendix 3 – Project Diary

108

Appendix 4 – Report on the IPO

112

Appendix 5 – Network Protocols

114

Appendix 6 – Development Tools

115

Appendix 7 – Node.js Server application

117

Appendix 8 – Mobile Application

119

Appendix 9 – Web Application

124

Appendix 10 – Testing session log

127

Appendix 11 – Project Management

128

5

Honours Project - SOC10101

List of Figures

10010000

Figure 1 - Transport protocol network hierarchy

22

Figure 2 - Progression of Cellular Technologies

25

Figure 3 - Hardware components of popular mobile devices

27

Figure 4 - Smartphone Sales 2007-2013

28

Figure 5 - Proposed System Composition

32

Figure 6 - Distribution of Android devices

38

Figure 7 - Android System Structure

39

Figure 8 - The Activity lifecycle

41

Figure 9 - Use Cases for Viewers and Streamers

52

Figure 10 – Web Application - Events page

52

Figure 11 – Mobile Application - User Interface - log in | stream | event

53

Figure 12 - Mobile Application – User interface class diagram

54

Figure 13 - Activity Diagram of the Streaming Process

55

Figure 14 - Bitbucket - project management through version control

56

Figure 15 - Testing Management

58

Figure 16 - Command to launch an Android virtual device

59

Figure 17 - Mobile application on an emulated Android device

60

Figure 18 - JW Player live stream configuration

66

Figure 19 - Nginx RTMP configuration

67

Figure 20 - FFMPEG command

67

Figure 21 - SDP configuration for live streams

70

Figure 22 - Send SDP configuration to server on socket 8089

70

Figure 23 - FFMPEG method for streaming

73

Figure 24 - Random port selection

77

Figure 25 - System results step 1

77

Figure 26 - Retrieval of a live stream

78

Figure 27 - Live stream displayed on multiple browsers

78

Figure 28 - Successfully added second live stream

79

Figure 29 - Testing process on development tasks

81

Figure 30 - Quality metrics

86

Figure 31 - ServerToMobile_alt.js quality metrics

87

Figure 32 - ServerToMobile.js metrics report

87

Figure 33 - Git development timeline

89

List of Tables

Table 1 - Transfer rates of cellular technologies

26

6

Honours Project - SOC10101

Acknowledgements

10010000

I would like to thank Rob Kemmer for his support and guidance throughout the entire project whose advice and enthusiasm assisted the direction and outcome of the project.

I would also like to thank the team at Winterwell Associates who agreed to test the application and engage in a feedback session for a usability testing session.

I would like to thank Kameshia Hussey and Roscoe McInerney for providing Android devices to develop and test the system.

7

Honours Project - SOC10101

Authorship Declaration

10010000

I, Jonathan Hussey, confirm that this dissertation and the work presented in it are my own achievement.

Where I have consulted the published work of others this is always clearly attributed;

Where I have quoted from the work of others the source is always given. With the exception of such quotations this dissertation is entirely my own work;

I have acknowledged all main sources of help;

If my research follows on from previous work or is part of a larger collaborative research project I have made clear exactly what was done by others and what I have contributed myself;

I have read and understand the penalties associated with Academic Misconduct.

I also confirm that I have obtained informed consent from all people I have involved in the work in this dissertation following the School's ethical guidelines

Signed:

Date: 25 April 2014

Matriculation no: 10010000

8

Honours Project - SOC10101

Data Protection Declaration

10010000

Under the 1998 Data Protection Act, The University cannot disclose your grade to an unauthorised person. However, other students benefit from studying dissertations that have their grades attached.

Please sign your name below one of the options below to state your preference.

The University may make this dissertation, with indicative grade, available to others.

The University may make this dissertation available to others, but the grade may not be disclosed.

The University may not make this dissertation available to others.

9

Honours Project - SOC10101

1. Introduction

1.1. Project Overview

10010000

Social media platforms have acted as a catalyst for users to share their life experiences by means of video, images and text. In addition IT infrastructures have improved tremendously with the advent of the internet, the explosive rise in smartphone users and the movement from 3G to 4G telecommunications.

Different formats of social sharing exist across a wide spectrum of services such as Facebook, Twitter, Google+ and Instagram with each attempting to deliver a unique type of sharing experience to their users. This is further supplemented by the availability of smartphones to users across all demographics with a majority of smartphones providing access to a camera, the internet and sharing applications.

The focus of this project will be on video sharing services and how a combination of cutting edge technologies can be used to create an even richer experience through the live sharing of video streams.

1.2. Aims & Objectives

The aim of this project is to build a system which allows users to broadcast and view live video streams over the internet from an event. This will involve creating a mobile application that will allow users to record video and broadcast a feed, a streaming server which is able to capture the stream and broadcast the stream automatically and a web application that dynamically updates when receiving new streams.

This type of automatic sharing of live video streams is the novelty of the system and will be accomplished by:

1. Investigating, researching and analysing existing social sharing applications with a focus on live streaming and video sharing services.

10

Honours Project - SOC10101

10010000

2. Identifying the requirements of the system based on existing technologies needed to accomplish the project objectives.

3. Selecting and justifying technologies that will be used to build the system architecture based on the requirements.

4. Designing the structure of the system and mapping the components of the different technologies identified from the requirements into a functional system.

5. Implementing the system using the technologies identified based on the design of the system and the functionality identified in the requirements.

6. Testing the system through integration and user testing.

7. Evaluating the software system using applicable metrics and analysis tools and to evaluate the conduct of the project including achievement of aims, project management and choice of technologies.

1.3. Report Structure

Chapter 1: Introduction

This chapter will present an overview of the purpose of the project including the aims and objectives.

Chapter 2: Literature and Technology Review

This chapter aims to provide the context for the aims and objectives of the project. An overview of the smartphone market, social sharing services and cellular technologies will be provided.

11

Honours Project - SOC10101

Chapter 3: Requirements Engineering

10010000

In this chapter the requirements will be specified for the system including functional and non-functional requirements of each aspect of the system.

Chapter 4: System Modelling

This chapter will describe each section of the system with the components of each section connected to a technology.

Chapter 5: Design

Each component of the system will be modelled in this section as identified in chapter 5. The system processes and lifecycles for each component will be modelled using UML.

Chapter 6: Implementation

In this chapter the implementation process will be discussed and the different challenges and obstacles that occur will be described. The decisions to overcome these obstacles will also be discussed.

Chapter 7: Product Evaluation

This chapter discusses the testing process throughout the development lifecycle and will also discuss the user testing performed on the system. Software metrics will be used to assess the quality of the outcome of the system.

Chapter 8: Process Evaluation

The processes, project activities and the management of the project are reviewed in this chapter.

12

Honours Project - SOC10101

Chapter 9: Conclusion

10010000

Further work, analysis of the project against objectives and comparison with other work is discussed in this chapter.

References

This will include a list of all the resources used for the project.

Appendixes

This will contain project information, source code, the project diary and screenshots.

13

Honours Project - SOC10101

2. Literature & Technology Review

2.1. Introduction

10010000

This review will discuss the existing applications within the live streaming of video across different platforms. It looks at trending applications and the techniques fuelling the technologies. Video streaming, codecs, compression and network protocols are discussed with a look at cellular networks and the improvement in transfer rates. The evolution of mobile technologies is reviewed with a final look at ethics and legal issues surrounding social media and live streaming.

By 2017 it is expected that over 4.7 petabytes of video data will be consumed per month by users of mobile devices (“Cisco Visual Networking Index,” 2013). Smartphone sales are expected to reach over one billion units (CCS Insight, n.d.) with over 1.7 billion subscribers to 3G services and over 126 million subscribers to 4G globally (“Global mobile statistics 2013 Part B,” n.d.).

Services such as YouTube receive over 1 billion unique visitors per month and over a million subscriptions per day (“Statistics – YouTube,” n.d.). Facebook, another social media service has over 1.23 billion monthly active users and over 100 million active mobile users (“Company Info | Facebook Newsroom,” n.d.). Skype has over 299 million users (Swider, 2013) with Instagram having over 100 million users (Systrom, n.d.). Twitter has over 241 million monthly active users with over 500 million tweets sent per day (“About Twitter, Inc. | About,” n.d.).

Users of social media services have varied and interesting ways in which to share their life experiences. There are numerous sharing sites and applications that stress emphasis on video, images or text. These social media services are at the core of social sharing experiences and with the trend in increasing smartphone sales, data usage and shifts to higher data packages on cellular networks, newer and more innovative applications are emerging.

The next section reviews the existing sharing applications with a focus on live video streaming and video sharing services. The evolution of the smartphone market and

14

Honours Project - SOC10101

10010000

cellular networks will be discussed as the improvements to these technologies have contributed vastly towards enabling users’ access to the internet with quick and simple methods to distribute content. Social media sharing services play an integral role in connecting users globally however copyright infringement and legal considerations present new challenges for this fast paced, ever changing environment. The legal implications and challenges will be presented in the final section of this chapter.

2.2. Existing Applications

Social media is bringing a new dimension to the sharing of user experiences. A user can share video, text or images with the click of a button, access to the internet and a mobile phone. The services offered by social media providers deliver these experiences in varying ways. Despite this there has been no attempt to provide a real time video sharing platform which consolidates video streams from broadcasters and groups them into similar events.

Cha et al (2007) states that User Generated Content (UGC) sites are creating new viewing patterns and social interactions. This trend is certainly apparent in the myriad of social sharing sites allowing users to create content and openly broadcast to a list of subscribers, a general audience or the public domain.

The most notable services for sharing video specific to the streaming of live video are Skype, Google Hangouts, Ustream.tv, Justin.tv, twitch.tv and Apples’ Facetime which all in different ways allow users to stream directly to other users or multiple users from a mobile device or computer. Some of these services offer direct mobile to mobile live streaming and others use a mediator to pass a video stream from the broadcaster to an audience.

These are not the only video sharing services, there are others that have taken a different route as to how video as well as text and audio is shared. Each application perhaps appealing to a different demographic or covering a widespread area when targeting users.

15

Honours Project - SOC10101

10010000

Video sharing services such as Snapchat, Vine and Tumblr provide users with a method of sharing video in unique ways. These can be seen as the young companies within a video sharing context and therefore the next section will focus on the larger more established brands and will eventually progress to more recent services.

2.2.1. Skype

Founded in 2003 and acquired by Microsoft in 2011 (Bright, 2011), Skype is a voice- over-IP service that allows users to contact and share text, voice and video between users or a group of users. It allows for the live streaming of voice and video to other mobile or desktop users with access to the internet.

Skype uses a Peer-to-Peer service (“What are P2P communications?,” n.d.) that allows users of different devices to directly communicate with each other without the overhead of server based processing and interpretation.

Users are required to register before communicating through Skype and once registered users are able to add contacts, make calls, instant message, video message and conference, send files and more.

In order to communicate between devices Skype uses its own proprietary transfer protocol. They operate on the IPv4 protocol using audio codecs such as G.2709, SVOPC and SILK (Wurm, 2013). Video codecs used include VP7 on older versions and Google’s open source VP8 codec as the default (Roettgers, 2011a) on current versions with H.264 (Ladha, 2014) used for high definition video.

2.2.2. Google Hangouts

Released in 2013 (Wiesen, 2013) Google hangouts follow the same principle as Skype in that it is a service that allows users to contact other users through voice and video with messaging and group video chat. Their approach is slightly different as they live stream video using H.264/SVC with an aim to switch to their open source VP8 video codec (Roettgers, 2011b). Google Hangouts utilises the WebRTC

16

Honours Project - SOC10101

10010000

framework for text, voice and video which is a framework for real time communications.

Their cloud video conferencing platform has incorporated the use of live streaming for events and live conversations with some of the most renowned celebrities using the platform. This allows users with an account to broadcast live using the platform for more than just conference calls but for other live streaming tasks. Live broadcasts include user commentary, events such as concerts, sermons, gaming and more.

2.2.3. Apple Facetime

Another service offering live video streaming is Apple’s Facetime service. Limited to Apple based products Facetime allows users to video chat with other Apple users.

Facetime uses the H.264 video codec and AAC audio codec with SIP, STUN, TURN, ICE, RTP and SRTP for managing and transferring video streams between apple users (Dilger, 2010).

2.2.4. Ustream.tv

Founded in 2007 Ustream.tv is a live social video streaming service offered on multiple platforms that allows users to broadcast from mobile phones, tablets, computers and the web. It offers live chat integration as well as recording of streams with an administration interface to manage the production of streams.

Their mobile platform allows for streaming on iPhone, iPad and android devices over 3G, 4G and Wi-Fi connections. Viewers of streams can search and filter categories of broadcasts and be notified of broadcasts happening within their vicinity. The service also integrates into social media services such as twitter and Facebook.

Their service spans across a multitude of industries which include government, education, private sector, media, religion and more. Within their mobile application the user has the option to select the quality level, receive feedback on network speeds and create new broadcasts.

17

Honours Project - SOC10101

10010000

The architecture of Ustream.tv consists of the Ustream Media Server (UMS), the Ustream TCP Congestion Control Algorithm (UTCP) and the Ustream Content Delivery Network (UCDN). Ustream.tv’s Cloud Platform automatically finds the best route to the end-user avoiding congested paths using an optimised TCP congestion control algorithm to serve content without buffering (Levy, 2013).

2.2.5. Twitch.tv

Twitch.tv also provides live streaming services however their target market is for gamers as well as general broadcasting.Twitch.tv is a streaming service targeted at the gaming community and has existed since June 2011. It is a platform which allows gamers to stream from a console device to an audience of gamers or viewers.

Users are able to watch live streams over Wi-Fi or a mobile network and their mobile application allows streams to be viewed with the ability to control the quality of content received.

Twitch.tv has over 28 million viewers with over 600,000 broadcasters and 3500 plus members on the twitch partner program (“Twitch Announces Record-Breaking Growth for Live Streamed Video Game Content and Audience | Business Wire,”

2013).

2.2.6.

Established and emerging sharing applications

Snapchat is a service that allows users to capture photo, video and text and share them with a group of friends. The service deletes a message after ten seconds which is an unorthodox functionality however the premise behind the functionality allows users to send different types of messages without the backlash from authorities or the national media. This is appealing as intimate or offensive content can be shared by users with others and the impact of such content would dissipate immediately.

Vine is another social sharing service focused on the generation of video content. Vine is provided by Twitter and allows users to create and share short looping videos of up to six seconds with close Twitter integration (Sippey, 2013).

18

Honours Project - SOC10101

2.2.7. Conclusion

10010000

Existing applications offer users a variety of ways in which video, images and text can be shared and each application delivers content to users using different approaches and technologies. Considering these sharing services and the functionality they provide to users it is apparent that these capabilities are being fuelled by other factors such as improvements in internet services, cellular networks and mobile technologies. In the next section the focus will be on the technological improvements and architectures which have enabled the explosion of video sharing services.

19

Honours Project - SOC10101

2.3. Video Streaming

2.3.1. Introduction

10010000

Streaming technologies provide users with an alternative means of distributed content over the internet. This has led to a paradigm shift in the way people are now viewing content with live streaming and video on demand being provided by content generators such as the BBC, Netflix, AppleTV, YouTube and Ustream.tv. It is being used in video conferencing, distance learning, digital libraries, telepresence and video on demand (Dapeng Wu et al., 2000). The consumption of video content has changed with the advent of new viewing habits, distribution channels, content providers, transport mechanisms, web based content and social networks (Minoli, 2012, p. 1).

Traditional means of content provision have declined with corporations embracing these technologies to function alongside their original provision method. What was essentially provided by broadcast networks, satellite backbone providers, IPTV and stored media (DVD, Blu-ray) is now accessible through the internet. Televisions now deliver content based on internet connections through services such as Netflix and other video on demand services. Video content can be viewed through gaming consoles, smartphones and on portable media devices such as an iPad.

To achieve the transmission of video over the internet a streaming system must first encode an audio/video stream with an audio/video codec, combine the streams into a single container then deliver the container over a network using a network protocol. Many formats exist for audio/video codecs, containers for these streams and there are a number of open source and proprietary protocols for the transference of the container streams to viewers. These will be looked at throughout this section as they play an important role in streaming media systems.

2.3.2. Video and Audio Coding

Video coding and audio coding technologies allow for video and audio to be delivered in a particular format at a fraction of the size of the original capture. Compression techniques such as rate deduction (Rupp, 2009, p. 102) allow for video

20

Honours Project - SOC10101

10010000

to be transferred at lower bandwidth rates and video quality can be improved through error concealment and redundancy detection. This is based on techniques such as spatial and temporal prediction within video streams that allow for recurring pixels to be remembered as opposed to repeatedly processed.

Video streams can be significantly reduced in current video coding formats as is evident by the movement from H.264 (Advanced Video Coding) to H.265 (High Efficiency Video Coding) which achieves double the data compression ratio.

Multiple codecs have been created and incorporated into operating systems and devices. These include MPEG-4 part10/H.264/AVC, VP6, VC-1, MPEG-2 part 2 and many more. In todays’ streaming industry the most common codecs in use are H.264 AVC/SVC and MPEG. Lee et al (2009) clarifies this further by highlighting the benefits of H.264 AVC and H.264 SVC and why these codecs are suited for todays’ demands for mobile video streaming.

Video Streaming involves such methods as Video on Demand (VOD), video multicast, video chat and video uplink (Pande et al., 2013). Delivery mechanisms such as progressive download account for 60% of cellular traffic. Streaming digital video involves the process of encoding, transmission and decoding in the delivery pipeline. Lee et al (2009) states that users of digital video perceive quality in three ways, these are temporal, spatial and picture quality and are closely tied to the frame-rate of a video and the number of pixels in a video frame.

2.3.3. Compression

Compression has played a pivotal role in a number of computing tasks ranging from data transfers over a network to file compression. Dodd (2012, p. 15) identifies a number of compression methods such as MPEG, MP3, Advanced Audio Coding (AAC) and WinZip. These protocols pertain to video, audio and file compression. “Compression reduces the size of video, data, image, and voice files. This decreases the amount of network capacity needed to transmit files and increases throughput without changing the actual speed of the line (Dodd, 2012, p. 14).

21

Honours Project - SOC10101

2.3.4. Network Protocols

10010000

Encoded video can travel over a network to a receiver by means of a network protocol. Many streaming technologies incorporate network protocols such as the Real-Time Transport Protocol (RTP), Real-Time Streaming Protocol (RTSP), Apple HTTP Live Streaming (HTTP Adaptive Streaming) and HTTP Flash progressive download (Minoli, 2012, p. 242).

and HTTP Flash progressive download (Minoli, 2012, p. 242). Figure 1 - Transport protocol network hierarchy

Figure 1 - Transport protocol network hierarchy

Streaming protocols are commands, processes, and procedures that can be used to select, set up, and start the playing, as well as the pausing, recording, and tear down of streaming sessions (Minoli, 2012, p. 241). Streaming can happen over different architectures such as peer to peer, server to single client or unicast and server to multiple clients or multicast.

2.3.5. Conclusion

Advances in the aforementioned topics revolve around the development and improvement towards video and audio codecs, compression techniques and network protocols. In a streaming system video and audio are transferred over a network. This was not possible until the last decade where the demand for data intensive services influenced cellular network progression and infrastructural changes on a global scale.

22

Honours Project - SOC10101

2.4. Cellular Networks

2.4.1. Introduction

10010000

Demand for rich media services is driving the need for high performance networks. Over the past two decades technological innovations have allowed cellular infrastructures to move from analogue to digital means with the inclusion of moving from voice only services to voice and data. There are over 1.7 billion 3G subscribers as well as 126 million 4G subscribers globally (“Global mobile statistics 2013 Part B,” n.d.). Part of the success of this technology includes advances and achievements in hardware technologies as well as improved techniques in transmitting data and managing and manipulating data over radio waves. Some of these changes involved adapting new technologies to existing ones due to the immense cost involved whilst the influence of the internet forced cellular technologies to incorporate new system architectures.

This section aims to look at a brief history of cellular networks, to discuss the evolving technologies and their data rate improvements and to consider the evolution of related technologies and their influence.

2.4.2. Evolution of cellular technologies

Mobile networks evolved through different stages of infrastructure beginning with A- Netz in 1958, followed by B-Netz in 1972 and C-Netz in 1986 (Rupp, 2009, p. 5). It was not until the 1990s when the second generation (2G) of mobile communication technology was introduced known as the Global System for Mobile Communications (GSM). 2G services were designed for voice traffic and due to the influence of the internet the General Packet Radio Service (GPRS) was created which focused on packet-switched traffic only enabling data transfers.

The movement towards third generation (3G) mobile networks was instigated by the need for more capacity and for providers to offer more revenue generating services (Dodd, 2012, p. 324). 3G technologies such as the Universal Mobile Telecommunications System (UMTS), introduced in the early 2000s, offered support for higher rates of data transfer with data rates moving from 10-60 kbit/s on GPRS to

23

Honours Project - SOC10101

10010000

386 kbit/s on UMTS. Round trip times were reduced from 1000ms to 140ms and further capabilities could be achieved including live video streaming and video chat (Rupp, 2009, p. 6).

The demand for data services continued to increase and 3G was extended with High Speed Downlink Packet Access (HSDPA) offering data rates of up to 14.4Mbit/s. Through new coding schemes and modulation techniques this service would enable even more features with a focus on better scheduling algorithms, retransmission and adaptive modulation.

Introduced in 2004 Long Term Evolution (LTE), a fourth generation (4G) technology transcends the capabilities of 3G cellular networks with the capacity to provide broadband level services to mobile users. LTE is intended to achieve higher download rates, an all-IP packet infrastructure, support for mobile TV, better use of the signalling spectrum and worldwide roaming delivering up to 100Mbps on downlink transmissions (Dodd, 2012, pp. 332–333).

These cellular technologies can be found in the latest mobile devices and most mobile networks offer at least 3G services. Most digital mobile handsets are tri-mode or quadruple mode. Tri-mode phones operate with three air interfaces; for instance, GSM, Wideband Code-Division Multiple Access (WCDMA), and High-Speed Packet Access (HSPA). Quadruple phones operate with four different air interfaces, such as GSM, WCDMA, HSPA, and LTE. (Dodd, A.Z., 2012). During the course of the future cellular networks will have moved exclusively to 4G technologies and users will be able to use the richness of all media services available. Data rates play an important role for users of mobile services and with the improvement in cellular technologies to handle potential problems such as latency, jitter, quality of service and other factors will greatly enhance the user experience.

Through the evolution of cellular networks other technologies have benefitted and when combined, unique service offerings emerge for commercial services. Figure 3 demonstrates the parallel progression of system applications, network technologies,

24

Honours Project - SOC10101

10010000

hardware platforms and radio technology from Raychaudhuri and Mandayam,

(2012).

and radio technology from Raychaudhuri and Mandayam, (2012). Figure 2 - Progression of Cellular Technologies It

Figure 2 - Progression of Cellular Technologies

It can be observed that as the network technologies progress system applications became more and more versatile moving from telephony, text and SMS in the early 2000s to location aware apps, mobile video and mobile applications circa 2010.

These observations are relevant to this project as the convergence of multiple technologies has led to the creation of an opportunity in which todays’ mobile devices, the current data transfer rates over cellular networks and improved camera quality within mobile devices allow for greater applications to be developed. Table 1 demonstrates the improvement of transfer rates over the course of evolving services. The movement from 3G to 4G will see an increase in upload rates from 384 Kbps to astonishing speeds of up to 86 Mbps. The global market is predominantly 3G which provides users with support for HSUPA upload rates of up to 1.4Mbps.

25

Honours Project - SOC10101

10010000

Service

Download Rate

Upload Rate

 

Peak data rate

Average data rate

Peak data rate

Average data rate

CDMA2000 1X

153.6Kbps

64Kbps

153.6Kbps

64Kbps

CDMA2000 1xEV-DO

2.4Mbps

500Kbps to 1Mbps

384Kbps

144Kbps

CDMA450

2.4Mbps

500Kbps to 1Mbps

384Kbps

144Kbps

CDMA2000 1xEV-DO (Revision A)

3.1Mbps

1.8Mbps

1.8Mbps

630Kbps / 1.325Mbps

HSPD/UPA (high-speed downlink/ uplink packet access)

14Mbps

2Mbps

1.4 Mbps

 

Long Term Evolution (LTE)

326Mbps

 

86Mbps

 

Table 1 - Transfer rates of cellular technologies

2.4.3. Conclusion

Cellular networks have evolved from analogue to digital services. Improvements to these networks alongside other technologies have enabled new applications to emerge with the added benefit of higher transfer rates.

2.5. The Rapid Evolution of Mobile Technologies

In 1918 a German national railway utilised a mobile radio system that enabled first class passengers to communicate through a radio based telephone link in Berlin (Feyerbend et al. 1927, p.872 citied by Rupp, 2009, p. 5). Marty Cooper, a senior engineer at Motorola made the first telephone call through a cellular device in 1973 (BBC, 2013). Since 1993 smartphones have been on the market with the first smartphone recognised as the IBM Simon from BellSouth and IBM. This contained functionalities such as paging, faxing, a PDA and included applications such as a calendar, calculator and an address book (Sager, 2012).

In the following years Nokia, Ericsson, Palm, Microsoft and Blackberry released smartphones with additional capabilities. These included the integration of operating systems, second generation (2G), third generation (3G) and fourth generation (4G) cellular services, hardware components and sensors.

26

Honours Project - SOC10101

10010000

Cellular phones have moved from containing no Central Processing Unit (CPU) to dual core and quad core processors at rates of 2.5GHz. Cameras contained within cellular devices have progressed from video graphics array (VGA) which provides a resolution of 640x480 pixels to 16 mega pixel cameras providing a resolution of up to 5312x2988 pixels.

Figure 3 represents the most popular devices over the past 15 years and their hardware components. As time progresses each device includes previous technologies and at times minimum support for cutting edge technologies at the time of release. Mobile devices move from supporting a single network to supporting multiple networks including 2G and 3G by 2008. Based on figure 3 devices began supporting wireless services from 2004 and by 2008 devices contained front facing and back facing cameras. This provides insight into hardware feature availability in conjunction with cellular improvements. Figure 3 is not an entirely true representation of the hardware components of all devices released in a particular year.

components of all devices released in a particular year. Figure 3 - Hardware components of popular

Figure 3 - Hardware components of popular mobile devices

In 2007 Apple released the iPhone with the integration of a multi-touch screen and took the cellular industry by storm and by 2008 Apple had sold over 2.3 million units since its release (Cheng, 2008). Google joined the smartphone race in 2008 with the release of the Android operating system and climbed to over 7 million units sold by 2009 (Wells, 2010). Smartphone sales for 2013 were expected to reach 1.01 billion units, or 55% of total mobile phone shipments (CCS Insight, n.d.). Figure 4 demonstrates the surge in mobile usage as more people are purchasing smartphones with Android based smartphones accounting for over 80% of shipments (Amadeo, 2013).

27

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 4 - Smartphone Sales 2007-2013 Users have the capability to

Figure 4 - Smartphone Sales 2007-2013

Users have the capability to capture different forms of media whether video or photographs on their mobile devices. It is clear that the progression of technologies within cellular networks and mobile devices have revolutionised the way in which users communicate and share information. This revolution however has led to a grey area of activity in which users, content providers, governments and corporations are faced with new challenges in ethics and law.

2.6. Ethics & Legal Issues

There have been many legal and ethical challenges in the area of media delivery services with actions being taken against individuals for sharing videos or images and voicing their opinion. Accounts have been terminated on social media web sites for criminal activity and the violation of the terms of service. Governments are restricting user access to the internet to prevent the sharing of recorded media with a global audience over domestic uprisings (Arthur, 2011). In 2013 Google released a transparency report which revealed the number of requests to remove content due to copyright infringement on their search platform. This peaked at over 28 million requests with over 4,400 copyright owners requesting the removal of URLs from their search engine (Google, 2014).

28

Honours Project - SOC10101

10010000

Twitter users who once retweeted a photograph of a man accused of breaking the law were charged with contempt of court (Wheeler, 2013). Social media users can face criminal charges for comments on social media services and the repercussions in different parts of the world vary depending on the laws of that country. What is seen, stated and observed in social media is bound by the laws of the country in which the aforementioned is committed.

YouTube has come under fire for the introduction of a system which automatically scans videos for audio and video content deemed “infringing” copyright (Tassi, 2013, para. 5). This results from content providers creating videos and using audio, animations, video and graphics created from others and monetising the content without payment to the originators.

In the case of live streaming, copyright infringement relates to the rights holders of broadcasted events being infringed by platforms providing live streaming services such as the live streaming of pirated sports content and in the case of Justin.tv who were subject to a copyright claim from Zuffa for failing to address the illegal uploading of live pay-per-view UFC events (Holland, 2012). Justin.tv were found not guilty but have introduced a mechanism by which pirated content can be detected using anti-piracy technologies (Albrecht, 2009).

In an interview with Elliot Zimmerman streamingmedia.com’s Geoff Daily asks the question “What’s the most important issue related to intellectual property for an online content publisher?” to which he replies;

When you’re talking about streaming video, things get especially complicated. You’ve got several separate elements to investigate, including text, art, music, video, photographs, trade and service marks, rights of publicity, and more. For example, if video is going to be synched with music and streamed over the internet, you’ll need a new-media license, a video-sync license, and the rights of publicity releases for those who appear in the file

29

(Daily, 2007)

Honours Project - SOC10101

10010000

This can be observed in the case of Viacom versus Google in which Viacom filed a $1 billion lawsuit in 2007 against YouTube for broadcasting copyrighted videos. The ruling went against Viacom and the judge stated that the burden of identifying what must be taken down is that of the copyright owner (Musil, 2013).

Digital media laws present many challenges for today’s rising content generators. Two examples have been reviewed concerning actions taken by corporations against media delivery providers for infringements made by users. There is no talk of the users who infringed the content in question and given that the laws in different regions of the world vary this topic is a complex problem with no transparent resolution in sight.

2.7. Conclusion

This chapter has reviewed existing applications in video streaming, the concept of video streaming and its components, evolving mobile technologies, cellular networks, legal ramifications and the state of the social environment around these technologies. Sufficient groundwork into these topics has resulted in a wealth of knowledge in which to continue pursuing the objectives of the project.

Understanding how these technologies are composed as well as the history behind the evolution of better networks, better encoding and decoding strategies as well as better algorithms to minimise the data flow within the communication life cycle provides the necessary perspective on the drivers behind the project.

Through this the requirements needed to develop the components of the proposed system will have been ascertained from the research into existing applications and the project can proceed with the necessary foundational attributes in place.

30

Honours Project - SOC10101

3. Requirements Engineering

3.1. Introduction

10010000

This chapter discusses considerations surrounding the development of the mobile application, the streaming server and the web application. In the field of front end technologies there is an increase in the release of useful and sophisticated libraries that provide a rich experience for online users and provide developers with effective tools to develop better applications. These libraries are capable of delivering cutting edge techniques such as 3D animation, communicating via web sockets in real time, utilising data persistence without server side intervention, data structures and algorithms, cryptography and more.

A Component-Based software engineering (CBSE) method will be used to develop the project. This approach involves defining, implementing and integrating/composing loosely coupled, independent components into systems (Sommerville, 2011, p. 453). The development of this project focusses on those libraries that resolve particular aspects of the proposed system that allow for the integration between different systems components across a network. The libraries considered will also address the usability and functionality requirements.

3.2. Architecture

The architecture of the system can be classified into three main segments; the mobile application, the streaming server and the web application. Each segment is relevant to two types of users defined previously (content generators and viewers) and the interconnection between them.

Content generators are involved primarily with the mobile application and secondly with the web application for administration of generated content. Viewers are connected closely to the web application and the binding factor between these two segments is the streaming server.

Figure 5 illustrates the proposed architecture of the project.

31

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 5 - Proposed System Composition Understanding how the system is

Figure 5 - Proposed System Composition

Understanding how the system is connected will allow for the relationships to be correctly defined as well as which libraries to choose that will effectively satisfy each segment of the infrastructure.

Different challenges will arise for each segment and selecting the library which is most suited, well documented and supported with an open source license will allow for the smooth progression of the development of the system.

Each segment will be analysed by identifying both functional and non-functional requirements. Then the platform, interactive development environment (IDE), programming language and libraries of choice will be declared followed by a justification for their selection.

3.2.1. Mobile Development

The first stage of the requirements phase of the mobile application was to test alternative platforms that currently exist. Custom functionality is determined at a later stage however amongst all the platforms tested a common group of requirements emerged.

32

Honours Project - SOC10101

Users or content generators:

10010000

will need to be able to login and logout of the application in order to synchronise the stream with their online account

should be able to select a stream quality that the mobile device can support

will need feedback on the status of their connection

will need to be informed of errors that affect or terminate the stream

with more than one camera on the device should be able to switch between them

will be able to tag and describe events

should be able to use the camera flash or other features of the camera whilst recording

will be able to choose whether high quality video is stored locally then pushed to the server when a better connection is available

should be able to change the default settings of their streaming environment

The mobile application should:

be able to communicate with the server

parse a data structure containing information about the user, device, location, event and network connectivity as well as any other information that needs to be exchanged

display the video feed to the user

elicit powerful error handling features so as to not degrade the experience for viewers

operate on multiple device types with variations in hardware

At all times the mobile application will be required to know the location of the streaming server otherwise a stream will not be retrieved. If the location is unknown the user should be provided a solution or the application will auto resolve the error.

33

Honours Project - SOC10101

10010000

The application will need to be able to adapt to the changing connectivity rates of roaming mobile users. Performance and data integrity are important factors as relationships are built between streams, locations and users.

Having identified the requirements of the application the next step involves selecting a platform which will provide the necessary libraries, development environment, documentation and support in order to develop the application, in chapter four we discuss this in more detail.

3.2.2. Streaming Server

The streaming server represents the hub of communication between content providers and content viewers. A multitude of technologies and applications will be utilised whilst the server is running in order to retrieve streams and broadcast streams to viewers.

The following functional and non-functional requirements were identified. The streaming server should:

handle multiple audio and video formats

allow thousands of users to stream in parallel

accommodate thousands of viewer requests

be flexible, adaptable and scalable

allow users to store video

convert videos to high resolution and low resolution alternatives

maintain a list of active streams in real time

broadcast active streams and newly added streams to all viewers

add user broadcasts to a database

track device, resolution and connectivity quality

check user access details before enabling streaming

ensure port availability for incoming streams

34

Honours Project - SOC10101

10010000

These requirements constitute a foundation of functions the streaming server should satisfy. The streaming server must be able to serve live streams to an audience and is closely linked to the web application

3.2.3. Web Application

Viewers of streams will need to interact with the web application in order to view streamed content. The streamed content will need to be connected to a data structure containing relationships between words and sets of words and users will need to be able to navigate throughout the application.

Streaming providers such as Twitch.tv or Ustream.tv allow users to view streams at different qualities however this is a requirement that will be addressed in the event that further development is undertaken on the project.

It is clear that in order to accomplish the interactivity desired utilising real time and synchronised techniques front-end cutting edge technologies will need to be investigated.

The requirements of the web application include:

receiving automatic updates on video feeds related to a particular event

allowing users to select a main stream and discard sub streams

rating streams based on quality

searching for streams based on a criteria such as quality and type

providing content providers with access to manage their account

implementing emerging JavaScript libraries

Essentially the front end web application should be able to monitor updates and adjust immediately to changes in some data stored within a data structure. This is a crucial aspect of the application. There are technologies available today that are able

35

Honours Project - SOC10101

10010000

to monitor an environment for changes in real time and this concept will be considered next.

3.2.4. Real Time Monitoring

Earlier it was identified that there are three segments to the system architecture. These segments may or may not be composed of clusters of applications containing a combination of multiple libraries.

Communication between such a varied range of technologies and the requirements to monitor changes in as close to real time as possible will form the basis of the connecting points represented in figure 6.

The real time monitoring aspect of the system should:

notify viewers of newly added events

capture a structure of formatted data retrieved from the streaming server

listen for connections and disconnections

monitor system events and changes in the system state

process thousands of requests

manage errors gracefully

The real time monitoring aspect of the system works in unison with data persistence and storage. There will be a considerable amount of data passed across environments and there are several aspects of the data that should be tracked for future purposes. Now that the requirements of each segment of the system are defined it is now time to consider which technologies will meet these requirements.

36

Honours Project - SOC10101

4. System Modelling

10010000

This chapter will discuss the tools, libraries, applications, servers, database technologies and frameworks and the reasons for selecting these. The mobile application will be addressed first.

4.1. Mobile Application

In the current market there are a number of operating systems that are utilised in smartphones including Apple iOS, Google Android, Windows phone, Blackberry OS and Ubuntu. For this application the selection criteria is heavily influenced by developer experience and the availability of open source tools to develop the application.

There are also other supporting reasons for the choice of mobile operating system (OS) such as familiarity with the programming language which offers a more gently sloping learning curve. Extensive documentation and literature about the chosen platform are also a deciding factor.

The mobile application will be developed using the Android OS. The Android OS was

first devised by Android Inc., who set out to develop an operating system targeted at mobile devices built on the Linux kernel and was eventually purchased by Google in

2005

(Elgin, 2005). The Android Software Development Kit (SDK) was released in

2008

(Morrill, 2008) and allowed developers to integrate android libraries into their

projects.

With over one billion android devices activated (“1 Billion Android Devices Were Activated in Just A Few Years, Making it The Fastest-Growing Platform Ever,” 2013) the prospects of exposing the application to a large audience solidified the motivation for selecting the Android OS.

According to statistics by Google for Android devices that have accessed the Google Play Store app the project application would be exposed to almost 80% of Android devices; this approximates to eight hundred million (800,000,000) devices.

37

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 6 - Distribution of Android devices The Android OS provides

Figure 6 - Distribution of Android devices

The Android OS provides a framework for developers to build applications or “apps” by providing libraries that allow for the communication and interaction between hardware and software components within the OS.

How the Android OS is composed and how it satisfies the streaming functionality defined in the requirements will be discussed next.

4.1.1. The Android OS Architectural Overview

The Android OS provides an application programming interface (API) for developers to communicate with a multitude of software and hardware components. The OS is built on top of the Java programming language and provides libraries that interact with multimedia, graphics, animation, user information, location, connectivity, data and more.

38

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 7 - Android System Structure This section will focus on

Figure 7 - Android System Structure

This section will focus on those aspects of the Android framework which fulfil the requirements defined in chapter 3. The framework is built around a combination of JAVA and XML. The XML aspect provides a loosely coupled relationship between the GUI and the code enabling developers to create different layout files for varying screen sizes. This enabling factor allows for the manageable extensibility of applications as code independence supplements modifiability and scalability.

A significant example of this in action involves an application which is targeted at local markets and international markets with the need to alter text content in relation to the users’ nationality. With the XML dependency developers can define text in different languages and call the XML file specific to the language pre-sets of the device.

The OS revolves around five components that offer developers the ability to interact with the environment in a secure and non-restrictive way; these are represented as intents, activities, services, content providers and broadcast receivers.

These concepts are best understood through the expression of simple examples. An activity is essentially the user interface element of the framework; a service can run

39

Honours Project - SOC10101

10010000

independently in the background; if there is a need to start another component then this can be initiated by an intent; regardless of the data storage method used on the device, access to data is managed using a content provider. Finally if there is a requirement to listen to system-wide changes within the OS then the developer will use the broadcast receiver.

The development of the application will revolve around these components and the libraries that will be used will be defined in the next segment.

4.1.2. The Android API

The activity component of the Android framework will allow for the design and development of the application. An activity is similar to the concept of a web page; it is place of interaction, content provision and represents a location within the application make up.

This is where development efforts will be focused as this iteration of the development cycle is dedicated to producing an application that provides the minimum functionality. In essence this is how the mobile application is connected to the streaming server through the use of activities and actions taken on the user interface. This is where users will start or stop streaming or enter information about an event being streamed. It is within an activity that the majority of all user functionality will be developed.

The activity lifecycle allows for runtime adaptation and adjustments to changes in the user interface as events happen. These include changes to the screen orientation, interruptions by higher priority applications such as incoming calls and other configuration changes. For future development work the activity lifecycle offers a degree of flexibility with the OS environment as events are frequently broadcast.

40

Honours Project - SOC10101

Honours Project - SOC10101 Figure 8 - The Activity lifecycle 10010000 Figure 8 shows the activity

Figure 8 - The Activity lifecycle

10010000

Figure 8 shows the activity lifecycle which is essentially the lifecycle of the interface a user currently interacts with.

One of the predominant functions defined within the requirements is the ability to stream audio and video to a streaming server. Streaming involves the transfer of video and audio content over a network using a network protocol such as HTTP, UDP, RTSP or RTP. In addition the video and audio being captured must be encoded in a desired format such as H264, H263, MP4 for video and AAC, FLAC or MP3 for audio.

The Android framework provides support for a number of network protocols and media formats (“Supported Media Formats | Android Developers,” n.d.), satisfying the capacity for the framework to support live streaming of media.

The format in which the media should be streamed is an essential early design choice for the developer. The difficulty lies in effectively aggregating a set of coherent steps which will begin with capturing video from a users’ camera and audio

41

Honours Project - SOC10101

10010000

from the sound hardware, converting the captured data into a stream of packets that will then be in a suitable format that can be transferred over a network.

The difficulty of this requirement is the time constraints in developing the functionalities needed to implement the process of streaming audio and video. Before the decision to progress with developing the structure of the application, research was conducted into whether libraries existed that met the project requirements.

4.1.3. Developer libraries

Research was conducted on what current methods and libraries could be used to stream data from a mobile device to a destination. One aspect of this research was that the feed captured from the users’ camera could be transformed into chunks of data and sent over the network to be rebuilt by the destination point. This would introduce the complexity of data management as well as piecing segments of the data together and additional computation as opposed to current methods which exist and have solved this problem in a more efficient way.

Github acts as the first point of consolidating the above design decision; this is a repository of code projects built across a variety of languages by developers scattered throughout the world.

There are applications within the Android marketplace that allow users to stream from their device. Some developers of these applications maintain a code repository on Github which will allow them to freely develop their application and allow other developers to utilise the libraries subject to the limitations imposed by their licensing choice.

For example “libstreaming” (“libstreaming - A solution for streaming H.264, H.263, AMR, AAC using RTP on Android,” n.d.) allows developers to integrate live streaming services into their application, providing numerous options and contains classes that handle building a stream as well as classes for handling audio and video. The formats are currently limited however a stream can be sent using a

42

Honours Project - SOC10101

10010000

generic format and the streaming server will be able to take the stream and convert it

to the relevant formats for different users if necessary.

The developer has focused on using RTP, UDP, SDP and RTSP network protocols with the support of the H264 and H263 video codecs, and AAC and AMR audio codecs. The library is built around the Internet Engineering Task Force’s (IETF) Request For Comments (RFC) documentation on the implementation of the algorithm for packetizing H264, H263, AAC and AMR.

Considering the H264 and AAC codecs are becoming the standardised format for video and audio this library supports the requirements of the application to be developed. These formats are used by some of the larger providers of video on demand such as Netflix, MLB and iTunes (Minoli, 2012, p. 241).

The captured stream in H.264 and AAC must transfer from the user’s device to the streaming server; this will be the focus of the next section.

4.2. The Streaming Server

The streaming server must be able to satisfy the functionality described in the requirements engineering section. It must also be able to accept communications from a mobile device and also communicate with the web application. This implies utilising libraries and applications that support the integration between these components.

There are a number of streaming systems available which are either open source or proprietary. These systems enable users to interact with a web application when requesting live streams or video on demand.

A number of systems were identified and the following were selected for

consideration; Wowza Media Server, Unreal Media server, Nginx RTMP module and Flash Media Server.

43

Honours Project - SOC10101

4.2.1. Streaming media server

10010000

Wowza Media Systems is a platform which allows for the live streaming of audio and video. The service is based on a monthly subscription model with a thirty day free trial offered. A developer license is available which enables developers to use the systems beyond the trial period but limits the deployment of commercial applications. This streaming system essentially fulfils all the requirements necessary to begin development for the project however the trial and developer limitation would have created future obstacles. In addition the invested time may not allow for granular control or the flexibility necessary to integrate future ideas. One advantage of the system was the handling of multiple streaming formats and a JAVA API that allows interaction with the streaming service.

Unreal Media server operates mainly in a Windows environment and provides a graphical user interface (GUI) for users to manually connect streams to the system. They offer IP audio/video streaming using a multi-protocol, low latency streaming server platform.

This was a possible solution however their documentation was unclear about how to integrate the streaming system with other systems allowing for the dynamic setup of streams and the choice of output on the recording of live streams. Again control and flexibility is a strongly desired non-functional requirement therefore this solution was unsuitable.

Nginx is a web server similar to Apache which allows system and web administrators to deploy web sites. The Nginx RTMP module is a plugin which is integrated into the Nginx engine that enables live streaming through the RTMP protocol in Adobe flash format. The benefit of this module is that at the current state of the streaming industry flash video live streaming is the predominant delivery format.

Unfortunately this technology was abandoned by Adobe in favour of HTML5 and will in due course become a legacy format built into browsers. The abandonment by Apple and more recently Android of the flash format as well as Adobe’s move

44

Honours Project - SOC10101

10010000

towards HTML5 (Peckham, 2011) will in due course lead to increased support for open source formats integrated into live streaming services.

The Nginx RTMP module is an open source module which provides developers the opportunity to live stream as well as broadcast video on demand. It is similar to the format by which Wowza operates as a majority of the development work into live streaming services was based around media players being able to deliver high quality streams that todays’ open source formats are not capable of.

This was chosen as it is open source and there is no cost associated with its use unlike Flash Media Server. The difficulties with the RTMP module is that the documentation is based on a blog of articles however the main repository is on GitHub with some informative examples and guidelines.

The Nginx RTMP module depends on a framework that is able to capture video on demand or live streams and forward the streams to the server. This implies that an intermediary will be required to capture streams from any device or input format and forward the streams to the Nginx server. For this task FFMPEG was reviewed and chosen, it simply acts as the application that connects incoming streams to the streaming server with the flexibility to choose how the stream is handled.

4.2.2. FFMPEG

FFMPEG is a cross platform solution that allows users to record, convert and stream audio and video providing functionality such as encoding, decoding and transcoding with support for antiquated and current formats. It comes with command line tools to convert between different formats, a streaming server for live broadcasts, a media player and a stream analyser.

It is composed of developer libraries that include decoders and encoders for audio and video codecs, input device grabbing, media filters, image scaling and audio resampling as well.

45

Honours Project - SOC10101

10010000

Considering it is an open source project there has been much development around FFMPEG with developers creating libraries that allow for the system to be integrated into other programming languages such as JAVA, PHP and JavaScript. Wrappers have been created around the command line interface utilising the output of the results to handle errors and analyse streaming information. With the ability to customise factors such as streaming dimensions, bitrates, latency management and maximum transfer rates this library offers the flexibility and performance required.

It is with this flexibility and availability of extensions to the FFMPEG library that the FFMPEG platform was chosen. It will fulfil a majority of the functionality required to capture a live stream as a streaming engine and re broadcast it to a streaming server.

Through the analysis of solutions it was decided that FFMPEG and Nginx RTMP module would fulfil the functional requirements as they provide greater control over the streaming process with heavy customisation. Being able to fine tune and analyse incoming streams will provide additional benefits for future work

Now that the mobile application can send live streams to FFMPEG which will capture the incoming stream and forward the output to the Nginx RTMP module users will need to be able to see updated video feeds in real time whilst observing a stream.

There are several options for integrating the back end of a system to the front end for users to see. The system will need to be able to receive incoming streams and then broadcast to all users that a new stream has been received and the web application will automatically update with the latest stream.

This introduces further architectural considerations into how a group of services can listen for incoming connections sent from a mobile device, capture the stream information from the device that will then be used by FFMPEG to capture the incoming stream and distribute the stream to the streaming server application. Bearing this in mind the system will also need to be able to immediately broadcast the new live stream to the web application.

46

Honours Project - SOC10101

10010000

This is essentially the key element of the system and with advances in web based services such as web sockets in HTML5 a new level of interactivity can be introduced. Web sockets will form the basis of communication between the streaming server and the web application. Techniques such as polling, long polling, server sent events and web sockets allow for the dynamic communication between a server and client in some cases maintaining an open connection between the two communication points.

A library, Socket IO, was identified for the potential to communicate in real time. This library is the final bridge between the server and client/clients, enabling live update of all user connected to the web application and supporting cross browser usage as well as fall backs for legacy browsers.

4.2.3. Socket IO

Socket IO is a cross browser web socket library for real time applications. The library was developed using JavaScript and allows for real time communications in all browsers and mobile devices. It uses transport mechanisms based on browser support and the list of techniques include WebSockets, Adobe Flash Socket, AJAX long polling, AJAX multipart streaming, Forever Iframe and JSONP Polling.

Since Socket IO was developed in JavaScript it would need a runtime environment in order to be executed. The most suitable JavaScript based application which would support Socket IO as well as the other components to be implemented is a JavaScript based server called Node.js.

4.2.4. NodeJS

Node.js is a platform built on Chrome’s Javascript runtime and is an event-driven, non-blocking I/O model enabling data intensive real-time applications across different devices (Joyent, n.d.). There are many advantages to using Node.js in which one of the predominant advantages is the ability for the application to prevent deadlocking allowing concurrency and multi-threaded processing. Never needing to

47

Honours Project - SOC10101

10010000

block system processing due to JavaScript’s’ callback methodology Node.js provides the necessary solutions for the objectives of this project.

With an already impressive number of modules available to extend Node.js functionality as well as support for reading streams and Socket IO Node.js is selected for its performance, ease of scalability and familiarity with the JavaScript language.

Node.js can also be used to act as the sole server application which serves html content to browsers as opposed to Apache and Nginx. Being able to develop a majority of an application within a single language proposes performance benefits on top of Node.js’ event based system. It is interactive and similar to FFMPEG where there is a lot of development in the community with extensible and advanced features being created frequently. Many libraries have cemented their presence for developers to utilise and in most cases a library can be used within a browser and Node.js such as Underscore.js.

The deciding factor for using Node.js was the functionality offered by Socket IO for real time communications and the simplicity of the Node.js architecture. Informative

The final segment of the streaming system is the web application in which users will be able to observe streams as they are retrieved from mobile devices.

4.3. Web Application

Visitors to the website will need to experience live updates of video feeds in accordance with the type of stream being broadcasted. Using a suitable set of technologies to allow the communication between the streaming server and the web application was one of the prime functionalities determined previously.

Options are available for web application frameworks that provide live updates based on their structural composition. These applications are predominantly based on Asynchronous Javascript and XML (AJAX) and through research this was identified as an ideal technology to implement for the web application.

48

Honours Project - SOC10101

10010000

Frameworks which offer this capability include AngularJS, a web development framework by Google which allows developers to bind data to html elements and dynamically update data values based on changes within a data structure. Other frameworks such as EmberJS provide a full stack of options to create a MVVC type architecture based on Javascript. These frameworks are extensible and augment the functionality provided by Javascript for new applications.

Considering the level of development needed to integrate multiple technologies the type of framework selected should be reassessed at a later stage once the main functionality is achieved. For the initial development phase a simple HTML5 web page will be developed with libraries such as jQuery with the Bootstrap CSS framework as well. Developing a web application that integrates responsive layouts from the offset will prove beneficial for layout control and the presentation of the final application. Using jQuery will allow for the manipulation of html elements when populating the application with new streams.

In addition to layout and functionality users will need to be able to view streams

using a media player. The web application will need to be able to display live feeds across multiple devices and allow for the broadcasting of varying types of media formats such as WebM, MP4, OGG and Flash.

A number of media players were reviewed including JW Player, VideoJS and

Flowplayer. These libraries offer the benefit of media playback based on Flash and HTML5 technologies. JW Player and Flowplayer integrate both HTML5 and Flash and also provide an API which allows for the communication between the media player and pre initialising the player with pre-sets such as muted video and setting the streaming source.

At this stage the two main players of interest are JW Player and Flowplayer with the decision to use which one to be determined at the time of implementation. It is difficult to determine at this stage which player will be more suitable with such similar functionalities offered and therefore this will be decided at the time of development.

49

Honours Project - SOC10101

10010000

Now that the components of the system are selected the next stage of the project is to map the functionality using design techniques in order to grasp the general functionality performed by the different sections of the system.

50

Honours Project - SOC10101

5. System Design

10010000

In

chapter 3 the architecture of the system was illustrated and provided an overview

of

the components of the system. Understanding the interconnectivity between

system components and mapping the functionality from the requirements stage of the project to the implementation stage is a key process within the software development lifecycle. This interconnectivity was defined in Chapter 4 with the

results of the research providing the foundation for the design of the system.

The Unified Modelling Language (UML) will be used to illustrate the processes inherent within each segment of the system in order to model the requirements. To begin the primary actors of the system, broadcasters and users, were considered and a set of actions were defined relating to the mobile application and the web application.

Users of the web application will be able to view streams based on events as well as remove streams from view and update their settings based on their viewing activity.

A

myriad of actions can be provided to users of each platform however the objective

of

the application is to allow broadcasters to stream and users to view the supplied

stream. The use case diagrams in figure 8 portray these requirements for each user

with the supplemental prototypical representation of the mobile user interface and the web application interface displayed in figure 9 and figure 10.

51

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 9 - Use Cases for Viewers and Streamers Illustrated in

Figure 9 - Use Cases for Viewers and Streamers

Illustrated in figure 9 the most common tasks are identified such as logging in, starting a stream as well as setting predefined values for streaming.

Users of the web application will be able to observe all events and view specific events within the streaming events page as illustrated in figure 10. The predominant function of the events page which represents the minimum deliverable of the project for the web application is to dynamically load streams as they are being broadcasted by mobile devices.

dynamically load streams as they are being broadcasted by mobile devices. Figure 10 – Web Application

Figure 10 – Web Application - Events page

52

Honours Project - SOC10101

10010000

The smaller video placeholders within figure 10 represent active live streams with a low quality configuration. The larger video represents a high quality stream based on a number of parameters to determine the streams quality which will be accessible within a data structure.

Mobile users will have the functionalities defined in the requirements engineering section actionable in the mobile user interface illustrated in figure 11.

in the mobile user interface illustrated in figure 11. Figure 11 – Mobile Application - User

Figure 11 – Mobile Application - User Interface - log in | stream | event

The minimum deliverable of the mobile application is to allow users to stream from their device. Although the login page and the event description page of the mobile application is included for the UI design the main focus is the stream page which allows users to start/stop streaming as well as select the front or back camera. For development purposes an IP address will be required for the user to stream live video to a server although the production version of the application would not require this.

The mobile application is based on a library provided by fyhertz as mentioned in chapter 4 section 1.3. The class diagram represents the main components of the code that are utilised to initiate a streaming session. This library handles a majority of the interaction with the android device’s camera as well as the codecs needed to

53

Honours Project - SOC10101

10010000

capture audio and video and it includes functionality for streaming through RTP or RTSP. The streaming application is built upon the class diagram in figure 12 with users being able to toggle streaming and switching between cameras if there is more than one camera.

switching between cameras if there is more than one camera. Figure 12 - Mobile Application –

Figure 12 - Mobile Application – User interface class diagram

These processes are interconnected and this is seen in figure 13 which demonstrates the movement of processes from the broadcaster who begins streaming to the web application which displays the stream to the user. The mobile application will begin the recording process and broadcast data to the streaming server which will listen on a particular port. The streaming server or Node.js will then capture the data and process the parameters provided. The parameters will then be used to capture the stream from the mobile device and initiate other applications such as FFMPEG to begin transcoding the incoming stream and broadcast the stream to the streaming server. Once the broadcast is initiated the details are then added to a data structure to track the active users of the different streams and then this information is broadcasted in near to real time to the web application. The stream is then added automatically and users can then view the newly added stream.

54

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 13 - Activity Diagram of the Streaming Process Now that

Figure 13 - Activity Diagram of the Streaming Process

Now that the functionality of the system is modelled the implementation stage of the project can be initialised. It was stated previously that the USDP method will be utilised to implement the project. This is a necessary consideration as there are a multitude of components which may inevitably introduce obstacles to the development process and an Agile methodology will be required to adapt accordingly.

55

Honours Project - SOC10101

6. Implementation

10010000

The first step in the implementation process was to set up a version control system for the ease of management of the code. Git was selected as the version control system and a Bitbucket account was set up to track changes. This provided a great degree of flexibility as the development of the project could be completed on any platform and tracked with updates being pushed to the central repository hosted by Bitbucket.

Another important contribution that Git provided was the ability to develop the features of the system defined in the requirements stage in parallel. Git’s branching scheme allowed for the system at working states to be detached from the central repository and functionalities such as capturing a video stream or calling FFMPEG commands to start streaming could be developed separately then merged.

Using a version control system also meant changes could be reverted easily to previous working states if the system were to fail. This allowed for the Agile development of the system across all applications being developed including the mobile application, the streaming server and the web application. Git will continually be mentioned throughout the explanation of the system development as it has heavily influenced decision making throughout.

as it has heavily influenced decision making throughout. Figure 14 - Bitbucket - project management through

Figure 14 - Bitbucket - project management through version control

Three repositories were set up for the development of the mobile application, web application and the streaming server. With the version control system in place the first iteration of development could commence.

For the sake of clarity the components of the system will be defined as these will frequently be referred to in different contexts. The mobile application is straightforward; it is an application running either in a virtual device or a mobile

56

Honours Project - SOC10101

10010000

device. A virtual device is an emulated Android mobile device that mimics the environment of an actual mobile device. The streaming server is a term which is composed of multiple technologies. The streaming server is actually a Linux OS operating in the context of a virtual machine. The streaming server is composed of the applications needed to execute the functionality of the system such as FFMPEG, Node.js, Nginx and the Nginx RTMP module. The streaming server contains two additional server applications which function as a web server similar to apache and these are Node.js and Nginx. In order for a Node.js server application to be running a JavaScript file is required to be executed by Node.js which runs as a background process awaiting incoming connections. The web application is a website composed of all the elements defined in the requirements section. It is composed of the JavaScript and CSS libraries necessary to run as well as the media player.

6.1. The first iteration

To begin with the mobile application was implemented using the Android software development kit (SDK) on the Android Studio Interactive Development Environment (IDE). The options available included the Eclipse IDE and the IntelliJ IDE which also integrated the Android SDK however Android studio was selected as it was developed by Google. This meant that features akin to the language were available within Android Studio which made the development process far more progressive such as refactoring suggestions and library calls. Once the IDE was selected and the necessary libraries installed the user interface was created using the built in modelling tools.

Google provides a virtual device manager that allows developers to create emulated devices with different configurations for a broad range of mobile phones. This also allowed for the development of the application to be tested across these different emulators for stability and performance with the main benefit being a device was not needed for the application to be tested.

57

Honours Project - SOC10101

10010000

In the requirements stage Libstreaming was identified as a library which could be implemented to satisfy the functionality of the mobile application. Many of the difficulties experienced during the development process included the learning curve in using Android Studio as well as the limited documentation provided by the Libstreaming library. Examples were provided on the Github repository as a springboard for testing and a majority of the development time on the mobile application encompassed debugging the sample application with Android Studio’s debugging tools and monitoring changes through the logging utility Logcat. When debugging the application Logcat was constantly relaying information about the status of the emulated device and pinpointing errors during development became cumbersome.

Several exceptions were thrown at the beginning of development that was mostly related to the project set up and configuration. The most common issues were resolved by using a knowledge resource such as stackoverflow.com and these were logged as shown in figure 15 on a google spreadsheet document for reference as quite often the same issue recurred.

for reference as quite often the same issue recurred. Figure 15 - Testing Management The Libstreaming

Figure 15 - Testing Management

The Libstreaming library handled a majority of the functionality to capture a live video and audio feed. The different virtual devices can be configured with a number of parameters enabling them to act as if they were functioning on a mobile network.

One of the obstacles for debugging the application was capturing the video feed from the camera and connecting this process to the streaming server. The emulators when actively running do not operate on the local network and therefore are not assigned an IP address within the range by which they could be used to communicate with other devices on the network. This was a crucial requirement

58

Honours Project - SOC10101

10010000

whilst developing and testing the application. In order to overcome this obstacle instead of launching the emulator through the Android Studio interface the command that calls the executable of the configured emulator would need to be called manually with a set of parameters. Most notably it would require that the DNS server be specified to allow the device to communicate over a local network.

Ascertaining which network the emulated device was on was accomplished through the process of elimination. It was discovered that the device can still access the internet as well as access locally hosted websites but could not communicate with other devices. To solve this a secure shell (SSH) application, Putty, was utilised to log into the virtual device and the ifconfig command was used to capture the IP address since the Android ecosystem is based on Linux. Once this value was captured a simple Google search on how to bind the emulated device to the local network produced the necessary results. The emulated device could operate on the local network as long as the DNS server was defined to be the IP address of the wireless router at the time of launch.

the IP address of the wireless router at the time of launch. Figure 16 - Command

Figure 16 - Command to launch an Android virtual device

This process of binding emulated devices to the local network was necessary in order to gain control over the conditions for testing and integrating other components of the system within a common network. At this stage the emulation tools were the only Android devices that could be tested as a real device was not available to bypass the aforementioned issue of emulated devices operating within their own IP environment.

There was no streaming server and web application to determine whether the mobile application could successfully stream a live video feed. To test whether live video was observable the VLC Media player was used to capture the live stream and functioned as a feedback mechanism for future action. Not knowing what to expect

59

Honours Project - SOC10101

10010000

as the camera feed was emulated, as represented in figure 17, further research was required to determine the correct output format so that the live stream may be captured.

output format so that the live stream may be captured. Figure 17 - Mobile application on

Figure 17 - Mobile application on an emulated Android device

The developer of Libstreaming recommended that the Session Description Protocol (SDP) be used; since the library already produces the configuration of the stream for SDP in string format the simplest solution was to begin the stream and send the configuration to the streaming server. The mobile application at this point was effectively capturing streams and VLC Media player was able to play the emulated stream based on the generated SDP configuration. The mobile application was now at a sufficient point in which to move to the development of the streaming server, this was the next stage of the development process.

The streaming server as defined in the requirements phase is not just a server with the ability to stream live video but it is essentially a Linux distribution with a host of technologies composed of the components identified. To interlink the mobile application with the streaming server the Ubuntu OS was downloaded and installed on a virtual machine using VMware. This offered numerous benefits for the development process such as sending the virtual machine into a paused state that

60

Honours Project - SOC10101

10010000

could be revisited enabling continuous development. This benefit will be further realised when the components of the system are being integrated.

To begin FFMPEG was installed using the inbuilt Linux distribution application repository. The FFMPEG library is a vastly complex yet powerful system for doing an array of actions on video and audio. Becoming acquainted with the commands and testing the results of different combinations of commands was the first step. Overcoming the learning curve of the application and researching how to convert a video into the desired formats took up a considerable amount of development time. A default video was added to a directory and FFMPEG commands were called against the file. Creating different formats of the file as well as changing the dimensions and manipulating other aspects of the video were part of the learning process. This step in development enabled the familiarity with the command line and the type of problems that would be experienced.

The output from FFMPEG was in essence a series of trial and error experiments until the desired effect was accomplished. Commands that were successful were logged and became a point of reference for future video efforts. A number of problems occurred whilst trying to use FFMPEG. Missing libraries, incorrect command line arguments and access to information in a central location which catered for beginners were just some of these problems. Forums and articles were dotted throughout without clear instructions as to how to resolve some of the technical issues that would commonly occur at the beginning of using the library. The output of FFMPEG within the command line was also challenging in determining the source of an error for an inexperienced user.

With a beginner’s level of knowledge of Linux commands and inexperience with the setup of applications within the OS the Linux distribution became disorganised and bloated. Installing libraries that would support the development process required “symlinking” which is the process of manually installing an application and linking a command within a terminal application to the installed application. The reason for this is that during the experimentation process with FFMPEG there were some visual cues that suggested that the installed version of FFMPEG from the Linux repository

61

Honours Project - SOC10101

10010000

was in fact not the official distribution from FFMPEG. It was discovered that this was a forked version of the library from a developer who left the project. This would certainly explain some of the performance and other issues that transpired as a result of using the library.

Re-installing the FFMPEG library from the official distribution on Github manually and uninstalling the forked FFMPEG version became more than a trivial task. This led to the OS becoming increasingly fragmented till the point where the development process became stagnant. The best option at this time was to re-install Ubuntu in a separate virtual machine and begin again. This was not entirely a set back as the reinstallation took a short period of time and any code developed could be pulled from the repository. There was a certain level of flexibility to the development process which aided in overcoming trivial obstacles as well as complex problems.

By manually installing the FFMPEG library another set of problems were introduced despite following instructions from the available resources online. Missing tools meant systematically installing different applications until the manual installation progressed without error. In comparison to the time it would take to attempt to solve the previous virtual machine instance this method was certainly more effective.

The new installation was tested against a default video and the results were immediately noticeable. This greatly enhanced the confidence in the development process as a more reliable and less cryptic response could be observed from the output of the video being manipulated. Now that the library was installed correctly some integration testing could be conducted on the system. This meant beginning a live stream from the mobile application which was running inside an emulated Android device and capturing the stream on the Linux virtual machine using FFMPEG. Again VLC Media player was used to verify if the stream was being captured correctly within Linux.

Since the Mobile application did not automatically create an SDP file with the generated string containing the streaming parameters this information was captured through the debugging process within the Android Studio IDE. The file was then

62

Honours Project - SOC10101

10010000

created with the SDP configuration information manually and whilst the emulator streamed the simulated video feed VLC Media player was used to load the stream based on the newly created SDP.

This process represented a milestone in the development of the mobile application as the streamed video content could be viewed on a media player or captured by FFMPEG. The new installation of FFMPEG provided less cryptic feedback during streaming which made it easier to debug and fix. At this iteration of the development lifecycle where a streamed video from an emulated mobile device could broadcast and be viewed through some manual set up of the configuration file, it was decided that the development focus would move towards forwarding the stream to the web application for users to view the video.

The process of development was to iteratively integrate each functional component of the system at a working state. This approach adopts the idea that however the functionality is achieved the focus is to integrate a close to working model of the components and further refinements can be applied to the core at a later time.

Now that the stream is being captured the next step was to forward the stream to a streaming server application that would make the stream accessible to the web application. This is so the web application could use the broadcast address to display the live video feed to the users. Before developing the web application the streaming server application selected for the first iteration of development was FFServer which is bundled with FFMPEG.

FFServer is a streaming server for both audio and video and supports several live feeds. In order to interact with FFServer, FFMPEG will need to capture the stream and forward the stream to FFServer. FFServer will contain a configuration file that allows for the captured stream to be converted or rebroadcasted over RTP/RTSP or HTTP. Several attempts were made to interlink the incoming stream to the server and this process became quite arduous. In addition the streams needed to be manually configured for each type of expected input.

63

Honours Project - SOC10101

10010000

Given that the proposed concept is to capture a number of incoming streams and automatically broadcast these streams to a web application FFServer would not have been ideal for the continued development of the project. What was critical to the interconnection of incoming streams and streaming server application was the ability of the user to view live streams. It was highlighted in chapter 3 that Nginx server was the chosen streaming server application and it is for this very reason that RTMP and Flash video is the way forward for the current climate until open source media formats become the mainstream for video streaming.

6.2. The second iteration

The second iteration focused on dynamically connecting the mobile application to the streaming server and automatically forwarding the stream to a streaming server application. The functional requirements defined previously include being able to dynamically capture and broadcast streams automatically. As a result further research was needed to be able to satisfy this requirement. After much investigation and reviewing the available video streaming players as well as the limitations of different formats to be streamed live, such as MP4, it was concluded that the best course of action was to use a streaming server application that could broadcast flash based media over RTMP. This has already been proven with a majority of live video streams being delivered through this format observable on many live streaming websites. This was in part due to Adobe and their investment in their Flash Media Server which delivers live streaming video with great scale for an inordinate sum of money beyond the scope of this project.

Research was necessary to discover open source alternatives and a number of options were available. Examples of RTMP streaming applications were found on GitHub and programming forums providing recommendations as well as shortfalls on a majority of the options. An open source alternative to Adobe’s Flash Media Server was discovered and contained a considerable amount of documentation to help beginners implement the library. This was the Nginx RTMP module that was developed by Roman Arutyunyan (Arutyunyan, n.d.) for Nginx. Not only would this module fulfil the requirements but it also enabled other options for further development.

64

Honours Project - SOC10101

10010000

The instructions to configure the Nginx RTMP module were clear however as with FFMPEG and the mobile application development there was a learning curve to overcome.

Once Nginx server was installed and the RTMP module was correctly configured another attempt was made to capture the live incoming stream. This was tested again with the VLC media player and as a result of constant debugging between using the correct FFMPEG command to capture the incoming stream and transcoding it with the correct parameters into the format Nginx RTMP expects the media player displayed the video.

Incoming mobile streams could now be broadcasted to a streaming server application and be observed within a media player, another milestone in the progression of the project had been attained. In essence the second iteration of the project began with researching an alternative to the streaming server originally identified and then preceded with the development and integration of the Nginx server and the installation of the Nginx RTMP module.

The connection between two systems had been obtained however whether this process could happen automatically and determining if the web application could display the live video feed was the next phase of development.

6.3. The third iteration

Now that the live video feed is being pushed to the streaming server application and it was determined that the feed is being broadcasted and is observable, the next phase was to develop the web application and attempt to display the live stream on a web page.

Chapter 4 has already defined which elements would be included within the web application and many elements such as the CSS libraries and the JavaScript libraries were trivial to implement. The complexity of the web application lay in selecting the correct media plugin to display the live video feed. It is mentioned that two libraries were selected as they supported the RTMP streaming protocol for flash

65

Honours Project - SOC10101

10010000

based media as well as video on demand; these were JW player and Flowplayer. The approach to implementing these plugins involved creating a simple web page that incorporated the plugin files and to set up the JavaScript configuration for the default player to capture the live stream.

Then within the Git repository for the web application a new branch was created for each media player. This meant that by switching between branches a different html structure would become active and when displayed on the web page the contents would reflect that of the active branch. It is very simple to switch between branches and testing which media player would best suit the needs of the system was the goal of this approach.

As a result JW Player was the easiest to configure supporting a wide array of configuration commands. JW Player was used as a free media plugin with the ability to purchase a license at a future date to enable further capabilities. To enable the plugin to work on the web page an unusual requirement was forced by the plugin in which the developer would be required to register with the website and retrieve an access key. This needed to be within the html page of the web application, once working the free version of JW Player could be utilised.

When testing JW Player it was important to know the URL of the server that was streaming the live video. In a real world application this would simply represent the domain name however the setup of the development environment involved using a virtual machine hosting Ubuntu being executed over a local network. Having little experience in DNS management the only alternative was to work with the local IP addresses of the machines given the short amount of time available. For each wireless router that was used the streaming server was configured with a hard address of 192.168.0.4.

the streaming server was configured with a hard address of 192.168.0.4. Figure 18 - JW Player

Figure 18 - JW Player live stream configuration

66

Honours Project - SOC10101

10010000

This would allow for the media player to capture the stream from the IP address with the additional configuration of the streaming server broadcast location for the active stream and the stream format.

In order for the player to capture the stream the streaming server application must listen on a particular port, in this case port 1935. The broadcast should be sent to the URL of the RTMP module including the application name, which in this case is “live” as shown in figure 19.

which in this case is “live” as shown in figure 19. Figure 19 - Nginx RTMP

Figure 19 - Nginx RTMP configuration

This allows for the command to be executed by FFMPEG to connect to the streaming server application and broadcast the stream based on the server IP, the application name (live) and the type of broadcast (flv). The command to broadcast the stream using FFMPEG would be represented by the following:

stream using FFMPEG would be represented by the following: Figure 20 - FFMPEG command Figure 20

Figure 20 - FFMPEG command

Figure 20 represents an FFMPEG command that allows for the streaming file that was created after capturing the configuration from the mobile application to push the stream to the streaming application ‘live’ with a stream name of ‘test’ in flv format. For this development cycle the incoming audio was ignored and the size of the broadcast was limited to a resolution of 320x240.

67

Honours Project - SOC10101

10010000

Most mobile devices on a 3G network will only be able to obtain an upload rate of up to 120kb/s. This meant that unless a user was on a faster network such as 4G or is connected to a wireless network with a high bandwidth rate most users will only obtain the upload rate as mentioned previously. It also allowed for users of the web application to watch the video without interruption as in most cases the download rate far exceeds the upload rate. Streamed video would also remain within a reasonable size of 44Mb of data transfer through one hour of constant streaming but this became more evident with the use of an android device as opposed to an emulated device which is discussed later.

By successfully connecting the mobile application, streaming server application and the web application the fundamental requirements of the system have been achieved. That is with a combination of technologies a user with a mobile device can live stream to the streaming server and this will then be broadcasted to all users of observing the web application. Each iteration cycle represents at a fundamental level the interconnection processes between each section of the overall system. Up to this

point the challenges and obstacles for creating a simplified version of the final output

of the project have been overcome.

A key element of the functional requirements is to achieve the above but in an

automated and dynamic way. This is where the ‘glue’ or set of applications play an integral part at expanding the functionality of the system through the dynamic and automatic process of streaming. The ‘glue’ will enable the system to retrieve a stream from multiple mobile devices, automatically push all streams to the streaming server application and will finally automatically update the web application without the need to reload.

Chapter four identified two applications that would enable this functional requirement. Node.js and Socket IO will be used to enhance the user experience and form the basis of the final iteration of the development cycle of the system.

68

Honours Project - SOC10101

6.4. The fourth iteration

10010000

Up to this point the three segments of the overall system have been connected manually and in order to achieve the dynamism required for multi user live streaming

the connection processes between devices will need to be revisited. Node.js which is

a JavaScript based server will provide the necessary integration tools for the desired dynamic element. It will also enable the automatic processing of incoming streams, create the necessary SDP files for FFMPEG, call FFMPEG through a predefined set of commands and broadcast the newest stream to all users connected to the web application. This will be aided by Socket IO which maintains an open connection between the Node.js server and the active web applications. Node.js will handle a majority of the functionality with a combination of external plugins and internal utilities.

The desire for the structural composition of the Node.js server was to keep the design modular so that systems can be built within systems and dependencies can be minimised. Modular frameworks are available through external plugins and one plugin which played an integral role in the development of the Node.js server was RequireJS. The RequireJS module significantly reduces development time by allowing developers to create a highly cohesive and loosely coupled architecture.

At this juncture it is important to remember that both the web application and the Node.js file contain the same methodology in implementation. That is the RequireJS module which is used on the front end can also be used on the server side for developing modular code. This is similar to the Asynchronous Module Definition (AMD) pattern which allows developers to encapsulate code into an independent unit of functionality.

If the system is to scale and perform as well as be easily maintainable then this

framework will aid in eliminating the complexities that can arise from a functional approach. To begin the Mobile application was refactored to allow for the broadcast of the stream configuration to the Node.js server. The description of the stream contains information about how an application can retrieve the stream for audio and

69

Honours Project - SOC10101

10010000

video. This is how FFMPEG is able to capture the stream and decode it in the correct format.

to capture the stream and decode it in the correct format. Figure 21 - SDP configuration

Figure 21 - SDP configuration for live streams

In order for the mobile application to communicate with the node.js server they would need to be able to communicate through a predefined port. For this port 8089 was selected, now when the user starts streaming from a mobile application the mobile application will send a string of information to the server on the available port.

a string of information to the server on the available port. Figure 22 - Send SDP

Figure 22 - Send SDP configuration to server on socket 8089

To receive this information Node.js will need to be running a JavaScript configuration file that is listening on port 8089. The configuration file will essentially contain a ‘net’ module provided by Node.js which allows for the creation of a server that listens on an assigned port. It is an asynchronous network wrapper which contains methods for creating both servers and clients (Joyent Inc, n.d.). Once the data is captured

70

Honours Project - SOC10101

10010000

through the net module a file is generated through the file system module which is a wrapper around the standard portable operating system interface (POSIX) functions.

Since Node.js is an event based system, when the file is being created a number of events are triggered which include open, close and finish. These are executed in a particular order and become extremely useful when implementing Socket IO for taking actions when the streaming file is generated.

Now that the SDP file is being generated automatically the next step was to take the FFMPEG command used to forward the stream to the streaming application and apply a level of automation to the process. Node.js contains a module which allows for the simulation of executing multiple commands in parallel. The module known as a ‘Child Process’ has a ‘spawn’ method which makes it possible to simulate commands as if they were executed within a command prompt or terminal application. By passing the command and a comma separated string of arguments the command will be executed without blocking the system. One of the advantages of this is that it allows for information and errors to be captured based on stderr, stdout and stdin which are references to the input or output of the commands being executed as one would see them in a terminal application on Ubuntu.

Part of the functional requirements included allowing users to log in and log out of the system but the core focus took precedence over this requirement and as a result the login mechanism was not developed. The data persistence process defined in the requirements engineering chapter was also not developed due to insufficient time and the amount of work required developing multiple applications. As users of a broadcasted stream would normally be assigned a unique identifier which allows them to broadcast without overwriting other streams a simple counter was employed to simulate this process.

That is when a user first starts broadcasting on the stream their stream name would be assigned a default variable called ‘test’ with the counter value appended to the variable. This meant that for each additional user that began streaming from their mobile device on the network ‘test0’ would increment to ‘test1’. In the fully developed

71

Honours Project - SOC10101

10010000

version of the application this would as mentioned previously be a unique identifier for the broadcaster. This means that we are able to mimic a dynamic streaming environment as opposed to a fully complete version. This was a design decision that was based on the limited availability of time whilst still being able to demonstrate the core functionality of the system. As a result the name of the streaming file and the name of the broadcasted stream were tied together by the counter value; an example would be ‘stream_0.sdp’ and ‘192.168.0.4/live/test0’.

During the development process some architectural decisions had to be made. It is accepted that these may not have been anticipated as this was the first time using many of the technologies some of the implementation considerations could not be made until the development of the functionality began.

Continuing from the ability of the Node.js server to broadcast the incoming stream dynamically the method by which the Node.js server begins the FFMPEG command is based on the event model when the streaming file is created. When the ‘close’ event is emitted this signifies that the file generation has completed and the ability to take further action is available. Within this code block the FFMPEG command is called using a function that requires the location of the generated file and the name of the stream. To test this the web application was provided the name and location of the stream and these were passed to JW Player which were edited manually. Since the FFMPEG command had already been tested and proven to work the automated setup was quickly confirmed with viewers being able to watch live streams.

72

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 23 - FFMPEG method for streaming When the close event

Figure 23 - FFMPEG method for streaming

When the close event of the file being created is executed the ffmpegInit method is called which then captures the incoming stream based on the SDP configuration file; figure 23 highlights this process. Since the automatic capture of the incoming stream from mobile devices was now complete the next step was to dynamically update all users when a new stream had been created.

This is where Socket IO plays a pivotal role in achieving the objective of updating all viewers of the web application for a particular stream. Socket IO maintains a consistent open connection with the users connected to the site. Determining how the communication process between the server and all the users connected to the server is a simple but extremely effective process.

The key to the integration of this application into Node.js is that all that is required is for events to be defined on both the Node.js server and the web application and when a particular action occurs the information is broadcasted to either all users connected to the web application or the server receiving communication based on the action of a user. The two main functionalities which emerged from this was that when a new stream is received from a mobile device all users connected to the web application will need to be updated with the stream location and the stream name.

73

Honours Project - SOC10101

10010000

The second functionality was that if a mobile application disconnects then the web application will need to be updated. Socket IO will fulfil these functionalities by acting as the bridge between the web application and the Node.js server.

Once a mobile device begins streaming, the file has been created and the FFMPEG process is started, Socket IO will send the parameters to the web application using web sockets or a plethora of fall back alternatives. When the data is received by the web application based on the event generated by the server application jQuery is used to append the stream to the web page and then the JW Player is initialised on the data provided by the server.

There were obstacles in attempting to set up this process dynamically. Mainly that if the development was to happen across multiple domains then Socket IO would throw an error regarding cross domain rights. This includes different directories in which the application is running as well as the access origin policy. To overcome this issue it was decided that the web page could be served by the Node.js server listening for incoming connections on a different port such as 8087. It essentially meant creating another web server within the same configuration file and serving the index.html file so that users could view the web page at 192.168.0.4 but they would have to include the port number to view the page.

The flexibility and capacity for Node.js to do multiple things at once allowed for this problem to be resolved with the aforementioned solution. Before this solution the web page was viewed using the Nginx server installed as a means to view the index.html file in a browser. This was removed from Nginx server and became the responsibility of the running Node.js file which would now be listening on port 8089 for incoming connections from mobile devices and port 8087 for web page requests. The cross domain origin policy error had now been overcome with this strategy and several tests were done to ensure that the data was being passed between the Node.js server and the web application. When the data was received by the web application it needed to be parsed before further action was taken. This is the benefit of communicating through sockets between two JavaScript applications running on the front end and back end. The received data would be used to automatically add

74

Honours Project - SOC10101

10010000

html elements to the dom of the web application and then the JW Player would be initialised based on this information with the stream name.

This was successfully achieved through constant incremental development with short bursts of coding combined with small units of testing to ensure that the application would operate as expected. Once the process was acting as expected tests were done using mobile devices.

To test the newly added automated code and dynamically stream a controlled environment was created which consisted of a D-Link DIR-615 wireless router, a Samsung Galaxy S3 and all the relevant components of the application running within the virtual machine. The first attempt at streaming from the mobile device was successful with the stream auto loading on the web application. Since the application is intended for multiple users a virtual device was started to simulate more than one user since multiple mobile devices were unavailable.

The virtual device initiated the stream process successfully by creating a new SDP file, auto pushing the stream with FFMPEG to the Nginx RTMP module and then the stream name and the stream location was broadcasted to the web application. The web application received the configuration parameters and initialised JW Player which then seemed to remain in an initialisation state. A command was executed on the terminal application within Ubuntu to check for running processes. A quick search was done for the FFMPEG process being executed on more than one mobile device and it was confirmed that the processes were running. More investigation was required if the final objectives of the streaming system were to be achieved.

6.5. Final Iteration

The system was at an extremely pivotal point in the project, a considerable amount of time had been spent integrating and testing the different applications. Not all elements worked as expected and multiple obstacles had been generated. They were overcome thanks to the flexibility of the generated code and the availability of support from the development community.

75

Honours Project - SOC10101

10010000

To capture the output of the error being generated the log application of Node.js was used. The error that was created stated that FFMPEG could only bind to one port at a time for incoming streams. Within the SDP file both audio and video are being sent from the mobile device and each of them are bound to separate ports. By analysing both SDP files generated automatically by the streaming process (stream_0.sdp and stream_1.sdp) it was discovered after much debugging that the mobile application which created the SDP configuration was using the same port number in which the FFMPEG process is bound to. This meant that all broadcasts from mobile users were attempting to bind FFMPEG to the same port and therefore parallel streams were not possible.

One thing to come from this was that if multiple browser windows were open and a stream began then all open browsers that were connected to the web application would simultaneously update with the new video feed. The final development task of the system was to set up mobile devices so that they caused FFMPEG to bind to separate ports. To overcome this challenge the mobile application would need to be able to select a port which no other mobile application is streaming on for both audio and video and include this within the configuration file broadcasted to the Node.js server application. This would entail managing a list of available ports within the application on Ubuntu so that cross binding did not occur. A substantial amount of development would be required to complete this final task by integrating OS commands as well as managing connected mobile devices.

The framework was in place to enable the functionality with a longer development cycle but a majority of the core functionality had been achieved. After further reviewing the Libstreaming library within Android Studio the point at which the ports for audio and video were assigned offered a springboard by which a quick solution could be implemented. Instead of assigning the same port for all mobile devices, for the initial stages of the application an option would be to randomly select a port within a range of ports and then send the configuration file to the Node.js server. A range between 30,000 and 50,000 was selected and the selection was randomised as shown in figure 24.

76

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000

Figure 24 - Random port selection

Armed with the randomised process this was simulated again by using a Samsung Galaxy S3 and an emulated Android device. This solution had correctly solved the multi user streaming aspect of system as FFMPEG was now able to bind to multiple ports. To demonstrate the multi streaming aspect of the system using this process is acceptable when a small number of users are streaming simultaneously. A better solution would be to allow for the Node.js server to maintain a list of available ports and when the request to connect to server is made by the mobile application two available ports would be sent to the mobile application.

Centrally managing this process is the commercially viable solution however at the beta stage of the project and with a small number of devices connected to the network the multi user aspect was successfully achieved. With all the components communicating and a basic level of robustness achieved the next step was to test the application with multiple users in a real world environment, this will be reviewed in the next chapter.

6.6. Results

The results are shown below in figure 25 which is demonstrated by a three step process. Two aspects of the system are to be demonstrated, mainly that each browser represents a user on a different computer awaiting a connection and more than one user can stream live to the system.

computer awaiting a connection and more than one user can stream live to the system. Figure

Figure 25 - System results step 1

77

Honours Project - SOC10101

10010000

The first step was to show three browsers awaiting an incoming stream. A Samsung Galaxy S3 is used to begin streaming to the streaming server. When the stream begins from the mobile device the expected result is for the video to load on all browser windows. Figure 26 shows this effect at the initialisation stage whilst the stream is buffering.

at the initialisation stage whilst the stream is buffering. Figure 26 - Retrieval of a live

Figure 26 - Retrieval of a live stream

Once the buffering stage completed the web application began to display the live feed from the mobile device as shown in Figure 27.

the live feed from the mobile device as shown in Figure 27. Figure 27 - Live

Figure 27 - Live stream displayed on multiple browsers

To ensure that the streaming system supports multi user streaming a virtual device was started and the mobile application was launched with a streaming session initiated from the device. The stream is successfully received from the mobile device and is broadcasted to the streaming server application as shown in Figure 28.

78

Honours Project - SOC10101

10010000

Honours Project - SOC10101 10010000 Figure 28 - Successfully added second live stream Some conclusions can

Figure 28 - Successfully added second live stream

Some conclusions can be drawn from the implementation of the different components of the system. A stream set at a resolution of 320x240 with only video streamed for one hour results in a total data transfer of 44 Mb. This can scale very quickly if multiple users are streaming simultaneously; the virtual machine was assigned a single processor with one core and was able to support this process. The mobile application will also need to be revisited and perhaps re-implemented for optimisation of resources and performance.

With all the components communicating with each other the next step was to test the application with multiple users in a real world environment, this is reviewed in the next chapter.

79

Honours Project - SOC10101

7. Product Evaluation

10010000

The aim of this project was to create a proof of concept by which users could stream from a mobile device live to a streaming server and viewers could observe the stream on a web application. In terms of the objectives of the project this was successfully fulfilled. The representation of the end product is more closely related to a working prototype and is by no means a complete streaming system.

To evaluate the system two testing methods were used; integration and usability testing. The system was also evaluated against a set of software quality metrics which provides insight into the maintainability, complexity and reliability of the system.

The first form of testing existed throughout the entirety of the development lifecycle and this was integration testing and debugging. Multiple applications communicating between each other meant small incremental changes to code and acquiring feedback. As has been seen development evolved through several iterations. Each completed iteration cycle symbolised an aspect of the overall system which was now functional. The final iteration focused on tying the sections of the system together and this would not achieve a degree of success if testing was not conducted throughout.

The second approach involved usability testing in which multiple users would attempt to stream with different Android devices in a controlled setting to the system. The results were recorded and a feedback session was conducted at the end of the testing session.

7.1. Integration testing

As mentioned previously throughout the entirety of the project an integration testing based approach was utilised at the end of each development cycle. With so many different applications being utilised an integration testing method was appropriate. To guarantee robustness each iteration needed to be verified and deemed functional

80

Honours Project - SOC10101

10010000

before other systems could be integrated. Through testing the results at the end of each iteration the necessary adjustments could be made to enhance the application.

7.1.1. Testing approach

Given the inexperience with developing for these platforms no formal method for testing was implemented however expected behaviour was identified at each stage. Referring to the first iteration of the development lifecycle where the goal was for the mobile application to stream video and for the video to be viewable a number of behaviours were identified.

These behaviours were originally defined in the requirements engineering section in chapter 3. The core functionalities were the predominant focus of the testing process. That is to say during the development of the mobile application a number of errors were thrown in attempting to implement the chosen library. These errors were reviewed, researched and then a change would be applied and the result observed and noted. If the change fixed the issue then the development would progress to the next stage. If the change failed to fix the issue then further research was required until the resolution was achieved.

research was required until the resolution was achieved. Figure 29 - Testing process on development tasks

Figure 29 - Testing process on development tasks

The cycle of development and testing at each stage is represented by figure 29. This iterative approach was repeated throughout each development cycle. A successful indication of a completed iteration would normally be represented by a visual

81

Honours Project - SOC10101

10010000

confirmation or the desired output from the execution of a command or set of processes.

This could usually be confirmed based on the output of the section of the system being tested. Feedback mechanisms were achieved through the platform in which the technology would be deployed. For the mobile application output results were viewed using Logcat, for the streaming server the output was ascertained from the execution of commands within terminal, and for the web application Chrome was utilised to monitor the output within the browser console.

7.2. Usability Testing

One method by which the end product of development was tested was through a usability testing session. For this a group of five individuals were gathered and an introduction to the project and purpose was presented. Each user received prior instructions which highlighted that they must own an Android device to partake in the testing session. A small presentation providing background into the project was demonstrated and users were then instructed to do a sequence of steps to connect to the network and begin the streaming process

7.2.1. Preparation

The setup of the testing session involved a D-Link DIR-615 wireless router and a laptop running an Ubuntu 12.04 distribution within VMware on Windows 8. The VMware configuration for Ubuntu contained 1GB of memory and a single 2.7GHz processor using one of four cores.

The prerequisites for the system involved starting the Nginx server as well as running a Node.js command on a JavaScript configuration file to listen to incoming connections. Next the web page is loaded so that users can view the streams being automatically added to the web application.

Users were then instructed to connect to the wireless router and to browse to 192.168.0.4/downloads where the application was available. Once all users had

82

Honours Project - SOC10101

10010000

downloaded the application and opened the application they were instructed to wait until further notice.

7.2.2. Testing Session

The testing of the application began with users being instructed to begin streaming systematically. The user with the Samsung Galaxy Mini began streaming and the stream began but immediately failed. This was shown by a new video being added to the screen and then being removed from the web application as the stream had failed.

The next user with a Samsung Galaxy S3 was instructed to stream and the stream was successfully added to the web page. The stream began playing, which was noted, and the next user who owned a Google Nexus S was instructed to begin streaming. Having used the device before to stream the user was instructed to change the camera in which the stream would be captured. The user began streaming and the stream was successfully added to the web application. The fourth user who owned a Google Nexus S with Cyanogenmod was then instructed to stream. The stream was successfully added to the web page and now three streams were being displayed simultaneously. The user with a Samsung Galaxy Mini attempted again to try and stream from the mobile device and on this occasion the stream was successful however the user with the Google Nexus S failed after a period of time.

One user who owned an HTC Desire S was unable to download the application although all other users were able to. This may have been due to the type of browser (Opera) as the OS version met the minimum of requirements; this user was unable to test the application.

Active users were then instructed to disconnect and reconnect their streams in which the streams were removed from the web application. One “ghost” stream remained on the web application after all had disconnected. This failed as a previous stream originally failed and was a presence throughout the testing session since the remove command was not sent to the web page.

83

Honours Project - SOC10101

10010000

The user with the Samsung Galaxy Mini re-attempted to stream and was able to stream from their device. However other devices failed when a fourth stream was added and the active streams were reduced to two.

A log was taken of the streaming session by announcing the occurrence of events

within the execution of the code blocks based on events. Appendix 7 contains the full

log list.

7.2.3.

Findings

A virtual machine running on a single core was able to provide an impressionable

amount of streams and handle multiple incoming connections. Not all users experienced immediate success and through trying to reconnect multiple times eventually became able to stream. Of the active streams there was a delay or lag from 3 to 10 seconds which extended as the stream time increased. This is highlighted by Dufaux et al (2013, p. 193) in which streaming applications through time will augment delays and incur higher packet loss rates.

Streams that were added but had failed still remained on the web application and were not automatically removed since no signal was sent from the server to signify that the stream terminated. Streams that failed could be restarted and re-added to

the web application for other viewers to see. Users who attempted to rotate the application whilst streaming would cause their session to crash and they would need to restart the streaming session. The number of streams that the system can stream

in parallel is unknown as using different devices introduces the requirement for

further testing. In addition the Android market is highly fragmented with a multitude of different brands with different versions of the Android OS.

7.2.4. Feedback and Questions

After the testing session and providing users with time to absorb the experience, a feedback and question session commenced.

84

Honours Project - SOC10101

10010000

Users felt that the concept was “very cool” and they “really like it”. One user saw it as an opportunity to sell the concept to Google Glass as it would tie into their strategy. Users highlighted that they had seen a similar concept but not in the same format and stated that they could see its application in a social and commercial context. Users enquired about the target market of the application. They were advised that it is for anyone with a mobile device who intends to share their experience whether at a football game, a concert, a conference or convention as well as other commercial applications.

The testing group discussed the implications of streaming live without recording for concerts as broadcasting rights are typically associated with a corporation. They confirmed that it may potentially be okay as the event is not being recorded but rather relayed to viewers. A user asked how the broadcasters of events could be grouped together. It was conveyed that users would have the option to enter a description of their broadcast before they begin. The testing session only included the stream page of the application which allowed users to only begin streaming as opposed to entering event details.

Another concern that was raised was that the multi user live streaming concept could introduce undesired streams from broadcasters. It was recommended that the streams be somehow monitored and administered for unwanted stream content. In the discussion one user mentioned that it is a hard problem and that some things such as testing would be difficult across Android’s fragmented ecosystem of mobile devices as well legacy versions of the operating system. One member identified the technology as part of the future of television whilst another wanted to be able to time shift within the live stream to a previous point in the broadcast.

Users of the system were provided with the most basic functionality to stream from their mobile phone. The testing session aided in discovering bugs within the application and provided insight into what users expect from the system. The users who tested the system had not experienced this before and a flood of questions ensued with positive feedback on the implications of the technology.

85

Honours Project - SOC10101

7.3. Software Quality Metrics

10010000

Quality metrics are used as a measure to ascertain statistical data about an application or a system. It is a means by which data can be used to measure the complexity, reliability, complexity and maintainability of a code base. For this project it was difficult to ascertain metrics concerning JavaScript based implementations given that the language is not representative of traditional programming languages. Other languages as well as IDEs such as Eclipse and Microsoft Visual Studio already have features in which quality metrics can be used to measure a system.

To utilise quality metrics for a JavaScript based server as well as the front end required some investigation and creative problem solving. A library which integrated with Node.js was discovered called Plato which is a JavaScript source code visualisation, static analysis and complexity tool. An analysis was conducted against the source code of the web application and the streaming server running on Node.js and the results were ascertained from Plato.

on Node.js and the results were ascertained from Plato. Figure 30 - Quality metrics The key

Figure 30 - Quality metrics

The key to the result of these metrics is the ease of maintainability of the project. The most important files are based on the execution of the ServerToMobile_alt.js, Global.js and Stream.js.

In the case of ServerToMobile_alt.js this file is the execution environment for Node.js and the interaction between the mobile devices and the web application. This file is only 81 lines of code, that is to say source lines of code and carries a maintainability index of 75%. This is an astonishing result as mentioned previously this file is responsible for retrieving input from a mobile device, creating a configuration file to

86

Honours Project - SOC10101

10010000

be used by FFMPEG to begin streaming and then forward the stream to a streaming server application as well as updating all users connected to the web application with the live feed.

users connected to the web application with the live feed. Figure 31 - ServerToMobile_alt.js quality metrics

Figure 31 - ServerToMobile_alt.js quality metrics

The estimated errors based on Halstead’s Delivered Bugs was only 0.79 and the maximum number of distinct paths through the code based on McCabe’s Cyclomatic complexity was 2 for any given function.

Cyclomatic complexity was 2 for any given function. Figure 32 - ServerToMobile.js metrics report From figure

Figure 32 - ServerToMobile.js metrics report

From figure 32 it can be observed that the command to initiate the FFMPEG process which is a reusable function is only 23 lines of code and has a low complexity. The web application revolves around Global.js and is mainly used to receive updates from the server for new and deleted streams. Again it is surprising that such a small number of lines of code can accomplish impressive functionality and interaction between other systems.

87

Honours Project - SOC10101

7.4. Conclusion

10010000

The testing approach utilised validates the integrity of the system. Ensuring that components can communicate with each other at the end of an iteration cycle enabled the progression of the project through the different stages.

Metrics have provided an insight into the types of development techniques used to build the system. Achieving low error rates, high maintainability scores as well as low complexity values signifies that the quality of the product is of an acceptable standard.

88

Honours Project - SOC10101

8. Process Evaluation

10010000

Selecting the right methodology to implement the project was one of the key factors of the project outcome. The component based approach served as a flexible model by which the system could be developed with the freedom and independence necessary to integrate components at a future point.

8.1. Key processes and tools

This was not the only project management decision; it was highlighted in chapter 6 that Git was used to manage the source code during development. This was an excellent choice as more control was exercised over the development process. More choices were available and this made it easier to experiment with other libraries as seen in the case of implementing the two media players in the web application.

as seen in the case of implementing the two media players in the web application. Figure

Figure 33 - Git development timeline

89

Honours Project - SOC10101

10010000

Figure 33 represents the timeline of the development life cycle of the server based application (streamserver), the web based application (streamapp) and the final representation of the working files were combined into a separate repository (streamweaver).

The choices of the components of the system defined in the system modelling section proved to fulfil the desired functionalities identified in the requirements section. To integrate these components a number of solutions were required, the decision behind these solutions were motivated by research, time constraints and requirements gathering. Because of inexperience with all of the technologies selected apart from the web application the choices made were driven by a best fit approach.

The tools used in the project included:

ProcessOn – a free online diagramming tool

Zotero – a references management tool

Git – Source code and version control

Plato – Software metrics analysis tool for JavaScript

Code Pro Tools – Metrics analysis for Java

Microsoft Project – project management

Google Drive – synchronise files across devices

Scrivener – Efficient story writing tool

Alongside the applications chosen in the development process these tools contributed to better project management, research, development and administration.

90

Honours Project - SOC10101

8.2. Project Management

10010000

At the beginning attention was placed on the management of the project and attempts were made to adhere to the timelines defined in the project Gantt chart. The challenge of this approach was that it was difficult to determine the approximate timelines necessary for research and more specifically the implementation process. How the outcomes of the project would be achieved and by which technologies changed as the project progressed. The Gantt chart in Appendix 11 was then used as a reference point with a detachment from processes exceeding the allotted time. The initially determined completion date for the project was for March, this proved to be unrealistic as the project had increased substantially in terms of work load as a result of selecting the implementation technologies, especially for the streaming server configuration.

The project size quickly spiralled as the technologies selected introduced a learning curve for each component with unforeseen setbacks occurring and the availability of support from communities limited. Management of the project accounted for a reasonable amount of time whereas the design, development and report were the main activities that consumed a majority of the project time.

Frequent meetings were held with the supervisor so as to receive guidance on the enormity of work required to implement the project. Deciding on which features to focus on and remove increased the chances of success and the manageability of the project. These meetings helped to ensure that the focus on the objectives of the project was maintained.

8.3. Reflection

With reasonable success the most important features of the system were implemented. The learning curve for each of the applications meant an inevitable delay in the completion of the project. Problems unrelated to the development of the system such as OS application management and installation utilised a disproportionate amount of debugging and testing time.

91

Honours Project - SOC10101

10010000

The project was ambitious and led to acquiring knowledge of technologies that have never been used before. It was difficult to determine how much work would be needed to meet the objectives and this caused a lot of anxiety as well as a feeling of despair. The research domain was quite large and the number of probable solutions equally as large. Some development tasks ended up in failed states and needed to be revisited, redeveloped and re-researched.

Overall the feeling of accomplishment and the positive feedback received about the concept is extremely gratifying. One of the testers from the user testing session was impressed by the application and offered a server by which to develop the application further. Feedback from others included asking when the application will start making money and individuals were impressed over the size of the project and the successful outcome. They could grasp the implications of the concept and envision how potentially huge the application could become.

92

Honours Project - SOC10101

9. Conclusion

9.1. Introduction

10010000

The aim of this project was to build a system which allows users to broadcast and view live video streams over the internet from an event. This chapter aims to critically analyse the objectives against the outcome of the project and future work.

9.2. Achievement of objectives

9.2.1. Objective 1

The first objective was to identify other live streaming services and social video sharing applications. This was addressed in the literature and technology review which resulted in grasping the types of applications that exist, the technologies for video streaming, the types of cellular networks and the limitations of each generation, and the evolution of mobile technologies. With the proposed system providing the versatility of live streaming whilst remaining mobile other factors such as the legal implications were investigated. Existing streaming systems face difficult challenges when faced with copyright claims.

This objective was successfully obtained and a strong technical background was achieved by identifying the technologies involved in video streaming.

9.2.2. Objective 2

The second objective was to identify the requirements of the system based on existing technologies needed to accomplish the project. By reviewing, testing and navigating existing applications the requirements of the system were identified. The multi user live streaming concept is not something that exists in the available application domain. Requirements were also determined through observation and trends within the events industry.

Overall this objective was achieved as substantial number of requirements was gathered for the implementation of the system by piecing together results from research, testing and observation.

93

Honours Project - SOC10101

9.2.3. Objective 3

10010000

The third objective was to select and justify the technologies that will be used to build the system architecture that were identified in the requirements stage. The system modelling chapter covers the selection of technologies for each segment of the system.

Technology selection was driven by the results of the requirements stage. Choosing which platform to develop for, the fastest route for development whilst maintaining desired quality attributes and which components will fulfil the functional and non- functional requirements were fundamental to the outcome of the project.

This objective was met successfully.

9.2.4. Objective 4

The fourth objective involved designing the structure of the system and mapping the components of the different technologies identified to build the system. The selection criteria for these components heavily leaned towards the experience of the developer and therefore a Java based solution was used for the mobile application, a JavaScript based solution was used for the server application and an html/css/JavaScript solution was used for the web application.

These solutions were driven by the appropriateness of the functionality they offer, the opportunity to learn new applications and experience. The selection of these components leads to the completion of the aims of the project.

9.2.5. Objective 5

The fifth objective was to implement system using the technologies identified based on the design of the system and the functionalities identified in the requirements.

This was a challenging task and led to much experimentation and several difficulties were experienced during development. The design choices, applications and components selected ensured that the project would fulfil in part the core

94

Honours Project - SOC10101

10010000

functionalities of the system. Not all functional requirements identified were fulfilled for each section of the application.

The implementation is filled with mixed success as the core functionality which connects the components was achieved however not all functional requirements were completed. The functionality that was missed was mainly the login system users. To resolve this, an alternative implementation which simulated a unique streamer was implemented. After due consideration these functionalities were identified to be part of the “shoulds” in a MOSCOW based selection of requirements.

The core functionalities were identified to be “musts” and were the primary focus of combining multiple applications and technologies together. This objective was fulfilled as the end product demonstrated the core functionality.

9.2.6. Objective 6

The sixth objective was to test the system through integration and user testing. Integration testing played a key role throughout the development of the project. This is evident within the product evaluation section as well as being described in the implementation section. Using the iterative method to piece together each component which would then become part of a larger system was the correct approach. The system became large very quickly and caution was exercised as to not become absorbed in the complexity of it all.

Testing verified a number of concerns and considerations and retrieving feedback from the user testing session provided insight and elation with users responding positively to the applications.

9.2.7. Objective 7

This seventh objective was to evaluate the system using applicable metrics and analysis tools and to evaluate the conduct of the project including achievement of aims, project management and choice of technologies.

95

Honours Project - SOC10101

10010000

Using software quality metrics provided a statistical perspective of the quality of the product of the system. The results were positive from the metric analysis tools with changes required for future development. Overall the objectives of the system were achieved with a strong degree of success.

One area of improvement would be towards the management of the project, more specifically time management. At the beginning a timeline was mapped for the entire project. In an effort to accommodate the uncertain nature of the processes involved within the project the timeline was used as a guide but the actual project progressed quite differently.

Overall this objective was achieved as a metrics analysis was conducted and the process evaluation was completed.

9.2.8. Comparison of application to the work of others

For this project there are no other applications which exist that utilise a combination of technologies in this way to deliver a live sharing experience of multiple users simultaneously. There are applications such as Ustream.tv that allow users to stream from a mobile device to a central server which then displays the stream in a media player in a browser. This project has taken live streaming to a new plateau by introducing the unique approach of multi user live streaming which is consolidated in a central location and then broadcasted instantly to all viewers.

The difference between this project and commercial applications is the near to real time updates when viewers are watching streams and the consolidation of similar streams into a single location or web page. Given that the implementation allows this to happen automatically with dynamic updates this has pushed the sharing experience even further.

96

Honours Project - SOC10101

9.3. Further Work

10010000

The project has opened a considerable number of avenues to take with future work including improving the functionalities of each component/segment of the system. The delivered application is in essence a prototype which has achieved much success in the form of a completed system. This is by no means a commercially ready product and there are numerous improvements that would contribute towards future work.

For the mobile application users would benefit from a better analysis of their environment in order to determine the best streaming quality. Mobile users suffer from limited upload rates on 3G networks as well as variations on the capabilities of their devices. The most ideal approach would be to take these factors into account before allowing the user to broadcast to the system. This may lead to developing algorithms that optimise the users experience based on past broadcasts within the same area, the performance of similar hardware as well as the type of network the user is on.

The system would have benefited from data persistence through a database such as MySQL or MongoDB. This would have aided in fulfilling the login functionality described in the requirements section.

The system in its current form does emit characteristics of instability. Testing has shown that there is the need for coarse grained control over the streaming server, especially with FFMPEG. This may require the development of a wrapper which is better at handling the output from FFMPEG. A greater degree of control will be obtained through managing the output and making decisions based on these outputs.

Users may wish to view an event in the future with all the streams related to that event. This was not developed however the flexibility of FFMPEG allows for the output of the incoming stream to be outputted into multiple formats for all platforms as well as broadcasting the stream to the streaming server application.

97

Honours Project - SOC10101

10010000

Further to this one of the most exciting opportunities which would lead to further development is to analyse all incoming streams and to simultaneously synchronise the streams based on the surroundings of the event. This could lead to image analysis and the design of algorithms that would analyse the contents of a stream thus providing a more complete experience for users.

There are some limitations to the number of streams that can be simultaneously streamed as well as the devices that successfully stream. Further investigation would be required to improve the overall stability of the system and improve the mobile application to take into account more mobile devices.

A sophisticated method is required to handle active streams and how it relates to other streams being broadcasted. Data persistence would be useful for this as a solution and it could be combined with Socket IO’s ability to section users of the web application into separate events.

The server application would benefit from tools which listen to the failure of the Node.js server. The streaming system would require a large amount of system administration tools within the Linux environment in order to ensure a robust experience for users.

Generally the web application requires a design and development overhaul for front facing users if it is to be deployed in a usable context. The complexity of the interaction of all components of the system requires further development, testing and fixes to be applied to take the system to where it needs to be.

The streaming system is at an extremely early stage of release, it is able to demonstrate multi user live streaming with live updates for all users watching the stream. This project was an ambitious attempt at combining multiple components to fulfil a gap in social video sharing of which is unavailable. A great deal has been learned, multiple languages and components were utilised, the aim of the project was achieved and the objectives fulfilled.

98

Honours Project - SOC10101

References

10010000

1 Billion Android Devices Were Activated in Just A Few Years, Making it The

Fastest-Growing Platform Ever [WWW Document],

Review. URL http://www.technologyreview.com/graphiti/520491/mobile-

MIT Technology

makeover/ (accessed 11.7.13). About Twitter, Inc. | About [WWW Document], n.d. Twitter About. URL https://about.twitter.com/company (accessed 3.29.14). Albrecht, C., 2009. Justin.tv Placates Copyright Holders With Anti-piracy Tech. Gigaom. Amadeo, R., 2013. Google’s iron grip on Android: controlling open source by any means necessary (Wired UK) [WWW Document]. Wired UK. URL

http://www.wired.co.uk/news/archive/2013-10/21/googles-iron-grip-on-android

(accessed 1.5.14). Arthur, C., 2011. Egypt blocks social media websites in attempted clampdown on unrest. The Guardian. Arutyunyan, R., n.d. Streaming with nginx-rtmp-module. BBC, 2013. Mobile phone hits 40th birthday. BBC. Bright, P., 2011. Microsoft Buys Skype for $8.5 Billion. Why, Exactly? | Wired Business | Wired.com [WWW Document]. Wired Business. URL http://www.wired.com/business/2011/05/microsoft-buys-skype-2/ (accessed

3.30.14).

CCS Insight, n.d. 4G Gathers Momentum as Smartphones Smash One Billion Units in 2013 [WWW Document]. URL http://www.ccsinsight.com/press/company-

news/1724-4g-gathers-momentum-as-smartphones-smash-one-billion-units-

in-2013 (accessed 11.6.13). Cha, M., Kwak, H., Rodriguez, P., Ahn, Y.-Y., Moon, S., 2007. I tube, you tube, everybody tubes: analyzing the world’s largest user generated content video system. ACM Press, p. 1. doi:10.1145/1298306.1298309 Cheng, J., 2008. The truth about the iPhone’s sales numbers [WWW Document]. Ars Technica. URL http://arstechnica.com/apple/2008/01/the-truth-about-the-

iphones-sales-numbers/ (accessed 4.22.14). Cisco Visual Networking Index: Forecast and Methodology, 2012–2017 

[Visual Networking Index (VNI)] [WWW Document],

http://www.cisco.com/en/US/solutions/collateral/ns341/ns525/ns537/ns705/ns

827/white_paper_c11-

481360_ns827_Networking_Solutions_White_Paper.html (accessed

10.22.13).

Cisco. URL

Company Info | Facebook Newsroom, n.d. Daily, G., 2007. Ten Questions: Intellectual Property, Copyright, and Streaming Media [WWW Document]. Streaming Media Magazine. URL

http://www.streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=65175

(accessed 4.24.14). Dapeng Wu, Yiwei Thoms Hou, Ya-Qin Zhang, 2000. Transporting real-time video over the Internet: challenges and approaches. Proceedings of the IEEE 88, 1855–1877. doi:10.1109/5.899055 Dilger, D.E., 2010. Inside iPhone 4: FaceTime video calling [WWW Document]. URL

http://appleinsider.com/articles/10/06/08/inside_iphone_4_facetime_video_call

ing (accessed 3.30.14).

99

Honours Project - SOC10101

10010000

Dodd, A.Z., 2012. The essential guide to telecommunications. Prentice Hall, Upper Saddle River, NJ. Dufaux, F., Pesquet-Popescu, B., Cagnazzo, M., 2013. Emerging technologies for 3D video: creation, coding, transmission and rendering. Elgin, B., 2005. Google Buys Android for Its Mobile Arsenal. BusinessWeek:

Global mobile statistics 2013 Part B: Mobile Web; mobile broadband penetration; 3G/4G subscribers and networks. [WWW Document], n.d. URL

http://mobithinking.com/mobile-marketing-tools/latest-mobile-

stats/b#mobilebroadband (accessed 10.22.13). Google, 2014. Copyright Removal Requests – Google Transparency Report [WWW Document]. URL https://www.google.com/transparencyreport/removals/copyright/ (accessed

4.23.14).

Holland, J., 2012. Courts find Justin.tv not guilty of “stealing cable” in lawsuit filed by UFC [WWW Document]. MMAmania.com. URL

http://www.mmamania.com/2012/3/22/2893430/ufc-lawsuit-justin-tv-not-guilty-

internet-piracy-online-video-stream-sopa (accessed 4.23.14). Joyent, n.d. node.js [WWW Document]. URL http://nodejs.org/ (accessed 4.13.14). Joyent Inc, n.d. net Node.js v0.10.26 Manual & Documentation [WWW Document]. URL http://nodejs.org/api/net.html#net_net (accessed 4.18.14). Ladha, M., 2014. Skype’s Pursuit of the Perfect Video Call. Skype Blogs. Lee, H., Lee, Y., Lee, J., Lee, D., Shin, H., 2009. Design of a mobile video streaming system using adaptive spatial resolution control. IEEE Transactions on Consumer Electronics 55, 1682–1689. doi:10.1109/TCE.2009.5278043 Levy, V., 2013. Ustream Cloud Platform Delivers Global Audiences to Any Broadcaster, Anytime | Ustream Newsroom. libstreaming - A solution for streaming H.264, H.263, AMR, AAC using RTP on Android [WWW Document], n.d. A solution for streaming H.264, H.263, AMR, AAC using RTP on Android. URL https://github.com/fyhertz/libstreaming

(accessed 3.17.14). Minoli, D., 2012. Linear and nonlinear video and TV applications using IPv6 and IPv6 multicast. Wiley, Hoboken, N.J. Morrill, D., 2008. Announcing the Android 1.0 SDK, release 1 | Android Developers Blog. Musil, S., 2013. YouTube defeats Viacom copyright lawsuit -- again [WWW Document]. CNET. URL http://www.cnet.com/uk/news/youtube-defeats- viacom-copyright-lawsuit-again/ (accessed 4.24.14). Pande, A., Ahuja, V., Sivaraj, R., Baik, E., Mohapatra, P., 2013. Video Delivery Challenges and Opportunities in 4G Networks. IEEE MultiMedia 20, 88–94.

doi:10.1109/MMUL.2013.44

Peckham, M., 2011. Mobile Flash Abandoned for HTML5: Adobe Surrenders, Apple Wins? Time. Raychaudhuri, D., Mandayam, N.B., 2012. Frontiers of Wireless and Mobile Communications. Proceedings of the IEEE 100, 824–840.

doi:10.1109/JPROC.2011.2182095

Roettgers, J., 2011a. Skype goes VP8, embraces open video codec. Gigaom. Roettgers, J., 2011b. The technology behind Google+ Hangouts. Gigaom. Rupp, M., 2009. Video and multimedia transmissions over cellular networks:

analysis, modeling, and optimization in live 3G mobile networks. Wiley, Chichester, West Sussex, U.K.; Hoboken, NJ.

100

Honours Project - SOC10101

10010000

Sager, I., 2012. Before IPhone and Android Came Simon, the First Smartphone. BusinessWeek: technology. Sippey, M., 2013. Vine: A new way to share video [WWW Document]. Twitter Blogs. URL https://blog.twitter.com/2013/vine-a-new-way-to-share-video (accessed

4.11.14).

Sommerville, I., 2011. Software engineering. Pearson, Boston. Statistics – YouTube [WWW Document], n.d. URL http://www.youtube.com/yt/press/en-GB/statistics.html (accessed 11.7.13). Supported Media Formats | Android Developers [WWW Document], n.d. URL http://developer.android.com/guide/appendix/media-formats.html (accessed

3.17.14).

Swider, M., 2013. Microsoft highlights 299M Skype users, 1.5B Halo games played [WWW Document]. TechRadar. URL

http://www.techradar.com/news/software/operating-systems/xbox-live-

upgrade-includes-300-000-servers-600-times-more-than-its-debut-1161749

(accessed 3.29.14). Systrom, K., n.d. Instagram Blog [WWW Document]. URL http://blog.instagram.com/post/44078783561/100-million (accessed 3.29.14).

Tassi, P., 2013. The Injustice Of The YouTube Content ID Crackdown Reveals Google’s Dark Side [WWW Document]. Forbes. URL

http://www.forbes.com/sites/insertcoin/2013/12/19/the-injustice-of-the-

youtube-content-id-crackdown-reveals-googles-dark-side/ (accessed 1.9.14). Wells, A., 2010. Nearly 8 Million Android Phones Sold in 2009 [WWW Document]. URL http://technorati.com/technology/android/article/nearly-8-million-android- phones-sold/ (accessed 4.22.14). What are P2P communications? [WWW Document], n.d. URL

https://support.skype.com/en/faq/FA10983/what-are-p2p-communications

(accessed 3.30.14). Wheeler, B., 2013. Twitter users: A guide to the law. BBC. Wiesen, A., 2013. Official Gmail Blog: Making calls from Hangouts — in Gmail and across the web. Wurm, K., 2013. Skype and a New Audio Codec. Skype Blogs.

101

Appendix 1 – IPO

Project Title: Multi User Event Streaming Via Mobile Applications

Overview of Project Content and Milestones

The aim of this project is to develop a mobile application that allows users to stream an event from their mobile device. This will be developed in conjunction with a streaming server whose purpose is to retrieve and transfer streamed content. A web application will be developed that will allow viewers to view streamed content whilst the content provider is streaming. A database will also be implemented and hosted on a web server alongside the web application.

The milestones for the project include, the completion of the research stage contributing towards the literature review. This will provide a basis of support and direction for developing the project. The design and implementation stages of the project are also key milestones as they aid in the considerations surrounding the functionality of the systems illustrated using aspects of UML.

The Main Deliverables

The main deliverables of the project are an android application, a database and a web application. This will be provided as a proof of concept with the intent to scale if applied in a real world scenario.

Providing content providers with an alternative technique to streaming events and providing viewers with a dynamic streaming experience is the ultimate achievement of this project. Additional achievements include learning how to develop mobile applications, working with network programming and integrating interactive front end applications.

The Target Audience For The Deliverables

This project is targeted at content providers and viewers. A content provider is any individual or organisation with the intent to provide streamed data of an event to viewers.

Content providers may be audience members at a concert, supporters at a sporting event, gamers at tournaments recording off stream matches, festival attendees streaming activities, music and entertainment, corporation employees streaming meetings and events within a secure environment, attendees at major functions such as ComicCon, Defcon and other events throughout the world. A viewer is an individual or group of people interested in the event being streamed.

The Work To Be Undertaken

The project will require investigation into streaming services; this includes considerations into the implementation of an open source streaming server, the android API for mobile development, the choice of database, the back end and front end language choices for the development of the web application on the web server.

There will need to be research into the knowledge surrounding streaming services and the streaming industry at its current state. This will provide a basis for the literature review by understanding the technologies involved in streaming content as well as the hardware and software required to meet streaming requirements.

As part of the work to be undertaken aspects of the software development lifecycle such as design, implementation and testing will be utilised. This includes modelling aspects of the system and establishing the interaction with users using use cases, activity diagrams and class modelling.

The implementation phase will involve the development of the mobile application, web application and interaction of the web server and streaming server with the end user applications.

The testing stage will include unit and integration testing as well as system testing for performance, security and stress. User testing will play a crucial role which will contribute towards the evaluation stage of the project.

The evaluation phase will inspect the process and methods used to manage the workflow as well as a reflection on the tools used to implement the project.

Additional Information / Knowledge Required

New knowledge will be acquired in mobile development and working with the android API. This involves learning transfer protocols, encoding and decoding techniques, file formats for video and audio, device detection, working with the device hardware including the camera as well as android class libraries.

A variety of technologies will be utilised to create the project. For the mobile application android and java will be used, the database will be MySQL with the web application utilising JavaScript, html, css, json, Ajax and jQuery. This will involve a JavaScript framework to allow for the rapid development of the front end of the web application. The back end will be created using java, php or python.

Mobile development is where current skills will be extended by working with the User Interface library as well as the API. Integrating the streaming server with the mobile application will also be an extension of current skills.

Information Sources That Provide A Context For The Project

Information will be sourced from books based on android development, streaming services and the history of streaming, technical documentation and API’s for android and JAVA. These include websites, journal articles and web articles of relevance. Alternative streaming services will be reviewed in order to provide the context for what is currently available.

The Importance of the project

These applications will provide content providers and viewers with a new level of interaction allowing content providers to reach audiences in an innovative way. Despite the existence of alternative applications this mobile app aims to connect content providers via a common medium which will be delivered to a viewer or group of viewers. The difference in this project in comparison to existing platforms for streaming is the way in which viewers have control over what they choose to watch. This can be likened to a security guard in a CCTV room viewing multiple positions within an environment; the viewer will be able to observe multiple streams from content providers streaming from a location and select which versions to watch.

The Key Challenge(S) To Be Overcome

The design and implementation stages of the project are the key challenges that will require attention. This implies that the android development platform is being used for the first time and will therefore require time padding to allow for mistakes and any learning curve to be experienced. This must take into account the interaction with the streaming server as testing and debugging this process is dependent on the set up of the systems.

Appendix 2 – Report on the Interim Review Meeting

Appendix 2 – Report on the Interim Review Meeting

Appendix 3 – Project Diary

18-11-2013

Progress

No work was undertaken on the dissertation – coursework for Software architecture taken priority

This poses a concern as this will affect time allocated to further tasks

Literature review not complete but requires further content

Need to prioritize time without compromising project

02-12-2013

Progress

Working on completing literature review before end of semester

06-01-2014

Progress

Researched alternatives

Set up development environment and UML tools

Tested current applications offering mobile streaming and documented features

Set up libraries and android development environment for testing and debugging

Set up database and started creating database tables and relationships

Continued polishing of literature review – further background information gathering and addition to literature review on some topics

Corrected references and layout

27-01-2014

Progress:

Mobile application being developed.

streaming server set up

web based libraries supporting ffmpeg such as php/java

Progress

Phase 1 of mobile application streaming activated

Web application server set up and libraries added

Written more within honours project.

Currently within implementation phase of project scheduled to be completed by 2nd week in February

All semantics of application considered and under development

17-02-2014

Progress

mobile app streaming

JavaScript server picking up configuration file

ffmpeg capturing feed - able to create multiple formats

web application framework created and sectioned

Challenges

Frequent errors passing live feed to ffserver

Find a suitable solution for rebroadcasting stream once forwarded by ffmpeg.

Solutions

Look into server alternatives, e.g. rtmpd – flash server

ffmpeg can still re broadcast stream to alternatives however limited options

Version of ffmpeg installed is not current version – reinstalling for performance reasons

Reinstalling ubuntu distribution with minimal configuration

Action

Reinstalled ffmpeg and ffserver based on git repository

This resolved my problem of passing the live stream to the server

Need to use rtmp server to broadcast livestream with cross browser support

24-02-2014

Progress

Installed rtmp server to capture livestream

Not a lot of work completed due to c/w

Challenges

n/a

Solutions

n/a

Action

Will revisit on or after the 28/04

Will use 2 weeks of final development from next week, other plans will be completed during this time including writing more of honours and completing further sections

03-03-2014

Progress

Modelling system through activity diagram

installed nginx and nginx-rtmp-module to allow live streaming via flash

reinstalled ffmpeg e.t.c on new virtual machine due to frequent errors

Challenges

ffserver cannot broadcast dynamically a live stream

installation of new applications unstable with high dependencies .e.g Linux version, compiler options e.t.c

inexperience with nginx server (solely apache experience)

Solutions

Minimal integration testing to be executed

To try and achieve a (crude) working model of the desired result

Action

Write and review more areas of honours project

test streaming to rtmp server

verify working and then implement dynamic streaming

10-03-2014

Progress

Dissertation re-organised

The rtmp server is working and receiving streams

The three important areas of the application function

Utilised a feature of GIT in which i could test other pieces of code by branching and leave the working code unaffected

Challenges

To combine the three components of the application to work in a dynamic fashion

This means creating a login process to identify users

Solutions

Action

Work more on the dissertation

work on the dynamic aspects of the streaming server

Appendix 4 – Report on the IPO

Appendix 4 – Report on the IPO

Appendix 5 – Network Protocols

Protocol

Description

Hypertext Transfer Protocol (HTTP)

An application - level, stateless, object - oriented protocol for distributed, collaborative, hypermedia information systems.

Real - Time Streaming Protocol (RTSP)

An IETF protocol that is used for continuous (streaming) of audio and video sessions. It provides the control for playing, stopping, media position control

(e.g., fast forward) via bidirectional communication sessions. An application - level protocol for control of the delivery of data with real - time properties. It embodies an extensible framework to enable controlled, on - demand delivery

of

real - time audio and video data; it uses Transmission Control Protocol

(TCP) or/or the User Data Protocol (UDP), depending on function.

Real - Time Transport Protocol (RTP)

An IETF protocol (a set of commands and processes) that is used to add timing and sequence information to each packet to allow the reassembly of packets to reproduce real time audio and video information. It is a UDP - based packet format and set of conventions that provides end - to - end network connectivity functions suitable for applications transmitting real - time data, such as audio, video, and so on , over multicast or unicast network services.

Real - Time Transport Control Protocol (RTCP)

(Also known as RTP Control Protocol) Control protocol that works in conjunction with RTP to control performance and for diagnostic purposes. RTCP control packets are periodically transmitted by each participant in an RTP session to all other participants.

Session Description Protocol (SDP)

A

media description specification used for describing multimedia sessions for

the purposes of session announcement, session invitation, and session initiation.

Appendix 6 – Development Tools

Android Development Studio

Appendix 6 – Development Tools Android Development Studio Sublime Text 2

Sublime Text 2

Appendix 6 – Development Tools Android Development Studio Sublime Text 2

Terminal - Ubuntu

Terminal - Ubuntu

Appendix 7 – Node.js Server application

/**

* File: Main.js

* Main configuration

* Module management handled by requirejs

*/ var requirejs = require('requirejs');

requirejs.config({

baseUrl:

dirname

,nodeRequire: require

});

requirejs(['app/ServerToMobile_alt'], function() { console.log('main called');

});

/**

* File: ServerToMobile_alt.js

*/ require(['net', 'http', 'fs', 'module', 'path', 'os', 'events', 'underscore', 'socket.io', 'app/Events', 'child_process'

], function(net, http, fs, module, path, os, events, _, io, EventManager, cp) {

var serverDir = path.dirname(module.uri); var count = 0; var ffmpegexecList = []; var activePorts = [];

//http://stackoverflow.com/questions/3653065/get-local-ip-address-in-node-js/3742915#3742915

var serverip =

return (val.family == 'IPv4' && val.internal == false) }).pluck('address').first().value();

chain(os.networkInterfaces()).flatten().filter(function(val) {

var mainindex = fs.readFileSync('index.html');

var plainHttpServer = http.createServer(function(request, response) { response.writeHead(200, { 'Content-type': 'text/html'

}); response.end(mainindex);

}).listen(8087);

var sio = io.listen(plainHttpServer);

function ffmpegInit(path, user) { var ffmpegProcess = cp.spawn('ffmpeg', ['-i', path, '-vcodec', 'libx264', '-an', '-threads', '2', '-s', '320x240', '-f', 'flv', 'rtmp://localhost/live/' + user]);

ffmpegexecList[ffmpegProcess.pid] = user;

ffmpegProcess.stdout.on('data', function(data) { // console.log("stdout: " + data);

}); ffmpegProcess.stderr.on('data', function(data) { // console.log("stderr: " + data);

}); ffmpegProcess.on('exit', function(code) { if (ffmpegexecList[this.pid]) { sio.sockets.emit('removeStream', JSON.stringify({ user: ffmpegexecList[this.pid]

}

));

ffmpegexecList.splice(this.pid,1);

}

}

});

var serverToMobile = net.createServer(function(cxnlsnr) { cxnlsnr.on('data', function(data) {

//set up stream location var path = serverDir + "/users/" + "stream_" + count + ".sdp"; var user = 'test' + count; var serverurl = serverip;

//create file to stream to var stream = fs.createWriteStream(path);

stream.once('open', function() { stream.write(data.toString()); stream.end();

});

stream.once('close', function() {

ffmpegInit(path, user); sio.sockets.emit('getStream', JSON.stringify({ id: user, user: user

}));

});

count++;

});

});

serverToMobile.listen(8089);

});

Appendix 8 – Mobile Application

package com.libstreaming;

import android.os.Bundle; import android.app.Activity; import android.util.JsonWriter; import android.view.Menu;

import java.io.DataOutputStream; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.io.OutputStream; import java.io.OutputStreamWriter; import java.